Jump to content
We've recently updated our Privacy Statement, available here. ×
  • What's New in Spotfire® 7.14


    Spotfire® 7.14 contains support for cascading filters when working with external relational data, additional editing capabilities in the data table workflow, numerous improvements to our OLAP and big data connectors (SAP Hana, Oracle Essbase, Microsoft SQL Server Analysis Services), an all new Salesforce connector with support for federated authentication, and the ability to automatically set the coordinate reference system when importing Shape files.

    Introduction

    Spotfire® 7.14 contains support for cascading filters when working with external relational data, additional editing capabilities in the data table workflow, numerous improvements to our OLAP and big data connectors (SAP Hana, Oracle Essbase, Microsoft SQL Server Analysis Services), an all new Salesforce connector with support for federated authentication, and the ability to automatically set the coordinate reference system when importing Shape files. 

    Note that Spotfire® 7.14 is a mainstream version. Fixes to critical issues discovered after the release will only be made to the most current version and to any long term supported versions. For more information on the difference between mainstream versions and long term supported versions see the documentation

    Visual Analytics

    Cascading filters in-db

    Spotfire® now supports cascading filters also when working with external data in relational databases. This means that it is easier to find specific values in, for example, list box filters, since the contents of one filter is affected by what is filtered out by other filters.

    image.png.ba5729128073fcdc6355bb151226938b.png

    The behavior can be switched on per data connection by selecting the 'Enable cascading filters for in-database data tables for this connection' check box in the data connection settings (see the screenshot above). This works for all relational database connectors.

    Set coordinate reference system from Shape file

    Spotfire® is now able to set coordinate reference systems automatically by recognizing the projection formats (.prj file) associated with Shape files.

    More detailed, updated list of supported coordinate reference systems 

    Spotfire® now supports even more coordinate reference systems and provides more details for each of them, to help you select the correct CRS faster.

    Data Wrangling

    Insert rows, columns and data transformations before other nodes

    With this release of Spotfire, you have access to yet another improvement of the data table editing workflow in the source view: the ability to add rows, columns and data transformations anywhere within an existing data table structure. This will save you days of data wrangling time when maintaining and developing analysis files. For example, you can now insert your sales data rows for the last month in the beginning of your data structure, before joins with other data. Earlier, the risk was high that you would have to rebuild the data table from scratch to obtain the desired structure or use workarounds like data table data sources (data table from current analysis).

    In the example below, we have an existing analysis with a sales transactions data set (Video_Game_Sales_Numbers_0-8000) joined with a dimensions data set. The dimensions provide more information about each transaction. We also have two data transformations on the Added Columns node and one operation on the final data table.

    image.thumb.png.a2fa6ec91794ebf8b5a6cd98d143caac.png

    Now, when more sales transactions come in over time we would like to add them as rows to our data. With Spotfire® 7.14 this is easy to do with the new editing capability which allows us to insert rows (and columns and data transformations) where we need them to go. In this example, the new sales transactions rows must be added to the Video_Game_Sales_Numbers_0-8000 data source. There are two access points in the source view for this. The image below shows the first alternative, on the arrow binding nodes together.

    image.png.31840e93530d75f4cf9e653f92118ef3.png

    The image below show the access point in the node operations list. In addition to inserting rows and columns, this access point can also be used to insert data transformations.

    image.thumb.png.e143d3c09402c1eb792bfede93fa5192.png

    The end result is a final data table with more sales transactions. The already existing join with the dimensions table, the two data transformations and the final data table operation are all intact.

    image.thumb.png.c26594de47fc349c79103d470b0dc163.png

    Data Access

    A new and improved Salesforce connector

    Native, self-service support for analyzing data from Salesforce was introduced in Spotfire® 7.5. This release brings support for the two most frequently requested enhancements for this connector: federated authentication, and removal of the need for an ODBC driver. The new connector also uses the Salesforce bulk API for quick access to millions of Salesforce records. You are now also able to load more than 2000 rows from Salesforce reports. You may notice that the new connector lacks the .com extension in its name:

    image.thumb.png.ba1ae47f2f5a492331436916b0fa9bc5.png

    The new connector's data source UI has a blue link which is used to login with federated authentication:

    image.png.4e28791f523842b2eb6381609b2ecc9d.png

    Once the blue link is selected, your default web browser will open the Salesforce login page:

    image.thumb.png.8471d5b3c24ad946e479b1862c288b5e.png

    If your organization is using a custom domain you will use that option when logging in:

    image.thumb.png.753fc843d7a632d7ec48a92301e6b3f9.png

    If you sign in with, for example, Google, you can select which account you are using:

    image.png.1ec04543f63582ed49a767af46f8441d.png

    Spotfire® needs access to certain information to be able to load your Salesforce data:

    image.thumb.png.92821fa5946cbd5d904ef4cfaadbf522.png

    As always when loading data from Salesforce it's recommended to deselect all columns (available as a right click option in the column list) and then pick only the columns you need for your analysis. Once done, define one or more prompts to limit data on, for example, State, like in the example below using the Account table.

    image.thumb.png.620e726c869d2b351e4845c1d5800433.png

    You can always go back and add or remove columns later on. This is done from the Source view:

    image.thumb.png.6056ee53793dcc8de37c8828ff69df04.png

    You can control whether users should be prompted when opening the analysis (or when reloading the Salesforce data) by selecting or clearing the Prompt for new settings before loading check box in the Data Table Properties:

    image.png.feed369eadab402500ea8e00ee283ebf.png

    For useful information about compatibility and using the updated connector to open Salesforce connections that you created in earlier versions of Spotfire, see this article.

    Aggregate calculated measures in Oracle Essbase

    With this Spotfire® release, you can create visualizations that aggregate calculated Essbase measures. This was previously not possible because not all calculated measures are additive. However, if you know that you are working with additive measures, you can now configure your Oracle Essbase Spotfire® data source to allow aggregated measures. In previous versions of Spotfire® you would get an error message if you tried to aggregate calculated measures in a visualization:

    image.thumb.png.5893f245cf7d8e46d374bcc2095a2b39.png

    image.png.2ee803e935caebcc6c6461e2e81cc157.png

    In Spotfire® 7.14 you can now allow aggregation of calculated measures by selecting the check box in the settings panel of the Data Selection in Connection dialog, as seen in the image below.

    image.png.c221f2018e40667235734590783d96bf.png

    Once allowed, your visualization will display data as expected.

    image.thumb.png.ec6a8b8c6585952bdaf51309694f0358.png

    Import spatial objects with connectors for Oracle, Microsoft SQL Server, and PostgreSQL

    The connectors for Oracle, Microsoft SQL Server, and PostgreSQL now support geographical data types. This allows you to connect to and extract geographical row level data into Spotfire's in-memory data engine with just a few configuration steps. The image below shows an example using all spatial data types in SQL Server:

    image.thumb.png.23c9b7f04867764b2479c933ac4d5bec.png

    The SAP HANA connector now supports all connection settings

    In addition to the new connection timeout setting for SAP HANA you can now set any connection string parameter from within Spotfire, for example the fetch size.

    image.png.439d870fa84b731a10e964caefac37e7.png

    Please note that it's not possible to enter properties already available in the dialog. If you for example try to enter a user name you will be notified of this:

    image.thumb.png.0e8d6c742d165f73f16e89956ab08d66.png

    Microsoft Analysis Services command timeout support

    You will now be able to analyze more data and ask more complex questions in SSAS by raising the maximum MDX query timeout time.

    image.png.ddd8ff9a21d2c68b0b2ba1ac56c42e99.png

    Microsoft Analysis Services username and password authentication support

    In addition to Windows Authentication you can now authenticate with username and password towards your analysis services instances.

    image.png.c994a6f2cb9f2d3637d5cc818ef66060.png

    Microsoft Azure Analysis Services support

    With the added support for username and password authentication you can now connect Spotfire® directly to Microsoft Azure Analysis Services.

    Amazon RDS SQL Server support in Cloud Business Author

    TIBCO Cloud Spotfire® and the Spotfire® on-premises platform can now connect to Amazon RDS SQL Server data. This means you can store analysis files in the Spotfire®(Cloud) library and let them query Amazon RDS SQL Server directly from the web based clients, Spotfire® Business Author and Consumer. You use the Microsoft SQL Server connector to connect to Amazon RDS SQL Server.

    A new connector query log

    As a Spotfire® administrator you probably use users action logs for an overall view of queries generated by the Spotfire® ecosystem towards your data sources. In addition to this, you might also be asked to investigate certain query related issues reported by end users. An example could be visualizations that "take forever" to render. In this case, the users action logs might not be the ideal tool to work with, as they only provide a view of historical data and not a real time view of currently running queries.

    With this release of Spotfire, you now have access to a query log dedicated to connectors. By loading the log into Spotfire® you can locate the MDX/SQL query in question, and copy it and run it in your favorite database tool. This allows you to instantly determine whether it is the complexity of the Spotfire® visualizations that needs to be adapted to better suit the data engine of the data source, or, whether you should ask the DBA to tune the database.

    The log file collects queries from Spotfire® Analyst, Node Managers and Automation Services. Each row in the log represents a query, which was generated from a data connector running on the Spotfire® instance and sent to an external data source. By default, the logging level is set to OFF.

    Level The logging level
    HostName The name of the computer running the Spotfire® service.
    TimeStamp The date and time, in the local time of the computer running the service, when the query was generated in Spotfire.
    UTCTimeStamp The date and time, in UTC, when the query was generated in Spotfire.
    QueryId The unique identifier of the query, as assigned by Spotfire.
    UserName The Spotfire® username of the logged on user.
    Status Specifies whether the query succeeded, failed, or was canceled by the user.
    DurationMs The amount of time, in milliseconds, that the query took to execute in the external data source.
    RowCount The number of rows in the query result.
    ColumnCount The number of columns in the query result.
    DataSourceType The type of Spotfire®connector that was used in the connection.
    DatabaseServer The URL or IP address of the server of the external data source.
    Database The name of the database in the external data source.
    DatabaseUser The database user that was used to log in to the external data source.
    Analysis The name of the Spotfire® analysis file.
    Visualization The name of the visualization in the analysis that generated the query.
    Operation The type of operation that generated the query.
    DataSourceInfo Connector type specific information regarding the data connection.
    Parameters Any parameters in the query.
    QueryString The full query string sent from Spotfire® to the external data source.

    As always in Spotfire, logging is controlled from the Help menu > Support Diagnostics and Logging:

    image.thumb.png.99af6f71bb98e22ea915f3e8b7721bea.png

    Go to the Logging tab and select the DEBUG or TRACE log level. Notice the path to where your log file is stored because we will open the log file in Spotfire® and analyze it later on. The log file is named Spotfire.Dxp.QueryLog.log.

    image.png.6964bf819f22095610897ea7a57543d8.png

    Go to File > Add data tables > Add > File... and select your log file. You will then see the Import Settings dialog:

    image.png.e2242b6a3688ed38d9d868b966d8cba7.png

    Go to the Advanced settings and select Allow newline characters in quoted fields:

    image.png.9e9c86ddb6070264061c91d6c250f2ca.png

    Once data is loaded you can visualise for example the number of times each query has been pushed to the underlying data source:

    image.thumb.png.1e06c589ce228676aae3ba7f15c40c98.png

    If you add the log file data table to an existing analysis you can analyze queries while you are using your analysis file:

    image.thumb.png.b27c199e0a6d44e86dfccb710097703e.png

    Administration

    Nodes & Services

    The following updates have been made to the Nodes & Services app on the Administration page:

    On the "Resource pools" page, when adding instances to a resource pool, the dialog now shows the total number of existing instances and the name of the resource pool.

    The "Untrusted nodes" page now includes port information for untrusted nodes.

    Scheduling & Routing

    The following updates have been made to the Scheduling & Routing:

    The CLI command config-scheduled-updates-retries has a new option, stop-updates- destination-unavailable. Using this option, you can indicate whether scheduled updates should be retried if the destination is offline or unavailable. By default this option if set to "true", so scheduled updates are not retried when the destination is unavailable.

    When creating a rule, if you do not first enter a rule name, the Rule name field is auto-populated with the name of the file, group, or user that you select. You can then edit the name as you see fit.

    To see further information on changes in functionality and list of items that will be deprecated please see the 7.14 Spotfire® Server Release notes.

    Developer

    JavaScript API: New authentication mechanism supports external/web authentication

    It is now possible to use the JavaScript API on a Spotfire® Server that is configured with any external/web authentication. For example, you can now create a mashup with a .dxp file located on the TIBCO Cloud Spotfire® library.

    The code sample below shows a simple mashup and illustrates the differences that comes with 7.14 compared with previous versions:

    <html>
    <head>
        <meta charset="utf-8"/>
        <meta http-equiv="X-UA-Compatible" content="IE=edge">
        <title>Simple mashup example</title>
        <script src="https://spotfire.tcsdev.tcie.pro/spotfire/js-api/loader.js"></script>;
    </head>
    <body>
        There are three changes you need to make to previous JS-API tools.
        <ol>
            <li>Change the script src to https://spotfire-environment.example.com/spotfire/js-api/loader.js
            <li>Change the script to use the new spotfire.webPlayer.createApplication API
            <li>Create the callbacks onReadyCallback, onError, onCreateLoginElement
        <p>
    	
        <div id="renderAnalysis"></div>
    </body>
    <script>
        var app;
        var doc;
        var webPlayerServerRootUrl = "https://spotfire-next.cloud.tibco.com/spotfire/wp/"";
        var customizationInfo = { showToolBar: false, showStatusBar: false, showPageNavigation: false };
        var analysisPath = "/Samples/Expense Analyzer Dashboard";
        var parameters = '';
        var reloadInstances = true;
        var apiVersion = "7.14";
     
        // This is basically an asynchronous version of the spotfire.webPlayer.Application constructor
        spotfire.webPlayer.createApplication(
            webPlayerServerRootUrl,
            customizationInfo,
            analysisPath,
            parameters,
            reloadInstances,
            apiVersion, // New. String specifying the api version. Should perhaps be optional with latest as default.
            onReadyCallback, //New. Callback with signature: function(response, app)
            onCreateLoginElement // New. Optional function reference to create a custom login element wrapper.
            );
     
        function onReadyCallback(response, newApp)
        {
            app = newApp;
            if(response.status === "OK")
            {
                // The application is ready, meaning that the api is loaded and that the analysis path is validated for the current session (anonymous or logged in user)
          console.log("OK received. Opening document to page 0 in element renderAnalysis")
                doc = app.openDocument("renderAnalysis", 0);
            }
            else
            {
                console.log("Status not OK. " + response.status + ": " + response.message)
            }
        }
     
        function onError(error)
        {
            console.log("Error: " + error);
        }
     
        function onCreateLoginElement()
        {
            console.log("Creating the login element");
            // Optionally create and return a div to host the login button
            return null;
        }
    </script>
    </html>
     

    API to insert data operations

    It is now possible to add data operations (AddRowsOperation, AddColumnsOperation or DataTransformationOperation) to any location within the data table structure (SourceView).


    User Feedback

    Recommended Comments

    There are no comments to display.


×
×
  • Create New...