Krista Dobley Posted February 29 Share Posted February 29 I have a dashboard which has many "load on demand" data sources against large data warehouses, with a "Load" action control button to allow the user to process a load data from an input field (multiple lines), which now is taking ~30 minutes to return data for them all. Are there lines I can add to a python script to select the order which the load on demand will process the load against my data sources (I'll load the quickest ones first), or even for a user to be able to select which data sources they want to reload against the data they post in the input box? Thank you! Link to comment Share on other sites More sharing options...
David Boot-Olazabal Posted March 1 Share Posted March 1 Hi Krista, Not sure how your script looks like, but I have found this one that defines the tables to be loaded:| from Spotfire.Dxp.Data import DataManager,DataTable from Spotfire.Dxp.Application.Scripting import ScriptDefinition from System.Collections.Generic import Dictionary,List from System.Collections import ArrayList from Spotfire.Dxp.Framework.ApplicationModel import NotificationService import clr table=List[DataTable]() table.Add(Document.Data.Tables["test"]) Document.Data.Tables.Refresh(table) When you copy the 'table.add' line to provide for all the data on-demand tables, it may just follow that order to kick off the loading of data for these tables. I am assuming that you have deselected the boxes "Load automatically" and "Allow caching" for all your on-demand tables. There might be possible a workaround, if you want to give users more freedom to select the tables that should be updated. In this case, you have to create a table holding all the on-demand table names. By marking specific table names, the iron python script should then use these values to kick off the on-demand load for the selected tables. As I have not tested this myself, I am not sure if and how you would implement that in the iron python script. And I do not know if you can build something into the code, that presets the loading of a specified set of on-demand tables, based on the user that is opening the analysis. On a different note, permitting this is possible in your case, I would encourage you to have a look at the long loading on-demand queries. As you mentioned, it takes quite a while to load all the on-demand tables against a large data warehouses, which kind of defeats the benefits of on-demand. Since data warehouses are not being updated as frequently as a transactional database, would it be an option to use scheduled updates or automation services jobs of the bigger queries, which are then used in the analysis? Kind regards, David Link to comment Share on other sites More sharing options...
Recommended Posts
Create an account or sign in to comment
You need to be a member in order to leave a comment
Create an account
Sign up for a new account in our community. It's easy!
Register a new accountSign in
Already have an account? Sign in here.
Sign In Now