Jump to content

“Suggestions to implement best industry practice and efficient loading time“

Elsa Zehra

Recommended Posts

We have single standard dashboard copy that is fetching data dynamically from SQL server through stored procedure. There are around 25 clients hence 25 different data sets in terms of data size. There is a lot of drilling down, toggling, joins between data tables, filter drilling and viewing data on transactional level. Let’s say we have around 50 million records of live data

What can be the best industry practice to carry out this scenario? Additionally, how efficient should be the loading time for it? Any leads.

Link to comment
Share on other sites

  • 2 weeks later...


From the description, it is a bit hard to determine what the best solution would be since Spotfire has several different ways to solve this scenario and more details would be helpful. One could pull the data into memory and apply the row-level security in-memory or pull the data as needed from the database or keep the data external in the database. One could mix these scenarios too.

The efficiency of the data loading is hard to predict since many of the factors that affect data loading in Spotfire are outside of its control. Some of these factors are network latency, query speed, size of the data.

Please message me if you want to discuss more.



Link to comment
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now
  • Create New...