Elsa Zehra Posted February 17, 2023 Share Posted February 17, 2023 We have single standard dashboard copy that is fetching data dynamically from SQL server through stored procedure. There are around 25 clients hence 25 different data sets in terms of data size. There is a lot of drilling down, toggling, joins between data tables, filter drilling and viewing data on transactional level. Let’s say we have around 50 million records of live data What can be the best industry practice to carry out this scenario? Additionally, how efficient should be the loading time for it? Any leads. Link to comment Share on other sites More sharing options...
Peter McKinnis Posted March 3, 2023 Share Posted March 3, 2023 Elsa,From the description, it is a bit hard to determine what the best solution would be since Spotfire has several different ways to solve this scenario and more details would be helpful. One could pull the data into memory and apply the row-level security in-memory or pull the data as needed from the database or keep the data external in the database. One could mix these scenarios too.The efficiency of the data loading is hard to predict since many of the factors that affect data loading in Spotfire are outside of its control. Some of these factors are network latency, query speed, size of the data. Please message me if you want to discuss more.Thanks,Peter Link to comment Share on other sites More sharing options...
Recommended Posts
Create an account or sign in to comment
You need to be a member in order to leave a comment
Create an account
Sign up for a new account in our community. It's easy!
Register a new accountSign in
Already have an account? Sign in here.
Sign In Now