We have a pretty big SQL Server database (50+ GB) on our main server, and it is inefficient for our Dash App to query the main server every time (some queries approach a runtime of 300 seconds).
So we have taken a different path.
We run tasks at specified hours (low server load timezones) and get the dataframes in csv formats, at the DASH server filesystem.
But when loading the app, there is a big lag of about 20 seconds, which I believe is connected to the initial parsing of the csv files. After that the app runs smoothly.
Is out approach wrong and how would you handle this problem?
Is there a better way than saving dataframes to csv and parsing it at every webapp client request?
If not csv the what type of Database would you use, in order to maximize our webapp speed?
Thank you in advance for your time,