My data source is a pandas df where it reads the data from a postgresql server. The number of rows ir fetches is around 2million rows.
It takes 31 seconds to load
I have few drop downs which filters the incoming data frame based on the drop downs and generates a scatter plot along with dash table.
Loading of data itself takes around 31 seconds
Generating scatter plot and dash table takes more than 120 seconds
Is there a way we can make this faster by pre rendering ?
Yeah, I once built something like this: pre-render all the top-level figures (so no dropdowns selected) and store the figs in a dict. Then when a user selects a certain toplevel filter the graphs appear immediately, but when they want to drill down it takes a bit longer. (in my application for most use cases users didn’t have to drill down, so it made sense to speed things up). The other downside is that the plots are not completely current ofcourse. (in my case it was all based on overnight data, so no problem)
I also feel that 120 seconds for generating a plot and table is too much though? Are you using python for loops instead of equivalent numpy/pandas functions by chance?
I am using pandas. Not sure how can we pre render