Any ways to speed up dash plotly charts when working with a large dataset?

I have a Django app that integrated dash plotly charts. It is working as expected. The downside is that the same chart takes varying amount of time to show depending on the filter value. When the data returned is about a small catalogue (decided by the filter value), there are about 1000 products times 30 days, it is relatively fast. When the data returned is about a big catalogue, it can be up to 40 times larger than the smallest catalogue and it takes an unacceptably long time for the chart to update.

I did optimisations such as selecting only the needed columns from the database, comparing different sql queries’ performance and selecting the fastest query, adding index on key columns in the database.

The thing is if someone select a longer period than 30 days, it will be impossible for the larger catalogue. And even the smallest catalogue will reach a point of too large dataset if a user selects, say, a 2-year period.

Is there a threshold for the amount of data to be handled in a dash plotly app? What would be the recommendation and how is the challenge of the size of data handled in a typical app? I am thinking that apps like Google Analytics pull years of data and it is rather fast. There must be some way to handle it.