Slow performance when dealing with large Datasets


I’m quite new to using dash library and in my experience, I have found a limitation when I’m working with a large amount of data because I’m using dash components in order to share data between callbacks, so atm I’m getting into a point where I see that all the work that I have done to build my dashboard is not enough because the performance of the page is poor and week.
Is there an alternative to mitigate this? I’m working with 5 million rows and this is not the biggest number, this could variate depending on the user input.

I also use a VM that has a lot of computational power but also is taking a while to update the other dash components.

dcc .store is the most effective way to share data between callbacks?

I think that’s the main issue of my code. I need to find a way to share data quick because the postprocessing part is very expensive in time so I need to do it only once and then shared the result to

Hello @seventy77,

You should look into serverside outputs here:

This keeps you from having to exchange it back and forth with the network.

Hi Jinny :slight_smile: this works with for local host? Because at this point in time is taking a while with my local host. Idk how long it would take once I shared though a link

Take a look here, and see if any of this can help you.

Also, if you are using inputs, especially typed, make sure you are using debounce