Announcing Dash Bio 1.0.0 🎉 : a one-stop-shop for bioinformatics and drug development visualizations.

Improving App Performance [ Help Please ]

Hi Everyone,

I’m interested in understanding what is the best practice to improve performance working in Dash when the DataFrame are not so big.

My main issue is related to flowchart considering the following:

  1. Dash works with independent callbacks that run in parallel, that means no time waiting for other callbacks finishing.
  2. But each callback involves additional time consumptions.

For example I have this part of the app related with the historical share price of a Company:

Here I have different Dash elements that works with the same information (ten years historic share prices), but all this figures implies:

  1. One dcc.Graph
  2. Five gauges with value, marks, max, min, grid ranges.
  3. Five range sliders with value max, min, marks.
  4. Six graph indicators for the deltas.
  5. One led with the last price.

What is the best aproach:

  1. Use one callback for each element to run in parallel (18 callbacks and 18 requests for the data) ?
  2. Put all together in one callback and update all the output at the end of the process ?

Take into account that the app has other modules that also run in parallel to those described above.

Additional minor question: have any sence to store the historical data in a dcc.Store to avoid calling the request 18 times?

Thanks in advance for your time. :smiley:

For the best performance, I would use a server side cache to store the shared data (unless the data is very small). Each callback would then access this cache and send only the needed data to the client.

Thanks @Emil for your answer :smiley:

The data is just 1260 rows one column and the index (dates) for the last share price closed from the last five years, I supposed that is a small (but I don’t know if its “very small” :joy:).

I saw your solution to use serving side caching, it looks pretty interesting, but I didn’t try to implement yet.

Related to using multiple callback versus using one to issues that are related each other, do you have an opinion? I read in the documentation that each callback spend additional time but I cant figure out if this time is compensated for using them in parallel.

Thanks again for your time.

Hey @Eduardo

Have you looked at the callback graph to see which callbacks are taking the longest?

Hey @AnnMarieW

Good idea :smiley:

Do you know how to enlarge the graph to see its content ? :thinking:

I usually select a configuration that works best from the dropdown, then zoom using the touchpad.

It doesnt work in my PC with windows :woozy_face:

I enlarge the browser zoom and the graph goes small. :upside_down_face:

I had a similar performance problem where I had multiple charts based on the same shared 100 days of Daily OHLC stock data.

I resolved it by having one callback that loads the data from the source data API into a dcc.Store object with storage type of Memory, which stores the ‘shared’ data, and then the individual chart call backs load the shared data from the dcc.Store object. I found the dcc.Store to work efficiently for this purpose as it eliminated the need to make multiple API requests for the same data.

Thanks @johndy

Let me give you a tip:

If you are using dcc.Store let implement server side caching callback, it’s very easy to do because use the same dcc.Store and you only need to make two minor changes in your code, just import the library:

from dash_extensions.enrich import Dash, ServersideOutput, Output, Input, State, Trigger

And replace the Output to the dcc.Store with:


It uses the same Inputs, and manage all the info in the server side.

Thanks @Emil for such a great tool !! :smiley:


Thanks for the great tip @Eduardo. I also had to modify app = dash.Dash()… to app = Dash() to get it working. This had an incremental improvement as the data then avoided the round trip and my files are relatively small, but the thing that really made the difference for my app was that where before I was using the Clientside Store Output that only supports JSON and thus had to the convert the data objects from/to JSON in the beginning/end of each callback, that step is no longer necessary with the ServersideOuput and even with small data loads I saw a huge improvement. Thanks again to you and @Emil.


Thanks for sharing this information. It was very useful.