Show and Tell - Server Side Caching

This is really awesome @Emil! Keep up the good work!

1 Like

Hi @Emil,
I was trying to repeat the experiment where you plotted millions of rows in a matter of seconds. It take the code below about 5 seconds to plot ~500,000 rows onto a px.scatter(). Do you know why that might be taking so long?

I used this data here.

And below, you can find the reproducible code.

import datetime
import dash
import dash_core_components as dcc
import dash_html_components as html
import pandas as pd
import as px

from dash.dependencies import Output, Input, State
from flask_caching.backends import FileSystemCache
from dash_extensions.callback import CallbackCache

df_org = pd.read_csv("green_tripdata_2019-01.csv")

app = dash.Dash(prevent_initial_callbacks=True)
server = app.server
app.layout = html.Div([
    html.Button("Run benchmark (with cache)", id="btn_wc"),
    dcc.Dropdown(id="dd_wc", options=[{"label": x, "value": x} for x in df_org["passenger_count"].unique()], value=1),
    dcc.Store(id="time_wc"), dcc.Loading(dcc.Store(id="store_wc"), fullscreen=True, type="dot"), html.Div(id="log_wc"),

cc = CallbackCache(cache=FileSystemCache(cache_dir="cache"))

@cc.cached_callback([Output("store_wc", "data"), Output("time_wc", "data")],
                    [Input("btn_wc", "n_clicks")])
def query_wc(n_clicks):
    df = df_org[["passenger_count", "trip_distance", "total_amount"]]
    return df,

@cc.callback([Output("log_wc", "children"), Output("mygrpah", "figure")],
             [Input("store_wc", "data")], [State("dd_wc",'value'), State("time_wc", "data")])
def calc_wc(df, value, time):
    toc =
    df_filtered = df[df["passenger_count"] == value]
    fig = px.scatter(df_filtered, x='trip_distance', y='total_amount')
    return ("ELAPSED = {} seconds".format((toc - time).total_seconds())), fig


if __name__ == '__main__':

Running your example, i get ELAPSED = 0.056303 seconds with caching and ELAPSED = 4.035909 seconds without. The time to do the actual plotting (not measured, but it seems to be around a few seconds on my laptop) comes on top, but i didn’t include that in the benchmark as it is not affected by the caching mechanism.

Hello @Emil!

I looked at your dash-extensions code and it looks pretty neat!

The CallbackCache components seems to have great performance improvement on large(r) datasets! Nice work :clap:t2: I really like the @cc.callback() syntax and it is the same I came up with dash-uploader (@du.callback()). I think it’s good if the community standardizes to some syntax!


Thank you for explaining. But I don’t fully understand what can be affected by the caching. Can the plotting be affected by the caching mechanism?

If i pass a figure into Store with @cc.cached_callback, doesn’t that affect plotting?

I was thinking the benefits from the Trigger / CallbackGrouper components (from dash-extensions) a bit more, but since they were not directly the topic of this thread, I created another one: @app.callback improvements? (Trigger component, same Output multiple times, callback without Output).

I also thought the names of the callbacks for CallbackCache: Would it make sense to have the @cc.callback to be the cached_callback() by default? Then, have @cc.callback_nocache for the case when cache is not needed, since the class name indicates that there would be some cache used? Or, have a keyword argument for @cc.callback(), such as cached=True, which would control the caching? It’s really a matter of taste, though!

1 Like

Hi @Emil

I had used your cached callback. Actually i was fetching large data frame from SQL server and plotting graphs from it on my dash app. But I want the graphs to be updated after every 30 seconds. So i used your cached callback as suggested by you. And i ran a schedular in another file to keep fetching data seperately from SQL and save as parquet file on my system. The cached callback used to read the data from it and then display it on all the 10 graphs on my dash app. And it was running perfectly.

I would love it if this callback will be documented on the dash’s official website.


Hi, I got time to make some testing and benchmarking for the CallbackCache. Here are the results:

Testing CallbackCache

Now I tested with example code of adamschroeder using his example data, and this is what I see in the dev tools:

Each time I pressed the “Run benchmark (with cache)” -button, there would be two HTTP POST requests to, (one after another since they are chained). You’ll have one HTTP POST request per each callback that is triggered.

  • The first _dash-update-component is fast and response size is very small, about 400Bytes, content is
    {"response": {"store_wc": {"data": "28de8ed6118d3dc73f39311bf5de1910"}, "time_wc": {"data": "83c5acc6308b231e3e56493fcf3774e1"}}, "multi": true}
    This corresponds the ID’s of the dcc.Store component. With this ID the data is read from the cache, if I understand correctly.

    I think here is the gain: Since only ID is saved to the browser, you do not have to send data from browser (dcc.Store) to server each time a callback is called.

  • The second _dash-update-component is slow (with large TTFB1; about 5-7 seconds). This contains the Graph data as JSON which is drawn to browser. Response size is about 1.8Mb.

As a diagram, this would be something like (3x small JSON, 1x large JSON, df to JSON only once)

So, what is taking the time in the second part?

The creation of the figure with px.scatter(df_filtered) takes about 5 seconds. Funny thing that df_filtered.to_json takes only ~0.5 second, so actually 95% of the time used to create a plotly figure is something else than just creating a json object out of it. (some optimization possible in px.scatter(), perhaps? Moreover, it could be possible to memoize calc_wc, too!)

Problem with caching?

I tried the mentioned code + this change

cc = CallbackCache(cache=FileSystemCache(cache_dir="cache"), instant_refresh=True)

@cc.cached_callback([Output("store_wc", "data"), Output("time_wc", "data")],
                    [Trigger("btn_wc", "n_clicks")])
def query_wc():
    import time
    time.sleep(5) # Added sleep
    df = df_org[["passenger_count", "trip_distance", "total_amount"]]
    return df,

and I was hoping to see the time to be 5 seconds with first callback call and ~0 seconds during the next ones. Unfortunately, it did not work like that. I tried both instant_refresh options (True/False). You can see this yourself from the dev console, looking at the Timing of the first _dash-update-component. It could be a bug, configuration issue, or just how it should work. My best guess is the last; the CallbackCache will not memoize the callback, but just serves as “Store” in the server side. That said, this could be memoized easily also without dash-extensions, (if there is no such functionality?) so I assume there could be additional speed gains, if needed.

Compare to without CallbackCache

The callbacks without CallbackCache would be roughly something like this

@app.callback([Output("store_wc", "data"), Output("time_wc", "data")],
                    [Input("btn_wc", "n_clicks")])
def query_wc(n_clicks):

    df = df_org[["passenger_count", "trip_distance", "total_amount"]]
    return df.to_json(),

@cc.callback([Output("log_wc", "children"), Output("mygrpah", "figure")],
             [Input("store_wc", "data")], [State("dd_wc",'value'), State("time_wc", "data")])
def calc_wc(df, value, time):
    df = pd.read_json(df)
    toc =

    df_filtered = df[df["passenger_count"] == value]

    fig = px.scatter(df_filtered, x='trip_distance', y='total_amount')
    return ("ELAPSED "), fig

As a diagram, this would be something like (1x small JSON, 3x large JSON. df to JSON 2 times, JSON to df 1 time)


  • First callback takes ~5.3 seconds and response is 6.7Mb! (this is the data as JSON string)
  • Second callback takes about ~10.5 seconds, since now it first sends the data as JSON from browser to server, and then gets the figure as response.
  • Summary: The CallbackCache saves now about 2/3 of time when there is dcc.Store involved with two chained callbacks + a Graph.
  • By caching / using memoization of the callback functions it could be possible to make this even faster.

I hope this makes the gains more clear to everyone!

:bell: Note: The tests were done in localhost without throttling (the network speed is much faster than in normal situation). In real world application, sending the big JSON packages back and forth would be even slower, and the difference between using CallbackCache vs. not using it would be more dramatical.

- Niko

1The TTFB is defined as

The browser is waiting for the first byte of a response. TTFB stands for Time To First Byte. This timing includes 1 round trip of latency and the time the server took to prepare the response


Will it be better to use flask-caching directly?

Great analysis! I love your diagram in particular :blush:. I’ll take it as the starting point of my answer. Let’s denote the arrows (1,2,3,4). To sum up what happens,

  1. Client sends button click (small).
  2. Client receives the full data, i.e. around 6.7 MB is this example.
  3. Client sends the full data, 6.7 MB.
  4. Client receives the figure, around 1.8 MB.

As you note, (1) is fast. Due to the large payload (6.7 MB), both (2) and (3) are slow. And since the data are not used for anything by the client itself, this data transfer is in fact unnecessary. The figure transfer in (4) is also rather slow (as the figure is large), but unlike (2) and (3) it is necessary. Since the figure is rendered client side, without sending the figure JSON to the client, the client would not know what to draw.

The caching mechanism targets the unnecessary transfers (2, 3), but it cannot do anything about (4). Hence based on your results, I would say that the cache works as intended.

You can use flask cache directly to avoid reevaluating the function multiple time, but it won’t save you the data round trip, which is the key point of the CallbackCache.

It would not make sense to use the cached callback per default. As noted in my previous post, it only makes sense to use the cache for callbacks that return data, which is not used by the client.

It might be more intuitive to use the same callback decorator for all callbacks and add a cache keyword argument. However, I think this argument should take a cache object as input rather than a Boolean. This would make it possible to use different caches for different callbacks, e.g a disk cache for large data blocks and a memory cache for smaller ones.

1 Like

Regarding to this, I posted my thoughts on naming/default behaviour of a “cache” to the separate thread.

Based on inputs from @np8 and @chriddyp, i have come up with a new syntax (available in dash-extensions 0.0.28). The performance should be the same, but the syntax is simpler (at least that is the intention). Here is the benchmark example using the new syntax,

import datetime
import dash_core_components as dcc
import dash_html_components as html
import numpy as np
import pandas as pd

from dash_extensions.enrich import Dash, ServersideOutput, Output, Input, State, Trigger

# Drop down options.
options = [{"label": x, "value": x} for x in [1, 10, 100, 1000, 10000, 100000, 1000000, 10000000, 100000000]]
# Create app.
app = Dash(prevent_initial_callbacks=True)
server = app.server
app.layout = html.Div([
    html.Button("Run benchmark (with cache)", id="btn"), dcc.Dropdown(id="dd", options=options, value=1),
    dcc.Store(id="time"), dcc.Loading(dcc.Store(id="store"), fullscreen=True, type="dot"), html.Div(id="log")

@app.callback([ServersideOutput("store", "data"), ServersideOutput("time", "data")],
              Trigger("btn", "n_clicks"), State("dd", "value"))
def query(value):
    df = pd.DataFrame(data=np.random.rand(int(value)), columns=["rnd"])
    return df,

@app.callback(Output("log", "children"), Input("store", "data"), State("time", "data"))
def calc(df, time):
    toc =
    mean = df["rnd"].mean()
    return "ELAPSED = {}s (and mean is {:.3f})".format((toc - time).total_seconds(), mean)

if __name__ == '__main__':

So what has changed? Instead of having to register the callbacks on the Dash app object, you now just have to use the custom objects from dash_extensions.enrich. The cached_callback decorator has been abandoned. You now just use the normal callback decorator and indicate which outputs should stay server side by using ServersideOutput instead of Output.


To change the cache path, you should create a new FileSystemStore backend, i.e. something like this,

from dash_extensions.enrich import Dash, FileSystemStore

output_defaults=dict(backend=FileSystemStore(cache_dir="some_path"), session_check=True)
app = Dash(output_defaults=output_defaults)

As an initial debugging step, you could check if files are in fact being written to "some_path". If not, it would indicate that you are still having permission issues.

Hi @Emil,

First, great job!!

I notice that in this version, you import dash from dash_extensions.enrich, and you use app=Dash(…) Instead of app=dash.Dash(…)

When I do the same, my app stops displaying / my callbacks don’t fire correctly anymore. Could you enlighten me on the reason for this change and if there are any alternatives? Thanks!

Thanks! Did you remember to import the Input, Output and State objects from enrich also? And what versions are you using?

@Emil Yes I have!

my versions:

dcc: 1.10.2
html: 1.0.3
dash_extensions: 0.0.31

Not even the app layout displays when I use app=Dash(...) instead of dash.Dash(...).

Also, is there a reason why you use prevent_initial_callbacks?

PS: When the callbacks don’t fire correctly (prevent_initial_callbacks=False), I actually get JSON serialization errors and the layout still doesn’t show

I tend to use prevent_initial_callbacks as initial callbacks with None values often needs special handling (which seems unnecessary when you can just use the `prevent_initial_callbacks flag). Hmm, i don’ t see why the callbacks shouldn’t work. Do you get any error, or are the callbacks just not fireing?

1 Like

After inspecting in the browser, the callbacks do fire but nothing is displayed. It might only have to do with the interaction with the layout. I have tried without using a css template but the problem persists.

Is there any way I can still use your module while importing dash.Dash()? It seems like I’m able to use ServerSideOutput without it, and callbacks still fire and the layout does show.