How to run 2 callbacks in parallel? My attempt using background_callbacks/redis/celery not working?

Is it possible to run 2 callbacks in parallel?
I had hoped that using redis/celery would mean that a call to a slow background callback would not block another fast callback.

I set up a dummy backend using FastAPI in main.py

import time

import arrow
from fastapi import APIRouter, FastAPI, Request, Response

app = FastAPI()


@app.get("/ping/slow")
async def pong() -> dict[str, str]:
    notnow = arrow.utcnow().format("YYYY-MM-DD HH:mm:ss.SSS")
    time.sleep(5)
    return notnow


@app.get("/ping/fast")
async def pong() -> dict[str, str]:
    now = arrow.utcnow().format("YYYY-MM-DD HH:mm:ss.SSS")
    return now

I run this from a shell (shell 1), which I can access at http://localhost:8000.

uvicorn main.app

I set up a dash app.

import dash
import dash_mantine_components as dmc
import requests
from dash import CeleryManager, Dash, Input, Output, callback
from flask import Flask
from celery import Celery

server = Flask(__name__)
celery_app = Celery(
    __name__,
    broker="redis://localhost:6379/0",
    backend="redis://localhost:6379/0",
)
celery_app.conf.update(
    task_time_limit=600,
    task_track_started=True,)
celery_manager = CeleryManager(celery_app)

# Set up fast and slow requests for testing
slow_url = "http://127.0.0.1:8000/ping/slow"
fast_url = "http://127.0.0.1:8000/ping/fast"


def get_response(url):
    try:
        response = requests.get(url)
        if response.status_code == 200:
            return response.json()
        else:
            return None
    except Exception as e:
        logger.error(e)
        return None


@dash.callback(
    Output("ping-fast-text", "children"),
    [Input("ping-fast-button", "n_clicks")],
    prevent_initial_call=True,
)
def ping_fast(n_clicks):
    response = get_response(fast_url)
    return f"Click: {n_clicks} Timestamp: {response}"


@dash.callback(
    Output("ping-slow-text", "children"),
    [Input("ping-slow-button", "n_clicks")],
    prevent_initial_call=True,
    background=True,
    running=[(Output("ping-slow-button", "disabled"), True, False)],
)
def ping_slow(n_clicks):
    response = get_response(slow_url)
    return f"Click: {n_clicks} Timestamp: {response}"


layout = dmc.Paper(
    [
        dmc.Title("Hello World"),
        dmc.Group(
            [
                dmc.Button(id="ping-fast-button", children="Ping-fast"),
                dmc.Text(id="ping-fast-text"),
            ],
            p=10,
        ),
        dmc.Group(
            [
                dmc.Button(id="ping-slow-button", children="Ping-slow"),
                dmc.Text(id="ping-slow-text"),
            ],
            p=10,
        ),
    ]
)


app = dash.Dash(
    __name__,
    server=server,
    background_callback_manager=celery_manager,
)

app.layout = layout
app.config.suppress_callback_exceptions = True

# set debug UI settings on/off when running under gunicorn
if get_settings().debug:
    logger.info("DEBUGGING configuration ON")
    app.enable_dev_tools(dev_tools_ui=True, dev_tools_hot_reload=False)

# expose application's object server so wsgi server can access it
server = app.server

if __name__ == "__main__":
    debug = True if get_settings().debug else False
    app.run_server(host="0.0.0.0", port=get_settings().development_port, debug=debug)

Now I set up redis and then run dash and celery in separate shells

(shell 2)

redis-server

(shell 3)

gunicorn --bind 0.0.0.0:8001 web.app:server --workers 4

(shell 4)

celery -A app.celery_app worker --loglevel=info

I had hoped that when I clicked ‘ping-slow’ that the ‘ping-fast’ button would remain responsive, and continue to update. Instead, it only updates after the ping-slow callback has completed.

CleanShot 2023-04-29 at 17.27.20

Is it possible to fix this?
I’ve tried different settings for workers in the dash app, and concurrency in the celery app and it does not work.

1 Like

You should be able to run multiple callbacks in parallel, if you use multiple workers or threads. I crafted a slightly simpler MWE,

from dash import Dash, html, Input, Output
from time import sleep

app = Dash()
app.layout = html.Div([
    html.Button("Fast", id="btn_fast"),
    html.Button("Slow", id="btn_slow"),
    html.Div(id="log_fast"),
    html.Div(id="log_slow")
])
server = app.server


@app.callback(Output("log_fast", "children"), Input("btn_fast", "n_clicks"))
def update1(n_clicks):
    return f"You clicked {n_clicks} times on the fast button."


@app.callback(Output("log_slow", "children"), Input("btn_slow", "n_clicks"))
def update1(n_clicks):
    sleep(5)
    return f"You clicked {n_clicks} times on the slow button."

which yields the behaviour you note if use a single thread/process,

gunicorn concon:server

However, if you use either multiple processes,

gunicorn concon:server --workers 2

or threads,

gunicorn concon:server --threads 2

the callbacks run in parallel; at least on my machine. Do you see a different behaviour? If so, please post additional environment information (OS, Python version, Dash version, …).

2 Likes

Thanks+++ That does work! :slight_smile:
I had tried something similar but without success before.
CleanShot 2023-04-29 at 23.06.01

FWIW, I also got the more complicated set-up working too.
The code (and your nice MWE) is in this repo

CleanShot 2023-04-29 at 23.05.10

One benefit of the complexity is that I get to keep the cache warm by using celery beat.

2 Likes