I am trying to get my dash application to work with celery and redis for background callbacks, in a dockerized environment.
Following the documentation here: Background Callback Caching | Dash for Python Documentation | Plotly
I have setup the connection to redis via celery in my main dash application file as follows:
if "REDIS_URL" in ENV: # Use Redis & Celery if REDIS_URL set as an env variable from celery import Celery celery_app = Celery( __name__, broker=ENV["REDIS_URL"], backend=ENV["REDIS_URL"] ) background_callback_manager = CeleryManager( celery_app, cache_by=[lambda: launch_uid], expire=60 ) else: # Diskcache for non-production apps when developing locally import diskcache cache = diskcache.Cache("./cache") background_callback_manager = DiskcacheManager( cache, cache_by=[lambda: launch_uid], expire=60 ) app = dash.Dash( __name__, background_callback_manager=background_callback_manager, suppress_callback_exceptions=True, use_pages=True, ) server = app.server # (...) Application code here... if ENV['ENV'] == 'dev': if __name__ == "__main__": app.run_server(debug=True) else: if __name__ == "__main__": app.run_server(host="0.0.0.0", port=8080, debug=False)
The main application is lauched in its docker container (in the build Dockerfile) with
CMD gunicorn --workers=8 --threads=2 --bind 0.0.0.0:8080 app:server
For deployment my docker-compose has: the dash application above, plus a redis container (plus irrelevant to this topic a mongo container and an nginx container).
When in local dev everything works fine using the diskcache (not providing REDIS_URL), but using redis with celery does not work. No errors are raised, but the redis container does not show any connections, and the background callback just never finishes…
Could there be a need to also setup a specific celery container on top of dash and redis ? Or the need of a celery worker command added to the main dash container ?