Dash_extensions: Error using CeleryManager for Long_callback with ServersideOutputTransforms

@Emil
I am creating a dash app hosted on heroku. I was originally using a standard dcc.Store component to store a large dataframe as a dictionary. This output is computed (runtime close to 1 hour) with a backend CeleryManager initiated over a long_callback, configured with the argument background=True in an app.callback function. Eventually there was a need to store larger dataframes into the dcc.Store component which resulted in storage capacity errors and ofcourse I wanted the app to be a bit faster when users would retrieve the data in other callbacks. I decided to use dash_extensions packages and a ServersideOutput transform to enable storing more data to the dcc.Store component, faster transforms, and memoization. However, I begin to get errors using the CeleryManager to store this data. The celery tasks are registered to the DashProxy app with the line app.register_celery_tasks() at the bottom of the app.py file. In fact, the celery worker will run the hour-long computation to derive the data however prior to returning the dataframe, I receive the error below:

On the other hand, when I comment out all the arguments to create a longcallback, ie commenting out background=True, running =, progress=. And then just quickly read a samlple dataframe and output it in the function, the callback works perfectly fine and I am able to use the Serversideoutput properly. However, I need the long callback functionality over the CeleryManager because the data is derived from computation of user specific input.

"Exception on /_dash-update-component [POST]
Traceback (most recent call last):
File “/app/.heroku/python/lib/python3.10/site-packages/flask/app.py”, line 2525, in wsgi_app
response = self.full_dispatch_request()
File “/app/.heroku/python/lib/python3.10/site-packages/flask/app.py”, line 1822, in full_dispatch_request
rv = self.handle_user_exception(e)
File “/app/.heroku/python/lib/python3.10/site-packages/flask/app.py”, line 1820, in full_dispatch_request
rv = self.dispatch_request()
File “/app/.heroku/python/lib/python3.10/site-packages/flask/app.py”, line 1796, in dispatch_request
return self.ensure_sync(self.view_functions[rule.endpoint])(**view_args)
File “/app/.heroku/python/lib/python3.10/site-packages/dash/dash.py”, line 1273, in dispatch
ctx.run(
File “/app/.heroku/python/lib/python3.10/site-packages/dash/_callback.py”, line 422, in add_context
raise LongCallbackError(
dash.exceptions.LongCallbackError: An error occurred inside a long callback: Working outside of request context.

This typically means that you attempted to use functionality that needed
an active HTTP request. Consult the documentation on testing for
information about how to avoid this problem.
Traceback (most recent call last):
File “/app/.heroku/python/lib/python3.10/site-packages/dash/long_callback/managers/celery_manager.py”, line 157, in run
user_callback_output = fn(*maybe_progress, *user_callback_args)
File “/app/.heroku/python/lib/python3.10/site-packages/dash_extensions/enrich.py”, line 1259, in decorated_function
unique_id = _get_cache_id(f, output, list(filtered_args), output.session_check, output.arg_check)
File “/app/.heroku/python/lib/python3.10/site-packages/dash_extensions/enrich.py”, line 1276, in _get_cache_id
all_args += [_get_session_id()]
File “/app/.heroku/python/lib/python3.10/site-packages/dash_extensions/enrich.py”, line 413, in _get_session_id
if not session.get(session_key):
File “/app/.heroku/python/lib/python3.10/site-packages/werkzeug/local.py”, line 316, in get
obj = instance._get_current_object() # type: ignore[misc]
File “/app/.heroku/python/lib/python3.10/site-packages/werkzeug/local.py”, line 513, in _get_current_object
raise RuntimeError(unbound_message) from None
RuntimeError: Working outside of request context.

This typically means that you attempted to use functionality that needed
an active HTTP request. Consult the documentation on testing for
information about how to avoid this problem.
ERROR:app:Exception on /_dash-update-component [POST]
Traceback (most recent call last):
File “/app/.heroku/python/lib/python3.10/site-packages/flask/app.py”, line 2525, in wsgi_app
response = self.full_dispatch_request()
File “/app/.heroku/python/lib/python3.10/site-packages/flask/app.py”, line 1822, in full_dispatch_request
rv = self.handle_user_exception(e)
File “/app/.heroku/python/lib/python3.10/site-packages/flask/app.py”, line 1820, in full_dispatch_request
rv = self.dispatch_request()
File “/app/.heroku/python/lib/python3.10/site-packages/flask/app.py”, line 1796, in dispatch_request
return self.ensure_sync(self.view_functions[rule.endpoint])(**view_args)
File “/app/.heroku/python/lib/python3.10/site-packages/dash/dash.py”, line 1273, in dispatch
ctx.run(
File “/app/.heroku/python/lib/python3.10/site-packages/dash/_callback.py”, line 422, in add_context
raise LongCallbackError(
dash.exceptions.LongCallbackError: An error occurred inside a long callback: Working outside of request context.

This typically means that you attempted to use functionality that needed
an active HTTP request. Consult the documentation on testing for
information about how to avoid this problem.
Traceback (most recent call last):
File “/app/.heroku/python/lib/python3.10/site-packages/dash/long_callback/managers/celery_manager.py”, line 157, in run
user_callback_output = fn(*maybe_progress, *user_callback_args)
File “/app/.heroku/python/lib/python3.10/site-packages/dash_extensions/enrich.py”, line 1259, in decorated_function
unique_id = _get_cache_id(f, output, list(filtered_args), output.session_check, output.arg_check)
File “/app/.heroku/python/lib/python3.10/site-packages/dash_extensions/enrich.py”, line 1276, in _get_cache_id
all_args += [_get_session_id()]
File “/app/.heroku/python/lib/python3.10/site-packages/dash_extensions/enrich.py”, line 413, in _get_session_id
if not session.get(session_key):
File “/app/.heroku/python/lib/python3.10/site-packages/werkzeug/local.py”, line 316, in get
obj = instance._get_current_object() # type: ignore[misc]
File “/app/.heroku/python/lib/python3.10/site-packages/werkzeug/local.py”, line 513, in _get_current_object
raise RuntimeError(unbound_message) from None
RuntimeError: Working outside of request context.

This typically means that you attempted to use functionality that needed
an active HTTP request. Consult the documentation on testing for
information about how to avoid this problem."

Here is the code to my app.py file, the function of interest is the def run_inference function. This takes three possible inputs. One of which runs the large computations, the other two which loads sample data fairly quickly. I can eventually separate these out using the MultiplexerTransform and so the latter two inputs will work great but I still will need a long callback for the long calculation:

import os
import io
import base64
import json
import logging
import pandas as pd
import pickle
import numpy as np
import plotly.express as px
import plotly.graph_objs as go
from plotly.subplots import make_subplots
from urllib.parse import quote as urlquote
from urllib.parse import urlparse
import base64
import dash
import dash_uploader as du
from dash import CeleryManager, dash_table, ctx
from dash.dash_table.Format import Format
#from dash import dcc
#from dash.exceptions import PreventUpdate
import dash_bootstrap_components as dbc
from app_session import infer
import time
import matplotlib.pyplot as plt
import matplotlib.colors as mcolors
from dash_extensions.enrich import callback, DashProxy, Output, Input, State, ServersideOutput, html, dcc, ServersideOutputTransform, RedisStore, FileSystemStore

L200_UPLOAD_DIRECTORY = ‘./in_L200’
OUTPUT_DIRECTORY = ‘./out_pred’
DepMap_DIRECTORY = ‘./depmap_files’
DepMap19Q4_DIRECTORY = ‘./depmap_files/df_crispr_19q4.csv’
DepMap19Q4_GENES_DIRECTORY = ‘./depmap_files/19q4_genes.csv’
DepMap19Q4_GENES_PICKLE = ‘./depmap_files/19q4_genes.pkl’
DepMap19Q4_SUMMARY_DIRECTORY = ‘./depmap_files/19q4_sum_stats.csv’
DepMap19Q4_ESSENTIAL_DIRECTORY = ‘./depmap_files/common_essentials.csv’
DepMap19Q4_NONESSENTIAL_DIRECTORY = ‘./depmap_files/nonessentials.csv’
DepMap19Q4_GENES_PICKLE = ‘./depmap_files/19q4_genes.pkl’

global dm_data
dm_data = pd.read_csv(DepMap19Q4_DIRECTORY, index_col=0)

if ‘REDIS_URL’ in os.environ:
logging.warning(‘Redis’)
# Use Redis & Celery if REDIS_URL set as an env variable
from celery import Celery
from celery_worker import celery_app
background_callback_manager = CeleryManager(celery_app)

my_backend = FileSystemStore(cache_dir='./cache')

else:
# Diskcache for non-production apps when developing locally
logging.warning(‘disk’)
from celery import Celery
from celery_worker import celery_app
background_callback_manager = CeleryManager(celery_app)

my_backend = FileSystemStore(cache_dir='./cache')

app = DashProxy(name, use_pages=True, suppress_callback_exceptions=True,
background_callback_manager=background_callback_manager,
external_stylesheets=[dbc.themes.BOOTSTRAP],
transforms=[ServersideOutputTransform(backend=my_backend)],
prevent_initial_callbacks=True)
server = app.server
#----------------------------------APP LAYOUT---------------------------------------------------------------------------------

app.layout = html.Div([
dbc.NavbarSimple(
children=[
dbc.NavItem(dbc.NavLink(“Home”, href=“/”)),
dbc.NavItem(dbc.NavLink(“Predict”, href=“/predict”)),
dbc.DropdownMenu(
children=[
dbc.DropdownMenuItem(“Overview”, href=“/overview”),
dbc.DropdownMenuItem(“Genes”, href=“/genes”),
dbc.DropdownMenuItem(“Z-Scores”, href=“/zscore”),
],
nav=True,
in_navbar=True,
label=“Explore”,
),
dbc.NavItem(dbc.NavLink(“Pathways”, href=“#”)),

],
brand="GO-LoCo",
brand_href="/",
color="primary",
dark=True,
style = {'padding-left': '100px',
         'padding-right': '100px'}
),

dash.page_container,

dcc.Store(id='L200-data-store', storage_type='session'),
dcc.Store(id='load-df_pred-store', storage_type='session'),
dcc.Store(id='input_filename', storage_type='session'),
dcc.Store(id='df_pred-store', storage_type='session'),
dcc.Store(id='experiment_labels', storage_type='session'),
dcc.Location(id='url_name'),
dcc.Download(id='download-pred'),

html.Div(id ='df_pred-store-html'),

dbc.Modal(
    [
        dbc.ModalHeader("Congratulations"),
        dbc.ModalBody("Your genome wide inference has successfully completed. Select the button below to download your prediction. Navigate to the Explore tabs to visualize your results."),
        dbc.ModalFooter(
            dbc.Button(
                "Download and Continue", id="completed", color="primary", className="ms-auto", n_clicks=0
            )
        ),
    ],
    id="completed-popup",
    is_open=False,
),

html.Div(id='dummy_div_app'), 

])

#-----------------------------------------FUNCTIONS---------------------------------------------------------------

def save_file(name, content, directory):
“”“Decode and store a file uploaded with Plotly Dash.”“”
data = content.encode(“utf8”).split(b";base64,")[1]
with open(os.path.join(directory, name), “wb”) as fp:
fp.write(base64.decodebytes(data))

def uploaded_files(directory):
“”“List the files in the upload directory.”“”
files =
for filename in os.listdir(directory):
path = os.path.join(directory, filename)
if os.path.isfile(path):
files.append(filename)
return files

def file_download_link(filename):
“”“Create a Plotly Dash ‘A’ element that downloads a file from the app.”“”
location = “/download/{}”.format(urlquote(filename))
return html.A(filename, href=location)

def parse_contents(contents, filename):
content_type, content_string = contents.split(‘,’)

decoded = base64.b64decode(content_string)
try:
    if 'csv' in filename:
        # Assume that the user uploaded a CSV file
        df = pd.read_csv(io.StringIO(decoded.decode('utf-8')))
    elif 'xls' in filename:
        # Assume that the user uploaded an excel file
        df = pd.read_excel(io.BytesIO(decoded))
except Exception as e:
    print(e)
    return
return df

def myround(x, base=5):
return base * round(x/base)
#----------------------------------------CALLBACKS----------------------------------------------------------------

@app.callback(
Output(‘warning-popup’, ‘is_open’),
[Input(‘submit-inference’, ‘n_clicks’), Input(‘close’, ‘n_clicks’)],
[State(‘upload-L200-data’, ‘filename’), State(‘warning-popup’, ‘is_open’)],
prevent_initial_call = True
)
def toggle_modal(n1, n2, L200_filename, is_open):
if L200_filename is None:
if n1 or n2:
return not is_open
return is_open

@app.callback(
Output(‘continue-popup’, ‘is_open’),
[Input(‘submit-inference’, ‘n_clicks’), Input(‘continue’, ‘n_clicks’)],
[State(‘upload-L200-data’, ‘filename’), State(‘continue-popup’, ‘is_open’)],
prevent_initial_call = True
)
def toggle_modal(n1, n2, L200_filename, is_open):
if L200_filename is not None:
if n1 or n2:
return not is_open
return is_open

@app.callback(
Output(‘L200-data-store’, ‘data’),
Output(‘file_upload_label’, ‘children’),
Input(‘upload-L200-data’, ‘contents’),
State(‘upload-L200-data’, ‘filename’),
prevent_initial_call = True)
def store_l200_data(l200_contents, l200_filename):
if l200_contents is None or l200_filename is None:
raise PreventUpdate
else:
data = parse_contents(l200_contents, l200_filename)
return data.to_dict(‘records’), "File Uploaded: " + l200_filename

@app.callback(
Output(‘load-df_pred-store’, ‘data’),
Output(‘pred_upload_label’, ‘children’),
Input(‘upload-df_pred-data’, ‘contents’),
State(‘upload-df_pred-data’, ‘filename’),
prevent_initial_call = True)
def store_l200_data(df_pred_contents, df_pred_filename):
if df_pred_contents is None or df_pred_filename is None:
raise PreventUpdate
else:
data = parse_contents(df_pred_contents, df_pred_filename)
return data.to_dict(‘dict’), "File Uploaded: " + df_pred_filename

@app.callback(
Output(‘warning-popup-2’, ‘is_open’),
[Input(‘submit-prediction’, ‘n_clicks’), Input(‘close_2’, ‘n_clicks’)],
[State(‘upload-df_pred-data’, ‘filename’), State(‘warning-popup-2’, ‘is_open’)],
prevent_initial_call = True
)
def toggle_modal(n1, n2, L200_filename, is_open):
if L200_filename is None:
if n1 or n2:
return not is_open
return is_open

@app.callback(Output(‘completed-popup’, ‘is_open’),
Input(‘df_pred-store’, ‘data’),
Input(‘completed’, ‘n_clicks’),
State(‘completed-popup’, ‘is_open’),
prevent_initial_call = True)
def toggle_modal(n1, n2, is_open):
if n1 or n2:
return not is_open
return is_open

@app.callback(
ServersideOutput(‘df_pred-store’, ‘data’),
Output(‘experiment_labels’, ‘data’),
Output(‘input_filename’, ‘data’),
Input(‘continue’, ‘n_clicks’),
Input(‘submit-prediction’, ‘n_clicks’),
Input(‘load-prediction’, ‘n_clicks’),
State(‘upload-L200-data’, ‘filename’),
State(‘L200-data-store’, ‘data’),
State(‘upload-df_pred-data’, ‘filename’),
State(‘load-df_pred-store’, ‘data’),
running = [(Output(‘submit-inference’, ‘disabled’), True, False),
(Output(‘eta_label’, ‘hidden’), False, True),
(Output(“progress_bar”, “style”), {“visibility”: “visible”, “height”: “20px”}, {“visibility”: “hidden”, “height”: “20px”},),],
progress = [Output(“progress_bar”, “value”), Output(“progress_bar”, “label”), Output(“eta_label”, “children”)],
background = True, # for deployment
prevent_initial_call = True)
def run_inference(set_progress, con_n_clicks, sub_n_clicks, load_n_clicks, L200_filename, L200_data, df_pred_filename, df_pred_data): #set_progress,
if con_n_clicks == 0 and sub_n_clicks == 0 and load_n_clicks == 0:
return dash.no_update
button_id = ctx.triggered_id
if button_id == ‘load-prediction’:
df_pred = pd.read_csv(os.path.join(OUTPUT_DIRECTORY, ‘prediction.csv’))
experiments = df_pred.columns.tolist()
experiments = [i.replace(’ (CERES Pred)‘, ‘’) for i in experiments if ‘(CERES Pred)’ in i]
return df_pred.reset_index().to_dict(‘dict’), experiments, ‘’
elif button_id == ‘submit-prediction’:
if df_pred_filename is None or df_pred_data is None:
return dash.no_update
elif df_pred_filename is not None and df_pred_data is not None:
experiments = df_pred_data.keys()
experiments = [i.replace(’ (CERES Pred)‘, ‘’) for i in experiments if ‘(CERES Pred)’ in i]
return df_pred_data, experiments, ‘’
elif button_id == ‘continue’:
if L200_filename is None or L200_data is None:
return dash.no_update
elif L200_filename is not None and L200_data is not None:
set_progress((0, “0 %”, “Estimated Time Remaining: " + “inf”))
genes2analyze_dir = os.path.join(DepMap_DIRECTORY, ‘19q4_genes.csv’)
scope = pd.read_csv(genes2analyze_dir)
scope = scope[‘target_genes’].tolist()
df_l200 = pd.DataFrame(L200_data)
a = infer(df_l200)
genes =
ceres_pred = np.zeros(shape=(len(scope),len(a.experiments)))
logging.warning(ceres_pred.shape)
z_scores = np.zeros(shape=(len(scope),len(a.experiments)))
logging.warning(z_scores.shape)
x_avgs =
x_stds =
t1 = time.perf_counter()
for i, gene in enumerate(scope):
logging.warning(gene)
pred = a.infer_gene(gene, aws_s3=True) #True for deployment
x_avg, x_std, z_score = a.calc_z_score(gene, pred)
genes.append(gene)
ceres_pred[i] = pred
z_scores[i] = z_score
x_avgs.append(x_avg)
x_stds.append(x_std)
p = 100 * (i+1) / len(scope)
p = np.round(p)
t2 = time.perf_counter()
t_avg = (t2 - t1) / (i + 1)
seconds = t_avg * (len(scope) - i)
eta = time.strftime(”%H:%M:%S", time.gmtime(seconds))
set_progress((p, f’{p} %‘, "Estimated Time Remaining: " + eta))
df_pred = pd.DataFrame()
set_progress((100, “100 %”, "Estimated Time Remaining: " + “00:00:00”))
df_pred[‘gene’] = genes
df_pred = pd.merge(df_pred, a.gene_categories, on=‘gene’, how=‘left’)
df_pred[‘gene_category’] = df_pred[‘gene_category’].replace(np.nan, ‘conditional essential’)
df_pred[‘avg’] = x_avgs
df_pred[‘std’] = x_stds
for i, exp in enumerate(a.experiments):
df_pred[exp + ’ (CERES Pred)’] = ceres_pred[:, i]
df_pred[exp + ’ (Z-Score)'] = z_scores[:, i]
df_pred.set_index(‘gene’)
set_progress((100, “100 %”, "Estimated Time Remaining: " + “00:00:00”))
df_pred = pd.read_csv(os.path.join(OUTPUT_DIRECTORY, ‘prediction.csv’)) # for testing
#df_pred.reset_index().to_dict(‘dict’)
return df_pred.reset_index().to_dict(‘dict’), a.experiments, L200_filename

@app.callback(Output(‘download-pred’, ‘data’),
Input(‘completed’, ‘n_clicks’),
State(‘df_pred-store’, ‘data’),
State(‘input_filename’, ‘data’),
prevent_initial_call = True)
def update_table(n_clicks, pred_data, filename):
df_pred = pd.DataFrame(pred_data)
filename = filename.replace(‘.csv’, ‘’)
filename = filename.replace(‘.xlsx’, ‘’)
filename = filename.replace(‘.xls’, ‘’)
filename = filename + ‘_prediction.csv’
return dcc.send_data_frame(df_pred.to_csv, filename)

app.register_celery_tasks()

if name == ‘main’:
app.run_server(debug=True, port=8053)

Here is my celery_worker.py file:
import os
from celery import Celery
import redis
import logging

if ‘REDIS_URL’ in os.environ:
redis_url = os.environ.get(‘REDIS_URL’)
logging.warning(‘connect to remote redis server’)

else:
redis_url = ‘redis://127.0.0.1:6379’
logging.warning(‘connecting to redis locally’)

celery_app = Celery(name,
BROKER_URL=redis_url,
CELERY_RESULT_BACKEND=redis_url,
BROKER_POOL_LIMIT=0,
include=[‘app’]
)
celery_app.autodiscover_tasks([‘app’])

Here is my requirements.txt file:
dash==2.8.0
dash-extensions==0.1.11
dash-bootstrap-components==0.13.1
dash-core-components==2.0.0
dash-html-components==2.0.0
dash-renderer==1.9.1
dash-table==5.0.0
Django==3.2.14
gunicorn==20.1.0
joblib==0.16.0
numpy==1.23.4
pandas==1.3.5
pickleshare==0.7.5
plotly==5.9.0
Cython>= 0.28.5
scikit-learn==0.22.1
tqdm==4.46.1
urllib3==1.26.9
boto3==1.24.89
botocore==1.27.89
dash-uploader==0.6.0
packaging==21.3
s3transfer==0.6.0
celery==5.2.7
rq==1.11.1
redis==4.3.4
requests==2.27.1
diskcache==5.4.0
psutil==5.9.0
matplotlib==3.5.2
matplotlib-inline==0.1.3
pymongo==4.3.3

And my procfile:
web: gunicorn app:server
worker: celery -A celery_worker worker --loglevel=INFO --concurrency=1

Please help. Thank you.

I tested out a more basic implementation of my problem above with fewer imports and only two callbacks, one that runs normally and one that runs in the background with a CeleryManager. It turns out the the long callback works fine if the output is just a normal output and is not set to a ServersideOutput. However, if it is set to a ServersideOutput as demonstrated below the same error as above persists.

app.py:

import os
import pandas as pd
from dash import CeleryManager
from dash_extensions.enrich import DashProxy, Output, ServersideOutput, Input, State, callback, html, dcc, ServersideOutputTransform, FileSystemStore, RedisStore

OUTPUT_DIRECTORY = ‘./out_pred’

Use Redis & Celery if REDIS_URL set as an env variable

from celery import Celery
from celery_worker import celery_app
background_callback_manager = CeleryManager(celery_app)

#my_backend = FileSystemStore(cache_dir=‘./cache’)
my_backend = RedisStore(redis_url=os.environ[‘REDIS_URL’], namespace=‘heroku’)

app = DashProxy(name,
background_callback_manager=background_callback_manager,
transforms=[ServersideOutputTransform(backend=my_backend)])

server = app.server

app.layout = html.Div([
dcc.Store(id=‘pred_data_store_1’, storage_type=‘session’),
dcc.Store(id=‘pred_data_store_2’, storage_type=‘session’),
html.Div([html.Button(“Run Short”, id=“btn_short”)]),
html.Div([html.Label(id=“run_short_label”)]),
html.Div([html.Button(“Run Long”, id=“btn_long”)]),
html.Div([html.Label(id=“run_long_label”)]),
])

@callback(ServersideOutput(‘pred_data_store_1’, ‘data’),
Output(“run_short_label”, “children”),
Input(“btn_short”, “n_clicks”))
def run_short(n_clicks):
if n_clicks > 0:
df_pred = pd.read_csv(os.path.join(OUTPUT_DIRECTORY, ‘prediction.csv’))
return df_pred.reset_index().to_dict(‘dict’), “successfully stored data with short callback”
else:
return ‘’, ‘’

@callback(ServersideOutput(‘pred_data_store_2’, ‘data’), #works fine with normal Output
Output(“run_long_label”, “children”),
Input(“btn_long”, “n_clicks”),
background=True)
def run_long(n_clicks):
if n_clicks > 0:
df_pred = pd.read_csv(os.path.join(OUTPUT_DIRECTORY, ‘prediction.csv’))
return df_pred.reset_index().to_dict(‘dict’), “successfully stored data with long callback”
else:
return ‘’, ‘’

app.register_celery_tasks()

if name == ‘main’:
app.run_server(debug=True, port=8052)

and then here is my celery file:

import os
from celery import Celery
import logging
from dash_extensions.enrich import DashProxy, Output, ServersideOutput, Input, State, callback, html, dcc, ServersideOutputTransform, FileSystemStore, RedisStore

if ‘REDIS_URL’ in os.environ:
redis_url = os.environ.get(‘REDIS_URL’)
logging.warning(‘connect to remote redis server’)

else:
redis_url = ‘redis://127.0.0.1:6379’
logging.warning(‘connecting to redis locally’)

celery_app = Celery(name,
BROKER_URL=redis_url,
CELERY_RESULT_BACKEND=redis_url,
BROKER_POOL_LIMIT=0,
include=[‘app’]
)
celery_app.autodiscover_tasks([‘app’])