I am developing an application in Dash that processes large datasets from Google Cloud. I got the data transfer to work just fine, however as the download takes a considerable amount of time, users don’t see an update until 20 seconds after triggering a query. I would like to add some visual feedback to that using dbc.Spinner() from dash-bootstrap-components or dcc.Loading() but simply wrapping the components to be updated into those didn’t work.
For context:
I have a dictionary of dataframes e.g.:
DATASETS = { 'df1' : df_1,
'df2' : df_2
}
which I populate like:
@app.callback(
Output('ddive-data-dummy', 'children'),
[Input("ddive-dataload-btn", "n_clicks")],
[State("ddive-filters", "children")]
)
def update_datasets(n_clicks, filter_str):
if not n_clicks:
raise PreventUpdate
# empty dataframes
for dataset, dataframe in DATASETS.items():
dataframe = pd.DataFrame()
for dataset in DATASETS.keys():
print(dataset)
filters = {}
if dataset in ("annotations", "reviews"):
level = "WEEK"
else:
level = "MONTH"
DATASETS[dataset] = load_from_query(
query, client)
return dt.utcnow()
Also, for completeness, the definition of load_from_query():
Is there a good way to set the components that are updating into a loading state, so that the user has visual feedback on the graphs that are updating?
I have a similar question, altough I see that nobody replies to kaiharuto
I perform long calculations of epidemiological data and would like to write on a label in the screen the progress of that calculation, to give the user the information on the time to wait to get the result. Is there a way to flush the output during a callback ?
In previous systems, asp pages, I was using response.flush. Is there something similar in dash ?
I wanted to try but in reality I don’t think I can. The progress bar must be updated with callback but in my case I would like to update the progress according to the real time spent by the process inside the long callback.
With the progress bar, as it is now, either i estimate the time of the process and have a fake progression but otherwise I cannot have an update driven by some parameter inside the long callback because this will come out only when it is completed.
example:
progress(’’", value=0)
…
@app.callback (output(‘progress’,‘value’),… , other input and outout)
def myLongCalc(valueProg,…):
…
[ step 1 of the calculation ]
< here I would like to update the progress bar >
[ step 2 of the calculation ]
< here I would like to update the progress bar >
[ step 3 of the calculation ]
< here I would like to update the progress bar >
If you can calculate the advance of your process (from 0 to 100), then you can write that progress in a file and with another callback using dcc.Interval you can take that number from the file and send the value to the dbc.Progress
The proposed approach relies on multiple callbacks being executed simultaneously. It you have sufficient workers available, it might work. However, in general I would recommend running the long tasks asynchronously, which would yield a more robust solution.
I mean that they should be run in a separate process, e.g. using Celery. As far as i remember, that is also what Dash Enterprise does for this kind of use case.