Announcing Dash Bio 1.0.0 🎉 : a one-stop-shop for bioinformatics and drug development visualizations.

Adding Visual Feedback to Long Callbacks with DB Download in Dash

I am developing an application in Dash that processes large datasets from Google Cloud. I got the data transfer to work just fine, however as the download takes a considerable amount of time, users don’t see an update until 20 seconds after triggering a query. I would like to add some visual feedback to that using dbc.Spinner() from dash-bootstrap-components or dcc.Loading() but simply wrapping the components to be updated into those didn’t work.

For context:

I have a dictionary of dataframes e.g.:


DATASETS = { 'df1' : df_1,
  'df2' : df_2
}

which I populate like:

    @app.callback(
    Output('ddive-data-dummy', 'children'),
    [Input("ddive-dataload-btn", "n_clicks")],
    [State("ddive-filters", "children")]
    )
    def update_datasets(n_clicks, filter_str):

        if not n_clicks:

            raise PreventUpdate

        # empty dataframes
        for dataset, dataframe in DATASETS.items():
            dataframe = pd.DataFrame()

        for dataset in DATASETS.keys():
            print(dataset)
            filters = {}
            
            if dataset in ("annotations", "reviews"):
                level = "WEEK"
            else:
                level = "MONTH"
            DATASETS[dataset] = load_from_query(
            query, client)

        return dt.utcnow()

Also, for completeness, the definition of load_from_query():


def load_from_query(query: str, client: Client) -> DataFrame:

    df = client.query(query).result().to_dataframe(bqstorage_client=bqstorageclient, progress_bar_type = 'tqdm')

    return df

Is there a good way to set the components that are updating into a loading state, so that the user has visual feedback on the graphs that are updating?

I have a similar question, altough I see that nobody replies to kaiharuto

I perform long calculations of epidemiological data and would like to write on a label in the screen the progress of that calculation, to give the user the information on the time to wait to get the result. Is there a way to flush the output during a callback ?

In previous systems, asp pages, I was using response.flush. Is there something similar in dash ?

thanks

Hi @annunal ,

I think dbc.Progress could be your solution, see this:

and:

https://dash-bootstrap-components.opensource.faculty.ai/docs/components/progress/

Thanks Eduardo. It can work.

I will try it

I wanted to try but in reality I don’t think I can. The progress bar must be updated with callback but in my case I would like to update the progress according to the real time spent by the process inside the long callback.

With the progress bar, as it is now, either i estimate the time of the process and have a fake progression but otherwise I cannot have an update driven by some parameter inside the long callback because this will come out only when it is completed.

example:

progress(’’", value=0)

@app.callback (output(‘progress’,‘value’),… , other input and outout)

def myLongCalc(valueProg,…):

[ step 1 of the calculation ]
< here I would like to update the progress bar >

[ step 2 of the calculation ]
< here I would like to update the progress bar >

[ step 3 of the calculation ]
< here I would like to update the progress bar >

return ??,…

If you can calculate the advance of your process (from 0 to 100), then you can write that progress in a file and with another callback using dcc.Interval you can take that number from the file and send the value to the dbc.Progress

Clever.

Maybe in a cookie then instead of a file. I will try and inform you.

The proposed approach relies on multiple callbacks being executed simultaneously. It you have sufficient workers available, it might work. However, in general I would recommend running the long tasks asynchronously, which would yield a more robust solution.

Hi @Emil

I didn’t understand your answer, what do you means with “long tasks asynchronously”

Thanks!

I mean that they should be run in a separate process, e.g. using Celery. As far as i remember, that is also what Dash Enterprise does for this kind of use case.