Multipage App with different datasets for each page

Hello Dash-Plotly community friends:

I would need some guidance with my issue with dash multipage with different datasets for each page. It takes forever to load in the local computer. On deployment to Cloud Foundry, it justs displays “Loading …” forever. Log file has no error other waiting the container to start the instance.

There are 3 pages with 3 different datasets, each approximately 13MB.

Datasets are cached and loaded in each page using Polars and in th end for plotting gets converted to Pandas dataframe. There are 5 callbacks in each page.

Any help would be highly appreciated.

Thank you,

An enthusiastic Dash Python User.

Hi, it’s difficult to tell without knowing your setup. But 13mb should not take long to read.

How do you cache them? And why do you convert it from Polars to Pandas?

You could plot it directly from Polars?

@AIMPED Thank you for your prompt response. Appreciate it.

My folder structure looks like this:

  • app.py
  • pages/
    • test1.py
    • itest1.py
    • test3.py
  • data/
    • test1.parquet
    • itest1.parquet
    • test3.parquet

Calling each parquet file in each page takes a long time to update. Every time the user clicks on the NavBar to select an option the data would be loaded first, then it would plot a bar chart, a dash-ag-grid table and a click event scatter plot. This is for every page.

So, I created a data_manager.py file, where I cache the data using flask and read the data using polars. This module by itself is blazing fast. However, i am struggling to render this in the app.py using dash multipage option. I call the layouts from every page and try to register them in app.py without success. The dash way of registering it in the pages gives instantiation error of the dash object. So dash.register_page is done in the app.py.

Anyway , i can get it to render only the NavBar in the browser with 12 errors saying duplicate ids when they are not . I have checked every page but couldn’t seem to find any id that is same. Reluctantly, i tried ChatGPT, Bard AI, and Bing AI for some solution but i did not succeed.

Hence, if you or anyone in the plotly community can show me an outline of how to proceed, I would be very grateful.

Thank you.

Hi @Bijoy

Currently I’m creating an app where I do exactly the same. Give me some time and I’ll create an MRE from that.

I don’t think you can do that.

This is probably because you are referring to app somewhere in your code, maybe @app.callback()

Can you share your code?

Hi @Bijoy here is what I did (more or less) in my app.

I memoize the data at app startup and use this afterwards on the different pages. There might be more elegant / faster / better ways to achieve something similar, though :see_no_evil:

1 Like

Hi @AIMPED ,

Thank you for taking the time to post your experience and the code. Appreciate it very much.

Unfortunately, it did not work for me. I did register each page in the pages folder separately. The dcc.store gave me “qouta exceeded” error.

The only way it works, but quite slow, is to decouple the data from the pages with a data_manager.py file which is memoized and cached using Flask. I then call the files individually in the pages folder.

On local machine it runs slow(20 to 30 seconds) to load. When clicked on different pages, it renders very slowly. The dash-ag-grid table is the better one - as one can check from callbacks graph.

On Cloud Foundry using Kubernetes , it runs very slow.
Not worth mentioning the time it takes to load.

So, I switched to Solara , Polars & Altair. The difference was dramatic. Now it loads and renders very fast, 0.07 seconds to load and changing between the pages is no more than 0.8 seconds.

My real world application has 10 pages, each page has a different database pull of nearly 10 million rows and 45 columns on an average. Each page has 3 charts and a huge Ag-Grid table. All of them seem to work well so far now.

Sorry, "dash-plotly " didn’t work for me. Maybe, I am not a power user of the tool. That beind said, I still am a fan and a believer that tool will do wonders.

Best,
Bijoy

Here is more information on using Polars & with Dash using a dataset with 65 million rows and 25 columns.

1 Like

Thank you very much AnnMarie! This article really helps. I do use Dash Plotly for most of my projects.

Appreciate your help and thanks again for reaching out to me.