Server Caching in MultiPage app


I have couple of huuge datasets which are specyfic for different types od users. Normally, if they were smailler I would fetch them from database, but due to their size I am looking for more optimized approach, couse the networks is a choke point. Datasets change every hour.
I would need some sort of dynamic caching, were I cache datasets with specyfic names and then in other points of the application retrieve those datasets, transfortm via pandas, and boom - we got this.
Easier said than done, I researched flask_caching with plotly dash and it would all be good if I was able to access the flask cache in callbacks of multi page app (in registered pages).

import dash
from dash import html, dcc, Output, Input, callback
from flask_login import current_user
from utils.login_handler import require_login
from components.AIO_DataTable import DataTableAIO
from data.docs_per_operator import generalDf
from dash.exceptions import PreventUpdate
from components.grids import grid_1x3_even
from data.docs_per_operator import generalDf


def layout():
    user = current_user.get_id()
    if not current_user.is_authenticated:
        return html.Div(["Please ", dcc.Link("login", href="/login"), " to continue"])

    return html.Div(
                          html.Div(id='caching-test', children = 'cache', className='ms-auto'), 
                          html.Button(id='cache-button', children = 'cache some', className='ms-auto')

@callback(Output('caching-test', 'children'),
          Input('cache-button', 'n_clicks'))
def cachingTest(n_clicks):
    if n_clicks == None:
        raise PreventUpdate
        # If df exist in cache load cache, if not create one
        df = generalDf(1, [2023,2022], [1,2,3,4,5,6])
        datab = DataTableAIO(df, 10, 'cacheTest')
        return datab

Now when loading the df as generalDf I would like to check if it exists in cache, if yes - then load it from cache ,if not, - put it there in cache, and use it next time.

I have hard time trying to access the cash from the main file where if set it up.

server = Flask(__name__)
cache = Cache(server, config={
    'CACHE_TYPE': 'filesystem',
    'CACHE_DIR': 'cache',

app = dash.Dash(
    __name__, server=server, use_pages=True, suppress_callback_exceptions=True
app.layout = html.Div(
if __name__ == "__main__":, port=5000)

Any sugestions will be helpful

Alright, seems like I misunderstood applications of the cache. I solved the issue by creating custom datasetManager, which manages datasets with “.parquet” format as I found it the fastest for my case. Here you can read some more about this stuff link

here the result for small ds:

here for a bigger one:

seems like around 18 times faster. I still would like to add frontend caching couse it takes some time to load. The table loaded is 27000r x 28c. This way even poorly designed queries will work relativly fast and limit db recources consumption.