I have a module utils/load_data.py where multiple files are being read from the cloud and stored in-memory.
I then import these in each page as required, e.g. from utils.load_data import variable_1.
Sometimes the underlying data has to be updated - how do I make that update apply to each page? Using the importlib module and its .reload() method has not been successful so far.
These are datasets that are too big to store on the user-side, 300mb in total approx. Basically, only the user (and subsequently its user-specific filters) are currently stored in dcc.stores. Rn, since we are still experimenting a lot these get updated a couple of times a day, in production it will be approx. weekly.
I understand that updating global variables is not optimal but I did not come with any better solution (I work even with bigger data few gigabites big). I use scheduler for around a year in our internal setup with about 10-20 user and for last year there was no problem with it.
I will be glad if somebody will come with better and architecturally better solution