I’ve created an app that reads a load of CSV files from an external website then plots them up based on user input. For example, the user could want data from 2000-2018 and with each year being one different CSV file the app loads these in year by year and merges them into a dataframe.
The CSV files can end up being very large so to speed things up I’m using Redis caching as described on the Dash user guide (which has been extremely helpful btw!). I tried to store the data in a JSON format on the page but there is so much data it ends up being very slow with each plot converting the data back and forth so Redis caching seemed the way forward.
However, when I push my code to Heroku and try to run the app, if the user chooses a year span which is quite large, the process takes more than 30 seconds and Heroku automatically shuts it down. Heroku suggests using background worker processes (https://devcenter.heroku.com/articles/python-rq) for any process that takes more than a few seconds.
I can’t get these background processes to work together with the Redis caching. I feel like I need something to keep checking if the background process has worked without hanging on a function that will break if it takes more than 30 seconds.
Any thoughts are much appreciated!