I’m connecting to firebase (a real time database), creating a global pandas DataFrame and using this to generate various plots.
All I want to do is re-run one query every x minutes to keep this DataFrame fresh but every example I’ve seen implies you either need to:
a) Run a seperate query in every callback
b) Convert the data to json and hide it in a div
I thought this would be straightforward to do without resorting to the above…am I missing something?
There is also
c) Re-run the query every x minutes and save it to a
.csv file in a separate process. Replace your dataframe variable (e.g.
df) in your code with a function like:
d) Cache the query with flask caching, see https://plot.ly/dash/performance. Replace your dataframe variable (e.g.
df) in your code with a function like (pseucode, I haven’t run this myself):
@cache.memoize(timeout=60*5) # 5 minutes
# [...] run query
return df.to_json() # serialize so that it can be easily written to a file for caching
df = get_df()
Sorry for trying to awake this old thread but I have a very similar problem where the data of interest in coming from Kafka. Thus I had a look at this code https://github.com/renardeinside/rtvis-proj/blob/master/visualizer/app/server.py which is linked in the “Show and Tell” thread. But it is also using a global DF that is updated. Just for clarification: This is not the way to go, right?
Thanks for the clarification, Chris! Yes, your (btw great) user guide made me realise that I should do it in a different way. But the linked project from “Show & Tell” made me wonder whether it would be okay in some cases. The solution based on Redis and Celery looks interesting, thanks!
@chriddyp i wonder how can i apply mysql instead of redis in the example you gave. any hint?