Hey,
I’m wondering if anyone could help me understand this behavior. I have a Dash App on “render” with a background callback using Celery and Redis.
If I understand it correctly, the app sends a hash of the function to Redis, and Celery takes the inputs, executes the function, and sends the outputs back to Redis.
The background callback takes an upload as input:
[Outputs]
Input('upload_preprocessed_data', 'contents'),
background = True,
prevent_initial_call = True
)
def upload_preprocessed(contents):
if contents==None:
raise PreventUpdate
get_memory("1.")
preprocessed_data = functions.zip_reader_preprocess(contents)
if preprocessed_data==None:
return no_update, True, "Invalid Data", None, None, None, None, None, None
data1, data2, data3, data4, data5 = preprocessed_data
get_memory("2.")
return True, False, "Upload Successful.", data1, data2, data3, data4, data5
The memory displayed at the beginning is 259MB, and at the end, it’s 273MB, which is an increase of around 14MB. contents is around 5MB.
I’ve noticed that the Redis metrics show a much higher spike. I’m struggling to identify where this increase is coming from. With bigger contents it results in a memory error.
Any ideas on what might be causing this?