Hi
I am following below example on using flask caching with File system to store a big dataframe which will be used repeatedly during different callbacks. Basically I’d like to cache that big dataframe so next time when callback, I don’t need to read it again.
And I always get an error of
delete key ‘main.get_longterm_dts_data_cache_memver’ → [WinError 32] The process cannot access the file because it is being used by another process: ‘cache-directory\3177d6ff18b798f403b4140f08f9fe90’
I don’t know why file is used by another process. I have no idea how to fix it.
Anyone help? Thank you so much
app = JupyterDash(__name__,
external_stylesheets=[
"https://stackpath.bootstrapcdn.com/bootswatch/4.5.0/flatly/bootstrap.min.css"
],
suppress_callback_exceptions=True
)
cache = Cache(app.server, config={
'CACHE_TYPE': 'filesystem',
'CACHE_DIR': 'cache-directory'
})
TIMEOUT = 60
df_DTS_all_longterm= pd.read_json(
get_longterm_dts_data_cache(wellname,dts_longterm_filename_list,start_date_prod,end_date_prod, # on;y need to process the first filename under the dropdown list
need_multile_dts_file,dts_data_frequency,dts_well_depth_min,dts_depth_grouping,dts_data_export,
dts_missing_percent_resample),
orient='split')
@cache.memoize(timeout=TIMEOUT)
def get_longterm_dts_data_cache(wellname,dts_longterm_filename_list,start_date_prod,end_date_prod,
need_multile_dts_file,dts_data_frequency,dts_well_depth_min,dts_depth_grouping,dts_data_export,
dts_missing_percent_resample):
return df_DTS_all_longterm.to_json(date_format='iso', orient='split')