I’m running my python app on a Docker Container in GCP, whereas my data is stored inside of Cloud Storage and transform it with Pandas.
My idea is to allow the user to download the information of two dataframes as one excel file, where each dataframe is in a excel sheet.
I had this code for a local version and it worked without a problem.
@app.callback(Output("download_data", "data"), [Input("download_data_button", "n_clicks")]) def dowload_two_dataframes(n_clicks): if "download_data_button" in ctx.triggered_prop_ids: file_name = "my_excel.xlsx" file_path = os.path.join(myPath, file_name) first_dataset = self.get_first_dataset second_dataset = self.get_second_dataset with pd.ExcelWriter(file_path) as writer: first_dataset.to_excel(writer, sheet_name="first_dataset", index="False", encoding="utf-8-sig") second_dataset.to_excel(writer, sheet_name="second_dataset", index="False", encoding="utf-8-sig") data= dcc.send_file(file_path, filename= file_name) return data
I had looked up in documentation from both Dash and GCP, but hadn’t found a way to adapt it. I can retrieve the information from the Cloud Storage, but the problem is using the ‘to_excel’ function, since its a stateless container.
Any idea on how it can be fixed?