Dash - GCP Cloud Run: Download a Dataframe bigger than 32 Mb

Hello everyone.

I have a Python Dash app deployed in GCP Cloud Run.
In this app, the user can configure some filters in order to get some specific data (Inside the code, is worked with Pandas Dataframes). Additionally, the user is alowed to Download that Dataframe into a CSV and have it in its PC.
The method dcc.send_data_frame(my_df.tocsv, filename=“Myfile.csv”) is used to dowload the Dataframe inside the callback of the download component.
The problem is that, when the Dataframe is bigger that 32 Mb, Cloud Run denies the download, since it has a limitation towards the size of request/response it can handle.
Is there a way to allow this download without limiting the amount of data the user can download? I have read that Streaming the lecture of data can work, but I haven’t found how one can do this in a download. Does anyone know?