Black Lives Matter. Please consider donating to Black Girls Code today.

Download large data from dashboard as csv

Hi. I am ussing html to download data from a dashboard.

csv_string = df.to_csv(index=True, encoding=‘utf-8’)
csv_string = “data:text/csv;charset=utf-8,” + urllib.parse.quote(csv_string)
return html.A(children=‘download data’, id=f’dl_bar_{metric}’, download=f"spx_markouts_{metric}by{’_’.join(groups)}.csv", href=csv_string)

However, the data can be quite large, with the csv_string being over 1mn characters. That seems to result in network errors.

Is there a way to increase the html length allowed or is there another way to download data?

Could you use one of the compression modes of df.to_csv to reduce the size of the data? See https://pandas.pydata.org/pandas-docs/stable/reference/api/pandas.DataFrame.to_csv.html, compression parameter

Thanks for your reply. I tried your proposal, but it seems that to_csv does not support in-memory compression, it only appears to work if a file handle or file is passed as input.

RuntimeWarning: compression has no effect when passing file-like object as input.

I thing the issue is described here

So, I went ahead and naively tried to save to local find and download that

def get_dl_link(df, metric, groups):
    filename=f"spx_markouts_{metric}_by_{'_'.join(groups)}.csv"
    df.to_csv(filename, index=True, encoding='utf-8', compression='gzip')      
                                                                                          
    return html.A(children='download data', id=f'dl_bar_{metric}', download=filename)

This creates the local file, but clicking on the link wouldn’t download anything

I also tried to compress myself using bz2

 csv_string = csv_string.encode('utf-8')       
 compressed = bz2.compress(csv_string)  
 return html.A(children='download datVya', id=f'dl_bar_{metric}', download=f"spx_markouts_{metric}_by_{'_'.join(groups)}.bz2", href=compressed)

However this throws

TypeError: Object of type ‘bytes’ is not JSON serializable