Black Lives Matter. Please consider donating to Black Girls Code today.

Chrome crashes on uploading large files


I have an Upload component to input csv/text files. If I upload large files (I tested with a 170mb CSV file), Chrome crashes. I tried setting the max_size property of the Upload component to -1. It works fine in Firefox.

I see that Chrome limits the file size of upload. Is there a way to get around this? I would want to load only a few lines of the CSV file at a time, but the file must be completely uploaded before it can be read.

Another way I am thinking would be to use something like a FileSelector to select the file path. Then I can use pandas to read the CSV as chunks? But I see that the path of the file selected gets stripped while uploading to the browser.

Any ideas?

Hm, I wasn’t aware of this. How did you figure this out?

Yeah, that’s a good idea. We’ll probably need a separate component for this. Maybe something like this: These components can be ported from React to Dash easily with the Dash plugin framework: The customer engineering team at Plotly can also be contracted to build these components for your company or organization as well:

Is the app always running on the same machine as the location of the files to upload? If so, then you could create a dcc.Dropdown that lists all of the available file paths and the callback could read the file from the disk.

However, if you are deploying the app and your users are uploading files from their machines, then you’ll need to use some type of upload component.

Hello Chris,

Thanks for the response.

I used the “Upload” component to upload a CSV file and display it in a dash table. When I use the “Upload” component and upload a large CSV, it crashes in Chrome (‘Aww snap page’) but runs without an issue in Firefox. I added breakpoints and see that it crashes as soon as it enters the Upload component callback. However, if I upload a smaller file, both browsers run it without a problem. I was thinking that the whole file gets passed into the ‘Upload’ component callback function (as binary stream?) first? ; before I can do any operations/optimizations.

Thank you! I tried to use the Plugin before to use to port to Dash, but haven’t been able to do so. I shall try again and see if I can get it to work.

One more option would be to make a local copy of the file and then import. But again, I do not know how to do this without the file path available.


Ah, I see. Yeah, it’s most likely a memory issue. There might be some ways that we can improve the memory management in Dash to get to 1xx-2xxMB files but it’d probably be a lot of deep architectural work, possible only through a corporate sponsorship.

Feel free to open a thread about this, I’d be happy to help you out here :slight_smile: The documentation for creating plugins could certainly be improved and elaborated, and a community thread might be a good place to work on this.

This one looks interesting because it has a “service” property, which could be hooked into a custom flask route (@app.server.route). The callback could be fired with a property that sets something like n_files_uploaded, similar to how we map click events to stateful properties (n_clicks).

1 Like

See Show And Tell -- Dash Resumable Upload for a new approach to uploading very large files :tada:

1 Like