Hi there,
I deployed a small application in my company which acts as a viewer for a special type of file. The file size is below 1 MB most of the time but can be as large as 1GB. There are only a few users of this app, let’s say a maximum of 10 at the same time.
Currently I store the user files (each user can upload many of them) in a dash_database, which should be thread safe.
However, packing the data in the database takes a lot of time, and I want to speed up the application.
Is there a recommended way to store the data serverside? I think clientside is no option, as the Browser storage is limited.
Should I try something more advanced as Redis? Or would your try something else, maybe store the data simply on the filesystem.
PS: The data is primary numeric, plus a small amount of strings
Hi,
Welcome to the community! 
I don’t have any experience handling files of such size (1Gb) with Dash and I imagine it might be tricky to handle the upload even with dash-uploader
… My quasi-educated guess is that the most efficient would be to store the files locally and eventually use a separated process to send it to a S3 bucket/blob storage type of service. AFAIK Redis works in-memory and it might require a lot of RAM on the server to store files that are not that much accessed…
You might have other problems “viewing” this type of file in Dash, depending on how much data you have to send to the layout. This can also be a bottleneck for the app performance. For that it would make sense to use server-side caching from dash-extensions
see here to cache just the open file (or files opened/uploaded recently).
Apologies if this does not help your specific problem and I hope someone with experience is such problems can add a better perspective. 
Hi,
thanks for your explanation.
The uploader works pretty fine, without any strange modifications, just out of the box.
I’ll try what’s possible with server side caching!
Franz