I know this post isn’t super relevant to this community, but I’ve had luck with this forum in the past so what the heck. I have made a web app in python using Dash, and now I am in the deployment stage. My overall database consists of over 3 billion data points, but I have scaled it down significantly for testing purposes. I have managed to run the app on my local server, but when trying to deploy on Heroku, I get the “Memory quota exceeded” error. My data is static and will remain so. My data is currently in 3 files, all of which are feather files. I just need to be able to query the data given some filter fast enough so Heroku isn’t overwhelmed. I am quite new to the field so I am looking for suggestions are far as which services would be the best for what I am trying to do. Services I’ve looked into include Google Big Query and Amazon S3, but my inexperience makes it hard to know what I should be looking for exactly. I am also open to other hosting ideas, it just seems as though Heroku is most popular with Dash apps.
1 Like