so lately I have been building a dashboard with a database that has over 12.5 million rows.
now I encounter 2 problems immediately, the first one being the loading time. waiting for the query to finish at the start of the app sometimes causes the app itself to crash!
The second problem which is probably the same as the first one but “drilling down a bit” is the part where I want to build an interactive dashboard. so let’s say I have 3 graphs that display different information from different tables from the database but all of the tables are connected via FK, and I want to create a drill-down for all the graphs by well clicking them.
I can’t make a query for the drill-down each time someone clicks the graph it will take too much time and resources to get the data from the database.
for now, the solution I came up with is using Redis cache, to store the entire table of 12.5 million rows and do all the filtering in the python code via pandas instead of the database.
I will only use the database to store the information and will filter the data in the code. (I have not implemented this strategy yet.)
so, I wanted to hear about how you guys handle big data dashboards!
any idea on how I can filter all the graphs fast? how to handle the big queries each time?
is there a way to well cache the entire app so that when a customer goes to the website it won’t take long to load?
I would love to hear about ideas to make the dashboard and overall customer experience better, using dash and other libraries together!