Hi All,
I am using dash aggrid a feature to render table with excel kind of filtering and sorting capabilities.
One quick question how huge volume of data aggrid can handle?
I have data in s3 bucket as parquet file, I’ll load it using pyarrow, but how to pass it to aggrid and how does dash handle such huge data (data can be as huge as 20milion records), but I want that feature to work on such huge data (I want to preserve the excel kind of filtering and sorting feature)
Currently I’m facing with memory leaks.
Will be super helpful if anyone can help me with this, can anyone answer asap please, need this real quick.