Feature size limitations on Clustergram?

Does anyone know the practical limitations of a Plotly Clustergram?

Could it nicely handle, say, 1,000 features? 10,000 features?

What about each of those with 10 samples? 100 samples?

Thanks!

for all those concerned - 1000x100 is stretching it but seems to work OK on my machine (6 core i7, 32 gigs of ram). 10,000x100 seemed to crash it. Of course, I have no idea how this really works on a hardware level.

this is worth noting as an issue

import dash
import pandas as pd
from dash import dcc
import dash_bio as dashbio
from dash import html, dcc
from random import random

app = dash.Dash(__name__)

my_dict={
    i:[random() for j in range(1000)] for i in range(100)
}
df=pd.DataFrame.from_dict(my_dict)

clustergram = dashbio.Clustergram(
    data=df
)

app.layout=html.Div(
    dcc.Graph(figure=clustergram)
)

if __name__ == "__main__":
    app.run(debug=True)