I’m using plotly.js to visualize oceanographic data in the browser and some of my target visualizations include 1 million or more data points. This is only possible (without terrible performance and/or crashes) with webgl based traces. It seems to be the case that plotly is moving away from these (at least as far as contour and heatmap go). I’m a little unclear on whether this represents a decision to move away from supporting large data sizes, misconceptions about the value of webgl based visualizations, or something else that I’m missing.
If anyone in the community can provide some clarity it would be much appreciated.
Thanks for using plotly.js!
I don’t think so. Infact here you could find an example that applied Plotly.js heatmap to display oceanographic data after decoding GRIB2 data right within the browser: https://github.com/archmoj/opengrib2
Very cool! I’ll take a look and see if I can glean some useful knowledge.
Do you know off hand how many data points are being rendered here?
The wave model applied in the example is from Environment Canada’s Global Deterministic Wave Prediction System which has 1441x721 points (https://weather.gc.ca/grib/grib2_GDWPS_e.html).
It also used to plot high-res data from HRDPS Continental 2576x1456 grid points (https://weather.gc.ca/grib/grib2_HRDPS_HR_e.html).
Interesting. I’ve had significant performance issues trying to render plots with much less data than that. I’ll take a look at your repo and see what I’m doing wrong.
Thanks for the help!