Heatmap of large 2D array using datashader


I’m trying to show a heatmap of a large 2D array (160x250000 entries). My idea was to use datashader for performance but I’m having troubles getting it right. This should be finally served as a dash app but I’m already having problem with plotly + datashader. There is probably something very basic I’m not understanding in this process. It would be great if someone could tell me what I’m doing wrong.
Below you can find a minimal example to reproduce the issue (using offline.plot since I’m using spyder to run it). If I’m not wrong, the code is defining a grid of 900x 300 and the value represented in each element of the grid is the mean of the original 2D array entries falling in that grid element. Now, I’m expecting the figure to keep the resolution fixed as I zoom in and to show more finegrained details - at some point, if I zoom in enough, in each grid element there should be just one entry of the original 2D array.
However when I zoom in in the produced figure the aggregation does not change.
Any help understanding this would be much appreciated.

import plotly.express as px
from plotly.offline import plot
import numpy as np
import datashader as ds
from datashader import reductions as rd
import xarray as xr

pw_s = np.random.randn(150, 25000)
pw_s[:, 10000:] += 3
pw_s = xr.DataArray(pw_s, coords=[('y', np.arange(150)),
                                  ('time', np.arange(25000)/2000)])
cvs = ds.Canvas(plot_width=900, plot_height=300,
                x_range=(0, 25000/2000),
                y_range=(0, 150))
agg = cvs.raster(pw_s, agg=rd.mean())

fig = px.imshow(agg)
1 Like

Further investigating on this I think the problem is that the aggregation is a fixed xr.DataArray so nothing changes when I zoom in. I guess I should write a callback to handle the zoom actions but I don’t know how to do it and couldn’t find any documentation about it. The solution does not seem really trivial.
Any help would be much appreciated.