Google chrome aw snap out of memory showing this page

Hi everyone,
I am trying to load my sensor data on a plotly graph. The timestamp of data points is every 10 seconds getting 1 data point, so 60 seconds get 6 data points based on the timestamp. If I calculate 1 month of timestamp data it will be 43,200 data points.
when I selected 1 sensor to load 1-month data 43,200 data points, it worked fine.
But, if I try 3 sensors to load 1-month data 129,600 data points, it’s not worked fine.

I am getting this error when I try to load that kind of data.
the message is coming after a few minutes of load data.
I tried to reduce data points every minute with the pandas resample function df.resample(‘1T’).mean(). it’s not working with the parquet file, inversely if I try with the CSV file the resample mean() average function is working.

Could you help me please how could I load my data fast or is there any other function to get reduce my data points in minutes on a timestamp base?

Thank you in advance.

Best Regards,

Hey @saddamcoder, how are you generating the chart? I can’t reproduce this using a Scatter plot, I tried up to 4320000 data points and three traces.

import dash
from dash import html, dcc, Input, Output, State
import plotly.graph_objs as go
import numpy as np

app = dash.Dash(__name__, update_title=None)

app.layout = html.Div(
                    style={'min-height': 40, 'width': '100%'}
                    options=[1, 2, 3],
                    placeholder='Select number of traces...'
            style={'width': '30%'}

    Output('graph', 'figure'),
    Input('drop', 'value'),
    State('points', 'value'),
def show_image(traces, data_points):
    # create data
    x = np.arange(data_points)
    y = np.random.randint(1, 5, size=data_points)

    # create figure
    fig = go.Figure(layout={'height': 600, 'yaxis_range': [0, 15]})

    # add traces to figure
    for trace in range(traces):
        fig.add_scatter(x=x, y=y + trace * 5)

    return fig

if __name__ == "__main__":