Dash DataTable Speed Issues

Hi there, I have created a rather simple Dash App (only for my local machine) that displays one Datable and one scatter plot based on a dataset with about 2000 rows and 50 columns (only 3 columns selected for the DataTable).

The pagesize of the DataTable is set to the number of rows of the dataset and it also implements sorting, filtering and row_select.

Both elements are controlled and updated via callbacks.

When loading the App and updating the elements via the callbacks I notice some considerable lag. About 5 seconds.

I am using R and a MacBook Pro.

Now my question to all of you is this: Are DataTables with 2000 rows and 3 columns always this slow? Or is the lag rather caused by my suboptimal programming?

I am curious to hear from you :slight_smile:
Thank you very much.

I think it should be possible to do it faster. How are you populating the table?

Hi @Emil, the following is my code. The columns are intentionally separate because they change based on the callback and here is only one example. Thank you :slight_smile:

columns = list(
 list(id = 'Date', name = 'Date', type = 'text'),
 list(id = 'Counterparty', name = 'Counterparty', type = 'text'),
 list(id = 'Amount', name = 'Amount', type = 'numeric', format = list(specifier = ',.2f'))

dt <- dashDataTable(
 id = "datatable",
 fixed_rows= list(headers = TRUE, data = 0),
 style_table= list(
   maxHeight = '400px',
   overflowY = 'scroll'
 columns = columns,
 data = df_to_list(dt),
 style_cell = list(
   minWidth = '50px', 
   textAlign = 'left'
 style_cell_conditional = lapply(c('Amount', 'Sum'),
                                 function(name) {
                                     'if' = list('column_id' = name),
                                     textAlign = 'right'
 style_data_conditional = list(
     'if' = list(column_id = 'Amount', filter_query = '{Amount} > 0'),
     backgroundColor = 'rgb(33, 176, 125)',
     color = 'white'
     'if' = list(column_id = 'Amount', filter_query = '{Amount} < 0'),
     backgroundColor = 'rgb(242, 2, 105)',
     color = 'white'
     'if' = list(column_id = 'Sum', filter_query = '{Sum} > 0'),
     backgroundColor = 'rgb(33, 176, 125)',
     color = 'white'
     'if' = list(column_id = 'Sum', filter_query = '{Sum} < 0'),
     backgroundColor = 'rgb(242, 2, 105)',
     color = 'white'
 row_selectable = "multi",
 page_size = nrow(dt),

Could you add details on how the data flows, i.e the callback structure and how ‘dt’ is loaded?


  1. how ‘dt’ is loaded:

It is simply a dataframe in the Rstudio global environment

  1. callback structure:

This is a screenshot of the Callback Graph

I hope this helps

So you load the “large” dataframe, i.e. the one with 50 columns into the global scope. And then you select the 3 columns in a callback and return the resulting data to the table?

I ask because a common performance pitfall is to send the full data, i.e the dataframe with 50 columns, to the client (in a hidden Div or Store element). But as i read your graph/comments, this is not the case for your app.

Yes your interpretation is correct.

Okay, then that’s not the issue. Another option would be to use the DashTabulator, but i don’t know if it is faster or slower (i haven’t tried it myself, but others on the forum seem fond of it). If you decide to give it a try, i would be interest to know if the performance is worse or better (or the same)

It seems DashTabulator is for Python not R. But I also had the idea to try Python instead of R.
Thank you very much so far.

In the Python documentation this issue is mentioned, but not in the R documentation.

https://dash.plotly.com/datatable/height :

So I will play a bit around with the different options to see what works fastest.

Just mentioning it, if other people using R have the same issues.