@app.callback output error message - dash.exceptions.InvalidCallbackReturnValue:

I’m outputting a pandas dataframe from an @app.callback. I keep getting the following error message:

dash.exceptions.InvalidCallbackReturnValue: The callback for `<Output `df_portfolio.children`>`
            returned a value having type `DataFrame`
            which is not JSON serializable.


            The value in question is either the only value returned,
            or is in the top level of the returned list,

            and has string representation
            `  shares share price  total cost
0      8      295.55     2364.40
1     95       15.65     1486.75
2     18       27.69      498.42
3     23       10.83      249.09
4      4       52.88      211.52
5    N/A        N/A       189.82`

            In general, Dash properties can only be
            dash components, strings, dictionaries, numbers, None,
            or lists of those.

The traceback is as follows:

Traceback (most recent call last):
  File "C:\Users\pkopp\AppData\Local\Programs\Python\Python38-32\Lib\site-packages\dash\dash.py", line 997, in add_context
    jsonResponse = json.dumps(
  File "C:\Users\pkopp\AppData\Local\Programs\Python\Python38-32\Lib\json\__init__.py", line 234, in dumps
    return cls(
  File "C:\Users\pkopp\AppData\Local\Programs\Python\Python38-32\Lib\site-packages\_plotly_utils\utils.py", line 45, in encode
    encoded_o = super(PlotlyJSONEncoder, self).encode(o)
  File "C:\Users\pkopp\AppData\Local\Programs\Python\Python38-32\Lib\json\encoder.py", line 199, in encode
    chunks = self.iterencode(o, _one_shot=True)
  File "C:\Users\pkopp\AppData\Local\Programs\Python\Python38-32\Lib\json\encoder.py", line 257, in iterencode
    return _iterencode(o, 0)
  File "C:\Users\pkopp\AppData\Local\Programs\Python\Python38-32\Lib\site-packages\_plotly_utils\utils.py", line 115, in default
    return _json.JSONEncoder.default(self, obj)
  File "C:\Users\pkopp\AppData\Local\Programs\Python\Python38-32\Lib\json\encoder.py", line 179, in default
    raise TypeError(f'Object of type {o.__class__.__name__} '
TypeError: Object of type DataFrame is not JSON serializable

During handling of the above exception, another exception occurred:

Traceback (most recent call last):
  File "C:\Users\pkopp\AppData\Local\Programs\Python\Python38-32\Lib\site-packages\flask\app.py", line 1950, in full_dispatch_request
    rv = self.dispatch_request()
  File "C:\Users\pkopp\AppData\Local\Programs\Python\Python38-32\Lib\site-packages\flask\app.py", line 1936, in dispatch_request
    return self.view_functions[rule.endpoint](**req.view_args)
  File "C:\Users\pkopp\AppData\Local\Programs\Python\Python38-32\Lib\site-packages\dash\dash.py", line 1031, in dispatch
    response.set_data(func(*args, outputs_list=outputs_list))
  File "C:\Users\pkopp\AppData\Local\Programs\Python\Python38-32\Lib\site-packages\dash\dash.py", line 1001, in add_context
    _validate.fail_callback_output(output_value, output)
  File "C:\Users\pkopp\AppData\Local\Programs\Python\Python38-32\Lib\site-packages\dash\_validate.py", line 251, in fail_callback_output
    _validate_value(val, index=i)
  File "C:\Users\pkopp\AppData\Local\Programs\Python\Python38-32\Lib\site-packages\dash\_validate.py", line 241, in _validate_value
    _raise_invalid(
  File "C:\Users\pkopp\AppData\Local\Programs\Python\Python38-32\Lib\site-packages\dash\_validate.py", line 182, in _raise_invalid
    raise exceptions.InvalidCallbackReturnValue(
dash.exceptions.InvalidCallbackReturnValue: The callback for `<Output `df_portfolio.children`>`
            returned a value having type `DataFrame`
            which is not JSON serializable.

Why can’t I output a pandas dataframe?

Any help would be appreciated.

My code is the following:

def sample_portfolio(app, df_portfolio):
   @app.callback(
      Output('df_portfolio', 'data'),
      [Input('df_portfolio', 'value'),
      Input('my-age', 'value'),
      Input('risk-tolerance-slider', 'value'),
      Input('my-amount', 'value')]
   )

   def portfolio(df_portfolio, age, risk_tolerance, account_bal):
  
      sec_list = ["VOO", "PRULX", "VTIAX", "PFORX", "FDHY", "SPY"]
      sec_val_list = []
      # For some reason when typing in a new value for age or account_bal we get a Nonetype error.
      # This prevents the error.
      if age is None or account_bal is None:
         return [''] * len(sec_list)

      conn = http.client.HTTPSConnection("ftlightning.fasttrack.net")
      conn.request("GET", "/v1/auth/login?account=300724&pass=1243OEIL&appid=F075C6E1-759C-4009-9B47-5FE284F31F55")

      res = conn.getresponse()
      data = json.loads(res.read())

      appid = data['appid']
      token = data['token']

      headers = {'appid': appid,
                  'token': token}

      for sec in sec_list:
         conn.request("GET", "/v1/data/" + sec + "/divadjprices", headers=headers)
         res = conn.getresponse()
         data = res.read()
         dic = json.loads(data)
         sec_val_list.append(dic["prices"][-1])


      # Determine base portfolio from age
      if (age >= 18 and age <= 22):
         portolio = reference.base_portfolio[0]
      elif (age >= 23 and age <= 27):
         portfolio = reference.base_portfolio[1]
      elif (age >= 28 and age <= 32):
         portfolio = reference.base_portfolio[2]
      elif (age >= 33 and age <= 37):
         portfolio = reference.base_portfolio[3]
      elif (age >= 38 and age <= 42):
         portfolio = reference.base_portfolio[4]
      elif (age >= 43 and age <= 47):
         portfolio = reference.base_portfolio[5]
      elif (age >= 48 and age <= 52):
         portfolio = reference.base_portfolio[6]
      elif (age >= 53 and age <= 57):
         portfolio = reference.base_portfolio[7]
      elif (age >= 58 and age <= 62):
         portfolio = reference.base_portfolio[8]
      else:
         portfolio = reference.base_portfolio[9]

      # initialize adj_portfolio
      adj_portfolio = [0] * len(sec_list)
    
      # Adjust for risk tolerance.
      if risk_tolerance < 5:
         for i in range(len(adj_portfolio) - 1):
            if i <= 2:
               adj_portfolio[i] = portfolio[i] + (portfolio[i] * (-reference.adj * (5 - risk_tolerance)))
            else:
               adj_portfolio[i] = portfolio[i] + (portfolio[i] * (reference.adj * (5 - risk_tolerance)))
      elif risk_tolerance > 5:
         for i in range(len(adj_portfolio) - 1):
            if i <= 1:
               adj_portfolio[i] = portfolio[i] + (portfolio[i] * (reference.adj * (risk_tolerance - 5)))
            else:
               adj_portfolio[i] = portfolio[i] + (portfolio[i] * (-reference.adj * (risk_tolerance - 5)))
      else:        # risk_tolerance = 5
         adj_portfolio = portfolio

      # Number of shares of each security.
      dom_stock  = int((account_bal * adj_portfolio[0]/100) / sec_val_list[0])
      gov_bond  = int((account_bal * adj_portfolio[1]/100) / sec_val_list[1])
      int_stock  = int((account_bal * adj_portfolio[2]/100) / sec_val_list[2])
      int_bond  = int((account_bal * adj_portfolio[3]/100) / sec_val_list[3])
      corp_bond  = int(account_bal * adj_portfolio[4]/100 / sec_val_list[4])
      cash = account_bal - ((dom_stock * sec_val_list[0]) + (gov_bond * sec_val_list[1]) + 
         (int_stock * sec_val_list[2]) + (int_bond * sec_val_list[3]) + (corp_bond * sec_val_list[4]))

      test = [[dom_stock, sec_val_list[0], dom_stock * sec_val_list[0]], 
            [gov_bond, sec_val_list[1], gov_bond * sec_val_list[1]], 
            [int_stock, sec_val_list[2], int_stock * sec_val_list[2]], 
            [int_bond, sec_val_list[3], int_bond * sec_val_list[3]],
            [corp_bond, sec_val_list[4], corp_bond * sec_val_list[4]],
            ['N/A', 'N/A ', cash]]


      portfolio = pd.DataFrame(test, columns = ['shares', 'share price', 'total cost'])
      return portfolio

try return portfolio.to_dict(‘records’). that is, convert the data frame into a data structure that can be serialized as json

Thank you very much for working on a Sunday night.

@chriddyp you beautiful man thank you

Hi Chris,

I came across a similar error. I am using the getData() to pull data from a db every 5 seconds. I am converting the dataframe to dict in my app.

import dash
import dash_core_components as dcc
import dash_html_components as html
import dash_table
import pandas as pd
import numpy as np
import mysql.connector


app = dash.Dash(__name__)

def getData():
    conn = mysql.connector.connect(
            host=,  # your host, usually localhost
            user=,        # your username
            password=,        # your password
            database=,# name of the data base
    )    
    c = conn.cursor()
    
    df = pd.read_sql("SELECT DATE_TIME, MES_LenSp, MES_ExzAbs, MES_DEinl, MES_DKalt  FROM table WHERE DATE_TIME > '2021-03-23 09:00:00'", conn)
    #Batch data every 1000 
    bins = pd.Series(np.arange(0,160000,1000))
    
    labels = ["{0} - {1}".format(i, i + 999) for i in range(0, 160000, 1000)]
    
    df['Len_bins'] = pd.cut(df['MES_LenSp'], bins=bins, right=False)
    df['Len_bins'] = df['Len_bins'].astype(str)
    
    #Calculations
    
    
    df['Concentricity'] = 1-(df['MES_ExzAbs']/(2*df['MES_DKalt']))
    
    #Impedance calculation 
    dielectric = 1.91
    x = float(138) 
    log10 = 2.302585
    df['Impedance'] = ((x * np.log10(df['MES_DKalt']/df['MES_DEinl']))/(np.sqrt(dielectric)))
    
    #Std dev calculation 
    df['DiamCore_diff'] = df['MES_DEinl'].diff(periods=-1).abs()
    df['DiamCold_diff'] = df['MES_DKalt'].diff(periods=-1).abs()
    df['Impedance_diff'] = df['Impedance'].diff(periods=-1).abs()
    df.dropna(inplace=True)
    
    #calculate mean of the range and divide by 1.128 to get std deviation for every 1000ft 
    
    #Convert date to int64 for groupby 
    df['DATE_TIME'] = pd.to_datetime(df['DATE_TIME']).values.astype(np.int64)
    
    df_result = df.groupby(['Len_bins'], sort=False)[['DATE_TIME','DiamCore_diff',
                                          'DiamCold_diff','Impedance_diff',
                                          'Concentricity']].mean().reset_index()
    df_result.rename(columns={'Concentricity':'Concentricity_Mean'}, inplace=True)
    
    df_result['Concentricity_Mean'] = df_result['Concentricity_Mean'].round(4)
    #convert date/time from int64 to date
    df_result['DATE_TIME'] = pd.to_datetime(df_result['DATE_TIME'], unit='ns')
    
    df_result['DiamCore_STD'] = df_result['DiamCore_diff'] / 1.128
    df_result['DiamCold_STD'] = df_result['DiamCold_diff'] / 1.128
    df_result['Impedance_STD'] = df_result['Impedance_diff'] / 1.128
    
    #Round decimal places 
    df_result['DiamCore_STD'] = df_result['DiamCore_STD'].round(7)
    df_result['DiamCold_STD'] = df_result['DiamCold_STD'].round(7)
    df_result['Impedance_STD'] = df_result['Impedance_STD'].round(4)
    
    
    
    #Compare against limits and generate alarm log
    df_result['DiamCore_STD_Alarm'] = np.where(df_result['DiamCore_STD'] > 0.000012, 1, 0)
    df_result['DiamCold_STD_Alarm'] = np.where(df_result['DiamCold_STD'] > 0.000065, 1, 0)
    df_result['Impedance_STD_Alarm'] = np.where(df_result['Impedance_STD'] > 0.07, 1, 0)
    
    #Concentricity Alarm
    df_con = df[['Len_bins','Concentricity']].copy()
    df_con['Concentricity_v'] = np.where(df_con['Concentricity'] > 0.95, 1, 0)
    df_con = df_con.groupby('Len_bins')[['Concentricity_v']].sum().reset_index()
    df_con['Conc_Alarm'] = np.where(df_con['Concentricity_v'] > 500, 1, 0)
    df_con = df_con[['Len_bins','Conc_Alarm']]
    
    #Merge concentricity alarm with DiamCold, DiamCore, Impedance Alarm
    df_result = pd.merge(df_result, df_con, on='Len_bins', validate='one_to_many')
    
    df_result['Alarm'] = np.where((df_result['DiamCore_STD_Alarm'] + 
                                   df_result['DiamCold_STD_Alarm'] + 
                                   df_result['Impedance_STD_Alarm'] +
                                   df_result['Conc_Alarm']) >= 1, 1, 0)
   
    
    
    #Convert date/time to date
    df['DATE_TIME'] = pd.to_datetime(df['DATE_TIME'])
    
    
    
    #Drop rows where impedance is 0 to retain non zero rows only
    df_result = df_result[df_result['Impedance_diff'] > 0]

    
    df_summary = df_result[['DATE_TIME','Len_bins','DiamCore_STD','DiamCold_STD',
                            'Impedance_STD','Concentricity_Mean','Alarm']].copy()
    df_summary['DATE_TIME'] = pd.to_datetime(df_summary['DATE_TIME'], 
                                              format='%Y-%m-%d %H:%M:%S').dt.strftime('%Y-%m-%d %H:%M:%S')
    
    
    df_summary['Len_bins'] = df_summary['Len_bins'].astype(str)
    return df_summary


app.layout = html.Div([
      html.H4('Dashboard'),
      dcc.Interval('graph-update', interval = 5000, n_intervals = 0),
      dash_table.DataTable(
          id = 'table',
          data = getData().to_dict('records'),
          columns=[{'id': c, 'name': c} for c in getData().columns],
                        page_size = 15,
                        style_as_list_view=True,
                        style_header={
                                    'fontWeight': 'bold'
                                    },

                        
                        css=[{'selector': 'table', 'rule': 'table-layout: fixed'}],
                        style_cell={
                                    'width': '{}%'.format(len(getData().columns)),
                                    'whiteSpace': 'normal',
                                    'padding': '7px',
                                    'height': 'Auto',
                                    'fontSize': 24,
                                    'textAlign': 'center'
                                    },
                        
                        style_data_conditional=[{
                                                'if': {
                                                    'filter_query': '{Alarm} contains "1"',
                                                    'column_id': 'Alarm'
                                                    },  
                                                    'backgroundColor': '#FF4136',
                                                    'color': 'white'
                                                    },
                                                {
                                                'if': {
                                                    'filter_query': '{Alarm} contains "0"',
                                                    'column_id': 'Alarm'
                                                    },  
                                                    'backgroundColor': '#3D9970',
                                                    'color': 'white'
                                                    },
                                            
                                                ] 
         )])

@app.callback(
        dash.dependencies.Output('table','data'),
        [dash.dependencies.Input('graph-update', 'n_intervals')])
def updateTable(n):
     return getData()

if __name__ == '__main__':
     app.run_server(port = 8087,debug=False)

The error message:

Traceback (most recent call last):
  File "C:\Users\savanna\Anaconda3\lib\site-packages\flask\app.py", line 2447, in wsgi_app
    response = self.full_dispatch_request()
  File "C:\Users\savanna\Anaconda3\lib\site-packages\flask\app.py", line 1952, in full_dispatch_request
    rv = self.handle_user_exception(e)
  File "C:\Users\savanna\Anaconda3\lib\site-packages\flask\app.py", line 1821, in handle_user_exception
    reraise(exc_type, exc_value, tb)
  File "C:\Users\savanna\Anaconda3\lib\site-packages\flask\_compat.py", line 39, in reraise
    raise value
  File "C:\Users\savanna\Anaconda3\lib\site-packages\flask\app.py", line 1950, in full_dispatch_request
    rv = self.dispatch_request()
  File "C:\Users\savanna\Anaconda3\lib\site-packages\flask\app.py", line 1936, in dispatch_request
    return self.view_functions[rule.endpoint](**req.view_args)
  File "C:\Users\savanna\Anaconda3\lib\site-packages\dash\dash.py", line 1078, in dispatch
    response.set_data(func(*args, outputs_list=outputs_list))
  File "C:\Users\savanna\Anaconda3\lib\site-packages\dash\dash.py", line 1044, in add_context
    _validate.fail_callback_output(output_value, output)
  File "C:\Users\savanna\Anaconda3\lib\site-packages\dash\_validate.py", line 261, in fail_callback_output
    _validate_value(val, index=i)
  File "C:\Users\savanna\Anaconda3\lib\site-packages\dash\_validate.py", line 251, in _validate_value
    _raise_invalid(
  File "C:\Users\savanna\Anaconda3\lib\site-packages\dash\_validate.py", line 190, in _raise_invalid
    raise exceptions.InvalidCallbackReturnValue(
dash.exceptions.InvalidCallbackReturnValue: The callback for `<Output `table.data`>`
returned a value having type `DataFrame`
which is not JSON serializable.

The value in question is either the only value returned,
or is in the top level of the returned list,

and has string representation

Does it matter if i convert the dataframe to dict in the app as opposed to in the getData()? Doing this because its easier for me to format column headers

Any help is appreciated. :slightly_smiling_face:

Hi @dasher007

Try changing your callback to:

return get_data().to_dict('records')

Hi @AnnMarieW, thanks for the sugestion. I changed the getData function to return as dict

return getData().to_dict('records')

but that didn’t help, still throws out the same error.

FYI, the changes i made

def getData():
.
.
.
    return getData().to_dict('records')



app.layout = html.Div([
      html.H4('Dashboard'),
      dcc.Interval('graph-update', interval = 5000, n_intervals = 0),
      dash_table.DataTable(
          id = 'table',
          data = getData().to_dict('records'),
          columns=[{'id': c, 'name': c} for c in getData().columns],
                        page_size = 15,
                        style_as_list_view=True,
                        style_header={
                                    'fontWeight': 'bold'
                                    },

                        
                        css=[{'selector': 'table', 'rule': 'table-layout: fixed'}],
                        style_cell={
                                    'width': '{}%'.format(len(getData().columns)),
                                    'whiteSpace': 'normal',
                                    'padding': '7px',
                                    'height': 'Auto',
                                    'fontSize': 24,
                                    'textAlign': 'center'
                                    },
                        
                        style_data_conditional=[{
                                                'if': {
                                                    'filter_query': '{Alarm} contains "1"',
                                                    'column_id': 'Alarm'
                                                    },  
                                                    'backgroundColor': '#FF4136',
                                                    'color': 'white'
                                                    },
                                                {
                                                'if': {
                                                    'filter_query': '{Alarm} contains "0"',
                                                    'column_id': 'Alarm'
                                                    },  
                                                    'backgroundColor': '#3D9970',
                                                    'color': 'white'
                                                    },
                                            
                                                ] 
         )])

@app.callback(
        dash.dependencies.Output('table','data'),
        [dash.dependencies.Input('graph-update', 'n_intervals')])
def updateTable(n):
     return getData().to_dict('records')

if __name__ == '__main__':
     app.run_server(port = 8087,debug=False)

The error:

In general, Dash properties can only be
dash components, strings, dictionaries, numbers, None,
or lists of those.
127.0.0.1 - - [24/Mar/2021 08:24:36] "POST /_dash-update-component HTTP/1.1" 500 -
[2021-03-24 08:24:39,206] ERROR in app: Exception on /_dash-update-component [POST]
Traceback (most recent call last):
  File "C:\Users\savanna\Anaconda3\lib\site-packages\dash\dash.py", line 1040, in add_context
    jsonResponse = json.dumps(
  File "C:\Users\savanna\Anaconda3\lib\json\__init__.py", line 234, in dumps
    return cls(
  File "C:\Users\savanna\Anaconda3\lib\site-packages\_plotly_utils\utils.py", line 59, in encode
    encoded_o = super(PlotlyJSONEncoder, self).encode(o)
  File "C:\Users\savanna\Anaconda3\lib\json\encoder.py", line 199, in encode
    chunks = self.iterencode(o, _one_shot=True)
  File "C:\Users\savanna\Anaconda3\lib\json\encoder.py", line 257, in iterencode
    return _iterencode(o, 0)
  File "C:\Users\savanna\Anaconda3\lib\site-packages\_plotly_utils\utils.py", line 134, in default
    return _json.JSONEncoder.default(self, obj)
  File "C:\Users\savanna\Anaconda3\lib\json\encoder.py", line 179, in default
    raise TypeError(f'Object of type {o.__class__.__name__} '
TypeError: Object of type DataFrame is not JSON serializable

During handling of the above exception, another exception occurred:

Traceback (most recent call last):
  File "C:\Users\savanna\Anaconda3\lib\site-packages\flask\app.py", line 2447, in wsgi_app
    response = self.full_dispatch_request()
  File "C:\Users\savanna\Anaconda3\lib\site-packages\flask\app.py", line 1952, in full_dispatch_request
    rv = self.handle_user_exception(e)
  File "C:\Users\savanna\Anaconda3\lib\site-packages\flask\app.py", line 1821, in handle_user_exception
    reraise(exc_type, exc_value, tb)
  File "C:\Users\savanna\Anaconda3\lib\site-packages\flask\_compat.py", line 39, in reraise
    raise value
  File "C:\Users\savanna\Anaconda3\lib\site-packages\flask\app.py", line 1950, in full_dispatch_request
    rv = self.dispatch_request()
  File "C:\Users\savanna\Anaconda3\lib\site-packages\flask\app.py", line 1936, in dispatch_request
    return self.view_functions[rule.endpoint](**req.view_args)
  File "C:\Users\savanna\Anaconda3\lib\site-packages\dash\dash.py", line 1078, in dispatch
    response.set_data(func(*args, outputs_list=outputs_list))
  File "C:\Users\savanna\Anaconda3\lib\site-packages\dash\dash.py", line 1044, in add_context
    _validate.fail_callback_output(output_value, output)
  File "C:\Users\savanna\Anaconda3\lib\site-packages\dash\_validate.py", line 261, in fail_callback_output
    _validate_value(val, index=i)
  File "C:\Users\savanna\Anaconda3\lib\site-packages\dash\_validate.py", line 251, in _validate_value
    _raise_invalid(
  File "C:\Users\savanna\Anaconda3\lib\site-packages\dash\_validate.py", line 190, in _raise_invalid
    raise exceptions.InvalidCallbackReturnValue(
dash.exceptions.InvalidCallbackReturnValue: The callback for `<Output `table.data`>`
returned a value having type `DataFrame`
which is not JSON serializable.

Another thing I noticed is that when i print getData() i see the data getting updated however the data doesn’t update in the app.

Any help is appreciated!

1 Like

Hi guys I am having the same errors:

def findActivity(choice):
  if choice>=1 and choice<=100:
    return 'Recovery'
  elif choice >100 and choice <=200:
    return 'Preparative'
  elif choice >=180 and choice <=325:
    return 'Organizational'
  elif choice >=300 and choice <=450:
    return 'Maintenance'
  elif choice >450 and choice <=640:
    return 'Thresold'
  elif choice >=640 and choice <=900:
    return 'Fitness'
  elif choice >=810 and choice <=1200:
    return 'Game'
  else:
    return "Training"
def idFunction(year,ind, indexList, i, days, weekDay):
  if ind==0 and days[weekDay][ind] not in [1,2,3,4,5,6]:
    if i==1:
      m_id=date(year-1, 12, days[weekDay][ind])
    else:
      m_id=date(year, i-1, days[weekDay][ind])
  elif ind==indexList[-1] and days[weekDay][ind] not in [24,25,26,27,28,29, 30,31]:
    if i==12:
      m_id=date(year+1, 1, days[weekDay][ind])
    else:
      m_id=date(year, i+1, days[weekDay][ind])
  else:
    m_id=date(year, i, days[weekDay][ind])

  return m_id
def activity(d):
  year=int(d[0:4])
  month=int(d[5:7])
  day=int(d[8:])
  
  c = conn.cursor()
  c.execute('SELECT activity FROM planner WHERE date=%s and teamName =%s', (datetime.date(year,month,day),"UCF"))
  result = c.fetchone()#fetchall
  c.execute('SELECT first, last, sum(load) AS load FROM "UCF_Women_Soccer_DEV".ff90_training_load WHERE date=%s GROUP BY first, last', (datetime.date(year,month,day),))
  result2 = c.fetchall()
  if result:
    return result[0]#result[0]
  elif result2:
    load=[value for first, last, value in result2]
    mode=statistics.mode(load)
    return findActivity(mode)
  else:
    return "OFF"
my_calendar = html.Div([
    dbc.Row(
        dbc.Col(
            dcc.Dropdown(
                id="month",
                options=[{
                    "label": x,
                    "value": x
                } for x in calendar.month_name],
                value=calendar.month_name[date.today().month],
                clearable=False,
            ))),
    html.Br(),
])
years = html.Div([
    dbc.Row(
        dbc.Col(
            dcc.Dropdown(
                id="years",
                options=[{
                    "label": x,
                    "value": x
                } for x in range(2000,2100)],
                value=2021,
                clearable=False,
            ))),
    html.Br(),
])
table_header = [
    html.Thead(html.Tr([html.Th("KEY"), html.Th(dbc.Badge("RECOVERY", color="primary")), html.Th(dbc.Badge("PREPARATIVE", color="success")),  html.Th(dbc.Badge("ORGANIZATIONAL", color="warning")),  html.Th(dbc.Badge("MAINTENANCE", color="info")),  html.Th(dbc.Badge("THRESOLD", color="light")),  html.Th(dbc.Badge("FITNESS", color="dark")),  html.Th(dbc.Badge("GAME", color="danger")) ]))
]

row1 = html.Tr([html.Td("RPE"), html.Td("<3"),html.Td("3-4"), html.Td("4-5"), html.Td("5-6"), html.Td("7-8"), html.Td("8-9"), html.Td("9-10")])
row2 = html.Tr([html.Td("VOL"), html.Td("<35min"), html.Td("35-45min"), html.Td("45-60min"), html.Td("45-75min"), html.Td("75-80min"), html.Td("80-100min"), html.Td("90min")])
row3 = html.Tr([html.Td("TL"), html.Td("<100"), html.Td("105-200"), html.Td("180-325"), html.Td("300-450"), html.Td("490-640"), html.Td("640-900"), html.Td(">810")])
row4 = html.Tr([html.Td("Readiness"), html.Td("<65"), html.Td(">77.5"), html.Td("66-75"), html.Td("70-80"), html.Td("75-85"), html.Td("82.5-95"), html.Td("87.5")])
row5 = html.Tr([html.Td("Recovery"), html.Td("n/a"), html.Td("12hr"), html.Td("<24hr"), html.Td("24hr"), html.Td("48hr"), html.Td("72hr"), html.Td(">72hr")])
table_body = [html.Tbody([row1, row2, row3, row4, row5])]

table = dbc.Table(table_header + table_body, bordered=True,
    hover=True,
    responsive=True,
    striped=True,)
DATABASE_URL = 'postgres://panjfsdgbbpqjp:e82392fdcc0e809c9ee02a9459d9da3e0e414094bdbc8f88acb69d3f7686da5a@ec2-52-44-31-100.compute-1.amazonaws.com:5432/d2itj4h57p7v0h'
conn = psycopg2.connect(DATABASE_URL, sslmode='require')
options6=["Highest Training Load", "Lowest Training Load"]
colors = {
    'background': '#ffffed',
    'text': '#7FDBFF'
}#style={'backgroundColor': colors['background']},
def layout():
  return html.Div(children=[
                    #html.Div(my_calendar, id='monthChoice'),
                    dbc.Row(html.H1("Planner"), className="container_title",),
                    html.Br(),
                    html.Div(dbc.Row([dbc.Col(html.H4("Team average", style={
        'textAlign': 'left',
    })), dbc.Col(html.Div([dbc.Row(html.H4("Last 7 days")),dcc.Dropdown(
                                    id="LastSevenDays",
                                    options=[{
                                        "label": x,
                                        "value": x
                                    } for x in options6],
                                    value=options6[0],
                                    clearable=False,
                                ), html.Div(id="LastSevenDayOutput")], className="container_title"))])),
                    html.Br(),html.Br(),
                    html.Div([dbc.Row([dbc.Col(my_calendar), dbc.Col(years)], no_gutters=True),
                    dbc.Row([
                        dbc.Col(
                            [html.Div("Monday"),
                             ])
                    ], no_gutters=True)], style={
                        "width": "98%",
                        "margin-right": "0", 
                    }),html.Br(),html.Br(),html.Div(table),html.Br(),
                    dbc.Row([dbc.Col(html.Div(id="day-activity", style={'overflowY':'scroll', 'height':400, 'width':500})),dbc.Col(dcc.Graph(id='forecast', style={'display': 'none'}))]),
                    html.Div(id="output2-container", children=[], style={'display': 'none'}),
                
                    html.Div(id='dropdownChoice'),
                    html.Div(id='Output6', style={'display': 'none'}),
])
@app.callback(Output("LastSevenDayOutput", "children"), Input("LastSevenDays", "value"))
def lastSevenDaysOutput(choice):
  c = conn.cursor()
  if choice=="Highest Training Load":
    c.execute('select first || \' \'|| last as  player_name, sum(load) as Training_load \
                    from "UCF_Women_Soccer_DEV".ff90_training_load \
                    where date >\'' + "2021-02-28" + '\' and  date <= \'' +  "2021-03-7"+ '\' \
                    group by ROLLUP (player_name) order by Training_load DESC LIMIT 6')
  else:
    c.execute('select first || \' \'|| last as  player_name, sum(load) as Training_load \
                    from "UCF_Women_Soccer_DEV".ff90_training_load \
                    where date >\'' + "2021-02-28" + '\' and  date <= \'' +  "2021-03-7"+ '\' \
                    group by ROLLUP (player_name) order by Training_load LIMIT 5')
  result=c.fetchall()
  row=[]
  names=[]
  load=[]
  if result:
    for i in result:
      if i[0] ==None:
        continue 
      j=i[1]/7
      j=round(j,1)
      names.append(i[0])
      load.append(j)
      #row.append(dbc.Row([dbc.Col(i[0]), dbc.Col(j)]))
    list_of_tuples = list(zip(names, load))  
    df = pd.DataFrame(list_of_tuples,
                  columns = ['Name', '7 Days average'])
    return dash_table.DataTable(
    id='table',
    columns=[{"name": i, "id": i} for i in df.columns],
    data=df.to_dict('records'),
) 
  return row

@app.callback([Output("monday", "children") ], [Input("month", "value"), Input("years", "value")])
def displayCalendar(value, yearss):
    df = [
        "January", "February", "March", "April", "May", "June", "July",
        "August", "September", "October", "November", "December"
    ]
    data1 = [
        'Preparative', 'Organizational', 'Maintenance', 'Thresold', 'Fitness',
        'Game', 'Recovery'
    ]
    for i, month in enumerate(df):
        if month == value:
            break
    i = i + 1
    print(i)
    days = calendar.monthcalendar(yearss, i)
    if i==1:
      previousMonth=calendar.monthcalendar(yearss-1, 12)
    else:
      previousMonth=calendar.monthcalendar(yearss, i-1)
    if i==12:
      nextMonth=calendar.monthcalendar(yearss+1, 1)
    else:
      nextMonth=calendar.monthcalendar(yearss, i+1)
    a=0
    for j in days[0]:
      if j ==0:
        days[0][a]=previousMonth[-1][a]
      a+=1
    a=0
    for j in days[-1]:
      if j ==0:
        days[-1][a]=nextMonth[0][a]
      a+=1
    monday = []
    data1 = [
        'Preparative', 'Organizational', 'Maintenance', 'Thresold', 'Fitness',
        'Game', 'Recovery', 'OFF'
    ]
    j = days
    days = pd.DataFrame(days)
    days.columns = [
        'Monday', 'Tuesday', 'Wednesday', 'Thursday', 'Friday', 'Saturday',
        'Sunday'
    ]
    for ind in days.index:
        m_id=str(idFunction(yearss,ind, days.index, i, days, 'Monday'))
        m_act=activity(m_id)
        if days['Monday'][ind] != 0:
            monday.append(
                dbc.Card(
                    dbc.CardBody([
                        html.Div(
                            dbc.Button(children=days['Monday'][ind],
                                       id={
                                           'type': 'date-button',
                                           'index': m_id
                                       },
                                       color="link")),
                       dcc.Dropdown(
                                    id={
                'type': 'dropdown2',
                'index': m_id },
                                    options=[{
                                        "label": x,
                                        "value": x
                                    } for x in data1],
                                    value=m_act,
                                    clearable=False,
                                )
                                
                                       
                    ])))
      
    monday.append(dcc.Store(id='permanent-date'))
    monday.append(dcc.Store(id='permanent-object'))
    monday.append(dcc.ConfirmDialog(
        id='confirm',
        message='Doing this change might override training specification, are you sure you want to continue?',
    ))
    
    return monday

I have a multi page daash app, and I get the error when I come at the page with this code

2 Likes

same issue:

i am using ayncio and aoodbc and i get the following bug:

dash.exceptions.InvalidCallbackReturnValue: The callback for <Output my-output.children>
returned a value having type coroutine
which is not JSON serializable.

The value in question is either the only value returned,
or is in the top level of the returned list,

and has string representation
<coroutine object datasets_refresh at 0x000001DF45383940>

In general, Dash properties can only be
dash components, strings, dictionaries, numbers, None,
or lists of those.

Could you share your callback with us? All the errors reported here is due to a callback function returning an object that is not JSON serializable to a DataFrame “data” props. In your case, it seems that you are missing an assignment in your await statement, if the coroutine returns the data.

data should be a list of dictionaries in the same format as in pd.DataFrame.to_dict("records").

I have a somewhat similar problem.
I have this working for a lot of pandas dataframes, but now I have a dictionary ({‘Data Type’: {‘Id’: dtype(‘int64’), ‘SepalLengthCm’: dtype(‘float64’), ‘SepalWidthCm’: dtype(‘float64’), ‘PetalLengthCm’: dtype(‘float64’), ‘PetalWidthCm’: dtype(‘float64’), ‘Species’: dtype(‘int32’)}, ‘Missing Values%’: { … }}

I have converted this to a pandas dataframe both using pd.DataFrame() and pd.DataFrame.from_dict() which works fine as you can see in this image:

However, when I now try to convert this dataframe using .to_dict(‘records’) (dash_table.DataTable(df.to_dict(‘records’)), )

dash.exceptions.InvalidCallbackReturnValue: The callback for `<Output `container-checks-button-pressed.children`>`
                returned a tree with one value having type `DataTable`
                which is not JSON serializable.


The value in question is located at
[0] Div 
[3] DataTable,

                and has string representation
                `DataTable(data=[{'Data Type': dtype('int64'), 'Missing Values%': 0.0, 'Unique Values%': 100, 'Minimum Value': 1.0, 'Maximum Value': 150.0, 'DQ Issue': 'Possible ID colum: drop before modeling process.,     Id has a correlation >= 0.8 with Species. Possible data leakage. Double check this variable.'}, {'Data Type': dtype('float64'), 'Missing Values%': 0.0, 'Unique Values%': 'NA', 'Minimum Value': 4.3, 'Maximum Value': 7.9, 'DQ Issue': 'No issue'}, {'Data Type': dtype('float64'), 'Missing Values%': 0.0, 'Unique Values%': 'NA', 'Minimum Value': 2.0, 'Maximum Value': 4.4, 'DQ Issue': 'has 4 outliers greater than upper bound (4.05) or lower than lower bound(2.05). Cap them or remove them.'}, {'Data Type': dtype('float64'), 'Missing Values%': 0.0, 'Unique Values%': 'NA', 'Minimum Value': 1.0, 'Maximum Value': 6.9, 'DQ Issue': "has a high correlation with ['Id', 'SepalLengthCm']. Consider dropping one of them.,     PetalLengthCm has a correlation >= 0.8 with Species. Possible data leakage. Double check this variable."}, {'Data Type': dtype('float64'), 'Missing Values%': 0.0, 'Unique Values%': 'NA', 'Minimum Value': 0.1, 'Maximum Value': 2.5, 'DQ Issue': "has a high correlation with ['Id', 'SepalLengthCm', 'PetalLengthCm']. Consider dropping one of them.,     PetalWidthCm has a correlation >= 0.8 with Species. Possible data leakage. Double check this variable."}, {'Data Type': dtype('int32'), 'Missing Values%': 0.0, 'Unique Values%': 2, 'Minimum Value': 0.0, 'Maximum Value': 2.0, 'DQ Issue': "has a high correlation with ['Id', 'PetalLengthCm', 'PetalWidthCm']. Consider dropping one of them."}])`

                In general, Dash properties can only be
                dash components, strings, dictionaries, numbers, None,
                or lists of those.

Any idea whats going wrong? as up until the pandas dataframe it all goes well, and using the .to_dict(‘records’) I have converted more dataframes to datatables, but somehow for this one it doesn’t work

1 Like

That df has multiple indexes. Perhaps you need to flatten them.

1 Like

Hi there, check again your dataframe structure, i was struggling with this situation and the problem was the headers (maybe DQ issue in your case) … try to make less complex the table in order to debug the situation

1 Like

It was solved by resetting the indexes indeed so that only one indexing existed! Resetting the index and removing curly brackets often solve the problem, time has learned me :slight_smile: