✊🏿 Black Lives Matter. Please consider donating to Black Girls Code today.
🐇 Announcing Dash VTK for 3d simulation graphics. Check out the March webinar.

413 Request Too Large (county-choropleth) after upgrading to Dash 1.11, Plotly 4.7

I was excited to upgrade to latest Dash/Plotly as we are heavy users of county-choropleth and scatter plots. When I tried Dash 1.12.0 and Plotly 4.7.1 for pattern-matching callbacks and figure creation, they worked fine in local development on my Mac. But when I deployed to our cloud environment (gunicorn on Ubuntu/nginx) the same versions of Dash and Plotly had two issues:

  1. my pattern-matched callbacks weren’t being called
  2. I keep getting 413 Request too large on the large county-choropleth figure. (This worked fine with Dash 1.9.)

I resolved #1 by switching back to Dash 1.11.0.
#2 is still a problem.

Any reason to expect the figure data to be much larger now? The same data works with Dash 1.9 and Plotly 4.5.

Thanks!
DDR

Could you provide a link to the problematic GeoJSON file?

https://drive.google.com/drive/folders/1TtomqUtufBdKI1j5WKD55EHIYTWKODBK?usp=sharing

Two files in there:

us_states.geojson
us_counties.geojson

While counties is twice as large, this problem manifests before the app is showing county boundaries. The first page load is slow but eventually the map with state boundaries loads. Trying to add traces to outline even one state, however, returns 413 Request Entity Too Large.

The figure code here is old and uses trace and legend dicts to form the figure. I was planning to migrate this to use graph_objects to see if that helps.

Thanks for any help and for responding so quickly!

-DDR

The geojson files are not excessively large, so it should be possible to get it to work. Do you have access to the nginx config? If so, you should try increasing the allowed body size, e.g. something like

client_max_body_size 25M;

To address the performance issue, an option could be to try Dash Leaflet. I have just written a GeoJSON component, and on my PC your geojson files are rendered relatively fast. I haven’t done a 1:1 comparison with the performance of the equivalent Plotly maps yet, so any comments would be appreciated. Here is a mwe,

import dash
import dash_html_components as html
import json
import dash_leaflet as dl

# Create geojson.
with open("assets/us_counties.geojson", 'r') as f:
    geojson = dl.GeoJSON(data=json.load(f))
# Create app.
app = dash.Dash()
app.layout = html.Div([dl.Map(children=[dl.TileLayer(), geojson], center=[39, -98], zoom=4)],
                      style={'width': '100%', 'height': '50vh', 'margin': "auto", "display": "block"})

if __name__ == '__main__':
    app.run_server()

If you decide to give it a spin, you would need to install

pip install dash==1.12.0
pip install dash-leaflet==0.0.11

You can see an interactive demo here.

Thanks for the tips. I’ve reworked my Plotly choropleth code to use graph_objects API object calls instead of data/layout dictionaries. This seems to give me better more consistent performance. And the app now works in my Linux deployment environment (Dash 1.11.0, Plotly 4.7.1). Hopefully can avoid tinkering with nginx. Thanks for the Leaflet tip! I’ve been seeing this crop up in web remarks. If graph_objects doesn’t do the trick I will def check it out.

-DDR

are you inlining the GeoJSON in the figure itself or referring to it by URL? The latter is probably a good idea in a Dash app, to leverage browser caching.

Either way, this seems like a regression if it used to work in previous versions of Dash!

@nicolaskruchten your tip to refer to geojson by URL was a game changer. That was the oldest code in my app and I had always just left it alone as working. But moving the large geojson files into assets/ and passing URLs has made a huge difference!

The Plotly Figure Reference API is a large and mysterious place. I could read it 100x and discover new things every time.

-DDR

2 Likes

You and me both… and it’s almost my full-time job :wink:

1 Like