Ok, problem solved! As usual, the problem I encountered was due to my own oversights rather than something wrong happening in Dash. (I forgot to add a return statement to lambda_handler
, therefore AWS was not receiving any response which resulted in the internal server error). Though I have learned about how compression works in Dash application. I will summarize the lessons I learned here for future reference in case anyone want to use compression in there Dash apps that run on AWS:
- Enable compression in your
Dash
instance:app = dash.Dash(compress=True, ...)
- If you are using an API resource for your AWS application, in your tools’
template.yaml
set"*/*"
underProperties -> BinaryMediaTypes
to tell your API that everything will be passed as binary data and it should not block that data. - Wherever you call
awsgi.response()
, create a list of allContent-Type
that will be communicated to and from the server. And pass that list to thebase64_content_types
parameter ofawsgi.response()
.
a. You have to be explicit in yourContent-Type
list asawsgi
does not expand wildcard characters, i.e. you cannot just have"*/*"
in the list becauseawsgi
will do a literal comparison between theContent-Type
of the communication and what is in thebase64_content_types
list.
b. If you seeUnicodeDecodeErrors
in your CloudWatch logs coming fromawsgi.__init__
then this most likely indicates that you encountered aContent-Type
that you didn’t add to the list yet. If you can’t figure out what the correct type is (theContent-Type
that awsgi sees is not always visible in the incomingevent
data), then do the following. Build your tool locally, then in the build folder, go toawsgi/__init__.py
and add a print statement in theuse_binary_response()
function that prints theContent-Type
to the AWS logs. Then deploy this adjusted version and test it out.
That’s it, the Dash code will take care of the compression/decompression by itself, just as you would expect.