Piping data from Dash to a function on another machine

I am working on a web server and have set up my Dash dashboard where a user uploads a file, which is then stored in a div (as described in https://plot.ly/dash/sharing-data-between-callbacks). The user makes some column selections, which are also stored in a div and the relevant data visualised.

I would like to have a button that pipes the data to an intensive function on another machine (a mule server) so as to not crash or slow down the web server. Ideally, this data would be on a SQL database but for now I am experimenting so it is a simple pandas dataframe stored on the webbrowser in a div.

Is there a way to pipe data to an external Python? Flask maybe? How might I go about this?

Before you start working on complex multi-machine architectures, I’d try just increasing the number of workers and threads on your app with e.g.

$ gunicorn --workers 6 --threads 2 app:server

There is a longer discussion about this in here: Celery integration?, including the case of moving the CPUs to a different process (celery) which could be used to run them on a different machine as well.

Thanks I’ll take a look at the ‘Celery integration’ thread.

I am stuck with a Windows machine so it looks like I will be using Waitress.

1 Like