Black Lives Matter. Please consider donating to Black Girls Code today.

Keeping DB conn open using dcc.Interval

I’ve got a dash app in production which is backed by a snowflake (AWS hosted SQL service) instance. Snowflake has a 4 hour timeout on any of their connections, so I need to architect this in a way to keep the connection open. Let’s assume for the time being that I can’t edit the 4 hour timeout. My thought is there are two ways around this…

  1. establish the connection (in app.py) on the client side, so that a new conn is established when a user hits the dash application. The four hours shouldn’t present an issue in this scenario. Am I correct in understanding that establishing the connection object in app.py (and then using that connection object on subsequent queries) is happening on the server side in dash’s current architecture?

  2. I recently setup dcc.Interval() to run a dummy query every hour to keep the connection open, but reading the forums it looks like dcc.Interval() is run on the client side (and I understand why that would be, but it sounds like it doesn’t help me out here).

So given the above, is there a current methodology to keep my connection purely client-side or run some queries server side that keep the connection alive? App layout and pseudo-code below.

repo

  • app.py
  • index.py
  • apps
    • app1.py
    • app2.py

app.py

app = dash.Dash(__name__, external_stylesheets=[dbc.themes.BOOTSTRAP,'.assets/custom.css'])
conn = utils.conn_fcn()
server = app.server
app.config.suppress_callback_exceptions = True

index.py

def layout():
    return html.Div(id = 'first-div',
                children=[
                ####### RANDOM QUERY TO RUN AND KEEP THE CONN ALIVE
                    dcc.Interval(
                            id='interval-component',
                            interval=60*60*1000, # in milliseconds
                            n_intervals=0
                        ),
                    html.Div(id='i-dont-care',style={"display":"none"}),
                    dcc.Location(id='url',refresh=False),
                    html.Div(id='outermost-white-div',
                             children=[row,body]
                            )
                        ]
                )

#this runs a dummy query every hour to keep the connection open
@app.callback(Output('i-dont-care', 'children'),
              [Input('interval-component', 'n_intervals')])
def run_dummy(n):
    print("i ran the timing query")
    df = pd.read_sql('''SELECT * FROM MY_TABLE LIMIT 1''',conn)
    return df.to_json()

app1.py

def return_charts():
    df = pd.read_sql('''SELECT * FROM DATA_TABLES''',conn)
    x = []
    for i in df['i']:
        x.append(build_chart(df,i))
    return x

def layout():
    body = html.Div(
            dbc.Row(html.Div(id='charts', children=return_charts()))
                             )
    return body```

Couldn’t you just create a method or an object that creates a new connection, if the previous one has timed out? Something like,

conn_cache = {}

def get_conn(key):
    try:
        conn = conn_cache[key]
    except Exception:  # limit to the actual exceptions; key missing, timeout, etc.
        conn = utils.conn_fcn()
        conn_cache[key] = conn
    return conn

Thanks @Emil
I’ve implemented a method to this extent, I’ll see how it works with the timeout overnight tonight, and will let you know, appreciate the insight.

Hi Emil,

Thanks again for the help. Here’s how I’ve implemented this on my side, unfortunately, I’m still getting the conn timeout error… this may be more of a snowflake issue than it is a dash/plotly one. I’m not sure if anyone out there has synced up Dash to this specific DB instance before, or how other have handled connections.

app.py

app = dash.Dash(__name__, external_stylesheets=[dbc.themes.BOOTSTRAP,'.assets/custom.css'])

conn_cache = {}
conn_cache['name'] = utils.conn_snow()

def get_conn():
    try:
        conn = conn_cache['name']
    except Exception:  # limit to the actual exceptions; key missing, timeout, etc.
        conn = utils.conn_snow()
        conn_cache['name'] = conn
    return conn 

chart1.py

def build_chart():
    #function that builds HTML table and returns it

def return_charts():
    df = pd.read_sql('''SELECT * FROM MY_TABLE''',get_conn())
    x = []
    for t in t2
        x.append(build_chart(df,t))
    return x

def layout():
    body = html.Div(
            dbc.Row(html.Div(id='my-charts',
                             children=return_charts(),
                             style={"margin-left":"auto",
                                    "margin-right":"auto"}
                             )
                     )
                    )
    return body

I’m starting to think the solution will be something more along these lines (or going full redis)…

Further, I think SQLalchemy may have a solution to this. I’ll reply here if I think this solves my issue…

https://docs.sqlalchemy.org/en/13/core/pooling.html#pool-disconnects