How to Dynamically Output to Div like LLM Streaming/Chatbot

I have a local llm server running (ollama). I can easily integrate the “non-streaming” option where the server responds with a single json object after it crunches through the LLM and I can trivially update the output div callback.

Problem with streaming ON trying to mimic chatGPT experience.

I created a call function (see below) that yields the strings (function returns essentially a generator type) but how to update the output div with the stream of strings?

I’ve tried dcc.Interval set to 1s as trigger to refresh the output Div, but 1) div doesnt update and 2) it gets stuck in a loop and it doesn’t stop when the LLM stops.

Thanks in advance for any advice.

**** Relevant Excerpt ****
def general_chain(question, model=defaultModel, botPrompt=botPromptDefault):
payload = [
{
“role”: “system”,
“content”: f"{botPrompt}"
},
{
‘role’: ‘user’,
‘content’: f’{question}‘,
}
]
# Send a GET request to the Ollama server
for part in client.chat(model, messages=payload, stream=True):
#print to CLI to see stream
print(part[“message”][“content”], end=’', flush=True)

     #Return strings as generator
     yield part["message"]["content"]

Hey @kerchan17 welcome to the forums.

You might be interested in this:

1 Like

This is exactly what I need I think. Thanks @AIMPED

1 Like