Dash Chat Component

:rocket: Exciting News for Dash Developers!

Hey everyone!

I’m thrilled to introduce Dash Chat – a brand-new chat component designed to seamlessly integrate into your Dash applications. Whether you’re building a support tool, a chatbot, or any app that needs a chat box, this is your go-to solution!

Check it out and let me know what you think. Your feedback is invaluable! :speech_balloon::sparkles:

Dash Chat Component
dash-chat-demo

Check Github for source code
Check the Python package on PyPI

10 Likes

Nice! Good to see a decent implementation for Dash, would be interested to adopt if you could support text streaming! Have a somewhat messy but functional implementation here: GitHub - tieandrews/dashGPT: A high quality chat interface built entirely with Plotly Dash incorporating functionality for RAG, feedback and more.

I haven’t dabbled in setting up custom components but open to helping refine/test :slight_smile:

Thank you for the feedback! Implementing text streaming functionality will be an exciting feature. One approach could be to have users set up an endpoint route where the component can send request calls as highlighted in your usecase.

Alternatively, I wonder if one could also explore using the dcc.Interval component to achieve this without requiring users to create an API route in their dash backend. This wouldn’t really be true streaming, as it is primarily periodic polling for updates rather than true real-time streaming, so it may not be as smooth as a dedicated streaming solution.

Once I have some free dev time, I can take a look at this.

To get streaming working properly, you would need some kind of real time update mechanism (the Interval component is not suitable), e.g. SSE or Websockets. I have build a number of chat bots for clients using SSE. The only drawback is that you need to add a FastAPI backend (or similar) to stream the data to the client :slight_smile:

gpt

EDIT: Here is a small example

4 Likes

Hi all, I’ve was able to create pretty nice streaming in a relative simlpe way - I’m using background callback manager and diskcache. Easy enough for a data scientist to do.

No need for the interval or SSE or a websocket (which are pretty much out of my league in terms of coding).

That’s definitely the most simple approach. The main drawback is that it won’t work faster than ~ 1 hz, and for streaming you typically want it (a lot) faster than that (compared to typical streaming, you’ll notice that you animation seems a bit laggy). But if ~ 1 hz is sufficient for you use case, it’s a perfectly valid approach :blush:

2 Likes

@Emil 's Websocket and SSE components can natively used with the async dash patch and hypercorn as asgi server

Hi @gbolly ! Thanks for the fantastic work. I recently started using Dash Plotly, and this is the only component I’ve found for building a chat application with it. Quick question: how can I display links in the chat (e.g., for sharing references)? Thanks!

@stinoco Are you asking about making links appear as clickable text? Currently, the component doesn’t support this out of the box, but I can open a PR in a couple of hours to add support for it. Should be pretty straight forward.

2 Likes

Yes! That’s exactly what I am looking for. Thanks again for the good work :slight_smile:

Hi, Could anyone please help me. I am trying to use the dash chat component for my app where I am creating a dashboard and the chat need to be beside the dashboard. I am using 2 dbc Cols (width=8, 4) but I am unable to fit the chat inside the column with width=4, I tried used container_style to fit width to 100% but it is not changing and I’m having horizontal scrollbar as width is >12. How could I customize the dash-chat component to fit inside it?

It seems the issue might be related to the dash-chat component itself. The component includes a default container class wrapping the chat box, which could be applying a fixed width.

To fix this, you can try creating a custom stylesheet to override the .container class. For example:

.container {  
  width: your-desired-width !important;  
}  

Keep in mind that this will affect any element in your app using the .container class, so use it cautiously. A more targeted approach would be:

.container:has(.chat-container) {  
  width: your-desired-width;  
}  

This will apply the style specifically to the container wrapping the chat box.

If neither of these solutions works, I can address this in a future update by removing the default container, which will make the chat box easier to style and integrate.

2 Likes

Thank you, @gbolly ! That worked perfectly and is exactly what I was looking for. I had been struggling with the container_style and fill_width parameters for a long time and tried many things, but nothing worked until now.

Hello @gbolly,

Really cool work! I really liked how easy it is to use this package to get a nice, working chat interface with only a few lines of code. Bravo!

After playing around, here is some feedback:

  • It would be nice to make the chat persistent with a persistent parameter like most Dash components.
  • It should throw an exception if the messages parameter is missing or if messages=None. Otherwise, we get JavaScript errors that are not explicit (respectively, “e is undefined” and “e is null”)
  • I think you should not style the width and height of the chat in the package, or maybe find another solution, because it doesn’t integrate well with parent containers. I had the same problem as @PV_Gunavardhan with an overflowing chat container not taking 100% width of my container. The same happens if you add padding outside the component: it doesn’t work anymore with full_height=True because 95vh is 95% of the window, not 95% of the parent container. For beginners, this will generate problems and confusion.
  • It could be nice to inherit the style from Dash Bootstrap or Dash Mantine components if there are used in the page. As a side note, the .container class is already used by Bootstrap (and I guess other CSS frameworks), so I don’t think using it in this package is a good idea.
  • I think the chat bubble should have a flexible width like modern chat apps. I used this CSS to get a width that matches the text size:
    .chat-messages {
        display: flex;
        flex-direction: column;
    }
  • Just like others have said previously, a simple parameter to enable streaming text would be nice. If it’s too complex for now, maybe adding a fake effect would do the job: Once the LLM response is done, you could make the text appear letter by letter or word by word at some speed, client-side. I bet that 90% of the time, it will look like it’s streamed even if it’s not the case. At the end of the day, it’s all about user perception. :sweat_smile:
  • and the most important: consider adding support markdown in the chat bubbles! for now, lists, links and other spaces do not display very well.

I created a short tutorial for Dash newcomers using your chat package:

I will share it on social platforms for Python developers so that they know it’s really easy to build this with Dash Plotly. :slight_smile:

Looking forward to see how this package improves!

5 Likes

Hi everyone!

I just made a fresh update on PyPI – Dash Chat 0.2.0!

What’s new in v0.2.0?

  • Support for text markdown formatting
  • Custom styling for chat bubbles – Have more control over how the chat bubbles would be displayed.
  • Chat message persistence
    …plus some other cosmetic tweaks.

As usual, I value your feedback, so go check it out and let me know what you think!
cc @spriteware

3 Likes

I tried the new version. It works well, well done Gbolahan. :slight_smile:
There are a lot of things to do with this component.
I am thinking about the possibility to attach a document to a message, or to edit/delete a message. These kinds of functionalities are both useful for regular chat interfaces (human to human) or AI chats.

Hi, fantastic work! One issue I found when I use v0.2 is that the markdown doesn’t look great when it has code in the answer. All line of codes are highlighted with grey color and rows are overlapped with each other. It will be great if it can work like display(Markdown()) in jupyter notebook. I personally use it a lot and it can print answers beautifully. Thanks!

Below is the way I usually do in jupyter notebook to print chatgpt response.
from IPython.display import display, Markdown.
display(Markdown(message[‘content’]))

Another question, how to hide the first user prompt?

I usually provided some context data to gpt and the data is from csv (converted to markdown). If I put the context prompt as default message, chatbot will show it. Is there a way to hide it?

Hi @odedloe87! Could you please share the code of your implementation? It looks good!

Sure! Here is a minimal example of how to stream a response.
Try asking it to write a story.

import dash
from dash import dcc, html, Input, Output, State, ctx, ALL, MATCH,clientside_callback,DiskcacheManager
import dash_bootstrap_components as dbc
import dash_mantine_components as dmc
import diskcache

cache = diskcache.Cache("./cache")
background_callback_manager = DiskcacheManager(cache)

sys_prompt = f"You are a chatbot assistant. Today is {datetime.datetime.now().strftime('%Y-%m-%d')}"

app = dash.Dash(__name__, background_callback_manager=background_callback_manager,external_stylesheets=[dbc.themes.BOOTSTRAP])
app.title="Streaming Simple Example App"

app.layout = dmc.Paper([
    html.Div(id='new_response_temp'),
    dmc.Paper(
        dmc.Grid(
        [
            
            dmc.Col(
                dmc.Textarea(id="text_input",placeholder="Ask me anything...",minRows=1,value='',maxRows=8,autosize=True,radius='md',persistence=True,
                                          rightSection=dmc.Center(dmc.ActionIcon(DashIconify(icon="bi:send"),id='submit')),
                                 style={
                                     "overflow": "hidden",
                                     "width": "100%",
                                     "boxSizing": "border-box",
                                 }),style={'width':'100%'},span=12),
        ]),
        radius='lg',
        withBorder=True,
        p='xs',
        style={
            'backgroundColor':'#f5f7f7',
            "position": "fixed",
            "bottom": "10px",
            "width": "60%",
            "left":"50%",
            'transform': 'translateX(-50%)',
            "boxSizing": "border-box",
            "height":"auto"
        }
    )
])

@app.callback(
    Input("submit","n_clicks"),
    State("text_input","value"),
    background = True,
    progress=[Output('new_response_temp','children')]
)
def response(set_progress,n_clicks,text_input):
    if n_clicks:
        response = client.chat.completions.create(
            model='gpt-4o',
            messages=[
                {"role":"system","content":sys_prompt},
                {"role":"user","content":text_input},
            ],
            temperature = 0,
        )
        chunks = []
        for chunk in response:
            if len(chunk.choices)>0:
                if chunk.choices[0].delta.content is not None:
                    new_chunk=chunk.choices[0].delta.content.split(" ")
                    chunks.append(' '.join(new_chunk))
                    set_progress(dcc.Markdown(''.join(chunks),style={'font-size':15}))


if __name__ == "__main__":
    app.run_server(debug=False)
1 Like