I am trying to create a Dash App where a user can select a video or enter a video url and then click a button on the app to see output after running an OpenCV model on the app.
I am clueless about how to show the output from OpenCV to my Dash app. From online posts and articles I learnt that I need to use web-sockets or a Quartz server to live stream the output frames. I have no experience working with web-sockets and couldn’t figure out how to set it up?
Can anyone guide me on how can I go from here?
I am sure there is a more elegant solution, but I have used a background thread that processes an uploaded video which stores the latest processed image in a queue. Although its heavily modified I think this was the blog that inspired me.
My use case is processing relatively long videos and tracking objects. I don’t need the user to see every frame (i am mostly interested in the data) but I want to let the user monitor if the tracking is following the right object. I use an interval callback to send a frame back every 5s or so, but every frame is being processed. The frame is stored in a 1 item Queue.
Hopefully this provides some inspiration at least.
from queue import Queue
# other imports
# dash app setup and layout
q_plot = Queue(maxsize=1)
# do stuff to get image, possibly in a separate thread (where q_plot is passed in as an argument)
# in the lines below f_idx is the frame index of the video which is incremented as its being processed
if f_idx % num_frames_to_skip == 0 and q.plot.empty():
# back in dash thread in callback, possibly in an interval component callback
img = q_plot.get_nowait()
img = cv2.cvtColor(img.astype(np.uint8), cv2.COLOR_BGR2RGB)
# use plotly express to create figure and sent to dcc.Graph object in callback
fig = px.imshow(img, binary_string=True)
Thanks @skiefer !
Seems like a nice approach, I’ll try this out at my end.