Video streaming in the Jupyter Notebook

Martin Renou
Towards Data Science
5 min readNov 19, 2018

--

ipywidgets plays an essential part in the Jupyter ecosystem; it brings interactivity between user and data.

Widgets are eventful Python objects that often have a visual representation in the Jupyter Notebook or JupyterLab: a button, a slider, a text input, a checkbox…

The user can get access and set the value of the slider writing Python code, and of course, interact with it using the mouse

The Python object representing the widget is alive on the server side (back-end), inside of the Python kernel (the part responsible for executing the code in the Jupyter Notebook). The Python object contains all of the information about the state of the widget. In the case of the Slider widget, the Python object contains the minimum value, the maximum value and the current value of the slider. This Python object (on the back-end, server-side) synchronizes with the Javascript model of the widget (on the front-end, client-side), which contains the same information about the widget. Every time the user displays the widget in the notebook it creates a view which is kept in sync with the Javascript model. On the Slider example, you can see that the two views are synchronized.

More than a library of interactive widgets, ipywidgets is a powerful framework upon which it is straightforward to create new custom widgets. Developers can quickly start their widget library with best practices of code structure and packaging using the widget-cookiecutter project.

A lot of different widget libraries has been created. You can try them online using mybinder without the need of installing anything!:

Since ipywidgets 7.4 we have two new widgets: the Audio and Video widgets which made it easy to do image/audio processing in the Jupyter Notebook and Jupyterlab.

Like the Image widget, the new Audio and Video widgets synchronize the binary data between back-end and front-end. You can easily manipulate this data with your favorite library (OpenCV, scikit-image…) and update the widget value dynamically.

Edge detection using OpenCV on a Video widget

Those two widgets have been nice building blocks for creating the ipywebrtc library. ipywebrtc has been created by Maarten Breddels (Maarten is the author of the awesome libraries vaex and ipyvolume). It uses the power of the WebRTC browser API to allow video streaming inside of the Jupyter Notebook.

The API of ipywebrtc is very simple: first, the user would create what we call a MediaStream widget. A MediaStream widget can either be:

  • A WidgetStream widget, given ANY input widget
  • A VideoStream widget, given a Video widget as input
  • An ImageStream widget, given an Image widget as input
  • An AudioStream widget, given an Audio widget as input
  • A CameraStream widget, which creates a video/audio stream given user’s webcam

Using the MediaStream widget, you can:

  • Record a movie, using the VideoRecorder widget
  • Take a snapshot, using the ImageRecorder widget
  • Record audio, using the AudioRecorder widget
  • Stream it to peers using the simple chat function

As for other widget libraries, you can try it live right now just by clicking on this link. You’ll be able to try all of those workflows.

Say you want to perform image processing on the fly using a camera linked to your computer and run a face recognition, an edge detection or any other fancy algorithm. It’s really easy to implement using ipywebrtc. All you need to do is create an instance of a CameraStream widget, create an ImageRecorder given the camera video stream as input and implement your callback that processes the image (using scikit-image for example).

Creation of an ImageRecorder taking snapshots of the CameraStream, and process images on the fly using scikit-image

Another nice feature of ipywebrtc is the ability to create a MediaStream widget from ANY widget. This means that you can easily record images and videos from your favorite widget library for 2-D or 3-D data visualization (here ipyvolume).

Create a WidgetStream with an ipyvolume widget as input and record a video using the VideoRecorder

Once you played with those nice features of the library, you can download the videos/images that you created. Alternatively, you can share them directly using the chat function. This function takes a chat room name and a stream that you want to share (by default a CameraStream) as inputs, and allows you to turn your Jupyter Notebook into a conference room!

Chatroom created live with ipywebrtc during a presentation at PyParis

You can find the examples used to make those images on Github: https://github.com/QuantStack/quantstack-talks/tree/master/2018-11-14-PyParis-widgets/notebooks

About the Author

My name is Martin Renou, I am a Scientific Software Engineer at QuantStack. Before joining QuantStack, I studied at SUPAERO. I also worked at Logilab in Paris and Enthought in Cambridge. As an open source developer at QuantStack, I worked on a variety of projects, from xsimd and xtensor in C++ to ipyleaflet and ipywebrtc in Python and Javascript.

--

--