Video streaming in the Jupyter Notebook
ipywidgets plays an essential part in the Jupyter ecosystem; it brings interactivity between user and data.
Widgets are eventful Python objects that often have a visual representation in the Jupyter Notebook or JupyterLab: a button, a slider, a text input, a checkbox…
The Python object representing the widget is alive on the server side (back-end), inside of the Python kernel (the part responsible for executing the code in the Jupyter Notebook). The Python object contains all of the information about the state of the widget. In the case of the Slider widget, the Python object contains the minimum value, the maximum value and the current value of the slider. This Python object (on the back-end, server-side) synchronizes with the Javascript model of the widget (on the front-end, client-side), which contains the same information about the widget. Every time the user displays the widget in the notebook it creates a view which is kept in sync with the Javascript model. On the Slider example, you can see that the two views are synchronized.
More than a library of interactive widgets, ipywidgets is a powerful framework upon which it is straightforward to create new custom widgets. Developers can quickly start their widget library with best practices of code structure and packaging using the widget-cookiecutter project.
A lot of different widget libraries has been created. You can try them online using mybinder without the need of installing anything!:
- bqplot: 2-D interactive data (binder link)
- ipyleaflet: interactive maps (binder link)
- pythreejs: interactive 3-D scenes (binder link)
- ipyvolume: 3-D interactive data visualization and multi-volume rendering (binder link)
- nglview: 3-D interactive molecular visualization (binder link)
- gmaps: data visualization on Google-maps
- itk-jupyter-widgets: interactive 2-D and 3-D data visualization (binder link)
- …
Since ipywidgets 7.4 we have two new widgets: the Audio and Video widgets which made it easy to do image/audio processing in the Jupyter Notebook and Jupyterlab.
Like the Image widget, the new Audio and Video widgets synchronize the binary data between back-end and front-end. You can easily manipulate this data with your favorite library (OpenCV, scikit-image…) and update the widget value dynamically.
Those two widgets have been nice building blocks for creating the ipywebrtc library. ipywebrtc has been created by Maarten Breddels (Maarten is the author of the awesome libraries vaex and ipyvolume). It uses the power of the WebRTC browser API to allow video streaming inside of the Jupyter Notebook.
The API of ipywebrtc is very simple: first, the user would create what we call a MediaStream widget. A MediaStream widget can either be:
- A WidgetStream widget, given ANY input widget
- A VideoStream widget, given a Video widget as input
- An ImageStream widget, given an Image widget as input
- An AudioStream widget, given an Audio widget as input
- A CameraStream widget, which creates a video/audio stream given user’s webcam
Using the MediaStream widget, you can:
- Record a movie, using the VideoRecorder widget
- Take a snapshot, using the ImageRecorder widget
- Record audio, using the AudioRecorder widget
- Stream it to peers using the simple chat function
As for other widget libraries, you can try it live right now just by clicking on this link. You’ll be able to try all of those workflows.
Say you want to perform image processing on the fly using a camera linked to your computer and run a face recognition, an edge detection or any other fancy algorithm. It’s really easy to implement using ipywebrtc. All you need to do is create an instance of a CameraStream widget, create an ImageRecorder given the camera video stream as input and implement your callback that processes the image (using scikit-image for example).
Another nice feature of ipywebrtc is the ability to create a MediaStream widget from ANY widget. This means that you can easily record images and videos from your favorite widget library for 2-D or 3-D data visualization (here ipyvolume).
Once you played with those nice features of the library, you can download the videos/images that you created. Alternatively, you can share them directly using the chat function. This function takes a chat room name and a stream that you want to share (by default a CameraStream) as inputs, and allows you to turn your Jupyter Notebook into a conference room!
You can find the examples used to make those images on Github: https://github.com/QuantStack/quantstack-talks/tree/master/2018-11-14-PyParis-widgets/notebooks
About the Author
My name is Martin Renou, I am a Scientific Software Engineer at QuantStack. Before joining QuantStack, I studied at SUPAERO. I also worked at Logilab in Paris and Enthought in Cambridge. As an open source developer at QuantStack, I worked on a variety of projects, from xsimd and xtensor in C++ to ipyleaflet and ipywebrtc in Python and Javascript.