The world’s leading publication for data science, AI, and ML professionals.

Hermes – Wildfire Detection using NVIDIA Jetson and Ryze Tello

Wildfires have been ever-increasing, devouring our planet earth, rendering our planet worse day by day. Just in 2019 and 2020 alone, there…

An NVIDIA Deepstream 5.0 and Ryze Tello powered Computer Vision Application to detect wildfires using YOLO.

Image by Author | 10 points if you notice the drone on my shoulder!
Image by Author | 10 points if you notice the drone on my shoulder!

Wildfires have been ever-increasing, devouring our planet earth, rendering our planet worse day by day. Just in 2019 and 2020 alone, there has been more than enough wildfires to throw off earth’s ecology out-of-balance. Amazon wildfires, Californian wildfires, Arctic Wildfires and Australian bush-fires are some of the incidents.

Photo by Arny Mogensen on Unsplash
Photo by Arny Mogensen on Unsplash

With early detection and mitigation, it is possible to reduce the damage caused by wildfires. To better enable our front-line workers, here is Hermes, an AI-powered Computer Vision application that helps in early detection of Wildfires using reconnaissance drones.

This is a technical blog, which will help you get started with the NVIDIA Jetson platform for Intelligent Video Analytics.


Introduction

Hermes Application consists of two parts. An Intelligent Video Analytics Pipeline powered by Deepstream and NVIDIA Jetson Xavier NX and a reconnaissance drone, for which I have used a Ryze Tello.

Image by Author | Ryze Tello and Jetson Xavier NX
Image by Author | Ryze Tello and Jetson Xavier NX

Deepstream SDK by NVIDIA is aimed at helping developers build better Intelligent Video Analytics pipelines. You can read more about it here. The DJI Ryze Tello is a drone for enthusiasts and tinkerers with an open-source SDK.

This project is a proof-of-concept, trying to show that surveillance and mapping of wildfires can be done with a drone and an onboard Jetson platform. Scroll to the end to see the system in action! 😄


Deepstream Setup

This post assumes you have a fully functional Jetson device. If not, you can refer the documentation here.

1. Install System Dependencies

sudo apt install 
libssl1.0.0 
libgstreamer1.0-0 
gstreamer1.0-tools 
gstreamer1.0-plugins-good 
gstreamer1.0-plugins-bad 
gstreamer1.0-plugins-ugly 
gstreamer1.0-libav 
libgstrtspserver-1.0-0 
libjansson4=2.11-1

2. Install Deepstream

Download the DeepStream 5.0.1 Jetson Debian package deepstream-5.0_5.0.1-1_arm64.deb, to the Jetson device from here. Then enter the command:

sudo apt-get install ./deepstream-5.0_5.0.1-1_arm64

Ryze Tello Setup

1. Installing pip packages

First, we need to install the python dependencies. Make sure you have a working build of python3.7/3.8

sudo apt install python3-dev python3-pip

The dependencies needed are the following:

djitellopy==1.5
evdev==1.3.0
imutils==0.5.3
numpy==1.19.4
opencv-python==4.4.0.46
pycairo==1.20.0
pygame==2.0.1
PyGObject==3.38.0
pynput==1.7.2
python-xlib==0.29
redis==3.5.3
six==1.15.0

You can either install them with pip command or use the requirements.txt file. Whatever sails your boat. 😃

# For individial packages
pip3 install <packagename>
# For requirements.txt
pip3 install -r requirements.txt

2. Redis

Redis is used for it’s queueing mechanism, which will be used to create an RTSP stream of the Tello camera stream.

Install the Redis Server:

sudo apt install redis-server

3. Connect Tello

First, connect the Jetson Device to the WiFi network of Tello.

Image by Author
Image by Author

Next, run the following code to verify connectivity

# Importing the Tello Drone Library
from djitellopy import Tello
pkg = Tello()
pkg.connect()

On a successful connection, your output will look something like this

Send command: command
Response: b'ok'

If you get the following output, you may want to check your connection with the drone

Send command: command
Timeout exceed on command command
Command command was unsuccessful. Message: False

Running the Application

The application can be run on sample input videos or the drone stream.

1. Clone the repository

This is a straightforward step, however, if you are new to git or git-lfs, I recommend glancing threw the steps.

First, install git and git-lfs

sudo apt install git git-lfs

Next, clone the repository

# Using HTTPS
git clone https://github.com/aj-ames/Hermes-Deepstream.git
# Using SSH
git clone [email protected]:aj-ames/Hermes-Deepstream.git

Finally, enable lfs and pull the Yolo weights

git lfs install
git lfs pull

2. Run with different input sources

The computer vision part of the solution can be run on one or many input sources of multiple types, all powered using NVIDIA Deepstream.

First, build the application by running the following command:

make clean &amp;&amp; make -j$(nproc)

This will generate the binary called hermes-app. This is a one-time step and you need to do this only when you make source-code changes.

Next, create a file called inputsources.txt and paste the path of videos or RTSP URL.

file:///home/astr1x/Videos/Wildfire1.mp4
rtsp://admin:admin%[email protected]:554/stream

Now, run the application by running the following command:

./hermes-app

3. Run with the drone

We utilize the Livestream of the camera for real-time detection of wildfires.

Since the Tello streams over UDP and the Deepstream Hermes app accepts RTSP as input, we need an intermediate UDP->RTSP converter. Also, we need to control Tello’s movement.

Run the following command to start the Tello control script:

python3 tello-control.py

This script will start the Tello stream on the following URL:

rtsp://127.0.0.1:6969/hermes

To control the drone with your keyboard, first press the Left Shift key. The following is a list of keys and what they do –

  • Left Shift -> Toggle Keyboard controls
  • Right Shft -> Take off the drone
  • Space -> Land drone
  • Up arrow -> Increase Altitude
  • Down arrow -> Decrease Altitude
  • Left arrow -> Pan left
  • Right arrow -> Pan right
  • w -> Move forward
  • a -> Move left
  • s -> Move down
  • d -> Move right

Finally, add the URL in inputsources.txt and start ./hermes-app.


The final takeaway

My effort and ideas have been directed towards leveraging the latest technology to better our environment. With the advancement in Computer Vision, a new generation of AI-enabled devices and robots will help make our planet a better place, if we choose to! 🙂

You can find the entire source on my GitHub here.

Here’s a video demonstration –


Related Articles