Low Latency C&C and Video Streaming with the Nvidia Jetson Nano: Video Streaming

Low Latency C&C and Video Streaming with the Nvidia Jetson Nano: Video Streaming

Video Streaming

Before starting a video stream, first get the information on your video camera’s capabilities. You can list the cameras attached to the Jetson Nano and check their capabilities using v4l2-ctl.
$ v4l2-ctl --list-devices
vi-output, imx219 6-0010 (platform:54080000.vi:0):
$ v4l2-ctl -d /dev/video0 --list-formats-ext
        Index       : 0
        Type        : Video Capture
        Pixel Format: 'RG10'
        Name        : 10-bit Bayer RGRG/GBGB
                Size: Discrete 3264x2464
                        Interval: Discrete 0.048s (21.000 fps)
                Size: Discrete 3264x1848
                        Interval: Discrete 0.036s (28.000 fps)
                Size: Discrete 1920x1080
                        Interval: Discrete 0.033s (30.000 fps)
                Size: Discrete 1280x720
                        Interval: Discrete 0.017s (60.000 fps)
                Size: Discrete 1280x720
                        Interval: Discrete 0.017s (60.000 fps)

In our case, we have one camera which is attached  and it is exposed to the user as /dev/video0. Our tests will be conducted using 1920x1080 at 30 fps. Fast encoding at H.264 can be accomplished using the omxh264enc plugin. You can see details and options of the omxh264enc plugin using

$ gst-inspect-1.0 omxh264enc

The output is very long and is not shown. The equivalent H.265 encoder plugin is omxh265enc. Nvidia has an  Accelerated GStreamer User Guide available online which details some of the capabilities of the Jetson Nano when used with GStreamer.


RTSP Streaming

RTSP streaming can be started using

$ ./gst-rtsp-server/examples/test-launch "nvarguscamerasrc ! video/x-raw(memory:NVMM) width=1920 height=1080 framerate=30/1 format=NV12 ! omxh264enc iframeinterval=15! h264parse ! rtph264pay name=pay0 pt=96"

Note that the command points to the gst-rtsp-server directory which was cloned earlier. To encode with H.265, change omxh264enc to emxh265enc, h264parse to h265parse, and rtph264pay to rtph265pay.

The stream can be picked up on the receiving PC using

$ gst-launch-1.0 -v rtspsrc location=rtspt://<IP Address>:8554/test ! application/x-rtp, payload=96 ! rtph264depay ! avdec_h264 ! videoconvert ! autovideosink sync=false

where <IP Address> is the IP address of the Jetson Nano. Again, to decode using H.265, change rtph264depay to rtph265depay and avdec_h264 to avdec_h265. In the command above we use TCP as the transport protocol. To use UDP, the “location=rtspt://” part should be changed to “location=rtsp://”.


With the above settings, the glass-to-glass latency was typically 110ms with only about 10ms for the transport through the Smart Radios.

The difference between TCP and UDP was around 3 ms.

Note that Doodle Labs Mesh Rider uses special radio and parameters to optimize the video transmission over wireless medium in high interference areas. For video transmission within Smart Radio private network, we recommend use of TCP and this video queue. 

A screen shot of a typical output is shown below. We can see that the latency added by the Smart Radio network amounted to less than around 10ms.


Figure 5 – Glass to glass latency