Stream RGB Data Over Wi-Fi* Using Intel® Aero Compute Board and Intel® RealSense™ Technology

Download Source Code 



This article shows you how to send a video stream from the Intel® Aero Compute Board that has an Intel® RealSense™ camera (R200) attached to it. This video stream will be broadcast over the compute board’s Wi-Fi* network to a machine that is connected to the Wi-Fi network. The video stream will be displayed in the QGroundControl* internal video window.


Target Audience

This article and the code sample within it is geared toward software engineers and drone developers who want to start learning about the Intel Aero Compute Board. Some information taken from other documents is also included.


General Information

The example given in this article assumes that you are working on an Ubuntu* 16.4 machine. Though you can work with GStreamer and LibRealSense on a Windows* platform, this article’s source code was written on top of Ubuntu 16.04; therefore giving any details for Windows is out of the scope of this document.

Although I will be referencing the Intel RealSense R200 camera in this article, this example is NOT using the LibRealSense library and taking advantage of the camera’s depth capabilities. Future articles will address that type of functionality. This is a simple app to get someone up and running with streaming to QGroundControl.

Note that the source code is running on the Intel Aero Compute Board, not a client computer. It sends a video stream out to a specified IP address. The client computer must be attached to the Intel Aero Compute Board network.


What’s Needed for the Sample Application

We assume that you do not have an Intel® Aero Ready to Fly Drone and will be working with the board itself.


What is the Intel® Aero Platform for UAVs

The Intel® Aero Platform for UAVs is a set of Intel® technologies that allow you to create applications that enable various drone functionalities. At its core is the Intel Aero Compute Board and the Intel® Aero Flight Controller. The combination of these two hardware devices allows for powerful drone applications. The flight controller handles all aspects of drone flight, while the Intel Aero Compute Board handles real-time computation. The two can work in isolation from one another or communicate via the MAVlink* protocol.


Video streaming: When connected to a camera, the Intel Aero Compute Board can handle all the computations of connecting to the camera and then pulling that stream of data and doing something with it. Perhaps it’s streaming that data back to a ground control station via the built-in Wi-Fi capabilities. All this computation is handled freely of the Aero flight controller.

Collision avoidance: The Intel Aero Compute Board is connected to a camera, this time the Intel RealSense camera (R200). The application can pull depth data from the camera, crunch that data, and make tactical maneuvers based on the environment around the drone. These maneuvers can be calculated on the compute board, and then using Mavlink, an altered course can be sent to the flight controller.

This article discusses video streaming; collision avoidance is out of the scope of this article.


The Intel® Aero Compute Board

Operating System

The Intel Aero Compute Board uses a customized version of Yocto* Linux*. Plans are being considered to provide Ubuntu in the future. Keeping the Intel Aero Compute Board up to date with the latest image of Yocto is out of the scope of this document. For more information on this process, please see the Intel-Aero / meta-intel-aero wiki.

Connector information

1Power and console UART
2USB 3.0 OTG
3Interface for Intel RealSense camera (R200)
44 lane MIPI* interface for high-res camera
51 lane MIPI interface for VGA camera
680 pin flexible I/O supports third-party flight controller and accessories( I2C, UART, GPIOs)
7MicroSD memory card slot
8Intel® Dual Band Wireless-AC
9M.2 Interface for PCIe* solid state drive
10Micro HDMI* port
RRESERVED for future use



Intel® Aero Vision Accessory Kit

The Intel® Vision Accessory Kit contains three cameras: an Intel RealSense camera (R200), an 8-megapixel (MP) RGB camera and a VGA camera that uses global shutter technology. With these three cameras, you have the ability to do depth detection using the Intel RealSense camera (R200) to perform use cases such as collision avoidance and creating point cloud data. With the 8-MP camera, the user can collect and stream much higher-quality RGB data than what the Intel RealSense camera (R200) is capable of streaming. With the VGA and its global shutter, one use case could be optical flow, which a developer could implement.

More detailed information about each camera.

Intel® RealSense camera (R200) - Product Details | Datasheet

8-MP RGB camera - Product Details

VGA camera - Product Details


Intel® RealSense™ Technology

With Intel RealSense technology using the Intel RealSense camera (R200), a user can stream depth, RGB, and IR data. The Intel Aero Platform for UAVs uses the open source library LibRealSense. This open source library is analogous to being a driver for the Intel RealSense camera (R200), allowing you to easily get streaming data from the camera. The library comes with several easy-to-understand tutorials for getting streaming up and running. For more information on using LibRealSense, visit the LibRealSense GitHub* site.



In order to develop against GStreamer on your Ubuntu computer, you must install the proper libraries. An in-depth look into the workings of GStreamer is beyond the scope of this article. For more information, see the GStreamer documentation. We recommend starting with the “Application Development Manual." To get all the necessary GStreamer libraries, install the following on your Ubuntu machine.

  • sudo apt-get update
  • sudo apt-get install ubuntu-restricted-extras
  • sudo apt-get install gstreamer1.0-libav
  • sudo apt-get install libgstreamer-plugins-base1.0-dev

As a bit of tribal knowledge, I have two different machines I’ve been developing on and these two different Ubuntu instances have installed Gstreamer in different locations: on one machine, Gstreamer headers and libraries are installed in /user/includes and /user/lib, and on the other, they are installed in /user/lib/x86_64-linux-gnu. You will see evidence of this in how I have included libraries and header files in my Eclipse project, which will appear as having duplicates. In hindsight, I could have just transferred the source code between two different project solutions.


Setting up Eclipse* Neon

As mentioned, you can use whatever IDE you like. I gravitated toward the C++ version of Eclipse Neon.

I assume that you know how to create an Eclipse C++ application and will just show you how I set up my include files and what libraries I chose.


Header Files


At this point, you should be ready to compile the following source code.

The Source Code

You may notice in the following code that the spacing is off. This is because I copied and pasted it directly out of my IDE. I didn’t want to change the spacing for this article in order to avoid messing up the formatting if you copy the code into your own IDE.

// AeroStreamRGBSimple
// Demonstrates how to capture RGB data from the RealSense camera and send
// it through a GStreamer pipeline. The end of the pipeline uses a UDP
// element to stream to wifi
// Built on Ubuntu 16.04 and Eclipse Neon.
//     * GStreamer
// Example
//   ./AeroStream

#include <gst/gst.h>
#include <stdio.h>
#include <stdlib.h>
#include <stdint.h>

int main( int argc, char *argv[ ] )

    // App requires a valid IP address to where QGroundControl is running.
    if( argc < 2 )
        printf( "Inform address as first parameter.\n" );
		exit( EXIT_FAILURE );

	char             str_pipeline[ 250 ]; // Holds the pipeline, needs to be large enough to hold the full string
		GstElement    	*pipeline     	= NULL;			 // The pipe for all the elements
		GError 			*error			= NULL;			// Holds error message if generated
		GMainLoop		*loop			= NULL;			// Main loop keeps the app running

		// Init GStreamer
		gst_init( &argc, &argv );

		// Construct the pipeline element string
	snprintf( str_pipeline, sizeof( str_pipeline ),
							"gst-launch-1.0 v4l2src device=/dev/video13 do-timestamp=true ! "

					"video/x-raw, format=YUY2, width=640, height=480, framerate=30/1 ! "
					"autovideoconvert ! vaapih264enc ! rtph264pay ! udpsink host=%s port=5600", argv[ 1 ] );

		// Parses the string to dynamically create the necessary elements behind the scene
	pipeline = gst_parse_launch( str_pipeline, &error );
		if( !pipeline )
			g_print( "Parse error: %s\n", error->message );
			return 1;

		// Set the gstreamer pipelines state to playing
		gst_element_set_state( pipeline, GST_STATE_PLAYING );

		// Create the app loop thread. This prevents the app from falling through to the end and exiting
		loop = g_main_loop_new( NULL, FALSE );
		g_main_loop_run( loop );

		// Clean up once the app ends executing

		gst_element_set_state( pipeline, GST_STATE_NULL );
		gst_object_unref( pipeline );
		gst_object_unref( loop );

		return 0;

My Workflow

A little about the workflow I used: I developed on my Ubuntu machine using Eclipse Neon. I then compiled the application to ensure there were no compilation errors. Next I transferred the files over to the Intel Aero Compute Board using shell scripts. Finally, I compiled the application on the Intel Aero Compute Board and ran it for testing.


Initial Thoughts

Again I want to mention that the information in this article does not teach GStreamer; rather it highlights a real working sample application. This article only touches the surface in how you can construct streams in GStreamer.

We start off ensuring that an IP address has been supplied as in input parameter. In a real-world sample it may be desirable to parse the input string to ensure it’s in the form of a real IP address. For the sake of simplicity, here we used the IP address of a client computer running QGroundControl. This client computer MUST be attached to the Intel Aero Compute Board Wi-Fi network for this to work.

Next we declare some variables. The pipeline will be populated by all the GstElements needed to run our sample code. The GMainLoop is not a GStreamer construct;, rather it’s part of the Gnome project. It runs in its own thread and is used to keep the application alive and from falling through to the end of the code.

The gst_parse_launch command will parse out the GStreamer string. Behind the scenes it analyzes all the elements in the string and constructs the GstElements along with other aspects of GStreamer. After checking to see that there are no errors, we set the pipeline’s state to playing.

Remember: This code runs on the Intel Aero Compute Board. You are sending this data FROM the Intel Aero Compute Board to a client machine somewhere on the same network.


Now I want to point out a couple of critical aspects of the GStreamer string.

  • v4l2src device=/dev/video13
    This tells the GStreamer pipeline which device to connect to. On the Intel Aero Compute Board, the Intel RealSense camera (R200) RGB is video13.
  • udpsink host=%s port=5600
    This tells GStreamer to use UDB and send the video stream via Wi-Fi to a particular IP address and port. Remember, the IP address is coming in via a command line parameter. You can include the port number on the command line as well if you want.

We create a new GMainLoop and get the loop running. Here the application will continue to run and while this loop is running, GStreamer is doing its processing of pulling data from the camera, processing that data, and sending it out to Wi-Fi.

At the end, we do some simple cleanup.

NOTE: While I work on my Ubuntu machine, I must still compile on the Intel Aero Compute Board.
NOTE: The way the GStreamer


Intel Aero Compute Board Setup

At this point, you have a project set up in your IDE. You’ve compiled the source code. The next step is to get the board connected.

The following images show you how to connect the board.

Now you can power up the board. Once it’s fully powered up, it will automatically start a Wi-Fi access point. The next step walks you through setting up connectivity on Ubuntu.

Connecting Wirelessly to the Intel Aero Compute Board

Once you have powered up the Intel Aero Compute Board, you can connect to it via Wi-Fi. In Ubuntu 16.04, you will see a derivative of CR_AP-xxxxx. This is the network connection you will be connecting to.

The SSID is 1234567890



If you do not see this network connection and provided you have hooked up a keyboard and monitor to your Intel Aero Compute Board, on the Intel Aero Computer Board, run the following command:

sh-4.3# lspci

This shows you a list of PCI devices. Check for the following device:

         01:00.0 Network controller: Intel Corporation Wireless 8260 (rev3a)

If you do not se this connection, do a “warm” boot.

sh-4.3# reboot

Wait for the Intel Aero Compute Board to reboot. You should now see the network controller if you run lspci a second time. Attempt once again to connect via the wireless network settings in Ubuntu.

At times, I have seen an error message in Ubuntu saying:

         Out of range

If you get this error, try the following:

  • Make sure there are no other active network connections; if there are, disconnect from them.
  • Reboot Ubuntu

More on the Intel Aero Compute Board Wi-Fi can be found at the Intel Aero Meta Wiki


Useful Bash Shell Scripts

Now that you’ve got the code compiled on Ubuntu, it’s time to move it over to the Intel Aero Compute Board. Remember that even though you might compile on your Ubuntu machine, you will still need to compile on the Intel Aero Compute Board as well. What I found was that if I skip this step, Yocto gives me an error saying that AeroStream is not a program.

To help expedite productivity, I’ve created a couple of small shell scripts. They aren’t necessary or required; I just got tired of typing the same things over and over.

migrateAero: First, it should be obvious that you must have a Wi-Fi connection to the Intel Aero Compute Board for this script to run .

This script runs from your Ubuntu machine. I keep it at the root of my Eclipse working folder. After I’ve made changes to the AeroStream project, I run this to migrate files over to the Intel Aero Compute Board. Technically, I don’t need to push the ‘makeAero’ script every time. But because I never know when I might change it, I always copy it over.

# clean up these files or files wont get compiled on the Aero board. At least this is what I've found to be the case
rm AeroStream/Debug/src/AeroStream*
rm AeroStream/Debug/AeroStream

# Now push the entire AeroStream Eclipse Neon project to the Aero board. This will create the folder /home/AeroStream on the Aero board. 
scp -r AeroStream root@

# makeAero script essentially runs a make and executes AeroStream
scp makeAero root@

makeAero: Runs on the Intel Aero Compute Board itself. It gets migrated with the project and ends up at the root of /home. All it’s doing is navigating into the debug directory and running the make file, and then launching AeroStream.

#Created a shortcut script because I'm tired of typing this in every time I need to migrate

cd /home/AeroStream/Debug

Instead of pushing the entire project over, you might instead just create your own make file(s) and just push the source code, however, this approach worked for me.

Also, you don’t even need to create a project on Ubuntu using Eclipse. Instead, if you feel confident enough you can just develop right there on the board itself.


How to Configure QGroundControl

There is one last step to complete: configuring QGroundControl. Downloading and installing QGroundControl is out of the scope of this document. However, I need to show you how to set up QGroundControl to talk to receive the GStreamer stream from the Intel Aero Compute Board Wi-Fi.

Note that QGroundControl also uses GStreamer for its video streaming capabilities. This is how the connection is actually being created. GStreamer has the ability to send to Wi-Fi from one location, and then listen for a signal from another location via Wi-Fi. And this is how QGC is accomplishing this.

NOTE: Make sure you are using the SAME port that you have configured in your GStreamer pipeline.


Step 1

When you launch QGroundControl, it opens into flight path mode. You need to click the QGroundControl icon to get to the configuration area.


Step 2

Click the Comm Links button. This displays the Comm Links configuration page.

Click Add.

Step 3

This displays the Create New Link Configuration page.

  1. Give the configuration a name. Any name is OK.
  2. For Type, select UDP.
  3. Select the Listening Port number. This port number must match the port that is being used from the GStreamer pipeline.
  4. Click OK.

Step 4

You will now see the new comm link in QGroundControl.

Launch the Application

NOTE: QGroundControl MUST be running first. It has to be put into listening mode. If you launch your streaming server application first, the connection will not be made. This is just an artifact of GStreamer.

  1. Launch QGroundControl.
  2. Launch AeroStream from the Intel Aero Compute Board. If everything has gone according to plan, you will see your video stream show up in QGroundControl.

Intel Aero Compute Board and GitHub*

Visit the Intel Aero Compute Board GitHub for various software code bases to keep your Intel Aero Compute Board up to date.


Other Resources



This article helped get you up and running with streaming capabilities with the Intel Aero Compute Board. I gave you an overview of the board itself and showed you how to connect to it. I also showed you which libraries are needed, how I set up Eclipse for my own project, and how to get Wi-Fi up, transfer files, and set up QGroundControl. At this point you are ready to explore other capabilities of the board and streaming.


Для получения подробной информации о возможностях оптимизации компилятора обратитесь к нашему Уведомлению об оптимизации.
Возможность комментирования русскоязычного контента была отключена. Узнать подробнее.