Quantcast
Channel: Intel Developer Zone Articles
Viewing all articles
Browse latest Browse all 3384

Intel® Aero Compute Board and Intel® RealSense™ Technology for Wi-Fi* Streaming of RGB and Depth Data

$
0
0

Download Source Code 

Contents

Introduction
Target Audience
General Information
What’s Needed for this Sample Application?
What is the Intel® Aero Platform?
    Two examples:
The Intel® Aero Compute Board
    Operating system
    Connector information
Intel® Aero Vision Accessory Kit
Intel® RealSense™ Technology
GStreamer
Setting Up Eclipse Neon*
    Header files
    Libraries
The Source Code
    My Workflow
    Some Initial Thoughts
    gint main(gint argc, gchar *argv[])
    static void print_realsense_error( rs_error *err )
    static gboolean init_realsense( rs_device **camera )
    static gboolean init_streams( rs_device *camera )
    static void cb_need_data ( GstAppSrc *appsrc, guint unused_size, gpointer user_data )
Intel® Aero Compute Board Setup
Connecting Wirelessly to the Intel Aero Compute Board
    Troubleshooting
Useful Bash Shell Scripts
    migrateAero
    makeAero
How to Configure QGroundControl
    Step 1
    Step 2
    Step 3
    Step 4
Launch the App
Intel® Aero Compute Board and GitHub*
Other Resources
Summary

Introduction

This article shows you how to send a video stream including RGB and Depth data from the Intel® Aero Compute Board that has an Intel® RealSense™ R200 camera attached to it. This video stream will be broadcast over the compute board’s Wi-Fi* network to a machine that is connected to the Wi-Fi network. The video stream will be displayed in the QGroundControl* internal video window.

Target Audience

This article and the code sample within it is geared toward software engineers and drone developers who want to start learning about the Intel Aero Compute Board. Some information taken from other documents is also included.

General Information

The example given in this article assumes that you are working on an Ubuntu* 16.4 machine. Though you can work with GStreamer and LibRealSense on a Windows* platform, this article’s source code was written on top of Ubuntu 16.04; therefore giving any details for Windows is out of the scope of this document.

Although I will be referencing the Intel RealSense R200 camera in this article, this example is NOT using the LibRealSense library and taking advantage of the camera’s depth capabilities. Future articles will address that type of functionality. This is a simple app to get someone up and running with streaming to QGroundControl.

Note that the source code is running on the Intel Aero Compute Board, not a client computer. It sends a video stream out to a specified IP address. The client computer must be attached to the Intel Aero Compute Board network.

What’s Needed for this Sample Application?

Assuming you do not have an Intel® Ready-to-Fly Drone and will be working with the board itself, the following items will be needed:

What is the Intel® Aero Platform?

The Intel® Aero Platform for UAVs is a set of Intel® technologies that allow you to create applications that enable various drone functionalities. At its core is the Intel Aero Compute Board and the Intel® Aero Flight Controller. The combination of these two hardware devices allows for powerful drone applications. The flight controller handles all aspects of drone flight, while the Intel Aero Compute Board handles real-time computation. The two can work in isolation from one another or communicate via the MAVlink* protocol.

Two examples:

Video streaming: When connected to a camera, the Intel Aero Compute Board can handle all the computations of connecting to the camera and then pulling that stream of data and doing something with it. Perhaps it’s streaming that data back to a ground control station via the built-in Wi-Fi capabilities. All this computation is handled freely of the Aero flight controller.

Collision avoidance: The Intel Aero Compute Board is connected to a camera, this time the Intel RealSense camera (R200). The application can pull depth data from the camera, crunch that data, and make tactical maneuvers based on the environment around the drone. These maneuvers can be calculated on the computer board, and then using Mavlink, an altered course can be sent to the flight controller.

This article discusses video streaming; collision avoidance is out of the scope of this article.

The Intel® Aero Compute Board

Operating system

The Intel Aero Compute Board uses a customized version of Yocto* Linux*. Plans are being considered to provide Ubuntu in the future. Keeping the Intel Aero Compute Board up to date with the latest image of Yocto is out of the scope of this document. For more information on this process, please see the Intel-Aero / meta-intel-aero wiki.

Connector information

1

Power and console UART

2

USB 3.0 OTG

3

Interface for Intel® RealSense™ camera (R200)

4

4-lane MIPI interface for high-resolution camera

5

1-lane MIPI interface for VGA camera

6

80-pin flexible I/O supports third-party flight controller and accessories (I2C, UART, GPIOs)

7

MicroSD memory card slot

8

Intel® Dual Band Wireless-AC

9

M.2 interface for PCIe SSD

10

Micro HDMI port

R

RESERVED for future use

Intel® Aero Vision Accessory Kit

The Intel® Vision Accessory Kit contains three cameras: an Intel RealSense camera (R200), an 8-megapixel (MP) RGB camera and a VGA camera that uses global shutter technology. With these three cameras, you have the ability to do depth detection using the Intel RealSense camera (R200) to perform use cases such as collision avoidance and creating point cloud data. With the 8-MP camera, the user can collect and stream much higher-quality RGB data than what the Intel RealSense camera (R200) is capable of streaming. With the VGA and its global shutter, one use case could be optical flow, which a developer could implement.

More detailed information about each camera can be found here:

Intel RealSense camera (R200)

8-MP RGB camera

VGA camera

Intel® RealSense™ Technology

With Intel RealSense technology using the Intel RealSense camera (R200), a user can stream depth, RGB, and IR data. The Intel Aero Platform for UAVs uses the open source library LibRealSense. This open source library is analogous to being a driver for the Intel RealSense camera (R200), allowing you to easily get streaming data from the camera. The library comes with several easy-to-understand tutorials for getting streaming up and running. For more information on using LibRealSense, visit the LibRealSense GitHub* site.

GStreamer

In order to develop against GStreamer on your Ubuntu computer, you must install the proper libraries. An in-depth look into the workings of GStreamer is beyond the scope of this article. For more information, see the GStreamer documentation. We recommend starting with the “Application Development Manual." To get all the necessary GStreamer libraries, install the following on your Ubuntu machine.

  • sudo apt-get update
  • sudo apt-get install ubuntu-restricted-extras
  • sudo apt-get install gstreamer1.0-libav
  • sudo apt-get install libgstreamer-plugins-base1.0-dev

As a bit of tribal knowledge, I have two different machines I’ve been developing on and these two different Ubuntu instances have installed Gstreamer in different locations: on one machine, Gstreamer headers and libraries are installed in /user/includes and /user/lib, and on the other, they are installed in /user/lib/x86_64-linux-gnu. You will see evidence of this in how I have included libraries and header files in my Eclipse project, which will appear as having duplicates. In hindsight, I could have just transferred the source code between two different project solutions.

Setting Up Eclipse Neon*

As mentioned, you can use whatever IDE you like. I gravitated toward the C++ version of Eclipse Neon.

I assume that you know how to create an Eclipse C++ application and will just show you how I set up my include files and what libraries I chose.

Header files

Libraries

Unless I’ve missed anything, you should be ready to compile the following source code.

The Source Code

When looking at this source code, you may find that the spacing is off. This is because I copied and pasted this code directly out of my IDE. I didn’t change the spacing for this document so that it wouldn’t mess up the formatting for you if you copy this into your own IDE.

//=============================================================================
// AeroStreamDepth
// Demonstrates how to capture RGB and depth data from the RealSense camera,
// manipulate the data to create a RGB depth image, the put each individual
// frame into the GStreamer pipeline
//
// Unlike other pipelines where the source is a physical camera, the source
// to this pipeline is a appsrc element. This element gets its data
// frame-by-frame from a continuous pull from the R200 camera.
//
// Built on Ubuntu 16.04 and Eclipse Neon.
//
//	SOFTWARE DEPENDENCIES
//	* LibRealSense
//	* GStreamer
//
// Example
//   ./AeroStream 192.168.1.1
//=============================================================================

#include <gst/gst.h>
#include <gst/app/gstappsrc.h>
#include <librealsense/rs.h>
#include <stdio.h>
#include <stdlib.h>
#include <memory.h>
#include <stdint.h>
#include <time.h>

const int WIDTH 	= 640;
const int HEIGHT 	= 480;
const int SIZE 		= 640 * 480 * 3;

// camera always returns this for 1 / get_depth_scale()
const uint16_t ONE_METER = 999;


// Holds the RGB data coming from R200
struct rgb
{
    unsigned char r;
    unsigned char g;
    unsigned char b;
};


// Function descriptions with the definitions
static void 	print_realsense_error( rs_error *err );
static gboolean init_realsense( rs_device **camera );
static gboolean init_streams( rs_device *camera );
static void 	cb_need_data ( GstAppSrc *appsrc, guint unused_size, gpointer user_data );


//=======================================================================================
// The main entry into the application. DUH!
// arg[] will contain the IP address of the machine running QGroundControl
//=======================================================================================
gint main(gint argc, gchar *argv[])
{
    // App requires a valid IP address to where QGroundControl is running.
    if( argc < 2 )
    {
        printf( "Inform address as first parameter.\n" );
        exit( EXIT_FAILURE );
    }

    char 		str_pipeline[ 200 ];	// Holds the pipeline
    rs_device 	*camera		= NULL;		// The R200 camera
    GMainLoop 	*loop		= NULL;		// Main app loop keeps app alive
    GstElement	*pipeline	= NULL;		// GStreamers pipeline for data flow
    GstElement	*appsrc		= NULL;		// Used to inject buffers into a pipeline
    GstCaps		*app_caps	= NULL;		// Define the capabilities of the appsrc element
    GError 		*error 		= NULL;		// Holds error message if generated

    GstAppSrcCallbacks cbs;				// Callback functions/signals for appsrc

    // Initialize GStreamer
    gst_init( &argc, &argv );

    // Create the main application loop.
    loop = g_main_loop_new( NULL, FALSE );

    // Builds the following pipeline.
    snprintf( str_pipeline, sizeof( str_pipeline ), "appsrc name=mysource ! videoconvert ! ""video/x-raw,width=640,height=480,format=NV12 ! vaapih264enc ! h264parse ! rtph264pay ! ""udpsink host=%s port=5600", argv[ 1 ] );

    // Instruct GStreamer to construct the pipeline and get the beginning element appsrc.
    pipeline   	= gst_parse_launch( str_pipeline, &error );
	if( !pipeline )
	{
		g_print( "Parse error: %s\n", error->message );
		return 1;
	}


    appsrc		= gst_bin_get_by_name( GST_BIN( pipeline ), "mysource" );


    // Create a caps (capabilities) struct that gets feed into the appsrc structure.
    app_caps = gst_caps_new_simple( "video/x-raw", "format", G_TYPE_STRING, "RGB","width", G_TYPE_INT, WIDTH, "height", G_TYPE_INT, HEIGHT, NULL );

    // This is going to specify the capabilities of the appsrc.
    gst_app_src_set_caps( GST_APP_SRC( appsrc ), app_caps );

    // Don't need it anymore, un ref it so the memory can be removed.
    gst_caps_unref( app_caps );


    // Set a few properties on the appsrc Element
    g_object_set( G_OBJECT( appsrc ), "is-live", TRUE, "format", GST_FORMAT_TIME, NULL );


    // GStreamer is all setup so initialize the camera and start the streaming.
    init_realsense( &camera );
    init_streams( camera );


    // play
    gst_element_set_state( pipeline, GST_STATE_PLAYING );

    // connect signals

    cbs.need_data 	= cb_need_data;

    // Apply the callbacks to the appsrc Elemen / Connect the signals.
    // In other words, cb_need_data will constantly being called
    // to pull data from R200. Why? Because it needs data. =)
    gst_app_src_set_callbacks( GST_APP_SRC_CAST( appsrc ), &cbs, camera, NULL );


    // Launch the stream to keep the app running rather than falling out and existing
    g_main_loop_run( loop );

    // clean up
    gst_element_set_state( pipeline, GST_STATE_NULL );
    gst_object_unref( GST_OBJECT ( pipeline ) );
    gst_object_unref( GST_OBJECT ( appsrc ) );
    gst_object_unref( GST_OBJECT ( app_caps ) );
    g_main_loop_unref( loop );

    return 0;
}



//=============================================================================
// Print RealSense error messages.
//=============================================================================
static void print_realsense_error( rs_error *err )
{
    printf( "rs_error was raised when calling %s( %s ):\n", rs_get_failed_function( err ), rs_get_failed_args( err ) );
    printf("    %s\n", rs_get_error_message( err ) );
}




//=============================================================================
// Initializes the R200 camera.
// Returns true if successful
//=============================================================================
static gboolean init_realsense( rs_device **camera )
{
	gboolean ret 	= true;
    rs_error *e 	= 0;

    // Create the context, ensure no errors
    rs_context *ctx = rs_create_context( RS_API_VERSION, &e );
    if( e )
    {
    	print_realsense_error( e );
        ret = false;
    }
    if( ret )
    {
    	// Create the actual camera, check for no errors
		*camera = rs_get_device( ctx, 0, &e );
		if( !camera || e )
		{
	    	print_realsense_error( e );
			ret = false;
		}
    }

    return ret;
}



//=============================================================================
// Initializes the RealSense RGB and Depth streams and starts the camera
// Returns true if successful
//=============================================================================
static gboolean init_streams( rs_device *camera )
{
	gboolean ret 	= true;
    rs_error *e 	= 0;

    /* Configure all streams to run at VGA resolution at 60 frames per second */
	rs_enable_stream( camera, RS_STREAM_DEPTH, WIDTH, HEIGHT, RS_FORMAT_Z16, 60, &e );
	if( e )
	{
    	print_realsense_error( e );
    	ret = false;
	}
	if( ret )
	{
		rs_enable_stream( camera, RS_STREAM_COLOR, WIDTH, HEIGHT, RS_FORMAT_RGB8, 60, &e );
		if( e )
		{
			print_realsense_error( e );
			ret = false;
		}
	}
	if( ret && camera )
	{
		rs_start_device( camera, &e );
		if( e )
		{
			print_realsense_error( e );
			ret = false;
		}
	}
    return ret;
}



//=======================================================================================
// Frame by frame, try to create our own depth image by taking the RGB data
// and modifying the red channels intensity value.
//=======================================================================================
static void cb_need_data ( GstAppSrc *appsrc, guint unused_size, gpointer user_data )
{
    GstFlowReturn ret;

    // Get the camera and wait for it to process the frame
    rs_device *camera = ( rs_device* ) user_data;
    rs_wait_for_frames( camera, NULL);

    // Pull the depth and RGB data from the LibRealSense streams
    uint16_t 	*depth 	= ( uint16_t* )   rs_get_frame_data( camera, RS_STREAM_DEPTH, NULL );
    struct rgb 	*rgb	= ( struct rgb* ) rs_get_frame_data( camera, RS_STREAM_COLOR, NULL );

    // Creating a new buffer to send to the gstreamer pipeline
    GstBuffer *buffer = gst_buffer_new_wrapped_full( ( GstMemoryFlags )0, rgb, SIZE, 0, SIZE, NULL, NULL );

    // Create the merge of depth data onto the rgb data Basically, trying to create
    // our own depth image here by varying the red intensity
    for( int i = 0, end = WIDTH * HEIGHT; i < end; ++i )
    {
        rgb[ i ].r = depth[ i ] * 60 /  ONE_METER ;
        rgb[ i ].g /= 4;
        rgb[ i ].b /= 4;
    }
    // Push the buffer back out to GStreamer so it can send it down and out to wifi
    g_signal_emit_by_name( appsrc, "push-buffer", buffer, &ret );
}

My Workflow

I wanted to ensure I spoke to my workflow. I developed on my Ubuntu machine using Eclipse Neon. I compiled the app to ensure there were no compilation errors. I then transferred the files over to Aero using shell scripts and compiled the application on Aero and ran it for testing.

Some Initial Thoughts

Again, I want to mention, this document is not intended to teach GStreamer; rather, it is for highlight a real working sample application. This only touches the surface for how you can construct streams in GStreamer.

If you read my other article on streaming RGB data from Aero, you will notice that this example is a lot more complex and begins to show off more capabilities of GStreamer as well as making use of the Intel LibRealSense library.

As in the simple RGB example, GStreamer automatically connected to the Intel RealSense camera (R200) using v4l2src source GstElement; however, this example goes about supplying the pipeline in a completely different manor. This source code will show you how you can inject R200 RGB and depth data into the GStreamer pipeline by using the LibRealSense library.

Let’s dig in!

gint main(gint argc, gchar *argv[])

We start off by making a few constants that define the size of the window we are planning on targeting.  Next is a structure that will be used for holding the RGB data coming back from the R200 and a few function declarations.

We start off ensuring that an IP address has been supplied as in input parameter. In a real-world sample, it may be desirable to parse the input string to ensure it’s in the form of a real IP address. For sake of simplicity, I’m not worrying about that here. This is the IP address of a client computer running QGroundControl. This client computer MUST be attached to the Aero’s Wi-Fi network for this to work.

Next, we declare some variables. The pipeline will be populated by all the GstElements needed to run our sample code. The GMainLoop is not a GStreamer construct; rather, it’s part of the Gnome project. It runs in its own thread and is used to keep the application alive and from falling through to the end of the code. I won’t go into detail here, I think the comments in the code explain what they are.

GStreamer must be initialized, that’s what gst_init is doing. It takes the same input parameters from main and initializes GStreamer. We create a GMainLoop that we start running later after all the other setup.

Now, we build our GStreamer pipeline command string. The gst_parse_launch command will parse out the GStreamer string. Behind the scenes, it analyses all the elements in the string and constructs the GstElements along with other aspects of GStreamer. After checking to see that there are no errors, we set the pipeline’s state to playing.

Remember… This code runs on Aero. You are sending this data FROM Aero to a client machine somewhere on the same network.

I want to bring attention to a couple of critical aspects of the GStreamer string:

  • appsrc name=mysource
    This is telling the GStreamer pipeline how we are getting the data. If you read my other article; Intel® Aero Compute Board and Intel® RealSense ™ Technology for Wi-Fi Streaming of RGB Data, you will see that it was specified by v4l2src device=/dev/video13. We are not doing that this time. If you recall, we want to inject the stream by using the functionality of the LibRealSense library which can give us depth data. So essentially, at runtime we are specifying the source of the stream as being a runtime-generated appsrc element.
  • udpsink host=%s port=5600
    This is telling GStreamer to use UDP and send the video stream via Wi-Fi to a particular IP address and port. Remember, the IP address is coming in via command-line parameter. There’s no reason not to include the port number on the command line as well if you wish.

After the pipeline has been parsed and created, we move on to setting up the appsrc element. We didn’t manually construct this ourselves; rather, we let GStreamer construct it for us behind the scenes and we need to go get it. This is done by the call to gst_bin_get_by_name(…). Once we access the real structure we can start modifying it.

We need to add and/or specify the capabilities of the appsrc element. We are specifying the video format, stream type, width, and height of the capabilities we require. After we have created the GstCaps structure, we assign it to the appsrc element.

Each GstElement also has properties that can be set, this is done using the g_object_set(…). This function is not a part of GStreamer; rather it is part of the Gnome project, which GStreamer relies heavily on. So, let’s set the is-live and format properties.   

Ok, now we are going to initialize the Intel RealSense camera (R200) and start the RGB and depth streams. I’ll talk about those in their respective functions later. Notice that I’m doing this BEFORE telling the pipeline to start playing. If I don’t, I will get unrespecting results. In my case, AeroStream would just crash out. At this point Intel RealSense is streaming so we can go ahead and start the pipeline.

You should notice something here. Pay attention to the order of things. You may wonder why I didn’t take care of all the appsrc setup BEFORE starting RealSense and the pipeline. The simple answer is… It crashes if I do. RealSensehad to be running BEFORE setting the pipeline to playing, and the pipeline had to be set to playing BEFORE setting the callbacks on appsrc. I’m not an expert on GStreamer and I don’t know how the guts of the pipelines are running behind the scenes. However, I do know that the order is important, and if they are not in this order, AeroStream would crash. Your mileage may vary!

A word about the callbacks. The appsrc element uses callbacks for event notifications. In this case, cb_need data is fired when the appsrc pipeline is empty and we need to pull more data from Intel RealSense. The GstAppSrcCallbacks structure contains three callback functions we can set. We are only setting the one. After setting the callback, we then must set the callbacks to the appsrc element by calling gst_app_src_set_callbacks.

Now, we need to get the loop running so we don’t fall out of the app. Here, the app will continue to run and while this loop is running, GStreamer is doing its processing of pulling data from the camera, processing the data, and sending out to Wi-Fi.

When the app ends, we do some simple clean up.  

NOTE: I think it’s important to mention that while I work on my Ubuntu machine, and compile the source code, it’s still required to compile on Aero.

static void print_realsense_error( rs_error *err )

This is a convenience function to display some LibRealSense error messages.

static gboolean init_realsense( rs_device **camera )

This function initializes the Intel RealSense camera (R200). The library uses the concept of a context that we create. Check for errors. If an error appears, print the error out.

If we have a good context, we attempt to create the camera device by calling the rs_get_device function and again check for errors. If everything is good, the variable ret will still contain a value of true.

static gboolean init_streams( rs_device *camera )

This function accepts an rs_device, which I’m calling a camera…. Because, that’s what it really is. The first thing we are going to try and do is get the depth stream running by calling the rs_enable_stream, specifying depth type parameters. Again, a little error checking.

If we are still good to go, we attempt to generate an RGB stream using the same function, but this time specifying color stream type parameters and again, doing some error checking.

If we are STILL good to go, we start the streaming via rs_start_device and…you guessed it, check for an error. If everything is good, the variable ret will still contain a value of true.

static void cb_need_data ( GstAppSrc *appsrc, guint unused_size, gpointer user_data )

Ok… We are at the last function to discuss. This is an interesting function. It’s the meat and potatoes function.

The GstFlowReturn is actually an enum. It will contain a value that we could have done something with but didn’t in this case. It gets used by the g_signal_emit_by_name function.

We need to typecast the user_data into an rs_device so that we can use it to work with LibRealSense. The first thing that happens is to call rs_wait_for_frames; in other words, let LibRealSense process the current frame and wait.

After that is done, we get both the depth and RGB data. We pass the RGB data into gst_buffer_new_wrapped_full to generate a buffer large enough to hold the current image.

Next, we jump into a for loop which is going to iterate over every pixel of the image, attempting to generate a composite image. An image that has both RGB and depth data at the same time, attempts to use the Red channel to represent depth. Based on the depth for the given pixel, a calculation is made to alter the red channel’s value, which represents depth.

Once that is done, the newly created image is then pushed out of appsrc onto the application’s GStreamer pipeline to be processed and sent out via Wi-Fi.

Intel® Aero Compute Board Setup

At this point, if you’re following along in order, you have a project set up in your IDE. You’ve compiled the source code. The next step is to get your board connected.

The following images show you how to connect your board.

Now you can power up your board. Once the board is fully powered up, it will automatically start a Wi-Fi access point. The next step helps walk you through setting up connectivity on Ubuntu.

Connecting Wirelessly to the Intel Aero Compute Board

Once your board has been powered up, you can now connect to it via Wi-Fi. In Ubuntu 16.04, you will see a derivative of CR_AP-xxxxx. This is the network connection you will be connecting to.

The SSID is 1234567890

Troubleshooting

If you do not see this network connection and provided you have hooked up a keyboard and monitor to your Intel Aero Compute Board, on the Intel Aero Computer Board, run the following command:

sh-4.3# lspci

This shows you a list of PCI devices. Check for the following device:

01:00.0 Network controller: Intel Corporation Wireless 8260 (rev3a)

If you do not see this connection, do a “warm” boot:

sh-4.3# reboot

Wait for the Intel Aero Compute Board to reboot. You should now see the network controller if you run lspci a second time. Attempt once again to connect via the wireless network settings in Ubuntu.

At times, I have seen an error message in Ubuntu saying:

Out of range

If you get this error, try the following:

  • Make sure there are no other active network connections; if there are, disconnect from them.
  • Reboot Ubuntu.

More on the Intel Aero Compute Board Wi-Fi functionality can be found at the Intel Aero Meta Wiki.

Useful Bash Shell Scripts

Now that you’ve got the code compiled on Ubuntu, it’s time to move it over to the Intel Aero Compute Board. Remember that even though you might compile on your Ubuntu machine, you will still need to compile on the Intel Aero Compute Board as well. What I found was that if I skip this step, Yocto gives me an error saying that AeroStream is not a program.

To help expedite productivity, I’ve created a couple of small shell scripts. They aren’t necessary or required; I just got tired of typing the same things over and over.

migrateAero
First, it should be obvious that you must have a Wi-Fi connection to the Intel Aero Compute Board for this script to run .

This script runs from your Ubuntu machine. I keep it at the root of my Eclipse working folder. After I’ve made changes to the AeroStream project, I run this to migrate files over to the Intel Aero Compute Board. Technically, I don’t need to push the ‘makeAero’ script every time. But because I never know when I might change it, I always copy it over.

#!/bin/bash

# clean up these files or files won’t get compiled on the Aero board. At least this is what I've found to be the case
rm AeroStream/Debug/src/AeroStream*
rm AeroStream/Debug/AeroStream

# Now push the entire AeroStream Eclipse Neon project to the Aero board. This will create the folder /home/AeroStream on the Aero board.
scp -r AeroStream root@192.168.1.1:/home

# makeAero script essentially runs a make and executes AeroStream
scp makeAero root@192.168.1.1:/home

makeAero
Runs on the Intel Aero Compute Board itself. It gets migrated with the project and ends up at the root of /home. All it’s doing is navigating into the debug directory and running the make file, and then launching AeroStream.

#!/bin/bash
#Created a shortcut script because I'm tired of typing this in every time I need to migrate

cd /home/AeroStream/Debug
make
./AeroStream

Instead of pushing the entire project over, you might instead just create your own make file(s) and just push the source code, however, this approach worked for me. 

Also, you don’t even need to create a project on Ubuntu using Eclipse. Instead, if you feel confident enough you can just develop right there on the board itself.

How to Configure QGroundControl

There is one last step to complete: configuring QGroundControl. Downloading and installing QGroundControl is out of the scope of this document. However, I need to show you how to set up QGroundControl to talk to receive the GStreamer stream from the Intel Aero Compute Board Wi-Fi.

Note that QGroundControl also uses GStreamer for its video streaming capabilities. This is how the connection is actually being created. GStreamer has the ability to send to Wi-Fi from one location, and then listen for a signal from another location via Wi-Fi. And this is how QGC is accomplishing this.

NOTE: Make sure you are using the SAME port that you have configured in your GStreamer pipeline.

Step 1

When you launch QGroundControl, it opens into flight path mode. You need to click the QGroundControl icon to get to the configuration area.

Step 2

Click the Comm Links button. This displays the Comm Links configuration page.

Click Add.

Step 3

This displays the Create New Link Configuration page.

  1. Give the configuration a name. Any name is OK.
  2. For Type, select UDP.
  3. Select the Listening Port number. This port number must match the port that is being used from the GStreamer pipeline.
  4. Click OK.

Step 4

You will now see the new Comm Link in QGroundControl.

Launch the App

NOTE: QGroundControl MUST be running first. It has to be put into listening mode. If you launch your streaming server application first, the connection will not be made. This is just an artifact of GStreamer.

  1. Launch QGroundControl.
  2. Launch AeroStream from the Intel Aero Compute Board. If everything has gone according to plan, you will see your video stream show up in QGroundControl.

Intel® Aero Compute Board and GitHub*

Visit the Intel Aero GitHub for various software code bases to keep your Aero up to date:

https://GitHub.com/intel-aero

https://GitHub.com/intel-aero/meta-intel-aero/wiki

Other Resources

Visit the Intel Aero Compute Board GitHub for various software code bases to keep your Intel Aero Compute Board up to date.

http://www.intel.com/content/www/us/en/technology-innovation/aerial-technology-overview.html

/en-us/realsense/aero

http://qgroundcontrol.com/

http://click.intel.com/

Summary

This article helped get you up and running with streaming capabilities with the Intel Aero Compute Board. I gave you an overview of the board itself and showed you how to connect to it. I also showed you which libraries are needed, how I set up Eclipse for my own project, and how to get Wi-Fi up, transfer files, and set up QGroundControl. At this point you are ready to explore other capabilities of the board and streaming.


Viewing all articles
Browse latest Browse all 3384

Trending Articles



<script src="https://jsc.adskeeper.com/r/s/rssing.com.1596347.js" async> </script>