Quantcast
Channel: Intel Developer Zone Articles
Viewing all articles
Browse latest Browse all 3384

Exploring Intel® RealSense™ SDK and the Oculus Rift* DK2

$
0
0

Introduction

This article explores some of the technical details and challenges developers will encounter when creating software experiences that integrate the Intel® RealSense™ SDK for Windows* and the Oculus Rift* head-mounted display (HMD). We begin with an overview of the Oculus Rift Development Kit 2 (DK2), then look at some of the issues you’ll encounter when developing apps that combine multiple infrared (IR) camera-based sensors (Figure 1). This article also provides a walkthrough describing how to integrate the Intel RealSense SDK and Oculus Rift in a Unity* 5 project.

Intel® RealSense™ SDK and Oculus Rift* Apps

Figure 1. Intel® RealSense™ SDK and Oculus Rift* demo apps running concurrently.

Scope and Requirements

The information presented in this article is limited to applications of the Intel® RealSense™ camera F200, which is front-facing. To follow along with the walkthrough you’ll need an Oculus Rift DK2, Intel RealSense camera (F200), and a development system that meets the following requirements:

  • 4th generation Intel® Core™ processor (or later) (for Intel RealSense SDK support)
  • 150 MB free hard disk space
  • 4 GB RAM
  • One USB3 port for the peripheral F200 camera and two USB 2.0 ports for the Oculus Rift DK2
  • Windows* 8.1
  • Dedicated graphics card, NVIDIA GTX* 600 series or AMD Radeon* HD 7000 series (or better) with DVI*-D or HDMI* graphics output.

The software components required to support the Oculus Rift* DK2 and Intel® RealSense™ camera (F200) are as follows:

  • Intel RealSense SDK (v6.0.21.6598 or greater)
  • Intel RealSense Depth Camera Manager F200 (v1.4.27.41944 or greater)
  • Oculus SDK for Windows* (v0.7.0.0-beta or greater)
  • Oculus Runtime for Windows (v0.7.0.0-beta or greater)
  • Oculus Utilities for Unity 5 (v0.1.2.0-beta or greater)

For the Unity walkthrough presented in this article, you can use the free personal version (v5.2.2 or greater) available here.

The Oculus Rift* DK2

The Oculus Rift DK2 is a hardware and software toolkit that allows developers to create interesting virtual reality (VR) games and user experiences. In addition to the HMD unit, the kit also includes a low-latency positional tracking camera for following the user’s head movements.

The positional tracking camera is essentially a standard webcam with an IR filter mounted to the lens. The headset itself has several hidden IR LEDs distributed in an array that the positional tracking camera uses to determine the location of the user’s head in 3D space. Interestingly, these hidden IR sources reveal themselves to the Intel RealSense camera when the IR camera stream is viewed in the Raw Streams SDK sample (Figure 2).

Oculus Rift* IR LEDs visible to Intel® RealSense™ Camera

Figure 2. Oculus Rift* IR LEDs visible to Intel® RealSense™ camera.

The Oculus Rift HMD includes a gyroscope, accelerometer, and magnetometer that, when combined via sensor fusion, determines the orientation of the user’s head and reports this information as roll, pitch, and yaw rotations. The positional tracking camera provides additional information on the position of the user’s head (i.e., the x-, y-, and z-spatial coordinates).

One way to better understand what the DK2 positional tracking camera adds to the virtual reality (VR) experience is to run the demo scene that can be launched from the Oculus Configuration Utility (Figure 3).

Oculus Configuration Utility

Figure 3. Oculus Configuration Utility.

When you view the demo scene (Figure 4) with the positional tracking camera plugged into the USB port, you’ll see that objects on the virtual desk appear to be closer or farther away as you move your head toward or away from the camera.

Oculus Rift* demo scene

Figure 4. Oculus Rift* demo scene.

If you then run the demo scene with the positional tracking camera’s USB connector unplugged, you’ll see that the orientation data from the Oculus Rift headset’s sensors still provide roll, pitch, and yaw rotations, but there is no longer the sense of depth as you move your head in the z-axis.

Interferences with the User-Facing Camera

The immediate concern to an Intel RealSense SDK developer interested in creating VR apps that integrate the Intel RealSense Camera F200 forward-facing camera is the potential for interferences between the 3D depth camera and the Oculus Rift’s positional tracking camera. Figure 5 shows a Lenovo ThinkPad* Yoga 15 Ultrabook™ 2 in 1 with an integrated Intel RealSense camera F200 and an Oculus Rift positional tracking camera mounted alongside it on the screen bezel.

Forward-facing cameras

Figure 5. Forward-facing cameras.

The Intel RealSense F200 camera uses coded light technology to project IR patterns onto the user and receives the nonvisible reflected images through an IR camera. The Oculus Rift headset partially accomplishes its head tracking by projecting an array of IR LEDs onto its own passive IR camera, which is optically filtered to receive only light in the IR spectral region. The effect of the Oculus Rift headset’s IR LEDs on the Intel RealSense camera can be seen as varying interference patterns in the depth stream, as shown in Figure 6.

IR Interferences to the Depth Data

Figure 6. IR interferences to the depth data.

Offsetting the DK2’s positional tracking camera by some angle relative to the Intel® RealSense™ camera might reduce the effects of IR interference somewhat; however, the fact that the Oculus headset itself can be a source of IR emissions means it will have some effect on a forward-facing Intel RealSense camera when the DK2’s positional tracking camera is used.

To better understand the effects of this interference, we ran the SDK Hands Viewer sample app with the DK2’s positional tracking camera connected to the USB port and the Oculus Rift VR Demo app running. As shown in Figure 7, there is a marked decrease in frame rate. Although this could be attributed to a few different factors (for instance, the under spec’d NVIDIA GeForce* 840M graphics used for this particular test), it is still interesting to observe that the hand tracking and gesture recognition on the Intel RealSense camera still performs reasonably well.

Gesture recognition in the presence of Oculus Rift’s IR LEDs

Figure 7. Gesture recognition in the presence of Oculus Rift’s IR LEDs.

Unity 5 Walkthrough

We previously mentioned the potential interferences between the Intel RealSense camera and the DK2’s positional tracking camera system, but what happens when you combine these technologies in a real project? In the following sections we’ll do a quick walkthrough that demonstrates the creation of a basic Intel RealSense camera-enabled Unity 5 project, and then enable it for a simple VR experience.

Create a New Unity Project

  • Start a new Unity project by double-clicking the Unity desktop icon. Select New, and then choose a name and location for the project.
  • If the Unity editor is already open, create a new project by selecting File, New Project from the menu bar, and then choose a name and location for the project.

Import the RSSDK Unity Toolkit

  • Import the RSSDK Unity Toolkit by selecting Assets, Import Package, Custom Package… from the menu bar.
  • In the Import Package screen, navigate to the SDK folder containing the Unity Toolkit package. This location may vary depending on where the SDK was installed. In this example, the toolkit is located under C:\Program Files (x86)\Intel\RSSDK\framework\Unity.
  • Select UnityToolkit, and then click Open.

    (Note: There are two Unity package files in this folder: UnityToolkit and UnityCSharp. Importing UnityCSharp will add only the required managed and unmanaged DLLs needed to support the Intel RealSense SDK in a Unity application. Alternatively, importing UnityToolkit will add the required DLLs to the project along with many other assets to help streamline the development of an Intel RealSense SDK-enabled project.)

  • The Importing Package screen will appear with all of the plugins, actions and prefabs selected. Leave all of the checkboxes selected and click Import.
  • In the Project screen, notice that a number of assets have been added to the project, organized under the following folders:
    • Plug-ins. Contains libpxccpp2c.dll and the unmanaged C++ P/Invoke DLL.
    • Plugins.Managed. Contains libpxcclr.unity.dll and the managed C# interface DLL.
    • RSUnityToolkit. Contains the Actions, Internals, Prefabs, and Samples folders.

Add a Game Object

  • The project will initially contain a Main Camera and a Directional Light. In the Hierarchy screen, click Create, and then select 3D Object, Cube. This adds a primitive Cube game object to the Scene.
  • In the Project folder under Assets, expand the RSUnityToolkit folder, and then select Actions.
  • The Actions folder contains a number of scripts that can be applied to game objects. Click the TrackingAction script, then drag and drop it on the Cube game object in the Hierarchy screen.
  • Select Cube in the Hierarchy screen, and you’ll see that Tracking Action (Script) is shown in the Inspector screen.

Configure Hand Tracking

  • By default, the Tracking Action is set to HandTracking and the Virtual World Box Dimensions are set to 100 for X, Y and Z. If you play the game at this point you will see the 3D depth camera’s LED turn on, indicating the camera has been activated.
  • If you raise your hand in front of the camera you will likely see the Cube game object shoot off the screen. This is because the Virtual World Box Dimensions are set too high. Change the Virtual World Box Dimensions to 10 for X, Y and Z.
  • In the Scene view notice that the Virtual World Box, outlined in red, is now more constrained around the game object (see Figure 8).
  • Run the game again. The Cube game object should now track your hand within a more confined virtual space.
  • You might notice that the movement of the Cube game object is somewhat jittery. This can be smoothed by increasing the Smoothing Factor to 10.

Unity* Editor – Tracking Action Settings

Figure 8. Unity* Editor – Tracking action settings.

Enabling VR in the Unity Project

As stated on the Oculus Rift DK2 website, Unity 5.1+, developers can substitute a stereoscopic VR camera with built-in orientation and positional tracking for their main camera by selecting the Virtual Reality Supported check box in Player Settings. Follow these steps to enable VR in the project.

  • Select Edit – Project Settings from the menu bar, and then click Player.
  • In the Inspector window, select the Virtual Reality Supported check box.

Click the Play button and put on the Oculus Rift headset; you’ll immediately see that the main camera transforms are now overridden by the Oculus Rift headset’s orientation tracking. The cube still tracks your hand via the Intel RealSense SDK Toolkit for Unity’s  TrackingAction script, but now you can view the scene with the Unity camera tracking the HMD’s movements.

Unity Virtual Reality Enabled

Figure 9. Unity virtual reality enabled.

Import the Oculus Utilities Package for Unity

The Utilities package is an optional supplement that includes prefabs, scenes, and scripts to support VR app development. The following steps demonstrate how to import this package into your project.  

  • Start by selecting Assets, Import Package, Custom Package… from the menu bar.
  • In the Import Package screen, navigate to the folder containing the OculusUtilities Unity package file. Select OculusUtilities, and then click Open.
  • The Importing Package screen appears with all of the Oculus assets selected. Leave all of the check boxes selected and click Import.

Note: In this example, we will be adding the OVRCameraRig prefab from the OculusUtilities package to the scene. Therefore, you do not need to select the Virtual Reality Supported check box in the Inspector window when using the OculusUtilities package.

  • Drag the OVRCameraRig prefab into the scene.
  • Turn off the main camera in the scene to ensure that OVRCameraRig is the only one being used.
  • Click Play and put on the Oculus Rift headset; you’ll see that OVRCameraRig is now tracking the HMD’s movements, with cube transforms still being controlled by hand movements.

Note: For complete coverage on how to use the Oculus Utilities package, visit the online documentation at http://static.oculus.com/documentation/pdfs/game-engines/latest/unity.pdf.

Add VR to an Intel® RealSense™ SDK Toolkit for Unity* Sample

With built-in support for VR in Unity 5.1+, it is easy to enable an Intel RealSense SDK Toolkit for Unity sample like Sample 1 – Translation to work with VR. Try it by following these steps:

  • In the Project folder, expand RSUnityToolkit – Samples – Scenes and then double-click Sample 1 - Translation (be sure to save your previous work first if you want to keep it).
  • Select Edit – Project Settings from the menu bar, and then click Player.
  • Make sure the Virtual Reality Supported check box is selected in the Inspector window.
  • Click the play and give it a try!

Conclusion

In this article we examined potential sensor interference issues that you may encounter when combining the forward-facing Intel RealSense camera in a project that uses the Oculus Rift’s positional tracking camera. We also explored some basic scenarios in which Intel RealSense camera hand tracking was used in simple VR experiences. Give it a try and see what interesting experiences you can create!

About Intel® RealSense™ Technology

To get started and learn more about the Intel RealSense SDK for Windows, go to https://software.intel.com/en-us/intel-realsense-sdk

About the Author

Bryan Brown is a software applications engineer in the Developer Relations Division at Intel. 


Viewing all articles
Browse latest Browse all 3384

Trending Articles



<script src="https://jsc.adskeeper.com/r/s/rssing.com.1596347.js" async> </script>