Download PDF[PDF 950KB]
By Lynn Thompson
In this third article in a three-part series, I focus on using Intel® RealSense™ technology to interface with traditional graphical user interface (GUI) widgets. Where the first two articles concentrated on the three-dimensional (3‑D) elements in a typical Unity* 3D scene, this article shows how to use the Intel RealSense SDK to augment the supporting elements in a game or simulation.
To develop a Unity 3D scene that uses Intel RealSense technology’s Hand Tracking and Activate/Deactivate Actions to drive GUI widgets native to Unity Editor version 4.6, I configure a typical GUI Slider Widget with a value of 0 to 1, indicating the relative slider position left to right. I then tie this value to a Hand Tracking Gesture that is constrained on the z- and y-axes. The motion of the Hand Tracking Gesture on the x-axis will be relayed to the GUI Slider Widget, scaled, and inserted into the GUI Slider Widget’s Value variable. I also configure a typical Toggle GUI Widget that the user can select or clear. I configure Activate/ Deactivate Actions for FingersSpread and ThumbsDown gestures, respectively. When these gestures are detected, the appropriate state is relayed to the Toggle GUI Widget, which selects or clears the toggle box, as appropriate.
Creating the Example
I begin this example by creating a new project in Unity Editor 4.6. In this new version of the Unity Editor, I configure a basic UI by adding both a Canvas and an EventSystem from the GameObject > UI menu. Using the same menu, I add a slider GUI Widget named Slider01. The default state for the Canvas is Screen Space – Overlay. This state renders the canvas to the screen without consideration for camera position. Running the scene at this point results in a screen that has a slider bar and a button that a user can manipulate with a mouse.
Add an Asset and Script to Relay Intel® RealSense™ Technology Input
Next, I create an empty asset by using the GameObject
menu name RealSenseUIRelay
. Using the Add Component button in the Inspector panel, I add a C# script to this asset named Relay. I add the code in Figure 1 to the script to verify that I can hard-code the value of the slider.
. . using UnityEngine.UI; . . private GameObject slider01; private Slider sliderScript01; . . . void Start(){ . . slider01 = GameObject.Find("Slider01"); sliderScript01 = (Slider)slider01.GetComponent("Slider"); . . sliderScript01.value = 0.0f;
Figure 1: Relay.cs
This method allows for hard-coding of the slider value.
Configure Intel® RealSense™ Software
To use Intel RealSense technology in the project, I click Assets > ImportPackage > CustomPackage > File in Unity Editor. I use the File dialog box to import the Intel RealSense Unity Toolkit from C:\Program Files(x86)\Intel\RSSDK\framework\Unity\RSUnityToolkit.unitypackage
. Following import of the toolkit, I restart Unity Editor, which is required to get the Intel RealSense Unity Toolkit menu to appear in Unity Editor.
To create a means of manipulating both the slider and the toggle box with the right hand, I create an empty asset named RightHandRS
. I then add a Tracking Action to the asset by clicking Intel RealSense Unity Toolkit > AddAction > Tracking. Selecting RightHandRS
in the Hierarchy pane on the left side of the Unity Editor shows the Tracking Action in the Inspector pane on the right side of the Unity Editor. I then open the Tracking Action in the Inspector and change Hand Tracking > Which Hand to ACCESS_ORDER_RIGHT_HANDS to keep the left hand from having any inadvertent effect on the scene. I also make this change for Hand Detected > Which Hand for the same reason.
Because the Slider GUI Widget I want to control moves only on one axis, I want to limit the motion of the hand tracking asset. I do this by selecting the Y and Z boxes for Rotation and Position on the Constraints tab under the Tracking Action in the Inspector when I highlight RightHandRS
in the Hierarchy pane.
Using the Add Component tab in the Inspector pane, I add a Halo Effect to RightHandRS
, setting the color to red. This allows me to view the empty asset during runtime. Obviously, RightHandRS
must be in view of the scene’s Main Camera.
To link the hand tracking functionality I just configured for RightHandRS
to the Slider GUI Widget, I again modify the Relay.cs
script, which is a component of the RealSenseUIRelay
Asset (see Figure 2).
. . using RSUnityToolkit; . . private GameObject rightHandRS; private TrackingAction rightHandTracking; private Vector3 rightHandInitPos; private Vector3 rightHandRunPos; private float rightHandHorzScale; . . void Start(){ . . rightHandRS = GameObject.Find("RightHandRS"); rightHandTracking(TrackingAction)rightHandRS.GetComponent("TrackingAction"); rightHandHorzScale = rightHandTracking.VirtualWorlBoxDimensions.x; rightHandInitPos = rightHandRS.transform.position; rightHandRunPos = new Vector3(0.0f,0.0f,0.0f); . . void Update(){ rightHandRunPos = rightHandRS.transform.position; sliderScript01.value = 0.5f + ((rightHandRunPos.x - RightHandInitPos.x)/rightHandHorzScale);
Figure 2: Relay.cs (Modified)
Running the scene at this point creates a red halo that moves with the user’s right hand but only on the horizontal axis. The button on the slider moves through its full range as the red Hand Tracking Halo moves through the full range of its x-axis. The Real World Box Dimensions in the Inspector pane defines the full range of the hand tracking x-axis.
Add a Toggle
As an additional example of using Intel RealSense technology to control Unity 3D UI Widgets, I add a toggle box to the scene. I add the toggle box the same as the slider—by clicking Unity Editor GameObject > UI > Toggle. I change the default name of the toggle to Toggle01. Next, I add an empty asset named ToggleState01
. I add a green Halo effect to ToggleState01
.
To implement a Unity* 3D UI interface with the RealSense SDK in this toggle example, I add Activate/Deactivate Actions to the RightHandRS
Asset. For both of these actions, using the Inspector pane, I change GameObjects Size from 0 to 1 and configure the resulting Element0 to be the ToggleState01
Asset. For both actions, I change the Gesture Detected > Which Hand to ACCESS_ORDER_RIGHT_HANDS. For the Activate Action, I change the GestureDetected > Gesture to FingersSpread. For the Deactivate Action, I change the Gesture Detected > Gesture to ThumbsDown.
To link these actions to the toggle button GUI Widget, I make the modifications in Figure 3 to the Relay.cs
script.
. . . private GameObject toggle01; private Toggle toggleScript01; private GameObject toggleState01; . . void Start(){ . . toggle01 = GameObject.Find("Toggle01"); toggleScript01 = (Toggle)toggle01.GetComponent("Toggle"); toggleState = GameObject.Find("ToggleState01"); . . . void Update(){ . . toggleScript01.isOn = toggleState01.activeSelf; . .
Figure 3: Partial C# Code for Access to the Toggle Button Widget
Running the scene at this point displays a green halo and toggle button in addition to the original red halo and slider bar. The slider bar and red halo continue to perform as they did before the addition of the toggle box. When the user makes a ThumbsDown Gesture, the green halo disappears and the toggle box is cleared. When the user makes a FingersSpread Gesture, the green halo reappears and the box is again selected.
Observations
Using hand tracking in combination with the Activate/Deactivate Actions works well. When I made a continuous left–right motion to drag the slider button back and forth, at the same time and with the same hand, I was able to make the ThumbsDown and FingersSpread Gestures to clear and select the toggle box. I detected no delay between the motion of the red halo and the slider button nor the green halo state and the toggle button selection. I had only one issue in creating this example: I had originally configured the Toggle Activate Gesture to be a ThumbsUp Gesture. I was never able to get this to work reliably. The ThumbsDown Gesture for toggle deactivation always worked reliably, as—now—does the FingersSpread Gesture for toggle activation.
The Relay.cs
C# script used in this example functioned well as a bridge between Intel RealSense Developer Kit and Unity Editor 4.6 UI widgets. Continuing with this method, an Intel RealSense technology developer can use gestures to manipulate the layout of the widgets in addition to widgets’ floating value or discrete state. The Unity Editor 4.6 UI Canvas can be manipulated in a similar fashion. A gesture indicating the need to pause the simulation for configuration through the GUI may put the Canvas in Overlay mode. When done, another gesture can be used to resume the simulation and deactivate the Canvas.
This example was not difficult to implement and used only one C# script in addition to the Unity Editor Intel RealSense Developer Kit. This script (Relay.cs
) is provided in its entirety in Figure 4.
using UnityEngine; using UnityEngine.UI; using RSUnityToolkit; using System.Collections; public class Relay : MonoBehaviour { private GameObject slider01; //provide access to the slider UI widget private Slider sliderScript01; //provide access to the script in the slider UI widget private GameObject toggle01; //provide access to the toggle UI widget private Toggle toggleScript01; //provide access to the script in the toggle UI widget private GameObject toggleState01; //provide access to the Unity asset holding RealSense //activate or deactivate state private GameObject rightHandRS; //provide access to the Unity asset holding //all of the configured RealSense Actions private TrackingAction rightHandTracking; //provide access to the Tracking Action component //in the RightHandRS asset private Vector3 rightHandInitPos; //The initial position of the RightHandRS asset private Vector3 rightHandRunPos; //The position of the RightHandRS asset as it //is moved though the scene by the RealSense //Tracking Action private float rightHandHorzScale; //This is the horizontal dimension of the Virtual //World containing the Tracking Action // Use this for initialization void Start () { slider01 = GameObject.Find ("Slider01"); //Find the UI slider widget sliderScript01 = (Slider)slider01.GetComponent ("Slider"); //Get the script in the slider //widget toggle01 = GameObject.Find ("Toggle01"); //Find the UI toggle widget toggleScript01 = (Toggle)toggle01.GetComponent ("Toggle"); //Get the script in the toggle //widget toggleState01 = GameObject.Find ("ToggleState01"); //Find the asset holding the //state dictated by the RealSense activate and deactivate Gestures sliderScript01.value = 0.0f; //Initialize the slider value rightHandRS = GameObject.Find ("RightHandRS"); //Find the RightHandRS asset holding the configured RealSense Actions rightHandTracking = (TrackingAction)rightHandRS.GetComponent ("TrackingAction"); //Get the RealSense TrackingAction from the parent Unity asset rightHandHorzScale = rightHandTracking.VirtualWorldBoxDimensions.x; //Set the horizontal scale to the width of the RealSense Tracking Action Virtual World rightHandInitPos = rightHandRS.transform.position; //Get the initial position of the Hand Tracking asset for scaling purposes rightHandRunPos = new Vector3 (0.0f, 0.0f, 0.0f); //Initialize the vector that will hold the runtime position of the Hand Tracking asset } // Update is called once per frame void Update () { rightHandRunPos = rightHandRS.transform.position; //Set the vector holding the runtime position of the Hand Tracking asset sliderScript01.value = 0.5f+2*((rightHandRunPos.x - rightHandInitPos.x)/ rightHandHorzScale); //Set the slider value (slider button position) by scaling and biasing the position //of the Hand Tracking asset with the initial position and the Hand Tracking Virtual World //horizontal dimension toggleScript01.isOn = toggleState01.activeSelf; //Check and uncheck the toggle box by setting is to the active state of the Unity //asset that is tied to the RealSense Activate and Deactivate Actions. } }
Figure 4: Relay.cs (Complete)
Conclusion
Configuring input from Intel RealSense technology to drive Unity Editor 4.6 UI widgets was not difficult at all, and the example performed well. The Intel RealSense Unity Toolkit is a viable means of using hand gestures to manipulate traditional GUI Widgets.
About the Author
Lynn Thompson is an IT professional with more than 20 years of experience in business and industrial computing environments. His earliest experience is using CAD to modify and create control system drawings during a control system upgrade at a power utility. During this time, Lynn received his B.S. degree in Electrical Engineering from the University of Nebraska, Lincoln. He went on to work as a systems administrator at an IT integrator during the dot com boom. This work focused primarily on operating system, database, and application administration on a wide variety of platforms. After the dot com bust, he worked on a range of projects as an IT consultant for companies in the garment, oil and gas, and defense industries. Now, Lynn has come full circle and works as an engineer at a power utility. Lynn has since earned a Masters of Engineering degree with a concentration in Engineering Management, also from the University of Nebraska, Lincoln.