Quantcast
Channel: Intel Developer Zone Articles
Viewing all articles
Browse latest Browse all 3384

The Path from Intel® Perceptual Computing SDK to Intel® RealSense™ SDK in Unity

$
0
0

Download PDF

Code-Monkeys engaged with Intel’s 3D sensing initiative back in 2012 when we first saw the (rather raw) technology at the Intel® Developer Forum that year. Back then, it was called “Intel Perceptual Computing,” and it used a Creative* camera. Since then, the initiative has been upgraded—a lot—and renamed Intel® RealSense™ technology.

This article is intended for developers who started with Intel Perceptual Computing, particularly those who use Unity, and want to migrate or upgrade to Intel RealSense technology.

The Short Story: Start Over, but in a good way

The changes that have been made are huge and occur at every level. The hardware profile is totally different; the function calls are different. In fact just about everything, all the way down to the root conceptual framework, has changed. With all that water under the bridge, the short story is that there is no real ”migration” path between the two SDKs. But don’t take that to mean all of your time learning Intel Perceptual Computing was wasted as you’ll see below.

To address what’s involved in migrating your code, here goes:

  1. Completely remove the Intel Perceptual Computing SDK. Completely means completely. The two SDKs do not play well together and having both installed on the same rig can cause intermittent and maddening conflicts.
  2. Say goodbye to your Creative IR Camera. It was good while it lasted, but the Creative hardware is not compatible with the Intel RealSense SDK.
  3. Install the new Intel RealSense SDK. You can find it here: https://software.intel.com/en-us/realsense/
  4. Go back to the studs of your Intel Perceptual Computing project and get ready to rework all of the interface and control pieces.

Now while you’ll need to remap controls to actions the good news is this: the new SDK is quite a bit easier to use and in most cases the real challenge in implementing this kind of interface isn’t in the code syntax. It’s in the way you think about your interface from the ground up. If you thought through all the intricacies of a Natural UI previously, it’s highly likely everything will carry over between SDKs.

Intel Perceptual Computing SDK in Hindsight

We spent a lot of time working with the Intel Perceptual Computing SDK, and it’s worth going back over what worked and what didn’t. After all, we learned a lot of lessons on this platform. Overall, the power of the Intel Perceptual Computing SDK was in its transparency. We had full and easy access to all of the data that was coming from the Creative camera, and that was a LOT of data.

One of the first challenges we faced while using this SDK was how to filter out what we didn’t need, and while that was a genuine speed bump, the SDK allowed us to make the tactical decisions about what was important, how to focus on the data we needed, and how to interpret what we got. It was raw and choppy and there was a ton of it, but we had options, which is to say we had power.

But this empowering reality also made Intel Perceptual Computing overwhelming at first. There was limited documentation, no best practices, and almost everything had to be processed by custom scripts. The result was a need to trim back our initial scope to accommodate the steep learning curve.

Overall, Intel Perceptual Computing reminded me of UNIX*. The people who know it rave about the available power, the options, and the limitless ways to display their mad command line skills. But for the less committed user, it can seem needlessly difficult for 95% of a user’s daily work.

Comparing Intel® Perceptual Computing to Intel RealSense™ Technology

The first thing an Intel Perceptual Computing veteran will notice about the Intel RealSense SDK is how much more WYSIWYG it is. Intel RealSense technology, specifically the Unity plug-in, includes a variety of panels and modifiers that make drag-and-drop functionality a genuine reality. Event-based logic means you can rely on the SDK’s built-in functionality to trigger actions that previously had to be sensed, listened for, and linked by hand.

An excellent example of this is the little-publicized Emotion Tracking functionality. Right out of the box, with a trivial amount of code, the system can look for and “detect” a variety of emotions and sentiments defined by certain facial expressions. A clever developer should be able to seamlessly integrate these user-generated events with context sensitive actions and this entire package can be installed in just a few minutes.

Of course, a more “user-friendly” system comes at the cost of granular control. Developers have a lot less access to raw data in the Intel RealSense SDK and customizing processing algorithms is no longer a simple matter.

In the end, though, the Intel RealSense SDK is a major improvement over Intel Perceptual Computing at basically every level. And while the nerdcore coder in us miss the unfettered data stream, the deadline-oriented coder is grateful for the improved level of accessibility and productivity.

A Real-world Example: Making a Mouse with the Intel RealSense SDK

In the following example we use Unity 4.5 and NGUI with the Gold version of the Intel RealSense SDK to demonstrate how easy it is to make a “mouse” object. The process follows this series of steps:

  1. Create a 3D UI in NGUI with an anchor and panel.
  2. Add a GameObject to the panel called RSMouseLocation.
  3. Add a Tracking Action script from the Intel RealSense SDK.
  4. Add an Icon for visual feedback.
  5. Set up a UI Camera.
  6. Add a Right Hand Grab Trigger from the Intel RealSense SDK.
  7. Create an RSMouseManager to listen for the Grab Trigger Event.
using UnityEngine;
using System.Collections;
using System.Collections.Generic;

public class RSMouseManager : MonoBehaviour {

	public Camera myCamera;
	public bool handClosed;
	public float swipeDistance = 100.0f;
	public Vector3 currentMousePosition;

	public UISprite myHandIconSprite;
	public string openHandSpriteName;
	public string closeHandSpriteName;
	public List<string> levelsToShowHandIcon;
	public bool showHandIconOnPause;

	private Vector3 inputStartLocation;
	private bool foundSwipe;

	void OnLevelWasLoaded(int level)
	{
		ResolveHandIconVisibility(true);
	}

	void OnEnable()
	{
		PauseManager.timeSwitch += ResolveHandIconVisibility;
	}

	void OnDisable()
	{
		PauseManager.timeSwitch -= ResolveHandIconVisibility;
	}

	void ResolveHandIconVisibility(bool unpaused)
	{
		if(!unpaused && showHandIconOnPause)
		{
			myHandIconSprite.gameObject.SetActive(true);
		}
		else
		{
			if(levelsToShowHandIcon.Contains(Application.loadedLevelName))
				myHandIconSprite.gameObject.SetActive(true);
			else
				myHandIconSprite.gameObject.SetActive(false);
		}
	}

	// Use this for initialization
	void Start () {
		handClosed = false;
	}

	// Update is called once per frame
	void Update () {
		currentMousePosition = myCamera.WorldToScreenPoint(this.transform.position);
		OperateRSMouseInput();
	}

	public void OpenHand(){
		if(handClosed){
			//Debug.Log ("Hand open. ");
			if (!foundSwipe) {
				GameInput.instance.AcceptExternalClick(inputStartLocation);
			}
			handClosed = false;
			foundSwipe = false;
			if(myHandIconSprite != null && openHandSpriteName != "")
				myHandIconSprite.spriteName = openHandSpriteName;
		}
	}

	public void ClosedHand(){
		if(!handClosed){
			//Debug.Log ("Hand closed.");
			//Vector3 screenPoint = myCamera.WorldToScreenPoint(this.transform.position);
			//GameInput.instance.AcceptExternalClick(screenPoint);
			handClosed = true;
			inputStartLocation = currentMousePosition;
			if(myHandIconSprite != null && closeHandSpriteName != "")
				myHandIconSprite.spriteName = closeHandSpriteName;
		}
	}

	void OperateRSMouseInput() {
		if (handClosed) {
			// Determine if a swipe is occuring
			if (Mathf.Abs(inputStartLocation.x - currentMousePosition.x) > swipeDistance && !foundSwipe) {
				foundSwipe = true;
				if (inputStartLocation.x > currentMousePosition.x) {
					GameInput.instance.AcceptExternalSwipeLeft();
				} else {
					GameInput.instance.AcceptExternalSwipeRight();
				}
			}
			if (Mathf.Abs(inputStartLocation.y - currentMousePosition.y) > swipeDistance && !foundSwipe) {
				foundSwipe = true;
				if (inputStartLocation.y > currentMousePosition.y) {
					GameInput.instance.AcceptExternalSwipeDown();
				} else {
					GameInput.instance.AcceptExternalSwipeUp();
				}
			}
		}
	}
}

Given that, here’s what it looks like in the Unity panel view:

Unity Panel View

Intel® Perceptual Computing APIIntel® RealSense™
QueryVoiceRecognizedQuery Voice Recognized
pp.QueryGeoNode(PXCMGesture.GeoNode.Label.
LABEL_BODY_HAND_LEFT,out leftHand)
pp.QueryGeoNode(PXCMGesture.GeoNode.Label.
LABEL_BODY_HAND_RIGHT,out leftHand)
Query Geo Node
pp.QueryFaceLocationData(faceId,out ddata)Query Face Location
pp.QueryGesture(PXCMGesture.GeoNode.Label.
LABEL_ANY, out gdata)
Query Gesture

Conclusion

While there is no 1:1 path to migrate an application from Intel Perceptual Computing to Intel RealSense technology, developers who know the former will be encouraged to see how far the latter has come in just a single year. And development is continuing at a healthy pace. With laptops, 2 in 1s, and All-in-Ones with integrated Intel RealSense 3d cameras soon to appear in the market it’s a great time to give the technology a try and write applications and games now, while the app store space is wide open to clever, early adopting developers.

About the Author

Chris Skaggs is a 15 year veteran of the web and mobile software industry. The founder and CEO of both Code-Monkeys and Soma Games LLC, Chris has delivered software applications to some of the country’s most discerning clients like Intel, Four Seasons, Comcast, MGM and Aruba Networks. In addition to corporate customers, Code-Monkeys and Soma Games have programmed many casual and mid-core games for iPhone, iPad, Android and Mac/PC platforms. A Black Belt in Intel’s Software Developer Network, Chris also writes and speaks on topics surrounding the rapidly changing mobile application environment at venues like GDC Next, CGDC, Casual Connect, TechStart, Serious Play, and AppUp Elements.


Viewing all articles
Browse latest Browse all 3384

Trending Articles



<script src="https://jsc.adskeeper.com/r/s/rssing.com.1596347.js" async> </script>