Quantcast
Channel: Intel Developer Zone Articles
Viewing all articles
Browse latest Browse all 3384

Tutorial: Using Intel® RealSense™ Technology in the Unreal Engine* 3 - Part 2

$
0
0

Download PDF 854 KB

Part 1

Setting Up Visual Studio 2010 for the Example Game

The steps below set your map file as the default map for the Example game by modifying the .ini file.

  1. Go to <UE3 Source>\ExmapleGame\Config.

  2. Open DefaultEngine.ini and change as shown below.

    [URL]

    MapExt=umap

    Map=test.umap

    LocalMap=BLrealsense_Map.umap

    TransitionMap=BLrealsense_Map.umap

    EXEName=ExampleGame.exe

    DebugEXEName=DEBUG-ExampleGame.exe

    GameName=Example Game

    GameNameShort=EG

  3. Open ExampleEngine.ini and change as listed.

    [URL]

    Protocol=unreal

    Name=Player

    Map=test.umap

    LocalMap=BLrealsense_Map.umap

    LocalOptions=

    TransitionMap=BLrealsense_Map.umap

    MapExt=umap

    EXEName=ExampleGame.exe

    DebugEXEName=DEBUG-ExampleGame.exe

    SaveExt=usa

    Port=7777

    PeerPort=7778

    GameName=Example Game

    GameNameShort=EG

  4. Open the UE3 Visual Studio project or solution file in <UE3 source>\Development\Src – UE3.sln, or open UE3.sln in Visual Studio.

    Figure 1: Microsoft Visual Studio* 2010.

  5. Build and run as in the previous steps. You will see the Unreal initial window and your game.

Using the Coordinate System in Unreal Engine

Before linking with the Intel® RealSense™ SDK, it is important to understand the coordinate system in Unreal.

Position is tracked by the X-Y-Z axis (Refer to “Origin” and “RotOrigin” class in UE3 source code) and rotation is by Euler (P-Y-R) and Quaternion (Refer to https://en.wikipedia.org/wiki/Quaternion for more detail). 


Figure 2: Coordinate system

Quaternion has one scalar factor and three vector factors.

To convert from Euler angle to Quaternion:

X-Y-Z angles:

Autoexpand Setup for a Debugger in Visual Studio 2010 (Optional)

The debugging symbols for bone structure array, position, and rotation array were originally encrypted and unrecognizable in Visual Studio. To see debugging symbols, follow the steps below.

  1. Find your Autoexp.dat
     

    For Visual Studio and Windows 7 64-bit, it is located at C:\Program Files (x86)\Microsoft Visual Studio 10.0\Common7\Packages\Debugger

  2. Find the debugging script and open it.
     

    UE3 source\ Development/External/Visual Studio Debugging/AUTOEXP.DAT_addons.txt

  3. Copy each [AutoExpand] and [Visualizer] section into your Autoexp.dat.

Intel® RealSense™ SDK Enabling on Unreal Engine 3

This section describes Intel RealSense SDK-related changes in Unreal Engine 3 after installing the Intel RealSense SDK and Depth Camera Manager. Face landmark and head-pose tracking APIs in Intel RealSense SDK are used to manipulate facial expression and head movement of the example character. Head-pose tracking is intuitive since the roll, yaw, and pitch values can be used in Unreal Engine 3 as is, but face landmark tracking is more complicated.


Figure 3: Roll-Yaw-Pitch.

There are 76 traceable points for the face provided by the Intel RealSense SDK. Each expression, like blink or mouth open, has a value range with relevant points. For example, when the eye is closed, the distance between point 12 and point 16 will be around 0, and when the eye is open, the distance will be greater than 0 and varies for each individual.

Based on this, the current implementation is based on the relative calculation of the minimum/maximum value between the character and the user. For example, for blinking, calculate and apply how much distance the game character’s eye should have for eyes open and closed according to the user.


Figure 4: Face landmarks and numbers of the Intel® RealSense™ SDK.

<UE3> is the home folder where UE3 is installed. Below four files are to be modified.

  • <UE3>\Development\Src\UnrealBuildTool\Configuration\UE3BuildConfiguration.cs
  • <UE3>\Development\Src\UnrealBuildTool\Configuration\UE3BuildWin32.cs
  • <UE3>\Development\Src\Engine\Inc\UnSkeletalMesh.h
  • <UE3>\Development\Src\Engine\Src\UnSkeletalComponent.cpp

UE3BuildConfiguration.cs (Optional)

public static bool bRealSense = true;

RealSense relevant codes are enclosed with “#if USE_REALSENSE” phrase. This sentence is used for defining “#if USE_REALSENSE” phrase at UE3BuildWin32.cs file.  If you modify this to “false”, RealSense relevant code won’t be referenced to be compiled. This is an optional.

UE3BuildWin32.cs

if (UE3BuildConfiguration.bRealSense)
{
SetupRealSenseEnvironment();
}
void SetupRealSenseEnvironment()
{
      GlobalCPPEnvironment.Definitions.Add("USE_REALSENSE=1");
      String platform = (Platform == UnrealTargetPlatform.Win64 ? "x64" : "Win32");

      GlobalCPPEnvironment.SystemIncludePaths.Add("$(RSSDK_DIR)/include");
      FinalLinkEnvironment.LibraryPaths.Add("$(RSSDK_DIR)/lib/" + platform);

      if (Configuration == UnrealTargetConfiguration.Debug) {
           FinalLinkEnvironment.AdditionalLibraries.Add("libpxc_d.lib");
      } else {
           FinalLinkEnvironment.AdditionalLibraries.Add("libpxc.lib");
      }
}

The definition of “USE_REALSENSE” that will be used to enable or disable Intel RealSense SDK relevance at source codes (Optional).

Since Unreal Engine 3 is a makefile-based solution, the Intel RealSense SDK header file and library path should be added at the project’s include and library path.

UnSkeletalMesh.h

#if USE_REALSENSE
	PXCFaceData* faceOutput;
	PXCFaceConfiguration *faceConfig;
	PXCSenseManager *senseManager;

	void InitRealSense();
	void ReleaseRealSense();
#endif

This is the declaration part of the Intel RealSense SDK classes and functions. The bone structure manipulating part is at UpdateSkelPos() of UnSkeletalComponent.cpp.

UnSkeletalComponent.cpp

#if USE_REALSENSE
	#include "pxcfacedata.h"
	#include "pxcfacemodule.h"
	#include "pxcfaceconfiguration.h"
	#include "pxcsensemanager.h"

	FLOAT rsEyeMin = 6;
	FLOAT rsEyeMax = 25;

	FLOAT rsMouthMin = 5;
	FLOAT rsMouthMax = 50;

	FLOAT rsMouthWMin = 40;
	FLOAT rsMouthWMax = 70;

	FLOAT chMouthMin = -105;
	FLOAT chMouthMax = -75;
……
#endif

Include Intel RealSense SDK header files. Defines minimum/maximum values of the user and game characters, starting with “rs” as the user’s value, and “ch” as the game character’s value (this should be changed according to the user and game character’s appearance). For example, for blinking, this defines how much distance a game character’s eye should have for eyes open and closed according to the user.

void USkeletalMeshComponent::Attach()
{
……
#if USE_REALSENSE
	senseManager = NULL;
	InitRealSense();
#endif

The Attach() function calls the InitRealSense() function to initialize the Intel RealSense SDK’s relevant classes and configure the camera. 

#if USE_REALSENSE
void USkeletalMeshComponent::InitRealSense() {
	if (senseManager != NULL) return;

	faceOutput = NULL;

	senseManager = PXCSenseManager::CreateInstance();
	if (senseManager == NULL)
	{
 // error found
	}

	PXCSession *session = senseManager->QuerySession();
	PXCCaptureManager* captureManager = senseManager-> QueryCaptureManager();

The InitRealSense() function configures which camera will be used,and creates face-relevant class instances.

void USkeletalMeshComponent::UpdateSkelPose( FLOAT DeltaTime, UBOOL bTickFaceFX )
{
……
#if USE_REALSENSE
if (senseManager->AcquireFrame(false) >= PXC_STATUS_NO_ERROR) {
	faceOutput->Update();
	int totalNumFaces = faceOutput->QueryNumberOfDetectedFaces();
	if (totalNumFaces > 0) {

The UpdateSkelPose() function is used for head pose and face landmark tracking.

// Head
FVector v(yaw, roll, pitch);

LocalAtoms(6).SetRotation(FQuat::MakeFromEuler(v));
LocalAtoms(6).NormalizeRotation();

Head-pose tracking is intuitive because roll, yaw, and pitch values from the Intel RealSense SDK can be used as is.


Figure 5: Face landmarks and numbers that are used for eyes and mouth expression.

To express blinking, landmark points 12, 16 and 20, 24 are used, and 47, 51, 33, 39 are used for mouth expression (detail implementation depends on developers’ preference).

// Mouth
FLOAT mouthOpen = points[51].image.y - points[47].image.y;
mouth = chMouthMax - (mouthOpen - rsMouthMin) * mouthRatio;

mouthOpen = points[47].image.x - points[33].image.x;
rMouthWOpen = chMouthWMin + (mouthOpen - rsMouthWMin) * mouthWRatio;

mouthOpen = points[39].image.x - points[47].image.x;
lMouthWOpen = chMouthWMin + (mouthOpen - rsMouthWMin) * mouthWRatio;

cMouth = chMouthCMax - (mouthOpen - rsMouthWMin) * mouthCRatio;
// Left Eye
FLOAT eyeOpen = points[24].image.y - points[20].image.y;
lEyeInner = chEyeInnerMin + (eyeOpen - rsEyeMin) * innerEyeRatio;
lEyeOuter = chEyeOuterMin + (eyeOpen - rsEyeMin) * outerEyeRatio;
lEyeUpper = chEyeUpperMin + (eyeOpen - rsEyeMin) * upperEyeRatio;
// Right Eye
eyeOpen = points[16].image.y - points[12].image.y;
rEyeInner = chEyeInnerMin + (eyeOpen - rsEyeMin) * innerEyeRatio;
rEyeOuter = chEyeOuterMin + (eyeOpen - rsEyeMin) * outerEyeRatio;
rEyeUpper = chEyeUpperMin + (eyeOpen - rsEyeMin) * upperEyeRatio;
rEyeLower = chEyeLowerMin + (eyeOpen - rsEyeMin) * lowerEyeRatio;

BN_Lips_Corner_R, BN_Lips_Corner_L, BN_Jaw_Dum       is used for mouth expression, and BN_Blink_UpAdd, BN_Blink_Lower, BN_Blink_Inner, BN_Blink_Outer is used to express eye blinking. (Refer to the “Facial Bone Structure in Example Characters” section for each bone number.)

// Mouth
FVector m(90, 0, mouth);
LocalAtoms(59).SetRotation(FQuat::MakeFromEuler(m));

LocalAtoms(57).SetTranslation(FVector(mouthWXZ[2], rMouthWOpen, mouthWXZ[3])); // Right side
LocalAtoms(58).SetTranslation(FVector(mouthWXZ[4], lMouthWOpen * -1, mouthWXZ[5])); // Left side

// Left Eye
LocalAtoms(40).SetTranslation(FVector(eyeXY[0], eyeXY[1], lEyeUpper)); // Upper
LocalAtoms(41).SetTranslation(FVector(eyeXY[2], eyeXY[3], lEyeLower)); // Lower
LocalAtoms(42).SetTranslation(FVector(eyeXY[4], eyeXY[5], lEyeInner)); // Inner
LocalAtoms(43).SetTranslation(FVector(eyeXY[6], eyeXY[7], lEyeOuter)); // Outer

// Right Eye
LocalAtoms(47).SetTranslation(FVector(eyeXY[8], eyeXY[9], rEyeLower)); // Lower
LocalAtoms(48).SetTranslation(FVector(eyeXY[10], eyeXY[11], rEyeOuter)); // Outer
LocalAtoms(49).SetTranslation(FVector(eyeXY[12], eyeXY[13], rEyeInner)); // Inner
LocalAtoms(50).SetTranslation(FVector(eyeXY[14], eyeXY[15], rEyeUpper)); // Upper
void USkeletalMeshComponent::ReleaseRealSense() {
	if (faceOutput)
		faceOutput->Release();

	faceConfig->Release();
	senseManager->Close();
	senseManager->Release();
}

Close and release all of the Intel RealSense SDK relevant class instances.

Facial Bone Structure in Example Characters

In the example, the face is designed with 58 bones. In the image, each box represents a bone. A complete list of bones follows.


Figure 6: Names of bones.

Conclusion

To make an avatar that moves and copies users’ facial movements and expressions to enrich the gaming experience in UE3 and using the Intel RealSense SDK, implementation of the UE3 source code is the only option, and developers must know which source file to change. We hope this document helps you when making avatar in UE3 with the Intel RealSense SDK.

About the Authors

Chunghyun Kim is an application engineer in the Intel Software and Services Group. He focuses on game and graphic optimization on Intel® architecture.

Peter Hong is an application engineer at the Intel Software and Services Group. He focuses on enabling the Intel RealSense SDK for face, hand tracking, 3D scanning, and more.

For More Information

Epic Unreal Engine
https://www.unrealengine.com

Intel RealSense SDK
http://software.intel.com/realsense

Part 1


Viewing all articles
Browse latest Browse all 3384

Trending Articles



<script src="https://jsc.adskeeper.com/r/s/rssing.com.1596347.js" async> </script>