Quantcast
Channel: Intel Developer Zone Articles
Viewing all articles
Browse latest Browse all 3384

Virtual Archers Put Gesture to the Test with Longbow*

$
0
0

By John Tyrrell

Initially released in 2013 for mobile devices, developer Jason Allen reworked archery mini-game Longbow* for Intel® RealSense™ technology-enabled laptops, PCs, and tablets. The game—slated for release in August 2015—is a relatively simple archery simulation set in a series of medieval-style environments where players take into account distance, wind speed, and wind direction to hit the bull’s eye (Figure 1).

Players fire at traditional-looking targets in a rustic setting
Figure 1: Players fire at traditional-looking targets in a rustic setting.

Intrigued by the 3D input possibilities of the Intel® RealSense™ SDK and Intel® RealSense™ 3D Camera (F200), Allen saw the archery-based gameplay as a perfect opportunity to use hand and gesture tracking for the action of aiming and firing arrows.

 

Optimizations and Challenges

 

Gesture Controls

Longbow for Intel RealSense technology is played using the hand-tracking capabilities of the Intel RealSense 3D camera, with the player’s forward bow-holding hand used for aiming and the rear hand used to fire. To aim arrows, the game detects the first hand raised to the camera and records its initial 3D position, which then becomes the center point. The player then aims by moving the forward hand, with the game tracking the distance and direction of it in relation to the center point. Mimicking the action of a real archer, the natural choice is to hold the aiming hand in a fist, but this is not actually required by the game.

The following code tracks the user's hand movements. It first queries the mass center using the Intel RealSense SDK member function QueryMassCenterImage from the PXC[M]HandData interface.  If it finds that there hasn't been an initial orientation, it assumes the user has just raised their hand to the camera and recalibrates the starting point. Otherwise, it measures the distance the user has moved their hand from the initial point and interprets that movement in the game (in Longbow's case, it rotates the camera).

PXCMPointF32 imageLocation = data.QueryMassCenterImage();

        Vector3 normalizedImageLocation = new Vector3(imageLocation.x / input.colorWidth * 2f - 1f, imageLocation.y / input.colorHeight * 2f - 1f, 0f);

        if (initialOrientation == Vector3.zero)
        {
            if (!calibrating)
            {
                calibrationElapsed = 0f;
                moveDelta = Vector3.zero;
                calibrating = true;
            }
            else if (calibrationElapsed > .3f)
            {
                initialOrientation = new Vector3(-normalizedImageLocation.y, normalizedImageLocation.x, 0f);
                moveDelta = Vector3.zero;
                calibrating = false;
            }
        }

        if (initialOrientation == Vector3.zero) return;

        moveDelta = (initialOrientation - new Vector3(-normalizedImageLocation.y, normalizedImageLocation.x, 0f)) * sensitivity;

The next code snippet measures the user's hand depth in the same way. First, it finds if there is an existing point of reference, and if not, calibrates one. It then measures the delta (this time in depth) from the first point to get how far the player has pulled their hand back.

if (initialDepth == 0f)
        {
            initialDepth = data.QueryMassCenterWorld().z;
        }

        myDepth = data.QueryMassCenterWorld().z - initialDepth;

The player uses the other hand to pull back the arrow while the game uses depth tracking to measure its distance from the forward aiming hand (Figure 2). When the rear firing hand reaches a predetermined distance from the aiming hand, the arrow automatically fires.

Players use their hands to mimic the gestures of aiming and drawing a bow
Figure 2: Players use their hands to mimic the gestures of aiming and drawing a bow.

Allen originally tested a different and more realistic firing gesture whereby the action of opening the rear hand would release the arrow. However, the responsiveness of this gesture proved inconsistent and hence frustrating for players. As a result, the hand-opening gesture was abandoned during development in favor of the simpler depth-based mechanism.

Despite players occasionally misfiring arrows as they became accustomed to the maximum distance they could pull back without firing, the depth-based system proved to be much more reliable and accurate, resulting in a more enjoyable experience.

Interpolating Data

When using human physical motion as an input device―either through the use of a 3D camera or an accelerometer in a handheld device―a common issue is jittery on-screen movements. This is caused by the constant minute movements of the hand and the sensitivity of the input device―in this case the Intel RealSense camera―and the sheer volume of precise data it generates.

With Longbow, Allen used the Unity 3D function, “Lerp” (linear interpolation), a process that averages the input data to deliver smooth on-screen movement. The process identified the optimum number of times per second the game needed to pull the hand-detection data from the camera to prevent detectable lag for the user. This turned out to be 5 to 10 times per second (considerably lower than the game’s frame rate of 30 frames per second). Next, linear interpolation is applied to the data, which averages the input data and estimates where the hand will be. This process results in a smooth and accurate on-screen rendering of the player’s movements. Allen smoothed the camera’s rotation based on the moveDelta value calculated earlier. The smoothness value determines how much to smooth the input; too much and you get lagged movements, and too little causes the movement to jump around by tiny amounts.

transform.rotation = Quaternion.Lerp (transform.rotation,
Quaternion.Euler (moveDelta + new Vector3 (0f, yOffset, 0f)),
Time.deltaTime * smoothness);

Allen also discovered that pulling data from the Intel RealSense camera as infrequently as possible and applying interpolation reduces the load on the processor, which helps the game maintain a steady frame rate and run more smoothly. This is particularly helpful when running the game on less powerful devices and ultimately improves the overall user experience.

 

Optimizing the UX

 

The biggest issue Allen had during development was adapting the game’s user experience for the Intel RealSense camera. He initially explored applying gesture controls to the game’s entire user interface, from the menu selections right through to gameplay, to make the game accessible without the need for touch or a mouse and keyboard. Using gestures to stop, start, navigate the menus, and make selections worked on a functional and technical level, but Allen found that that the process fell significantly short in delivering an enjoyable user experience.
 

The first problem was the complexity of teaching the user what actions to use and where to use them. Players were required to memorize a set of three specific hand gestures to navigate the menu and start the game. Allen found that players would frequently confuse gestures resulting in unwanted outcomes. Additionally―particularly when the original fist closed-to-open firing gesture was still in the game―Allen found that players would sometimes trigger an unwanted action such as pausing the game, adding to their frustration.

No Offense

Another interesting challenge that Allen faced while implementing the initial gesture-controlled interface was making sure that the gestures recognized by the Intel RealSense SDK were appropriate for an international audience. For example, the “two-finger pinch” or OK symbol, which is made by bringing together the tips of the thumb and forefinger, has a potentially offensive meaning in the Brazilian culture. The inability to use certain commonly recognized gestures and the need to create original gestures made the process of creating a gesture control scheme that users would be able to memorize even more complex.

Heavy Hands

One unexpected issue that Allen found with the gesture controls was the physical discomfort players experienced from having to hold their hands in front of the camera throughout the game. This led to aching arms, which significantly reduced the fun factor. To address this issue, Allen modified the game to allow players to drop their hands between rounds, instructing the Intel RealSense camera to go through the process of detecting the hands again at the start of each new round.

Keeping With Tradition

Overall, the game’s initial gesture-only interface proved non-intuitive to players and added a layer of complexity to the navigation. In the end, both Allen and Intel agreed that the menu interface would work better using touch and traditional mouse and keyboard controls. In the case of Longbow where the game is played in close proximity to the camera and screen, using these traditional interface controls is easy and accessible for the player, and they delivered a significantly more intuitive and comfortable user experience.

 

Testing and Analysis

 

As an independent developer, Allen had no testing pool and conducted the local testing alone using only his own computer. Fortunately for Allen, working with the Intel RealSense SDK meant he was able to count on Intel’s support at each stage of development. He used the Intel RealSense SDK documentation provided during the early phases, relying more heavily on the support of Intel engineers as the project took shape. Throughout their collaboration, Intel provided valuable feedback on the implementation of the gesture controls, including for the interface and the actions of drawing and firing arrows.

The main problems that arose through testing were the arrow-release mechanism and the user interface as described previously. The initial firing mechanism involved opening the fist to release the arrow, and testing showed that many users were unable to consistently fire this way. This led directly to the implementation of the modified firing mechanism based on drawing distance, whereby the arrow is fired when the drawing hand reaches a certain distance away from the forward aiming hand. Testing also led to the return to traditional mouse, keyboard, and touch controls for the game’s main navigation.

 

Intel RealSense SDK: Looking Forward

 

Following his Intel-inspired discovery of the Windows* Store, Allen now develops games for web and Windows devices in addition to his core work for the mobile market. His keen interest in developing for emerging platforms is what led to his involvement with Intel and his work in bringing Longbow to the Intel RealSense SDK platform.

Developing for the Intel RealSense SDK opened Allen’s mind to a world of new possibilities, the first being head tracking and simulations, either in a game or in an actual simulator where, for example, the user is being taught a new skill. The ability to look around a virtual world without having to wear head gear is a capability that Allen has already experimented with in his previously released game Flight Theory*.

Allen believes that Intel RealSense technology is a new frontier offering exciting new user experiences that will be available to growing numbers of consumers once the hardware begins its commercial rollout.

 

What’s Next for Longbow

 

Longbow was initially developed for mobile platforms, and the Windows version currently uses the same art assets (Figure 3). Allen intended to upgrade the graphics when he began developing the Intel RealSense SDK-enabled version of the game, but unexpected UX challenges sidelined the task, although a visual update is still high on the list of priorities.

Allen borrowed from the past to add more fun and a frisson of danger to Longbow*
Figure 3: Allen borrowed from the past to add more fun and a frisson of danger to Longbow*.

Now that Allen has the Intel RealSense SDK Gold release, he might revisit the original finger-tracking gesture control for firing arrows, using the release finger movement rather than the pullback distance-sensitive release mechanism.

 

About the Developer

 

Driftwood Mobile is the studio of independent developer Jason Allen based in Tasmania, Australia. Allen initially founded the studio in 2008 to develop games for the blind and visually impaired, having noted that few experiences were available that adapted to that audience. Around the same time, the mobile gaming and app market was beginning to explode, a shift that Jason has successfully capitalized on with the release of five separate mobile titles to date. Collectively, the games have accumulated over 40 million downloads over the last three years, with bowling game Galaxy Bowling* being the most successful, both in terms of user numbers (currently approximately one million active users) and revenue.

Allen is currently exploring how to make Galaxy Bowling (Figure 4) accessible for the blind and visually impaired with the vital support from the community. According to Allen, the core challenge in adapting a game for visually impaired players is distilling the large amount of information simultaneously displayed on a screen to comprehensible audio-based directions, which need to be delivered in a linear sequence so the player can process them. Allen aims to take the experience beyond coded bleeps of early games, using more realistic sound effects to direct the player, with his experiments so far proving surprisingly successful in delivering a fun experience.

Galaxy Bowling* for iOS* and Android* devices is Allen’s most successful title to date
Figure 4: Galaxy Bowling* for iOS* and Android* devices is Allen’s most successful title to date.

 

Additional Resources

 

Driftwood Mobile developer website

Intel® Developer Zone for RealSense™ Technology

Intel RealSense SDK

Intel® RealSense™ Developer Kit

Intel RealSense Technology Tutorials


Viewing all articles
Browse latest Browse all 3384

Trending Articles



<script src="https://jsc.adskeeper.com/r/s/rssing.com.1596347.js" async> </script>