Quantcast
Channel: Intel Developer Zone Articles
Viewing all articles
Browse latest Browse all 3384

Revving Up Space Astro Blaster* with Intel® RealSense™ Technology

$
0
0

Game developers are continually pushing the limits of technology, demanding faster processing and higher quality graphics for their games. Alongside this push, laptops and tablets configured with the Intel® RealSense™ 3D F200 camera are transforming the ways users interact with games and devices. Using Intel RealSense technology, developers are now integrating complex gesture-controlled methods into their games, including hand tracking, facial analysis, and even voice commands.

One such game, Space Astro Blaster* developed by CompSens, was originally showcased at the 2014 Consumer Electronics Show in Las Vegas, Nevada. Built with a single battleship and a single scene, the PC game was a proof of concept that integrated the Intel RealSense SDK and allowed players to control the game without a keyboard or mouse. After a successful demo about how the gesture controls worked, CompSens developed Space Astro Blaster (Figure 1) into a full title, with dozens of combinations of ships, enemies, and levels to take full advantage of the Intel RealSense SDK and create an immersive gesture-controlled experience previously unavailable on any platform.


Figure 1: Game trailer of the 3D space shooting game Space Astro Blaster*,
which was developed using the Intel RealSense SDK.

Space Astro Blaster is a third-person space shooting game in which the player controls spaceships against a progressively more difficult onslaught of enemies ranging from passive asteroids to large and powerful enemy cruisers. Using their hands and head movements, players move the ship in 3D space to avoid obstacles and aim its substantial weaponry, firing upon the enemy ships while progressing through the environments.

All gameplay functions can be done without touching a keyboard or mouse. With the integrated gesture-recognition technology, the player can move the ship in any direction (forward, back, side to side, and even in or out of the screen) by simply holding a hand in the air and moving it slightly in that direction. This functionality gives the player six axes of control without the need for a controller. Finger gestures are also recognized so that putting fingers together continuously fires the primary cannon, but raising and parting the index and middle fingers into a “V” (V-sign) will fire a larger missile (Figure 2).


Figure 2: Raising and parting two fingers in a V-sign fires large missiles.

Using the gesture-recognition feature supported by the Intel RealSense SDK, the game detects the V-sign and triggers the missile-launching event. For the basic firing gesture, the game uses algorithms (based on the joint-tracking output data from the Intel RealSense SDK) to evaluate the bending of fingers to determine in real time whether the hand pose is closed or open.

Optimizing with Intel RealSense Technology

By utilizing the Intel RealSense SDK and an Intel RealSense 3D camera, developers like CompSens can integrate gesture control in a wide range of applications and scenarios. Space Astro Blaster specifically takes advantage of hand tracking, finger tracking, and basic facial analysis to integrate the unique control scheme found in the game.

Hand Tracking

Hand and finger tracking can take advantage of 22 tracking points, enabling support that recognizes static poses as well as dynamic gestures such as waves. CompSens used the algorithms integrated with the Intel RealSense SDK to find and track the hand in a specific pose (Figure 3) to enable firing of the weapons.

Open hand = Free movement in 6-axis with no fire.
V-sign = Launch a missile
Close hand = Fire primary cannon with free movement.

Figure 3: Gesture controls

Waves and basic positions are used to move the ship on its six axes of control—up, down, left, right, forward, and back—and allow the player to move the battleship to complete the mission (Figure 4). Combining the position and hand-and-finger tracking allows for a completely intuitive and exciting gameplay style.

 
Figure 4: Waves and basic positions are used to move the battleship up, down, left, right, forward, and back.

Facial Analysis

Basic facial analysis is used for head tracking during specific segments of the game that test the player’s ability to dodge incoming obstacles with the battleship. By simply shifting their body to the left or right, players move the ship in the same direction (Figure 5), bringing a more visceral feel to the gameplay style.

 
Figure 5: Players can navigate obstacles by moving their head to the left or right.

The Intel RealSense SDK helped to simplify this development process for CompSens by integrating gesture and position detection right out of the box. The SDK enables game developers to track natural interface methods so they can focus on implementing these gestures into gameplay and the user interface rather than having to focus on the image recognition and processing portions necessary to reliably measure and report gesture output.

Decisions and Challenges

When creating any game, developers need to understand the capabilities of the platforms they are targeting and which recognition modules are required. CompSens noted that in order to maintain fast performance and low latency on gesture recognition, modules that are not actively in use during a specific game instance are disabled. For example, requiring the facial recognition model during standard gameplay that focuses on hand recognition made the software more complicated, causing performance concerns on lower performance platforms. To address this concern, CompSens enabled the facial recognition portion only during the obstacle avoidance levels; the hand recognition module was disabled during this activity to keep a healthy balance of features and performance.

While the Intel RealSense SDK provides a wealth of recognition models and algorithms, the Intel RealSense software allows for more flexibility from the developer. For example, CompSens created and modified a particular gesture recognition pipe to improve performance for specific finger gestures. One example is the recognition of the basic firing gesture, “close hand”. Initially, the gesture recognition module in the Intel RealSense SDK was used; however, it sometimes failed to recognize the gesture and caused the basic cannon to fire intermittently, especially when the user’s hand was moving fast. Thus, CompSens changed the recognition strategy for this gesture to their own algorithm, which was to evaluate the bending of fingers based on the hand-tracking joint data from the Intel RealSense SDK in every frame to determine whether the hand pose is closed or open in real-time. Because the Intel RealSense SDK is modular, this upgrade was simple to integrate.

For those instances where the Intel RealSense SDK implementation is on a mobile device, disabling and powering down the Intel RealSense 3D camera can save compute cycles and power consumption. Disabling the camera when in menus or during stage loads where it is not needed allows the platform to conserve power, extending battery life and time for a longer game session or other productivity tasks.

Testing

With a new interface methodology—and the potential problems stemming from teaching players how to interact with the game—developers might assume that the normal testing process might require alterations; however, according to CompSens, that wasn’t the case with their app.

In keeping with the course of normal software development, the Space Astro Blaster team put the game in front of players, watched how they reacted, and listened to what was reported after the play testing. Of course, the application changes made as a result of that user testing were slightly different than what would be required with traditional input methods, but the testing and validation process was nearly identical to previous projects. Before long, they found the hand fatigue issue caused by the new input method: the user’s hand had to be raised continuously to control spaceship flight in the game. To address the fatigue problem, they designed two control modes in the game process: one is the hand and gesture control to fight with alien warships, the other is to use the head motion to control the battleship and avoid asteroids. The two control modes are carried out alternately so that the player’s hand isn’t fatigued.

Intel Tools and Support

With the world of natural gesture recognition being relatively new, Intel built the RealSense SDK and the software development program to improve the integration experience of developers like CompSens. During the development of Space Astro Blaster, CompSens received hands-on support from Intel, and the team worked with the company on Intel RealSense SDK start-up concerns, questions about algorithm integration, and the custom recognition path built for Space Astro Blaster directly.

Online documentation is also a critical feature of any software package for developers. A quick look at the Intel RealSenseSDK Support page shows a collection of reference documentation, release notes, and training guides in the form of videos, presentations, and webinars. Intel includes reference design guides to help point developers in the right direction for integrating many of these new user interface methods.

Sample code is also available on the Intel RealSense SDK Support Training page, listed under the Tutorials section. If the sample code solves the problem the developer is addressing, it can be simply lifted and placed; otherwise, it can be modified and expanded through the in-line documentation provided with each code set.

Intel also helped CompSens with the development of Space Astro Blaster by showcasing the early stages of the game at trade shows like the Consumer Electronics Show in Las Vegas and the Intel Developer Forum in San Francisco. These opportunities allowed the media and OEMs to interact with the game and learn more about how Intel RealSense technology can change the way players interact with computers and spread the word about the space-based shooter.

Looking Forward

There are still opportunities for CompSens to integrate the Intel RealSense SDK and platform into the game. They want to use the 3D face modeling and integration feature that is included with the Intel RealSense SDK so that players can create in-game 3D models of their faces. These models could be used as the icon for specific spaceships or even integrated into the cockpit of the battleships to be used as a player’s avatar, making Space Astro Blaster an even more immersive experience.

As computers and software continue to become more complex, users will need a more natural way to interact with machines. The ability to use gestures, voice, and motion will become not only a welcome change, but a necessary piece of the computing ecosystem.

About CompSens

CompSens, based in Beijing, China, develops both virtual reality and augmented reality software solutions. With a focus on natural-interaction software for professional and consumer spaces, it was a natural progression for the team to look into game development as a way to learn about the new methods of user interaction that the Intel RealSense SDK provides. CompSens is planning further integration of the Intel RealSense SDK into the Space Astro Blaster game and also plans to offer lifelike user-interface integration into other upcoming software projects.

Additional Resources

Intel RealSense Developer Zone

Intel RealSense 3D Camera

Case study Space Between* Plumbs Ocean Depths with Intel RealSense Technology


Viewing all articles
Browse latest Browse all 3384

Trending Articles



<script src="https://jsc.adskeeper.com/r/s/rssing.com.1596347.js" async> </script>