Quantcast
Channel: Intel Developer Zone Articles
Viewing all articles
Browse latest Browse all 3384

Seduce Case Study

$
0
0

By John Gaudiosi

Downloads


Seduce Case Study [PDF 831.93KB]

Introduction


With Moore’s Law pushing technology forward at a record pace, the challenge for software developers today is to keep up with new input devices such as touch screens, perceptual computing, and eye tracking. One innovative developer created a future-proof system that will allow game makers and app programmers to stay ahead of the curve as new technology is introduced across multiple platforms.

When Eskil Steenberg of Quel Solaar, an independent development and research studio, was contacted by the organizers of the Intel Ultimate Coder Challenge: Going Perceptual contest, he decided to enter because he loves using the Ultrabook™ device and wanted do something very “techy and opaque.” Steenberg admits that he does not normally enter competitions and didn’t expect to win this one; however, he is glad he entered this particular contest, where his development toolkit, Seduce, won in the technical merit category.

The Intel Ultimate Coder Challenge: Going Perceptual contest was created to encourage innovation around perceptual computing. By engaging with early innovators and thought leaders, contestants like Steenberg collaboratively shared across the contest, with weekly communication around their experiences with the Intel® Perceptual Computing SDK, their challenges, and their solutions. The collaboration also included features of the Intel Perceptual Computing SDK that each contestant leveraged and new algorithms they developed. Contestants collectively improved the resulting apps across the board.

The Seduce App: Future Proof


Steenberg and Intel had discussed collaborating on Seduce before the competition. As a game developer and programmer who currently sits on the OpenGL* Architectural Review Board, Steenberg wanted to create an app that would allow for seamless interactivity in today’s cross-platform technology world.

“The PC is an incredibly open platform, and you can connect a wide variety of hardware, displays, and input devices,” said Steenberg. “Many people think of it as a desktop device with a mouse and keyboard.” Steenberg’s goal was to build applications that can accommodate the hardware available today as well as future hardware. “When a new input device enters the market, you usually try to redesign the application because there are certain things you didn’t think about. I wanted to fix that problem.”

Future-proofing technology is something that everyone tries to do. The key to longevity of a program or app in the marketplace is to design it so that programmers and developers won’t have to constantly rewrite code. Steenberg accomplished this goal by focusing on a few products on the horizon.

“I looked at very large displays built for multiple users, where the user cannot be expected to reach the entire display in a touch interface,” said Steenberg. “This impacts things such as the traditional start button, for example. When an entire wall makes up the display, users cannot touch the start button. I also examined the number of mouse pointers available for multi-touch, something that most interfaces and software today don’t handle well. For example, a mouse doesn’t work on Xbox* and a controller doesn’t work on Microsoft Windows*. In addition, I looked at resolution independence and scalability. Computers assume that an element with a certain amount of pixels will cover a certain portion of the screen. Independence in resolution input and graphics would allow low-resolution displays to work with a high-resolution mouse and super high-resolution displays to work with touch buttons for core precision.”


Seduce demo pop-up menu

Steenberg noted the possibilities and limitations of these technologies and coded the software so it stayed within these parameters and limitations. He said complications sometimes arose because he needed to solve problems for things he wouldn’t have done otherwise. He made a future projection, listed its requirements, and then made architectural adjustments. “I created an interface that generically described input devices so users can connect and configure them to any input device,” said Steenberg. “The technology challenge was not the interface, but rather the ability of plug-ins to take over the rendering pipeline, which was much more complicated. In the end, it was a lot of trial and error. The Microsoft interface for OpenGL and context system that exists is robust and good; however, it’s an obscure piece of technology that only a few people on the OpenGL review board, operating system developers, and I care about. There are very few specific uses for creating multiple contexts of OpenGL and making them work together. I searched for people who could give me sample code or a description of how this is done, and I spent a few late nights on the Internet to learn how it works.”

To display an interface, one shouldn’t assume the pixel resolution in any way corresponds to the size of the interface. A normal image on a PC screen comprises a collection of pixels, but when zoomed in, the images get blocky. Steenberg set out to create an image in a resolution that would remain static, regardless of size. Interfaces today are designed for a general idea of the display resolution utilizing bitmap graphics. He sought a different approach by storing the images in triangles, or curved lines, and using mathematical descriptions. Regardless of how small or large a triangle gets, you can re-compute which pixels are inside or outside the triangle. An interface built from triangles, or polygons, increases the resolution because they’ll become crisper with more detail when zoomed in. This entire interface is scalable. Each individual element in the interface can be changed in the display to accommodate different environments and displays such as using a touch display while wearing gloves.


Seduce demo icon list

The interface should take into account the users’ viewing angles within a 3D space so it can easily support stereoscopic displays, head tracking, head-mounted displays, and augmented-reality applications. Steenberg created the Betray library, which opens up the future-proof technology that Seduce unleashes across all platforms, allowing any type of input device—from mouse to keyboard to gestures to touch screens—to seamlessly work on an app with the ease of an API.

“Betray is a library of all the inputs from the hardware,” said Steenberg. “I wanted to (hide) where the input originated because a pointer can come from many different devices, including a mouse, track pad, touch screen, or a Wii* remote. The device can have any number of buttons. I wanted to enter something very generic.”

The Betray library doesn’t use much space and is actually two different APIs. One is what the application uses to ask for hardware capabilities, input, buttons, display, and the things required to send out sounds. This encompasses 90 percent of how the app will be used. Betray is currently very small and has very limited features because Steenberg’s goal was to create a secondary API that doesn’t let users read input, but provide input. Betray allows users to write a plug-in for any type of input.

“You can install an SDK and then write a plug-in that explains to Betray what this hardware does. Betray then passes this information to the normal API for use,” said Steenberg. “The app doesn’t need to understand how the plug-in works; it simply requests the information so that developers can support hardware that they don’t have or understand.”


Betray relinquish test application

That means users can buy a plug-in that searches for your buttons or pointers, changes the display or maximizes it in certain ways, implements sounds, or adds an entire 3D sound system.

“An interface, buttons, sliders, and other input devices are needed to support future hardware,” said Steenberg. “The end stage is an OpenGL standards interface that is fully 3D. Everything is transferrable and scalable, so there’s no pixel size. The interface handles everything from smartphones to a full wall-size display.”

Challenges Addressed During Development


Project and Technology

Steenberg needed to create an intuitive, future-proof library to seamlessly allow input from multiple devices. His first objective was to create the “Imagine” sub-library to handle directory management; he listed here the volumes and directories that unified Unix* and Windows. A traditional Windows PC has multiple volumes of storage that exist virtually in the OS, but it’s not actually a space, it’s where you pick your physical hard drive. Windows requires two separate pieces of information: the available volumes and their content. In a Unix system, one root exists from which everything branches off, and you can place things anywhere in the tree. This causes complications for programmers trying to support these two different machines. In Steenberg’s solution, if the user requests a root directory, he’ll find a fake directory list of the machine’s contents and the app will automatically go into those drives. When you search your machine for a file, you can easily do a recursive search and the app searches all the disks; the code will look the same, whether it’s on a Unix, Linux*, or Mac* system. He also created the application settings API (previously found in Seduce), the dynamic loading of libraries, the sharing of function pointers (needed for the plug-in system), threads and mutexes, and the execution.

Next, Steenberg implemented an out-of-the-box functionality of the Betray library, using code from older projects. This allowed for opening a window with OpenGL/OpenGL ES Context (with FSAA), mouse and keyboard support, reading cut and paste, opening file requesters, directory search, execute, quad buffer stereoscopic, threads, multi-touch (it supports Windows 7 and forward, but not builds on older systems), full screen, timers, and mouse warp. Steenberg is a pure C programmer, so he sought help from a C++ friend, Pontus Nyman, to write a C wrapper for the functionality of the API. Steenberg also encountered an algorithm that didn’t use a depth map for face recognition; he overcame this issue using his own code.

The project undertaken during the challenge was to enhance and simplify a platform library with features such as multi-touch, tilt sensors, head tracking, and stereoscopics. Several different types of applications exist, including Adri Om, a data visualization tool, and Dark Side of the Moon, a real-time strategy game currently using the platform library, which will be modified to showcase the possibilities with these technologies. Steenberg identified an interface toolkit with sliders, buttons, menus, and other elements. The toolkit was designed for software development where the application can run on diverse hardware setups such as tablets, TVs, PCs, laptops, convertibles, head-mounted displays, or large scale multi-user walls. It includes wands, head tracking, multi-touch, and stereoscopics.


Seduce tracks head and hand movements for new interactivity.

According to Steenberg, working with the Intel Perceptual Computing SDK (in beta at the time of the contest) presented some challenges; however, he was able to use the Creative* Interactive Gesture camera and bypass the SDK and API to get the head tracking at 60 frames per second and the head detection to a quarter pixel (down from five pixels). He wrote four separate algorithms to find, track, and stabilize the head position, down to sub-pixel accuracy. He used no smoothing or prediction to avoid adding any latency. The result was a much more predictable, precise, and stable head tracker.

“Once you find a head you’re tracking you want to hold onto it,” said Steenberg. “Therefore the ‘head finder’ is only needed in the first frame or if the algorithm concludes that the head is lost. To make this quick, I pick a pixel, read out its depth value, and then check if a pixel’s head size—to the left, right, and above—are all at least 200 mm farther away. I do this on every hundredth pixel. When I get a lot of positives, I choose the one closest to the camera.”

Mother Nature caused one issue that Steenberg faced. Heads are round, and if you choose a pixel on the side of a head, the height value will be much lower, and the vertical scan will fall off the edge where the head rounds off. Secondly, a head’s edge has hair, which diffuses the infrared (IR) pulse. To resolve this issue, Steenberg sent many vertical and horizontal rays toward the head—accounting for distance—and then averaged them.

“Now I have a fairly accurate idea of the head’s location; however, I was tracking the edges of the head hair and not the skull surface,” said Steenberg. “I drew a box around the head and did an average position of the pixels inside the box. I weighed the pixels by how far they protruded from the face and the brightness of the IR reflection.”

While working with this technology, Steenberg delved into his vast programming knowledge to think to the future and create one adaptable app. Steenberg quickly supplemented the API, and he believes that with higher-resolution gesture cameras, head tracking will come of age on new devices. “Head tracking has reached a point where it’s very useful,” said Steenberg. “If we get double or quadruple resolution on these gesture cameras we’ll have very good head tracking, and that’s exciting for a lot of uses. I’m excited about the ability to gather situational awareness data for computers. For example, you would get a 3D scan of an item by photographing it with a gesture camera that has depth capabilities, getting measurements in a 3D model, and then manipulating the model to get immediate feedback.”

The user interface should disappear and connect users directly to the machine,” said Steenberg. “When you drive a car you don’t think about how to turn left, you just make the turn. If you have doubts that the wheel will turn the car, the interface becomes a disaster; for an interface to disappear you must trust it 100 percent, and if it fails once it becomes worthless. This creates an incredibly high bar for tracking and voice recognition to reach.” But Steenberg believes advances in web cameras will create new opportunities for eye tracking and intelligent communication between PCs and users. He believes the key to the future is thinking beyond present day use.


Seduce demo “vanishing point”

“When developing a game or tool, I think about what I want to do now and what I want to do in the future. I want projects that spawn not just a new application but new libraries and new technologies that align with the future. Preferably, my products will leave users with a lot of options and flexibility.”

For Steenberg, the Intel Ultimate Coder Challenge is about building technology for tomorrow. It's about making something that supports the Intel Perceptual Computing SDK and the new generation of Ultrabook devices, and making everything developers will do in the future to support these and emerging technologies.

Lessons Learned


Today’s consumers and business workers are accessing everything—from the Internet to productivity tools—from a variety of connected devices. Developers seek multiple interfaces for each unique device. Steenberg built a future-proof library to complement Seduce, an app that adapts to any type of user interface. Every project of this nature involves challenges. When the Intel Perceptual Computing SDK and built-in camera presented limitations, Steenberg used his experience to think outside the box and create his own solution. He used his programming skills to create an app that is built for today’s evolving landscape and will adapt as camera technology improves and portable computing advances.

About Eskil


Eskil Steenberg, an avid programmer, game designer, and participating developer of OpenGL, has worked on experimental programming projects such as Verse, a network protocol for computer graphics and connected micro applications that can synchronize in real-time two different graphical applications. He recently worked on Love, a massive procedural action-adventure research project that focused on what video games should be; and he’s currently developing a new strategy game called Dark Side of the Moon. Steenberg believes it’s important to always make room for more innovation.


Eskil Steenberg

Resources


Eskil Steenberg supplemented his skills with resources such as Component Source. Along with the other contestants, Steenberg also utilized the Intel forums and Intel hardware support.

Intel does not make any representations or warranties whatsoever regarding quality, reliability, functionality, or compatibility of third-party vendors and their devices. For optimization information, see software.Intel.com/en-us/articles/optimization-notice/. All products, dates, and plans are based on current expectations and subject to change without notice. Intel, the Intel logo, Intel Core, Intel AppUp, Intel Atom, the Intel Inside logo, and Ultrabook, are trademarks of Intel Corporation in the U.S. and/or other countries. *Other names and brands may be claimed as the property of others. Copyright © 2013. Intel Corporation. All rights reserved.


Viewing all articles
Browse latest Browse all 3384

Trending Articles



<script src="https://jsc.adskeeper.com/r/s/rssing.com.1596347.js" async> </script>