The Intel® RealSense™ SDK has been discontinued. No ongoing support or updates will be available.
By Edward J. Correia
Intel is aiming to revolutionize the way users interface with traditional PCs, and Jacob Pennock is among the movement's primary champions. Back in 2013, Pennock won the Intel Perceptual Computing Challenge with Head of the Order*, a game that cast a spell on perceptual computing. Originally built with the Intel® Perceptual Computing SDK and Creative Senz3D* camera, the game has since evolved with the implementation of the new Intel® RealSense™ SDK and the Intel® RealSense™ 3D (front facing) camera. Pennock's experiences—and those of his coworkers—are plotting a course through the new APIs and creating a navigational aid that other developers can use to steer their own perceptual apps.
Armed with the new Intel RealSense SDK and a new company, Livid Interactive, Pennock and his team set out to transform the user experience and enhance Head of the Order (Figure 1) by implementing improved gesture controls and 3D hand-tracking points in the Intel RealSense SDK that were not possible with the previous Intel Perceptual Computing SDK.
From Perceptual Computing to the Intel® RealSense™ SDK
Gesture Control Improvements
The Head of the Order team was particularly interested in the new hand- and finger-tracking capabilities of the Intel RealSense SDK. These capabilities provide 3D positioning of each joint of the user's hand, with 22 points of hand and joint tracking (Figure 2) for greater precision. Control of the hands is everything to this game; hands are used to craft and cast off spells and to combine multiple spells to form more powerful ones.
Figure 2: The hand can be tracked in 3-D using 22 landmark data points. [Image source]
3D Hand Tracking
With the original SDK, a user’s hands could only be represented as flat, 2D images superimposed on the screen (Figure 3—left). To achieve this, Pennock had to create a hand-rendering system that resampled the low-resolution 2D hand images, and then add them to the game’s rendering stack at multiple depths through custom processing with his own code.
According to Pennock, the implementation of the fine-grained hand tracking in the new Intel RealSense SDK allowed the gameplay experience to become far more life-like and engaging, with much better 3D positioning. Hands are now seen as 3D models (Figures 3 and 4—right) that interact within the game space.
Figure 3: Original 2D spell crafting (left) and the improved 3D hand rendering (right).
This capability enhances the immersive nature of the game and allows Head of the Order to run on virtual-reality headsets. The Intel RealSense SDK also greatly improves the position tracking of all the finger joints, providing much more depth and accuracy when it comes to casting spells and navigating the virtual game space.
Figure 4: Original 2D hand spell casting off (left) and the improved 3D hand rendering (right).
Switching from the Intel Perceptual Computing SDK to the Intel RealSense SDK wasn't an entirely smooth ride; it took time for the functionality to ramp up in the new SDK. But by the time the Intel RealSense SDK Gold R2 version release was ready, Pennock and his team had replicated and extended what they had achieved with the predecessor SDK.
Head of the Order is controlled entirely with hand gestures, and spells are created by drawing simple shapes in the air. Over time, players learn how to master the art of combining gestures to craft the most powerful spells. The learning curve for gesture-based input in general can be steep, particularly for players accustomed to traditional mouse and keyboard interfaces or game hand controllers. Some of the biggest challenges Pennock and his team faced were communicating to the players what they wanted them to do.
Because many user movements can be picked up by the Intel RealSense 3D camera's gesture-recognition capabilities, Pennock and his team noticed that if a player makes random or unrecognizable movements—or is too far from or too close to the camera—the camera can’t process what the player is attempting to do. For those players who are accustomed to traditional interfaces, this situation can cause them to become frustrated. "Even if it's working perfectly well,” said Pennock, “they may be interacting in an unexpected way so it appears that the system isn’t functioning. This can be difficult to amend on the development end.”
To resolve this issue, Pennock and his team created a 5-minute guided tutorial with narration and video examples to demonstrate proper input techniques (Figure 5) and step the players through gameplay scenarios. This idea came about during the first contest, as early testers had trouble realizing that three steps were required to create a spell.
Figure 5:The tutorial demonstrates proper input technique.
Because visual cues also play a key role during gameplay, Pennock addressed this issue by factoring gesture speed into the visualization. Now, a trail is drawn on the screen only when the hand movement speed is within acceptable limits. "Other than that, when we're actually tracking for a particular gesture, your hands glow," he said.
For users who want to play an “easy” game, Head of the Order offers characters that perform very simple gestures.
Advice for Developers
With gesture-based apps, perhaps more than for those using traditional input types, Pennock stresses the importance of letting new users try the app and observing how they interact with it. He also emphasizes the use of outsiders to test the app as opposed to those involved with its development because it's easy to make something work correctly if you already know how to use the app. The advantage to having new users test the app is being able to see what doesn’t feel right to them and then considering ways to address the problem.
Not surprisingly, tests with younger audiences are generally more successful. "For the kids who grew up with motion controls such as Microsoft Kinect*, it's intuitive to them and they don't usually have a big issue with gesture control," said Pennock, adding that testing the system on the more mature crowds at trade shows is when most problems arise.
Performance Over Implementation
Pennock and his team acknowledge that performance can be an issue: the enormous data streams coming from the cameras create latency. This is particularly true if using more than one tracking module at a time or opting for a large number of tracking points.
It initially took some time for functionality to ramp up in the new RealSense SDK, but Pennock said that, “The Gold R2 release was a great update.” In this latest RealSense SDK, the level of noise in tracking finger joints is improved, and smoothing functions are better.
The target systems on which Head of the Order can actually be played are continuing to emerge. The software is designed only for systems equipped with a natural user interface such as that provided by the Intel RealSense 3D camera. Intel offers versions tailored to the desired application. Today, this includes tablets, conventional and two-in-one laptops, and all-in-one PCs equipped with Intel RealSense technology. What's more, an increasing number of technology companies such as Acer, Asus, Dell, Fujitsu, Hewlett-Packard, Lenovo, and NEC currently offer or have announced systems that feature Intel RealSense technology.
The Intel RealSense SDK and the technologies it's tied to provide facial detection and tracking, emotion detection, depth-sensing photography, 3D scanning, background removal, and the tracking of 22 joints in each hand for accurate touch-free gesture recognition. In the future, Pennock believes that Intel RealSense technology could find applications in automotive control, robotics, home automation systems, and on the industrial side, "I see a wealth of opportunity for measurement devices."
About the Developer
The original version of Head of the Order was built and submitted to the Intel Perceptual Computing contest under Pennock's development company—Unicorn Forest Games. That company has since combined forces with Helios Interactive, where Pennock was employed as a developer. The result was a new entity: San Francisco-based Livid Interactive.
According to Michael Schaiman, managing partner at Helios, his company was asked to develop concepts for the Intel® Experience, a set of "hands-on experience zones" being set up at 50 Best Buy flagship stores across the United States. The zones are designed to showcase cutting-edge Intel® technologies for people of all ages and technical abilities. Schaiman was tasked with demonstrating Intel RealSense technology. "One concept we came up with was to build a special version of Head of the Order that consumers could play with," said Schaiman. "They really loved that idea." The Head of the Order demo is set to hit the stores sometime in June 2015, with the game to follow later in the summer.
Throughout the development process, the Head of the Order team held monthly "innovator calls" with engineers at Intel. These calls allowed Livid developers to stay abreast of Intel RealSense SDK features, sample code, and documentation that were about to be released and to have a structure for providing feedback on what had come before.
Get more information on Livid Interactive here.