At IDF 2012 in San Francisco, we got to witness first-hand the implementation of gesture recognition inside a next generation Ultrabook™. This ain’t your mama’s gesture recognition technology, either. With the full support of a Perceptual Computing SDK and amazingly innovative features, expect some pretty futuristic apps to come out of the creative minds of savvy Ultrabook and PC developers.
What is gesture recognition?
Gesture recognition is a technology that allows computers to see hand movement and facial expressions as input controls. Identifying the way your face moves to portray the vast multitude of facial expressions, or the way your hands move to grasp a control; these are both movements that gesture recognition tech attempts to interpret, usually from incredibly complex facial mapping and human behavior identification. Here's a demo of gesture recognition at IDF:
The goal of gesture recognition is to make it possible for the user to interact with their computer intuitively without the “middleman” of controls (keyboard, mouse, touchpad, etc.), and to make the computer as “human” as possible. So far, we’ve seen developments in the field using highly sophisticated cameras that capture our movements and connect them to whatever action they might correspond with; i.e., a smile might tell a radio app that we are enjoying the current song, a snap of our fingers might tell a word processing program to save and close.
Gesture recognition is part of what is called “perceptual computing”, a style of personal computing experience that gives the devices we use every day the ability to interpret what we are doing via human-like senses. Instead of a mouse, we use our voice. Instead of a keyboard, we use gesture recognition. The jury is still out on whether or not these input devices will be made obsolete by this technology, but it’s certainly useful as an added extra input control.
How is gesture recognition supported in the Ultrabook?
At IDF, we learned that gesture recognition will be supported in the next generation of Ultrabooks in two very exciting ways. First, in October 2012, a Perceptual Computing SDK will be launched to encourage and support developers to create apps that will take advantage of this technology. This SDK will give developers direct access to information they can use to develop apps integrating object tracking in both 2D and 3D objects, voice recognition processing, hand gesture recognition, finger tracking, and facial analysis.
This SDK will utilize a camera and microphones designed specifically for voice and gesture recognition by SoftKinetics and Creative. This USB-powered depth sensor camera uses sophisticated technology, including an RBG lens (720p), an HD webcam, dual-array microphone, and infrared in order to translate the gestures (hand poses, thumbs up, finger tracking, etc.) on the display into actionable commands. Initially, this camera will most likely be mounted on top of a laptop, but eventually it’s safe to assume that these cameras will be part of form factors in many PCs, including Ultrabooks. This camera will be available to developers for $150 when purchased along with the SDK (which is free). Both large, sweeping gestures and small, finger-focused movements are recognized. As you can see in the demo above, gesture recognition can easily control educational software or game playing. As the technology grows more sophisticated, eventually we might see computers that are able to ascertain what we might want with just a small movement of our fingers, without us having to completely spell out what we’re trying to accomplish.
The second way that Intel is supporting innovation in the field of gesture recognition is with a very exciting contest that will be giving out a total of $1 million in awards to developers who are particularly creative with this SDK. The field is wide open since this technology is relatively new on the market, and it will be interesting to see what developers come up with. This contest will go live later this year.
Gesture recognition app possibilities
This being a relatively new field, the possibilities are literally endless with gesture recognition. One of the biggest possible areas of innovation could be for elderly or special needs consumers. In a meeting with John Bergquist and Chris Skaggs of Soma Games, we discussed the promise of gesture recognition for children with autism who might require extra training with facial expressions and socially appropriate emotional responses. This could be a huge aid to parents and educators who are looking for something to help their children pick up on social cues and interaction.
In addition to special needs consumers, elderly people or people dealing with debilitating diseases that affect their nervous or muscular systems could greatly benefit from a sophisticated gesture recognition app. Imagine if someone with Parkinson’s was able to type even though their fingers weren’t necessarily up to the task, or someone else who had lost their ability to speak was able to communicate via hand gestures? The potential here could quite literally be life-changing.
What do you think about gesture recognition? Is it a fad, or is it here to stay? What kind of apps can you imagine with this technology? Let us know in the comments.