If you’ve ever seen one of the Iron Man movies, you’ve most likely observed the futuristic computing technology that Tony Stark, aka Iron Man, uses to not only invent his amazing armored suit, but also just to conduct events in his everyday life: phone calls, programming, interacting with the rest of the Avengers, etc. (you can see many of the effects by the company that made them right here). It’s the movies, so we know there’s a little bit of suspended belief expected here, especially when it comes to technology. However, with the advent of Intel’s support for perceptual computing, this kind of Hollywood tech interaction isn’t that far off.
What is perceptual computing?
Imagine controlling your computer merely by using your voice or a wave of your hand, rather than a mouse, a keyboard, or even a touchscreen. Perceptual computing, debuted at IDF 2012, is all about natural human interactions with machines in addition to those tried and true control mechanisms: facial recognition, voice commands, gesture swiping, etc.
This technology has been around for gamers (think Microsoft’s Kinect technology here) for a while, but bringing it to the desktop user is a whole ‘nother ballgame. Future Ultrabooks and PCs will actually be able to recognize perceived intentions of users and complete tasks accordingly in a seamless, intuitive fashion.
Intel has long been a proponent of personal computing, and while perceptual computing certainly is a different direction, it goes along with a history of supporting innovation. Responsive computing that is individually tailored to an individual’s unique needs is really what perceptual computing is all about.
How will this work on an Ultrabook?
At IDF 2012 San Francisco, we learned that the next generation of Ultrabooks will use fourth generation Intel Core processors, built on the Haswell technology platform. These will enable not only a more robust battery life and increased processing power, but also enhanced media and graphics displays.
New processors are certainly something to cheer about since Ultrabooks will be even more powerful, however, this also opens the door for the Ultrabook to be even more innovative and interactive – i.e., perceptual computing.
Perceptual computing and developers
In order to fully support developers and perceptual computing development for Ultrabooks and PCs, Intel plans to release its Intel Perceptual Computing Software Development Kit in October 2012 (you can see more information about that here). To support this kit, Intel also made available the Creative Interactive Gesture Camera. This SDK will enable developers to fully integrate innovative facial and voice recognition, gesture controls, and alternative reality features into next generation Ultrabooks and PCs. It is absolutely free and can be found at intel.com/software/perceptual.
What kinds of innovations are supported by this SDK? There are several:
- Object tracking: 2D/3D object tracking gives developers the ability to put together real-time images from the Creative Interactive Gesture Camera along with close-range tracking and graphic images to create a whole new user experience.
- Speech recognition: Text to speech transcription, complete sentence dictation, voice commands, and directions for specific processes; i.e., “search Google for closest pizza restaurant”.
- Gesture control: With close range tracking, this SDK offers developers a chance to create a virtual reality for their users. Recognizing common hand poses and gestures, developers can program interactive controls.
Why should developers use this SDK to develop perceptual computing applications? There are a few obvious benefits. First, it’s free, and you can’t beat free, right? Second, since a lot of the work is already done via the SDK and standard APIs, developers can write their apps and get them in front of end users more quickly. Third, the Ultrabook market is big and getting bigger – this is the next wave of personal computing. Industry support is there for these apps and this technology.
Apps built using the Intel Perceptual Computing SDK can take advantage of the Ultrabook’s many sensors, including touch, accelerometer, GPS, and NFC. With this SDK, savvy developers can also create programs that include facial analysis, voice recognition, and 2D/3D object tracking. These are all advanced, intuitive features that will only enhance user-to-machine collaboration. For example, instead of just building a standard online text translation, developers could use this SDK to build a real-time, voice activated translation service. Another idea might be giving users the ability to track items on the screen via a series of simple, intuitive gestures via gesture and facial recognition. See the demo below for an idea of what this might look like on an Ultrabook environment:
YouTube video courtesy tech2vids
Tony Stark, eat your heart out
Perceptual computing and the Intel Perceptual Computing SDK for developers are truly a step into futuristic computing, especially when paired with the next generation of Ultrabooks. We’re not too far away from that Tony Stark/Iron Man ideal of virtual reality computing, where our gestures govern our computer processes and our voices process tasks to fruition. Sure, he might be a billionaire superhero who saves the world in his spare time, but I think even Tony would be intrigued by what developers can do with these new tools.