ARPedia: “You’ve Got a T-Rex?”

Note:  If you're interested in Intel® RealSense™, then you'll want to check out the 2014 Intel® RealSenseApp Challengea contest intended to encourage innovative apps that take advantage of everything that perceptual computing has to offer. Using the Intel® 2014 Software Development Kit (SDK) and the brand new 3D gesture camera, developers will be able to show off their ideas, spark future imagination, and maybe even take a few prizes home from the $1 million dollars’ worth of cash and promotions offered. Interested? Check out the official 2014 Intel® RealSenseApp Challenge page to get started!

For anyone who’s seen the iconic film “Jurassic Park”, the above-mentioned quote is instantly familiar. While some of us might be more partial to the more docile Triceratops, the Tyrannosaurus Rex is instantly recognizable in most people’s imaginations. Zhonggian Su and an intrepid group of fellow graduate students took their love of dinosaurs and combined that knowledge with perceptual computing technology with some pretty amazing results; they were the top prize winner in the Creative User Experience category in the Intel® Perceptual Computing Phase 2 Challenge, announced at CES 2014. Their app, titled ARPedia, gives users the ability to unlock a world of dinosaur knowledge with just a gesture. Watch the video below to get an idea of how this works:

The application showcases the rich contents of an encyclopedia, by means of the technology of augmented reality and Intel PerC SDK. 3D models and animations are mixed together with video from real world, which provides the users a brand new way of interaction with the virtual world. Users can interact with virtual objects by body movements, hand gestures, voice and touch. This great experience of digital exploration through an encyclopedia takes the user to the appealing unknown world. The app demo uses augmented reality to show how a dinosaur grows, and leverages Intel PerC SDK functions to allow the user to interact with the 3D objects in more natural methods of touch and hand gestures.

Gesture recognition is a technology that allows computers to see hand movement and facial expressions as input devices. Identifying the way your face moves to portray the vast multitude of facial expressions, or the way your hands move to grasp a control; these are both movements that gesture recognition tech attempts to interpret, usually from incredibly complex facial mapping and human behavior identification. It’s amazing to watch how far we’ve come in this technology (as demonstrated in the ARPedia demo above).

The team behind this intriguing app used a variety of tools to develop it over the course of a couple months: Maya* 3D to create the 3D models, Unity* 3D to render 3D scenes and develop the application logic, then the Intel Perceptual Computing SDK Unity 3D plug-in (included in the SDK) to pull it all together into one cohesive dinosaur extravaganza.  Using just your hand, you can pick up the little dinosaur, and then depending on how you use gestures and placement, you can get information, make him move, and interact with other dinosaurs in the app.

The Intel® Perceptual Computing Challenge is an ongoing contest intended to encourage innovative apps that take advantage of the potential of perceptual computing. Using the free Perceptual Computing SDK and the Creative* Interactive Gesture Camera Kit, developers are able to show off their ideas, spark future imagination, and maybe even take a few prizes home from the $1 million dollars’ worth of cash and promotions offered.

One thousand participants from eleven different countries participated in the 2013 Challenge, with over one hundred prototypes submitted for judging. Four distinct categories were up for app submission: Productivity, Perceptual Gaming, Creative User Interface, and Multimodal. Over one hundred prototypes were received for judging within these four categories, and many of these can be viewed at the Intel Perceptual Computing Showcase.

Intel has a very deep interest in perceptual computing; in fact, a large investment in this field was just recently announced at Computex 2013.  Intel announced that Intel Capital will be creating a $100 million Intel Capital Experiences and Perceptual Computing Fund over the next two to three years in order to support developments in this rapidly moving field:

"Devices with human-like senses – the ability to see, hear and feel much like people do – has long been a subject of science fiction but is now within reach given recent innovations in compute power and camera technology," said Arvind Sodhani, president of Intel Capital and Intel executive vice president. "This new fund will invest in start-ups and companies enabling these experiences, helping them with the business development support, global business network and technology expertise needed to scale for worldwide use."

This definitely makes for some exciting developer possibilities. These technologies are changing the way we develop apps; for example, a user could choose to open up a YouTube video simply by using a hand gesture, or post a new Facebook status with a blink of an eye and voice recognition. With the release of the Perceptual Computing SDK and the Creative * Interactive Gesture Camera Developer Kit with RGB and 3D depth capturing capabilities to bundle this entire tech together, developers have a unique opportunity to usher in a whole new era of PC and computer interaction.

Through Intel’s ongoing support of this emerging tech, developers have a wonderful opportunity to grow and shape developments in this field. A free Perceptual Computing SDK allows developers to jump right into perceptual computing development.  This SDK is absolutely free to download, and enables developers to be on the cutting edge of creation with their apps, integrating facial analysis, speech recognition, hand and finger tracking, and 2D/3D object tracking. Developers are encouraged to purchase the Creative Interactive Gesture Camera Developer Kit, a depth-sensor camera that is designed for use with the SDK for better object tracking.

The way we interact with our computers is changing. From multiple cords to none, keyboards to voice control, and mouse to touch screen, it’s clear that input controls are evolving to a more natural, user-friendly interface. This ongoing move towards a more integrated human/computer interface is what perceptual computing is all about. Expect to see much more technology like the dynamic ARPedia showcased in this article, with even more human/computer interactivity.

 

 

For more complete information about compiler optimizations, see our Optimization Notice.
Categories: