What can you do with Intel® RealSense™ technology?

Note:  If you're interested in Intel® RealSense™, then you'll want to check out the 2014 Intel® RealSenseApp Challengea contest intended to encourage innovative apps that take advantage of everything that perceptual computing has to offer. Using the Intel® 2014 Software Development Kit (SDK) and the brand new 3D gesture camera, developers will be able to show off their ideas, spark future imagination, and maybe even take a few prizes home from the $1 million dollars’ worth of cash and promotions offered. Interested? Check out the official 2014 Intel® RealSenseApp Challenge page to get started!

Two videos recently shared on the Intel YouTube channel focused on Intel RealSense technology, formerly known as Perceptual Computing. Both were shot at GDC and highlighted what’s new for 2014 with the RealSense technology, as well as growing excitement about the different use cases that RealSense is uniquely suited for.

In this first video, Meghana Rao gives a brief demo of the capabilities in the current SDK and talks about what is coming in the future. She points out in a demo that the 3D depth camera (found here) tracks fingertips, and the fingertip tracking action allows her to create music as she moves her fingers, along with the tempo. Watch below:

You might already be aware that RealSense tracks facial and speech recognition; for 2014, 3D cameras will be part of various form factors, making it easier than ever to add RealSense technology to applications. A new version of the SDK will also be released, adding enhancements and modalities like 3D scanning and augmented reality.

In this short video demo, Shannon Gerritzen of Hidden Path talks about how she can see Intel® Realsense™ being used to make games more interactive. She’s looking forward to what they can do with RealSense, and is impressed with the technology – small components make it easy to implement. RealSense makes immersive gameplay more of a reality, with lots of different possibilities for gaming. Watch below:

RealSense technology makes it possible for our digital worlds to interact with our physical, organic worlds in meaningful ways. Many of the projects that developers are creating step across boundaries that just a few years ago would have been impossible to imagine.

This definitely makes for some exciting developer possibilities. These technologies are changing the way we develop apps; for example, a user could choose to open up a YouTube video simply by using a hand gesture, or post a new Facebook status with a blink of an eye and voice recognition. With the release of the Perceptual Computing SDK and the Creative * Interactive Gesture Camera Developer Kit with RGB and 3D depth capturing capabilities to bundle this entire tech together, developers have a unique opportunity to usher in a whole new era of PC and computer interaction.

The common thread with RealSense and perceptual computing innovation is the Perceptual Computing SDK.. Free to download, this platform allows developers to jump right into perceptual computing development,  enabling developers to be on the cutting edge of creation with their apps, integrating facial analysis, speech recognition, hand and finger tracking, and 2D/3D object tracking What kinds of innovations are supported by this SDK? There are several:

Object tracking: 2D/3D object tracking gives developers the ability to put together real-time images from the Creative Interactive Gesture Camera along with close-range tracking and graphic images to create a whole new user experience.
 

Speech recognition:  Text to speech transcription, complete sentence dictation, voice commands, and directions for specific processes; i.e., “search Google for closest sushi restaurant”.
 

Gesture control: With close range tracking, this SDK offers developers a chance to create a virtual reality for their users. Recognizing common hand poses and gestures, developers can program interactive controls.
 

Hand and Finger Tracking:Close-range tracking is the overall term for a sub-category of perceptual computing interactivity that includes recognition and tracking of hand poses, such as the thumbs up, hand and finger tracking, and hand gestures. This usage is made possible through the Creative Interactive Gesture Camera’s 3D capability.
 

Facial Analysis:Face tracking can be used as a perceptual computing component in games or other interactive applications. The Intel Perceptual Computing SDK supports facial recognition, facial tracking, and attribution detection such as smile.
 

Augmented Reality:2D/3D object tracking provides the user with an augmented reality experience in which real-time input from the Creative*Interactive Gesture Camera (RGB video, depth map and audio) is combined with other graphics sources or video.
 

Background Subtraction:Using information from the depth cloud, we've developed exciting technology that allows developers to separate objects/people in the foreground from the background. This technology works in real time, meaning users can eliminate irrelevant or cumbersome background information from video to immerse themselves in video chat, online collaboration, and more. 

Developers are encouraged to purchase the Creative* Interactive Gesture Camera Developer Kit, a depth-sensor camera that is designed for use with the SDK for better object tracking.

Need a bit of inspiration? If you’re still a little unsure about what you can create with perceptual computing technology, you’ll want to check out the Intel Perceptual Computing Challenge Showcase, spotlighting a wide array of apps that made it to the finals in the most recent Challenge. Winners include:

Kagura (Grand Prize Winner): “This application creates music and graphics by detecting user's motion. Users don't play the musical instruments, they just move their body parts, and play music and produce graphics. It generates new ways of performing music with visual effects. You can also change your voice to a musical instrument, arrange the tempo of music, and where to put the sounds on the screen. The base system (like I/O of devices, image processing, sound processing) are written in C++, and the contents (like graphics, sounds, motion, effects, and interactions) are written in Lua. This App uses default audio device, and sampling frequency is 22050Hz. OpenGL to used to draw all graphics.”

ARPedia (First Prize Winner): “The application exhibits the rich contents of an encyclopedia, by means of the technology of augmented reality and Intel PerC SDK. 3D models and animations are mixed together with video from real world, which provides the users a brand new way of interaction with virtual world. Users can interact with virtual objects by body movements, hand gestures, voice and touch. This great experience of digital exploration through an encyclopedia takes the children to the appealing unknown world. The app demo uses augmented reality to show how a dinosaur grows, and leverage Intel PerC SDK functions to allow the user to interact with the 3D objects in more natural methods of touch and hand gestures.”

Head of the Order (First Prize Winner): “This is a gestural spell casting game. Players face off tournament style and use gestures to create magic spells which they can cast at each other or defend themselves with. To cast a spell, the player will raise a single finger, and draw a stroke based gesture which corresponds to a particular spell. Beyond single stroke based gestures, the game features a deep level of gestural interaction. After creating spells, they can be juggled from hand to hand, tossed, or combined in various ways. Players can also perform specific sequences of multistroke gestures to cast more elaborate spells. We want to give the player the feeling that they are learning how to do magic. It highlights the possible depth of gestural interaction and it's effect on emerging gameplay mechanics, as well as provide as much enjoyment for the player as possible.”

If you’re a developer interested in creating something with RealSense, check out the RealSense information page, which includes RealSense information for developers.

 

 

 

For more complete information about compiler optimizations, see our Optimization Notice.