Intel® RealSense™ Technology

(previously Intel Perceptual Computing Technology)

Now available: Be the first in line to build a more immersive experience.    Download the Intel® RealSense™ SDK for Windows* (version 2014, Gold) and reserve your Developer Kit.


Creating applications that use your common senses 

The Opportunity for Developers

The personal computer is decades old, but in all that time how personal has it really become? Do you interact with your computer the same way you interact with friends or family? Does your computer understand your communication style? Is it intuitive to work and play on a computer, or is it cumbersome? At Intel, we thought human-computer interaction could use some work, which is why we’ve spent the past few years figuring out how to make computers feel, well, personal. In order to create an interface that’s truly natural, we had to first enable computers to experience the world the way we do. The combination of hardware and software that we introduced is called Intel® RealSense™ technology.

Intel RealSense technology understands four important modes of communication: your hands, your face, speech, and the environment around you. The Intel RealSense 3D camera, which can understand and interpret depth much like the human eye, makes this possible. If you’re excited to see what Intel RealSense technology can do, great! You should head over to our technology site. If you’re a software developer, well we’ve got some exciting news: the Intel RealSense SDK makes it easy to make use of our unique hardware and incredible software to create your own applications. Even more exciting news: the Intel RealSense™ 3D camera will be integrated into Ultrabooks, Notebooks, 2-in-1s, and all-in-ones in 2014.


Develop Experiences Your Customers Will Love

At Intel, we’re excited about bringing integrated 3D cameras to market, but it’s even more exciting for software developers. In 2012 and 2013, when Intel RealSense technology was called perceptual computing, consumers had to purchase a peripheral 3D camera in order to experience applications powered by Intel’s perceptual computing technology. With integrated 3D cameras shipping on systems from Dell, HP, Lenovo, Asus, Fujitsu, NEC, and Acer, applications built on the Intel RealSense SDK will work on a broad range of systems, which represents a tremendous opportunity for developers.

We know software professionals are among the most creative individuals in the tech industry, so we can’t wait for you to discover usages we haven’t. So, what capabilities will the Intel RealSense SDK support?

Global Categories




Manipulate interface elements in mid-air! The SDK tracks 22 joints in each hand for accurate touch-free gesture recognition.


Identify 78 points on the face, infer emotions and sentiments, and track the position of a user’s head. Build applications that can respond to user movements and sentiments.


Enable your apps with powerful speech recognition for intuitive hands-free interfaces

The Environment

Understand scenes more accurately using depth data for incredibly precise augmented reality. Remove the background for an image to create a screen-less green screen.


A World of Creative Applications

One of our goals as we evolve perceptual computing into Intel RealSense technology is making sure that implementing the technology is easy for developers. You don’t have to be an expert in machine learning or hand-tracking algorithms to implement gesture into your application. That said, we also provide full access to the raw depth data, leaving developers open to innovate in any way they like. With this creative firepower at your fingertips, you’re free to apply ingenuity to your interfaces and delight your customers. Intel is focusing on applications in five main categories: immersive collaboration, gaming and play, natural interaction, learning and edutainment, and 3D Scanning. Let’s delve into each category and explore the technology that powers them.

Immersive collaboration means you can create applications that let you meet virtually anywhere. Imagine you need to videoconference with Bill for a business meeting. Using background removal, you can call Bill from what he thinks is your 110 degree un-air conditioned office in Los Angeles when in reality you are sitting on the lanai of a nice golf course condo in Maui. Background removal, essentially a screen-less green-screen, is an easily deployable feature, powered by the Intel RealSense SDK.

Games and play are a natural fit for Intel RealSense technology users who want to bring all their common senses to bear engaging in and controlling a gaming activity Intel RealSense technology game and play development menu includes both old and new capabilities Speech recognition can be used to control action in tactical battle games like There Came an Echo soon to be released by Iridium Studios. Gesture control, facial analysis, as well as head, hand and finger tracking can all also be used to augment or replace traditional inputs methods, bringing unparalleled precision to gaming experiences as in Head of the Order, another Intel® Perceptual Computing Challenge winner

There Came an Echo Uses Speech to Control Tactical War Games

There Came and Echo is a voice-controlled real-time strategy game, starring Will Wheaton of Star Trek*, Eureka*, and The Big Bang Theory* fame. It is currently being developed for Intel® RealSense™ technology enabled systems and should be released in mid-2014 on Steam*. Designed from-the-ground-up for voice-based commands using the Intel® RealSense technology, Echo enables you, as a Field Commander of a small tactical squad to use voice commands to accomplish tactical objectives and even interact with other members of the attack squad. Check out this video of Iridium Studio’s CEO and Chief Architect Jason Wishnov as he talks about the development process of There Came and Echo.

Interactive Storytelling lets us add a new dimension to normal storytelling by merging the virtual world with the real world. Using augmented reality techniques, we can personalize, dramatize and add emotional content to applications by creating a 3D virtual world and making story elements “user aware.”

A child’s story book application might feature a child playing with a doll. By adding augmented reality capabilities to this application we can make the virtual doll smile at the child, have it mimic her facial expressions or even say “Hi”, when it recognizes her. A little girl playing with this application will get more out of being part of this immersive experience as she uses all her senses to interact with the media content.

By adding capabilities to sense not only the child but her environment, we can render the media content in a photo-realistic manner creating a local environment in which the child can interact. The virtual doll can jump onto the real table and walk as the child interacts with it using gestures.

Experience Interactive Storytellling with a Tour of Uncle Jeffrey’s Funky Farm

Interactive storytelling and augmented reality are great to talk about, but take a look for yourself with a quick visit to Uncle Jeffrey’s Funky Farm, an Intel CES 2014 demo project.

At the Farm a child can participate in a magical farm experience and see some of the almost endless possibilities of augmented reality. Snow falls, rainbows appear and sheep bleat as Uncle Jeffrey takes you on a tour of the farm. Anyone experiencing Uncle Jeffrey’s Farm will get more out of being part of this immersive experience because they are now using all their senses to interact with the media content.


Natural Interaction

Natural Interaction applications literally put application control at your customers’ fingertips. By using Intel RealSense technology capabilities like hand and finger tracking, facial landmark detection, and speech recognition, you can accurately track hands and fingers and use simple gestures or precise 3D manipulations to control your applications.

Hand and Finger Tracking: This capability works from 0.2 – 1.2m and provides 3D positions of fingertips, palm and forearm point locations that can be used to control an application. In 2013 the SKD supported only a limited number of finger and hand landmark points but in 2014 the Beta SDK supports 22 points of hand and joint tracking for greater accuracy and resolution. This allows you to enable new application usages with more granular GUI control and better fingertip tracking and hand orientation for more sophisticated interactions.

In 2013 (left) the SDK supported only fingertips, palm center, grasp point, and forearm. In 2014 (right) the SDK tracks 22 points.

Gesture Tracking: Identifies 8 static poses and 6 dynamic gestures that create mid-air hand movement and gestures that act as a natural way to interact with software.  Static pose examples are things like thumbs up or peace sign, while dynamic gesture examples might be a wave or circular motion.

Facial Analysis: Using the Intel Perceptual Computing SDK, facial analysis was limited to frontal, 2D detection and 7 landmark points including eyes, mouth, and nose, allowing you to compare that current face to a reference library of previously stored faces. 3D Facial Analysis for 2014 has improved, and now supports 3-dimensional depth, with 78 landmark points for increased accuracy, true 3D face detection as well as roll, pitch, and yaw of the face. This allows you identify the presence of faces in the camera’s range and on a single face identify location of the facial features. It also allows you to track and recognize a head pose and recognize expressions and emotions such as anger, surprise, and frustration based on these facial landmarks.

In 2013 (left) the SDK support only 7 points and 2D detection.
In 2014 (right)the SDK supports 78 landmark points, depth for true 3D face detection as well as roll, pitch, and yaw.

Speech Recognition: Create applications that allow users to interact and control applications naturally using speech. Besides command and control this also includes dictation and translation of text to speech in 9 languages. Speech Recognition is a very powerful capability allowing for hands off control of both applications and games, as we saw above with There Came an Echo.


Capture and Share 

With the Intel RealSense SDK, we’re bringing the ability to scan, modify, print, and share small objects. Here’s how it works: because Intel RealSense is powered via a 3D camera, you can rotate an object in front of your computing device to build a 3D mesh. Overlay the mesh with color, and you’ve got a fully printable, shareable digital replica. Want to build some army men for your kids to play with? Great. Want to upload a 3D model of the teapot you’re selling on an internet marketplace? Totally possible. Additionally, we’re abstracting the technology so that developers can incorporate 3D scanning into their own applications.


Developer Programs – The Rest of the Opportunity

 So far we’ve talked about two parts of the Intel RealSense Technology opportunity for developers in 2014, an expanding market and a robust, feature-rich SDK. There is a third part of the opportunity equation that is equally important, programs Intel has created to reward developers, ensure their development process is successful, and showcase and sell their products. Key to all of this will be a new Intel® RealSense™ Technology Developer Resource Center, a one-stop shop for developers to learn about the technology, download the SDK, order an Intel RealSense Camera Developer Kit, participate in developer contests like the App Challenge as well as education programs and see a full showcase of both demos and fully functional applications.

The Intel® RealSense™ App Challenge 2014 is a follow on to the highly successful 2013 Intel Perceptual Computing Challenge. In the Ideation Phase, developers were given an overview of the new Intel RealSense 3D Camera and SDK before submitting application proposals for demos in 5 categories: Gaming and Play, Learning and Edutainment, Interact Naturally, Immersive Collaboration /Creation, and Open Innovation. Hundreds of finalists have been selected from this phase to proceed to the App Development Phase, where they will receive an Intel RealSense 3D Camera Development Kit and get approximately 10 weeks to complete their application demo. Finalists compete for their share of $1,000,000 USD in cash and marketing opportunities!


Hackathons, Virtual AE Sessions, Webinars, and Tutorials 

In addition to the Challenge, in 2014 lntel will again hold a series of events to train developers on the SDK and help them be successful:

  • Hackathons: Intel® RealSense™ Technology Hackathons are in-person events scheduled worldwide where developers get a chance for hands-on experience with the SDK and camera, get questions answered, share ideas and have fun. 
  • Virtual AE Sessions: These are a cross between a Hackathon and a webinar. Held online, Virtual Application Engineer (AE) sessions are a chance to meet as part of a group online and get your development questions answered by an Intel AE.
  • Webinars: In 2014 lntel will again hold a series of Intel® RealSense technology webinars to help train developers on the technology and how to build great applications using it. Webinars vary between overviews of the SDK to highly technical webinars on coding.
  • Tutorials and Documentation: For those of you who aren’t able to participate in a Hackathon, or attend a Virtual AE Session or Webinar, in 2014 we’ll have a robust set of self-help tutorials and in-depth documentation at the Resource Center.


Showcasing Applications

Once a game or application is complete, the Intel RealSense Technology Application Showcase will be the place where you can show off your work and where customers can see it. In addition to applications the Showcase will also be the place to go to see technology demos and videos about the technology.


Getting Started with Intel® RealSense™ Technology in 2014

So, now that you know the scope of the Intel RealSense technology opportunity in 2014, your next question is probably how do I get started? It’s easy. Go to and sign up for the new 2014 SDK and the equally new Intel® RealSense™ 3D Camera Development Kit when they become available.


For more complete information about compiler optimizations, see our Optimization Notice.