The Role of Sensors in Redefining Human-Computer Interaction


Intel’s Ultrabooks, Atom and Core-based tablets, and handheld devices incorporate a wide range of hardware sensors. These sensors enable high definition image processing, audio processing, motion detection, environmental conditions detection, and geographical and proximity location detection, and more. These sensors create multi-channel multi-dimension connections between the user and the device.  Combining with high computation performance and low energy consumption, these sensors provide foundation for innovative changes in the interfaces between people and the devices, and the interactions between people and the world through the devices.

Perceptual Computing

For decades, we have been relying on mice, keyboards, and touch screens as the main ways to “command” the computers. These input methods are not intuitive. They also require direct touches on the machines.

In his recent keynote speech at Intel Developer Forum (IDF) 2012, Intel executive vice president Dadi Perlmutter presented perceptual computing as the next wave of technology that will redefine the human-computer interaction.

Perceptual computing is a technology that uses voice commands, facial recognition, and gesture controls to interact with a computer. The computer and the applications “perceive” the user’s intentions based on the sensor data it collects.

Here are some perceptual computing use cases:

·         You use voice commands to ask your computer to start the eReader app, open your favorite book title in the digital library, and use hand gestures to turn the pages without touching the screen.

·         In a 3D car racing video game, the app detects the angles and gestures of how the player holds the steering wheel and controls the vehicle’s movements.

Intel formally announced Perceptual Computing SDK 2013 Beta during the IDF.

Augmented Reality  

If perceptual computing is about the interactions between people and the computers, augmented reality is about the interfaces between people and the world through the computers. The ideas and applications of augmented reality have been around for decades. It became hot again recently after Google announced the “Google Glasses” project earlier this year.

Augmented reality (AR) is a real-time view of the physical world with the elements augmented by computer generated information. It is an extension of virtual reality. AR associates the real world objects with the sensory inputs, such as image, video, sound, location, etc. On mobile devices, the implementation of augmented reality application heavily depends on the sensors on the devices, such as video camera, orientation sensor, compass, GPS, proximity sensor, and others.

Here are some augmented reality use cases:

·         A vehicle navigation app shows road names, landmarks, and turn-by-turn instructions on the phone’s video camera’s displays.

·         You point your phone’s camera to a restaurant on the street, the screen shows today’s menu and the customer reviews.



This article discusses perceptual computing and augmented reality at a high level. The major elements which provide foundation to redefine the human-computer interaction include:

·         Advanced silicon technology which supports high computing performance and low energy consumption.

·         Sensors which provide accurate real-time data.


For more complete information about compiler optimizations, see our Optimization Notice.


Mauricio Alegretti's picture

@Miao - totally agreed. So far, everyone who used touch screens in our tests don't want to go back to mice anymore! :)

Miao W. (Intel)'s picture

Mauricio, I agreed with you. A strong evident is recently Intel announced the touch-enabled Ultrabooks will soon be available in the market. The early users of these laptops with touch screens running the next release Windows platforms provided almost total approval feedbacks.

Mauricio Alegretti's picture

Miao, don't you think that the recent trend towards touch screens (both in hardware and in software, as seen in how the Windows 8 start screen operate MUCH better with touch) will eventually kill the mouse - or at least the trackpad / touchpad in classic notebooks?

The only scenario where maybe a user will prefer the mouse over a touch is where he wants precision, and maybe that he could get with a pen input (directly on the screen or with a digitizer)...

Add a Comment

Have a technical question? Visit our forums. Have site or software product issues? Contact support.