The Shift from Explicit to Implicit Computing

By Robert P Duffy, Published: 04/13/2016, Last Updated: 04/13/2016


In one of my posts, “Developers Need to Consider Relative Input For Intel® RealSense™ Technology,” I mentioned how there is a shift happening in computing. Computer control using Intel RealSense technology is best when it’s relative control versus absolute control. Another way to look at this is a shift from explicit computing to implicit computing.

What is Explicit Computing?

Explicit computing is what we’ve been used to from the beginning of the PC era (in the late 70s) to today. As we all know, if you want a computer, video game or smartphone to do something, we must directly command the device to do so. To move a vehicle you move a joystick. To give a computer commands, we type the letters on a keyboard. To select your next date you swipe left versus swipe right. Each time the action is explicit and the result is predictable.

Everyday Implicit Computing

Now, think about your significant other. How do you interact with him or her? To get eye contact and a smile, what do you do? While we might fancy the idea of a keyboard or touch screen to run the “smile at me” app on our life partner, that is not how it works. Reality is more like getting in proximity to your loved one, and if he or she isn’t looking at you, beginning with a greeting like “hi honey.” Then, as that person turns around, you might provide a compliment. The likely result is a smile in your direction. Beware, not every greeting and compliment will return a smile. There is variability, which could land you at a friend’s house. However if you know this person, it shouldn’t be too difficult to figure out the right combination of implicit input that results in a specific reaction like a smile.

Whether it’s your Nest* thermostat, Amazon Echo* or the back-up camera on your SUV, we are all experiencing implicit computing. For Nest, tour habits and patterns for occupying your residence are now programming your thermostat. When backing up our vehicle, we don’t intend to set off the back up camera warning, it’s something that happens when the car movement and proximity to other things is within a threshold deemed dangerous.

We See this in Movies

Implicit computing is foreseen in the movie Minority Report. You may remember this scene, where the main character walks across a room to find that the walls are covered with advertisements and communications tailored to him. He didn’t do anything explicit, other than occupy space with a face that the environmental computer recognized. This is food for thought to developers. There is a whole new world of implicit computing around the corner. And as we see in this scene from Minority Report computing didn’t happen based on direct control from a device.  Computing happened within the environment.

The Environment as the Device

As we discover new computing experiences outside the PC, outside the tablet and outside the phone, it is interpretive and implied computing that are coming next. This is where the environment becomes the device. With the environment as the device, the best experience won’t be to explicitly launch an app, or swipe right or left. Instead, the environment will sense your presence, calculate your intent, and then cause a specific outcome to occur. Some thoughts on this come from Intel Software Innovator Peter O’Hanlon in a recent Vice Motherboard article on making Museums smart.  As in Peter’s project implicit computing, where the environment is the device, is not only possible, it is happening.  Technology is available that opens or shuts locked doors by the presence of the right person, or lights turning on or off when humans, not animals come enter or leave a room. Soon your car will automatically start, or properly parallel park without your direct control. These are all things that you expect to happen when they should happen (but without explicit command); very much like receiving warm smile from the one you love.

Get Started Now

Unlike previous evolutions in computing from mainframes to PCs to Mobile, you as the developer don’t have to wait for ecosystem to deliver into the mainstream before you can start development. All the building blocks for implicit computing, and the environment as the device, are here today.  Three technologies are at the heart of this new wave of computing

  • Internet of Things: Learn more about using microcontrollers to make the environment and modern electronics smart by visiting our IoT Developer Zone
  • Perceptual Technology and Sensors: Learn more about technology that can see people, things and movement much like how people see things at the RealSense Developer Zone
  • Cloud Computing: Learn how big data and complex calculations and operations can be processed by systems in the cloud by going to Modern Code Developer Zone

Product and Performance Information


Intel's compilers may or may not optimize to the same degree for non-Intel microprocessors for optimizations that are not unique to Intel microprocessors. These optimizations include SSE2, SSE3, and SSSE3 instruction sets and other optimizations. Intel does not guarantee the availability, functionality, or effectiveness of any optimization on microprocessors not manufactured by Intel. Microprocessor-dependent optimizations in this product are intended for use with Intel microprocessors. Certain optimizations not specific to Intel microarchitecture are reserved for Intel microprocessors. Please refer to the applicable product User and Reference Guides for more information regarding the specific instruction sets covered by this notice.

Notice revision #20110804