Touch Mechanics: Haptic Technology and Perceptual Computing

Our sense of touch is vitally important. It’s involved in literally every interaction we have. When it comes to human-computer interaction, touch is especially important, especially as touch-enabled input controls and interactive experiences make our computing experiences ever more tactile. As more and more devices are adapting touch-enabled features, controls, and form inputs to satisfy consumers’ ever-growing appetite for tactile input, we’re seeing new advances from a wide variety of fields.

Haptic technology

One of these core fields is Haptic, which is all about interactive touch technology. This field looks tactile and kinesthetic touch interactions; tactile includes contact location, pressure, shear, slip, vibration, and temperature; while kinesthetic includes position, orientation, and force. Touch helps us understand the physical interactions we have with the world, but can we make technology better by leveraging touch better? Can we improve interfaces and machines by touch?

In a TED talk given by assistant professor and mechanical engineer Katherine Kuchenbecker, she gives a brief overview of the field of haptic technology. How do we capture how objects actually feel in real life and recreate those experiences on the computer realistically? Her department created hand-held tools that measures pressure, direction, shaking, movement, etc. They then take the generated data from tool interactions (such as scraping it across a piece of cloth, for example), and make a mathematical model of those complex relationships and interactions. The computer uses that model to play vibrations that mimic what just happened on the cloth. Professor Kuchenbecker dubs this process “haptography”; taking virtual photographs of a touch interaction and recreating it virtually. Watch the entire TED talk below:

Implications for haptic technology

This could have interesting implications in a wide variety of fields. In the TED talk, the example was given of dentistry. Dental students need to figure out which teeth have cavities, and of course you can accomplish this with x-rays. Dentists also use exploring tools that feel teeth for telltale signs of cavities. These types of judgments are hard for dental students to make, so Katherine’s team put accelerometers and other haptic technology on a dental explorer, than mimicked that same exploration with a  “touch track”. This resulted in dental students being able to practice finding cavities and making judgments without working on actual patients.

What about sports training, or stroke victims who are relearning how to perform specific movements? Usually a physical trainer has to be called in. But what if you could use computers to augment the process and maybe even make it more fun? Kuchenbecker’s research team rigged up a practice system hooked to a Kinect machine, giving touch cues on the arms with haptic arm bands that guided the subject they moved. If they deviated from a set pattern, the sensors let them know (like a physical trainer or coach would) that they weren’t on task. More about the technology behind this intriguing process here:

“Among microelectromechanical systems are inertial sensors, which were first developed for automotive and military applications. They are tiny accelerometers and angular rate gyros that can be combined to form a complete inertial measurement unit. An IMU detects the three-dimensional motion of a body in space by sensing the acceleration of one point on the body as well as the angular velocity of the body. Incorporated in our IMU design are a microcontroller for analog-to-digital conversion and a low-power radio frequency transmitter for wireless data transfer to a computer. The resulting design is as small as a postage stamp and has a mass of merely three grams; a rechargeable lithium-ion battery adds just 1.5 grams to the unit. “ – “A Sporting Chance”, University of Michigan School of Mechanical Engineering

What this could mean for developers

Obviously this has some pretty amazing implications for developers, especially in the realm of games. How about a first-person-shooter (FPS) game scenario in which you’re actually experiencing what’s going on in the screen in front of you:

“In the case of his Tactile Gaming Vest, that means simulating the injuries that lie in wait for a computerized avatar wandering the alien-infested corridors of Half-Life 2. One of his ideas is to make the first-person-shooter genre a little more immersive. So when your attackers target you from behind, you feel a thwack-thwack-thwack against your kidneys. If they come at you straight on, you feel the gunfire in your ribs.” – University of Pennsylvania, “Touching the Virtual Frontier”

Watch the video demo of gaming in the fourth dimension below:

It’s no secret that people enjoy touch technology. A recent study overseen by Daria Loi, UX Innovation Manager at Intel, took on the task of seeing how subjects in different countries and walks of life would interact with touch-enabled Ultrabooks (see The Human Touch: Building Ultrabook™ Applications in a Post-PC Age). The results were encouraging as far as adaptation: an overwhelming majority of the subjects surveyed were “delighted” with touch, found it very easy to work with, and were prepared to pay more for the touch experience. You can watch Daria talk about this study in the video below:

Is this perceptual computing?

The fields of haptic technology and perceptual computing seem to be intimately connected. Two contests at Intel were just recently concluded that touch on this very subject; first, the Intel® Perceptual Computing Challenge, an ongoing contest meant to encourage innovative apps that take advantage of everything that perceptual computing has to offer by using the Perceptual Computing SDK and the Creative* Interactive Gesture Camera Kit. Phase 1 of this contest recently ended with some amazing entries; these can all be seen at the Intel Perceptual Computing Showcase.

There’s also the Ultimate Coder Challenge: Going Perceptual in which seven developers competed for seven weeks to create apps that utilized the latest Ultrabook convertible hardware along with the Intel Perceptual Computing SDK and camera to build the Ultimate app prototype. From an interactive puppet theater to a futuristic user interface, all of the entries were fantastic, with some pretty wide-reaching implications for where we might be going in this field.

Exciting times we live in

From virtual reality surgery to physical training for athletes to computer-generated tactile experiences, the field of haptic technology gives us a lot to look forward to. Perceptual computing walks hand in hand with this and the possibilities from both of these closely connected technologies are exciting.

What kind of opportunities do you see in the field of haptic technology? Perceptual computing? Please give us your thoughts in the comments. 

Для получения подробной информации о возможностях оптимизации компилятора обратитесь к нашему Уведомлению об оптимизации.