THE INTEL® REALSENSE™ SPATIAL AWARENESS WEARABLE

How It Works

This project uses the Intel® RealSense™ Depth Camera (R200) to see how far the objects in its field of view are from a user. The camera field of view is divided into eight sections:

  • Three top sections: right, center, left
  • Three middle sections: right, center, left
  • Two bottom sections: right, left

 


In each section, the camera measures the distance to the closest point on the nearest object that the camera detects in its field of view. We map the vibration intensity to the appropriate vibration actuator on the body corresponding to the distance to this closest depth pixel in each section. The closer the object is in that particular section, the stronger the vibration. The vibration actuators are placed on eight parts of the body:

  • Three on the chest: left chest, center chest, and right chest
  • Three at hip level: left hip, belly, right hip
  • Two on the legs: left leg and right leg

The vibration actuator is connected to a battery-powered Wi-Fi-enabled micro-controller (Particle Core). The eight sections in the field of view of the camera are mapped to the eight vibration actuators. The vibration actuators communicate with the laptop that has the R200 camera over TCP using a portable router.

  • Ultrabook™
  • Intel® RealSense™ Depth Camera (R200)
  • Vibration Actuators
  • Portable Wireless Router

In the portable router, fixed IP addresses are assigned for the laptop and the eight vibration actuators. The laptop runs an application that processes the depth data from the camera, and functions as a TCP server to handle the communication with the vibration actuators. The vibration actuators run a TCP client application that connects to the server application on a specific IP address and port number. Based on the fixed IP addresses, the server application identifies which vibration actuator is for which body part, and then feeds that actuator the appropriate data.


What Is the Goal of Spatial Awareness Wearable (SAW)?

Spatial Awareness Wearable (SAW) was originally conceived and built by a team of designers from the Experience Design and Development team at Intel’s Perceptual Computing Group. The team started by asking if the cameras being used for 3D scanning, gesture recognition, augmented reality, and so on, could be used to help the visually impaired navigate cluttered and occupied spaces. The team was challenged to work with developers, researchers, companies, and not-for-profits to further develop and research these spaces, building upon the initial work done on SAW. The Intel® Developer Zone is part of that effort. It is meant to support anyone looking to get started with this technology in this space.

We are looking to reach beyond our original goal of navigation of cluttered spaces. Our current investigations include face detection and recognition, various forms of user feedback, environmental objects, additional sensors in concert with the current system, and context sensitive behavior.


What's Here?

You will find links to learn about and contact our team for collaboration and support, code samples, and guides to get you started with using Intel® RealSense™ cameras for providing spatial awareness in wearables. All of our code, clothing designs, 3D printable camera mounts and actuator boxes, and the plans to physically construct the system we built, are offered without restriction as reference for developers and researchers.

Build Something and Tell Us About It

We look forward to seeing where you go with this. Our goals are to help enable those with deep experience in accessibility to bring inexpensive and easy-to-use tools to their work, and that those with strong development backgrounds might learn more about accessibility and how to contribute to development in this area. We believe that this is one small step that might help developers to turn ideas into reality. Please contact us, share what you’re doing, let us promote awareness of your project via this site, and post on the forum. Take this, run with it, and tell us about it.