The Intel® RealSense™ SDK has been discontinued. No ongoing support or updates will be available.
During CES 2016 in Las Vegas in January, Intel announced the Intel RealSense Smartphone Developer Kit (SDK), an Android device with embedded Intel® RealSense™ Camera ZR300 and supports Google* Project Tango* developer ecosystem. Currently the developer kit is open for reservation.
The Intel® RealSense™ Smartphone Developer Kit is powered by the Intel® Atom™ x7-Z8700 SoC (formerly Cherry Trail), which features the 14nm Intel Architecture technology with 4 Cores / 4 Threads and Gen 8 Intel® HD Graphics, and the industry-leading Intel® RealSense™ Camera ZR300. The Developer Kit includes a 6” QHD (2560x1440) display. The device comes with 2GB of memory and 64GB of internal storage. It includes an 8MP rear camera and a 2MP front-facing camera. Figure 1 and Figure 2 show the front and back views of the Developer Kit, respectively.
Figure 1 The front view of the Intel RealSense Smartphone Developer Kit
Figure 2 The back view of the Intel RealSense Developer Kit
The Intel® RealSense™ Camera ZR300 was designed mainly for rear (world) facing usages. It includes a Laser Projector which emits a structured pattern of infrared light, and 2 IR imagers (left and right) which capture the scene. Depth and dimensional characteristics of objects are calculated via point shift of the left to the right IR images. Besides the depth camera for computing high density depth (>10million points per second), ZR300 also includes a wide-field of view camera (VGA with >160 degree FOV) with a high-precision accelerometer-gyroscope combination for motion and feature tracking.
The ZR300 depth camera captures high-quality, dense depth data for 3D scanning of people, objects and scenes along with robust pose estimation. This information can be used for 3D rendering of people/objects and depth mapping and scene reconstructing for environments. The feature tracking camera, high-precision Inertial Measurement Unit (IMU) and rigorous software standards for synchronization between sensors (via time-stamping of data) enables Google Project Tango capabilities such as indoor navigation and area learning.
On top of the powerful hardware platform which advanced silicon technologies such as the Intel® Atom™ x7-Z8700 SoC and the Intel® RealSense™ Camera ZR300, the device runs Android operating system, and supports the Google Project Tango SDK. The Google Project Tango bring 3 core technologies to mobile devices: Motion Tracking, Area Learning, and Depth Perception. The motion tracking technology uses the IMU to calculate the device’s current position relative to where it started, thus can be used in indoors navigation without using the GPS. The area learning technology applies a process called Simultaneous Localization and Mapping (SLAM) to learn an area while tracking the device’s current positon within it, which can be used to perform “drift correction” in long motion tracking scenarios. The depth perception technology uses the highly accurate ZR300 depth camera to generate a Point cloud to represent the depth information of objects. The Intel® RealSense™ technology potentially enables developers to implement the rich user experience features such as face tracking, scene perception meshing, environment reconstructing, and depth-enabled photo features such as photo depth blundering, photo layer segmentation, measurement, photo parallax and refocusing.
With the Intel RealSense Smartphone Developer Kit (SDK), Android developers can now create a new class of end-user software applications all on a single mobile platform.
Other Related Links
About the Author
Miao Wei is a software engineer in Intel Software and Services Group.
*Other names and brands may be claimed as the property of others.