Intel® RealSense™ Blog Series: Smart Robots and More with Developer Yuanbo She

The Intel® RealSense™ SDK has been discontinued. No ongoing support or updates will be available.

Intel® RealSense™ technology makes it possible for our digital worlds to interact with our physical, organic worlds in meaningful ways. Many of the projects that developers are creating step across boundaries that just a few years ago would have been impossible to imagine.

This definitely makes for some exciting developer possibilities. These technologies are changing the way we develop apps; for example, a user could choose to open up a YouTube video simply by using a hand gesture, or post a new Facebook status with a blink of an eye and voice recognition. With the release of the Intel® RealSense™ SDK for Windows Beta and Developer Kit and the Creative * Interactive Gesture Camera Developer Kit with RGB and 3D depth capturing capabilities to bundle this entire tech together, developers have a unique opportunity to usher in a whole new era of PC and computer interaction.

The RealSense™ App Challenge 2014 is an exciting competition meant to spur developers from all over the world to forge new ground in the field of perceptual computing. Voice control, gesture recognition, facial analysis, and augmented reality are just some of the ideations we’re looking forward to in this challenge. Evolved from the 2014 Perceptual Computing Challenges and utilizing the Intel® RealSense™ SDK for Windows Beta and Developer Kit as well as the Intel 3D gesture camera, developers will be tasked with the challenge of designing innovative new ways that RealSense™ technology can be integrated into our computing experiences.

One of the 2013 challengers, Yuanbo She, graciously took some time of out of his busy schedule to talk to us about his experience with that competition, his thoughts on perceptual computing, and his development plans for the future.  Mr. She won an Early App Demo Submission award for Open Innovation with a project titled “Obstacle Avoidence and Moving Object Tracking with Smart Robot”. Watch the demo below:

More on this project from Yuanbo She:

“In this application, we focus on the ability of avoiding obstacles and tracking moving objects automatically. With the depth information from the interactive camera, the robot will be able to "see" the obstacles and the moving objects on its way, and then avoid the obstacle and get to the destination automatically -- or track the moving object.

Tools and Technology

  • Intel® Perceptual Computing SDK: get the RGB-D, and voice data; translate the voice and gesture into commands.

  • ROS(Robot Operating System): it's a distribution sub system can run in Windows and Ubuntu, to transport the data among different systems and computer devices; it contains many packages to control robots, to do computer vision tasks, to fusion sensors data, to do voice recognition and so on, all of them are open source and easy to use.

  • Opencv: provide many computer vision algorithms. It's also the default computer vision tool in ROS.”

Tell us about your development background.

We are experienced in software development and robot development. Our work is focused on fusing artificial intelligence and robotics and creating intelligent robot applications. Natural human-robot interaction is one important area in intelligent robot research.

What got you interested in coding, particularly perceptual computing?

The RealSense SDK contains many natural Human-Robot Interaction features and easy to use. So I like to use it to develop my intelligent robot application.

What were the highlights for you in the 2013 Perceptual Computing Challenge?

The voice recognition, voice synthesis and RGBD data capture were the most important features in my robot application.

What were the key takeaways/lessons that you learned? Any Intel developer tools that helped you?

SDK, the SDK documents, and demo code. Intel Perceptual Computing SDK is easy to use. And the features are easy to integrate into our robot applications.

Any particular categories or markets which you are interested in developing apps for? 

We are interesting in developing apps for robot applications and intelligent robot markets. Our motivation is starting a startup in intelligent robot markets.

Tell us about the overall development experience: did everything go smoothly? What challenges did you come across, and how did you solve them?

Everything went smoothly because we are experienced in software development. The only trouble was that the Perceptual Computing SDK works in Windows, but our robot developing environment is in Ubuntu. If there is a Linux version of Perceptual Computing SDK, we will develop more smoothly. That would mean that one part of our robot application is developed in Windows, and the other part is in Ubuntu which is installed in a virtual machine. 

Since the last Challenge, tell us what you’ve been working on. How did the Challenge help you in your professional or personal development projects?

We are still working on the intelligent robot development, and our research, e.g., natural human-robot interaction, integration of artificial intelligence and robotics, vision based robot navigation and so on. The Perceptual Computing SDK and the camera help us save a lot of time to create our demo with voice recognition, gesture recognition, RGBD data capture and so on.

How would you define what RealSense technology is?

Natural human-machine interaction technology.

Were there new capabilities enabled through the latest Real Sense software or hardware solutions which piqued your interest and or got you inspired?

Yes, more stable and precision result of the SDK.

Do you have future plans for this project?

We want to introduce the features of RealSense SDK into our intelligent robot applications (we are going to startup a company to make intelligent robot), if it’s approved by Intel.

What advice would you give your fellow Challengers?

Just do it. It’s easy to learn and integrate into your applications.

Which other tools, documentation, and/or websites were the most helpful to the dev effort?

Google, OpenCV, the RealSense SDK documents and the example codes.

What other perceptual computing projects are you currently involved in?

Interaction with robots by myo-electricity sensor is another perceptual computing project.

Where do you see perceptual computing going in the next five years?

It will change many people’s habit of interacting with machine.

How would you envision using Intel RealSense and perceptual computing for other types of projects?

We just care about this technology with intelligent robot. It’s perfect to do natural human-robot interaction projects.

What do you find most interesting or exciting about perceptual computing?

Voice recognition feature and gesture recognition features.

Thanks again to Yuanbo She for participating, and we wish you the best of luck in this year’s Intel RealSense App Challenge! For more information, we invite you to visit the following resources:

 

For more complete information about compiler optimizations, see our Optimization Notice.