Why RealSense is REALLY Important

So, I'll start with a confession: I'm a geek. I mean really a geek. Like "I can cite Pi to 9 digits but I have no idea how many innings there are, in a football game" Geek. I'm also a gamer, big time - I love the fact that world's biggest video game tournament pays the players ALMOST as much as the World Cup. So I know that with those two monikers under my belt, anything relating to the Xbox Kinect would be natural for me to love. For anyone that hasn't heard of it, the new Intel RealSense technology is similar to Kinect, in that it combines an RGB camera with 3D mapping technology so that users can with their computers through gesture recognition. If this sounds familiar, you may have heard of something called Perceptual Computing last year. While there were some truly amazing things that came out of that last year, it was honestly just a trail balloon of sorts, to see if this concept has some legs. But we took a hard look at what we did, what our developer friends were able to come up with and we realized we were really on to something!
 
However, in all honesty, many of the 1st iterations of software created using Perceptual Computing (which had hereby been rebranded as Intel® RealSense™ technology) were mostly games. While a guy like me, a gamer and a geek (is that redundant??), loves that stuff, when I walked around Augmented World Expo it became quickly apparent that RealSense would get used for more than just "fun & games." When you combine it with other technologies such as augmented realty, virtual reality, and 3D printing, things really start to get interesting!
 
Take the situation of augmented reality: RealSense helps to make it much more useful. For example, imagine a "Magic Mirror" in a Ladies dressing room... The lady could wear a bikini or undergarments and then the "mirror" (which would actually be a large display connected to RealSense) could then overlay outfits on her based on a selection she makes, not just the outfits available in her size, in that store, at that time. This would be a whole new paradigm for clothes shopping. Women would pick the styles that they want, and then the perfectly fitting clothes (because RealSense would know every measurement down to the millimeter) would get shipped to her in a few days. In fact, when she gets her hands on her own RealSense-enable All-in-One or #2in1, she could even do this in the comfort of her own bedroom, potentially even through a web browser. How cool would that be?
 
Or Virtual Reality, when connected with RealSense systems make it all much better. When you look at most virtual reality systems today, they are based on worlds that developers create. Imagine how much easier it would be if the worlds could just be based in scans made by a RealSense-enabled robot, going through, as an example, the Louvre. It would be like the difference between walking through the museum with a video camera or trying to capture the experience by painting a picture, taking two steps, then painting a new picture! Not only could thousands (if not millions) more people "experience" the Louvre, but it would also make it something that could be enjoyed by bed-ridden senior citizens that previously might have never been able to.
 
Then there is the world of 3D Printing. With RealSense, not only does this bring about the concept of "copy & paste" to the real world, but it also enables the concept of "physical Photoshoping*" (Photoshop is a registered trademark of Adobe, but the verb form is commonly used to describe the act of using photo editing software to improve a picture). Specifically, in this case, imagine if the plastic handle was the wrong size for a person's hand. One could unscrew the handle, scan it in detail with a RealSense enabled system, then scale it larger or smaller for a perfect fit. Or instantly make a scan of a beloved family pet so that a college-bound freshman could bring a reminder of home to a sterile and unfamiliar dorm room.
 
Don't get me wrong, there will be plenty that Intel® RealSense™ technology will be able to enable all on its own. Not only could people enjoy more intuitive gaming, like we've already seen, but the ability for facial expressions to be loosely interpreted so when a user looks very puzzled, perhaps the software could automatically open the help window. Or situations where say a surgeon, baker, or a sculptor wants to interact with software without TOUCHING anything (in the case of surgery because of germs and in the others because their hands are dirty). Whether the surgeon wants to adjust a medical scan or the baker wants to turn the page of the virtual cookbook, both options would be available. In this video below, I show how kids can bring their real life world into a computer game:
 
 
While the above example is very simple, you could imagine a more complex one where a toy company focused on creativity could let kids build a space ship with their product, then fly that space ship that they created through an entire, imaginary world. The possibilities are nearly endless, but blog posts never should be, so I'll end this post with a simple question:
 
What Do YOU Think Would Be the Biggest Opportunity for RealSense?
(post an Answer in the comments below or let me know on Twitter at: @CaptGeek)
For more complete information about compiler optimizations, see our Optimization Notice.