Why #AWE2015 Was AWEsome!

I'm a geek. And like most geeks, there are few things we enjoy more than getting to take a little peek into the future. In the realm of trade shows, few shows provide a better preview as to what is coming from over the horizon than Augmented World Expo. Key among the reasons for this is that the central theme of the show - Augmented Worlds - are still firmly closer to science fiction than every day life, at least for now.
 
For anyone unaware, there are two main reasons why Intel enjoys participating in AWE, besides the central theme of it being a cutting edge tech show. First and foremost, it is because of our processors. When you look at the complexities of Virtual Reality and Augmented Realty (VR/AR), there are a couple of different dynamics happening. When you take these issues together collectively, they build a strong cause for using the world's best processors to make the world's best VR/AR solutions.
 
One consideration is that both of them require very short latencies. In order for the user to have the best possible experience, VR/AR require smooth performance. In the world of AR, that overlay over the real world needs to be seamless, or it becomes more frustrating than helpful. If a surgeon was using AR to help her know the exact extent of a tumor based on a recent cat scan, a quarter of a second lag spike could mean the difference between nicking an artery or not. While that is one of the most cases, any overlay that noticeably lags behind the real world would quickly become annoying.
 
In conjunction with latency is the complexity of the modeling. Really good AR will be seamless - in a perfect world, you wouldn't even be able to tell it was being overlayed, like when you watch Jurassic Park - it is very easy to get lost in the movie and forget that none of the dinosaurs are real. But one of the big reasons is that there are a lot of graphic artists painstakingly working to make those dinosaurs so well rendered, with shadows and detailed textures, that you have trouble remembering they are fake. The zenith of AR will be "additions" to your environment that are so smooth and detailed that you quickly forget that the little dragon walking around your desk isn't real. Of course for many of the high-end movies, you had racks and racks of servers rendering each frame. Now, we want both amazing quality graphics as well as extreme portability so that you can have AR wherever you want to, not just where you can be near an outlet. And for that, you're going to want to use the world's best processors (cue the Intel Bong).
 
That said, on conjunction with great processors, the other big need is to be able to quickly and easily pull the real world into your computer so that it can then find out what it wants to add & do so "perfectly" - meaning following all the normal real world rules, like gravity and not passing through solid objects. In the case of a fun dragon pet to play with on your desk, any kind of AR based on just RBG will have trouble with inferring depth. For example, your dragon could be walking along your desk and seem like it will walk behind your coffee cup, but in actuality, it suddenly seems to be walking "in front" of your cup because it is a white mug on a white white surface and an RGB camera couldn't tell that it was actually a cup and not just a weird design on your desk. We've all seen those "street art" chalk drawing that are done with a certain perspective so that it looks like someone is interacting with an environment that is actually flat, but looks like it has depth because of a very special perspective used.
Stunning 3D chalk art Nikolaj Arndt
 
To an RGB-based AR system, it could actually infer that the boy is walking over a deep pit, because that is what it looks like. But to a system that has the ability to pull in a detailed point cloud, like Intel® RealSense™ cameras, it would be able to tell that he is just walking on colorful, but flat, asphalt. And that is the next place that piques Intel's interest in AR - how to use our amazing depth capturing technology to create better VR/AR solutions. In the case of the pet dragon, the depth camera could tell that the white mug is on a white desk, and the dragon would seem to walk behind it. Then, it could even seem to jump up from behind and perch on the back edge of the cup and steal a little of your coffee. All the way down to when its mouth touched the surface, the computer could generate tiny "waves" along the top of the liquid will all the correct perspective to make it seem "all too real."
 
So these are the reasons why Intel is very interested in VR/AR
  • They play to our strengths in processors
  • They benefit strongly from our depth-capturing technology
All of that is great, but what are we doing to help push the envelope? First off, we had several technologists come and do presentations about how Intel® RealSense™ technologies are helping to move the ball forward:
 
For next year, what would you like to see us bring to the next AWE? If you went to AWE15, what were your best observations. We'd love to hear from you in the comments below! 

 

standard
For more complete information about compiler optimizations, see our Optimization Notice.