Computer Interaction Evolves

Welcome to part 2 of my 3 part series on how WPC gave us a “peek into the future.” If you missed it, part 1 was posted here:

And, when it is up, you’ll be able to get to part 3 here:

But, in the meantime, this section is about: How Computer Interaction is Evolving!

Remember Star Trek? I think the following was one of the best scenes in the entire history of the franchise:

I love it because, when it came out, it showed (at time) exactly how far behind "modern" computers were from what we watched in the Star Trek shows. My only problem with the scene was, given that I don't think I ever saw any keyboard previously in any other episode of Star Trek, Scotty could certainly pound the keyboard with the best of them, (which seemed unlikely). However, that oversight aside, it really highlighted how, in the future, Starship Captains will just talk to computers. While I don't think we are quite there yet (although the version of Windows 8 preview I received had "Narrator" built right in), there were a number of cool things shown in from "Microsoft Labs" that I found to be very interesting.

Perhaps the most interesting one was when they took an otherwise normal Kinect & hooked it up above a flat surface (it could be a large TV or it could just be a wall being projected upon). Now, from that vantage point, the Kinect was able to watch the movements of your arm & turn said surface into a virtual touch screen. And that, to me, really opened up a number of possibilities, because it looked like you didn't really have to "touch" the screen to make it work. An obvious use of this would be far more "reasonably priced" interactive monitors in places like the mall or the airport. Tie this together with some reasonable voice recognition ("Show me the Food Court" or "Highlight all the Shoe stores") and now you get a truly interactive experience. But other, less obvious examples could be in an Operating Room where the doctor needs to manipulate a medical scan - perhaps rotate it or zoom in - and he could do that all by gesturing near the screen, without actually touching it, keeping his hands nice and sterile for the rest of the surgery. Or, a teacher in a classroom could now manipulate the lesson on the screen, without ever having to touch it with her hands, making education more fun & engaging for the kids.

Another cool thing they showed that used the Kinect was the ability to use it like an incredibly low-cost laser scanner. They actually showed it working live on stage where they built a 3-dimensional model of a person sitting in a stool, and then, that could be sent to a 3D printer - like those from RepRap or Makerbot - to make a "near-instantaneous" (meaning like maybe 5 or 10 minutes later) model of that person. Or, perhaps more interestingly, of an object - like say a small plastic part from your child's favorite toy. As a parent, I think this unfortunately happens to all of us at some point. Your kid "over-plays" with the best toy in the world & something breaks. In the past, that meant several minutes working with Crazy Glue (and trying not to glue your fingers together) and then waiting for it to dry for several hours (just to be safe), all to just have it break again because it has been weakened. How much cooler would it to be, to put the pieces on a table, scan them from a few different angles, let some awesome software code figure out where the break is and "virtually" repair the pieces, and then sent that file to a small 3D printer to make a whole new piece right there. The whole thing might take just 20 or 30 minutes, and then it would be good as new. How cool would a future like that be?

And, the last nifty (pardon my language) thing they showed around the future of interactivity was a remote collaboration tool. It was a combination of an overhead lamp, with a built in projector and camera. They did the demo from across the arena, but it could have just as easily had been across the planet. How it worked was, as person A put a drawing on his table, person B was able to see it on hers (projected on a blank piece of paper). Then person B could look at it, & start drawing on it - showing how they had to change the design here and here, and add a new module there. As person B drew on her table, person A could see what she was drawing in real-time on his table. At the end of the collaboration, everything could be captured digitally and combined into one file. But, beyond just playing a friendly game of Tic-Tac-Toe (it was part of the demo), they could do the same thing with solid objects. Person A took that same bust of the guy that was made for the previous demo involving the 3D printer & put it into the overhead’s field of view. Almost like magic, it not only showed up on person B's table, but she could draw on the bust (coloring in the eyes, etc.) and the projector would then make those same modifications on person A's bust. If both sets of collaborators had a 3D printer, then could start making revisions of prototypes in near real-time (allowing for coffee breaks while the 3D printers print). By the end of a one hour collaboration session, the team could finish their prototype of a 3-handled (120° separation) coffee cup for use on ships so that they would practically never tip over (the bottom of the handles would be level with the bottom of the cup. Now working on teams in different states or different countries could feel an enormous leap forward in collaborative abilities.

Here’s a video of the full keynote (jump forward to about 1 hr 56 minutes to see the stuff mentioned here).

 


Also, my friend Eric Ligman posted a full blog of the Day 2 Keynote if you want more info.
So that’s it, this was some my observations on how Microsoft is enabling interacting with computers will be evolving today and in the near future. But what do you think? Did any of these seem particularly interesting or are there other things you’re looking forward to soon? Let me know in the comments below…

Thanks!
@CaptGeek
(or Eric Mantion, from Intel)

Para obter mais informações sobre otimizações de compiladores, consulte Aviso sobre otimizações.