Interview with Sixense: Perceptual Computing SDK and Puppets

Sixense, one of the challengers in the Ultimate Coder Challenge: Going Perceptual, was interviewed twice this past week at GDC 2013 about what they’re building for the Challenge and how they are working with the Perceptual Computing SDK.

“It’s been a great experience for us”

http://youtu.be/kU4bn7VNnMQ

In this first interview, Ultrabook Development Community Manager Bob Duffy interviews Creative Director Danny Woodall and CTO Alejandro Diaz. In the Ultimate Coder Challenge, all coders have been working with the newly re-released Perceptual Computing SDK. Sixense reports that they took their experience from creating a Portal 2 demo for CES and applied that experience to their current project, a puppeteering platform tentatively titled Puppet in Motion. They feel “lucky” that they’ve had this experience prior to participating in the Challenge, since it definitely is a boon to their current work, especially since they’re building this particular program from the group up for the contest. 

Sixense is using the Unity engine, and since the SDK came with a Unity plugin, it gave them a good head start. Their Puppet in Motion app was a new build from the group up, and as the team was somewhat new to Unity, it’s been a very good learning experience. 

The Puppet in Motion app is basically a virtual puppet theater. It uses the Creative* Interactive Gesture Camera to simulate movement, giving the user the ability to use their hands like they would operating a sock puppet and see it reflected on the screen. Sixense has a few different story modes that they are working on for this app, with the classic tale of “The Three Little Pigs” being showcased for the Challenge. 

As this SDK is relatively new, it’s been interesting for Sixense from a technical perspective. Originally, the team was a little skeptical of whether or not the SDK would be able to accomplish what they needed it to, but once they got Portal 2 working and this new project off the ground, they were very impressed.

How do you control a puppet’s virtual movement? It’s trickier than you might think it would be, especially when dealing with the amount of raw data that the gesture camera generates as well as the SDK’s API. Sixense is utilizing both of these resources to control their puppet and give it the most realistic, intuitive feel possible; they experimented using the depth buffer to use more traditional gestures for controlling the puppet as well.

As far as the Challenge, Danny states that “seven weeks is not a lot of time to get things done!” which all the Ultimate Coder challengers would probably agree on! They’ve really enjoyed working with Intel on this project, remarking that the “SDK is improving all the time.” 

In another interview, Ultrabooknews.com executive editor Ben Lang interviews Chip from Sixense about their puppet theater. Eventually, users will be able to choose from several different stories, using their hands to control the puppet, record their voices, share with social channels, and change the story to whatever they might like – kind of a “Choose Your Own Adventure” experience.  Overall, the response has been very positive: people love to be able to control the puppets intuitively and it’s just a lot of fun. Sixense plans on taking this “proof of concept” project to the next level, developing it into a real product on the marketplace.

One thing that might be in the works for the contest: the ability to literally “blow the house down” (this is the “Three Little Pigs”, after all) by literally blowing into the camera. This would trigger the house being destroyed. The physics for this action are already in the platform, and they’re working on integrating this fun feature before the end of the contest.

The concept for Puppets in Motion actually came up when they were discussing what to do with the Gesture Camera for CES. They had an idea for a marionette-style program; since the camera can track individual fingers, their idea was that the camera would actually be in front of the laptop facing up underneath the user’s fingers, with virtual strings controlling the puppet on-screen. However, as anyone who’s attempted to operate a marionette could tell you, it’s pretty difficult to do with any degree of success or accuracy. Puppets with a sock puppet/hand puppet feel for the contest seemed to be a more attainable challenge to tackle.

More about Sixense, Perceptual Computing, and Ultimate Coder Challenge

To find out more about perceptual computing, Sixense, and the Ultimate Coder Challenge: Going Perceptual, please visit the following resources:

有关编译器优化的更完整信息,请参阅优化通知