Ultimate Coder Challenge: Sixense Studios - Week 1

Hey everyone! 

We are thrilled to have been invited to participate in Intel’s Going Perceptual: Ultimate Coder Challenge!  What an amazing opportunity to work and collaborate with such talented groups, and with such cutting-edge technology.  We are very eager to see what all of these talented teams can create.

Who Are We?

Sixense Entertainment is the team behind our own Sixense Tracking Technology, a complete hardware/software platform and SDK for six-degree-of-freedom motion tracking.  With encouragement from Valve Software, Sixense Studios (our team -- the software dev arm of Sixense Entertainment) developed the Portal 2 MotionPack for PC and the Portal 2 In Motion DLC for the PS3 Move which was awarded the “Best Use of the PlayStation Move”. In addition, our MotionCreator software allows for motion controls to be mapped to virtually *any* existing game. 

Our Portal 2 DLC can be found in both the Portal 2 MotionPack for the Razer Hydra on PC with, and Portal 2 In Motion for the Move motion controller on PS3.

Four of us from Sixense Studios will be participating in the contest: Danny will be directing the project, chip will be acting as designer, Dan will be providing our art, and Ali is our code wrangler. 

Be sure to check us out at http://www.sixense.com!

How Did We Get Here?

After shipping Portal 2 In Motion, we were approached by Intel to provide a demo for CES 2013 using the Creative Labs Gesture Camera and Intel’s Perceptual Computing SDK.

You can see our work for the CES demo in the following clip. This was a video we shared for approval with our partners over at Valve. They approved it of course but mentioned that Danny should get some more sleep :)

We tossed around a few demo concepts before deciding to piggy-back on our motion-based mechanics by using a build from our own Portal 2 In Motion. After a successful show and a flawless keynote demo by Intel’s Achin Bhowmik, we began talking about how we might bring to life one of those early concepts of a puppeteering platform. With its unique ability to track individual finger movements, we believe that Perceptual Computing will allow us to advance virtual puppeteering in a way that has previously been impossible.  The Ultimate Coder Challenge came at a perfect time for us and we are excited to use this as an opportunity to create the experience of which we’d been dreaming.  Our internal pitch went over very well and started to generate some excitement of the project and its potential as a platform. Our plan is to use this 7-week challenge to flesh out that potential and create a proof of concept or “vertical slice” of the final product.

Where Are We Headed?

Puppet In Motion is our current internal name for this “puppet platform”.  We are still brainstorming ideas for a title … suggestions are welcome! 

At it’s core, Puppet In Motion is a freeform, creative experience enabling users to intuitively interact with virtual puppets. The Ultrabook provides the perfect platform for the experience we are envisioning.  You will be able to physically reach out and mimic real puppet gestures to control the virtual puppet on screen. We plan to allow users to create a scene (or multiple scenes) using the touchscreen with a set of “movie-set” components such as “foreground”, “backdrop”, “props” etc. The microphone will be used to provide voices and sound effects for your puppet show.  We may even get crazy and incorporate the motion sensors as a way to affect the scene (earthquake!) and potentially add face/eye tracking as an additional puppet input!  Players will be able to interact together in a multiplayer environment allowing a number of possible scenarios, game modes, and of course, sharing! 

Where Are We Now?

We began investigating what game engine would be appropriate for the scope we have planned for the project.  We are pretty familiar with the SDK already from our work with it in Portal 2 integrated directly into the Source engine.  We considered Source (of course, as we are familiar with it), Unreal, and Unity.  After experimenting with Unity and the Perceptual Computing SDK plugin integration, Ali was able to prove that we could get the SDK data we needed into our Unity scene to drive a basic skeleton.  Having this groundwork established solves one of the big unknowns we had coming into this project.  So, we’ve settled on Unity after some very promising tests with an early puppet rig that Dan built. (You know you are onto something when Danny has the entire office in stitches lip syncing his side of the conversation with a virtual puppet on a skype call using screen sharing!)  chip is getting familiar with Unity and is looking forward to diving in! 

Dan’s test puppet model is based on typical human topology, which will hopefully ensure good deformation.  This, however, created an issue with geometry clipping due to the puppet jaw and neck flexing far further than real humans do.  He ended up removing the neck and separating the head from the body with “float space” to allow the jaw to travel as far as it needs.  Dan hasn’t done any rigging in a while though, and he had to relearn basic skeleton creation and skin binding.  While getting up to speed, Dan came up with some great ideas.  He saw an opportunity to make the puppets more animated than they are in real life. Where puppets’ facial features are normally static, we’ll be able to move them dynamically!  He also realized that an interactive virtual environment lends itself to customizable models, allowing us to merge the puppet space with the coolness of action figures and the customization of dolls. So, he’s set up the rigging to allow for the puppets’ hands and feet to be removable and interchangeable.  This also makes for a very flexible base model.

What’s Next?

The biggest challenge we now anticipate (as far as the Perceptual Computing SDK is concerned) is that we will have to deal with finger bone inaccuracies as you move, extend, and clench your fingers.  There are a number of directions for Puppet In Motion that we can dream up in a blue-sky environment, but how we can handle these inaccuracies will ultimately determine what we can do in 7 weeks, and what we therefore will focus on.  Next week we should have a much clearer sense of where we are heading after exploring this issue.  We’ll be sure to keep you up to speed on how we’re progressing.  Thanks for reading! 

Sixense Studios

 sneak peak

For more complete information about compiler optimizations, see our Optimization Notice.