GDC 2013: Interview with the Code-Monkeys

This year at GDC, Ultrabook Community Manager Bob Duffy got the chance to talk with John McClothlan and Gavin Nichols of Code-Monkeys, a development team that is currently participating as challengers in the Ultimate Coder Challenge: Going Perceptual. This isn’t the Code-Monkeys’ first time in the Ultimate Coder rodeo; they participated in the first ever Challenge held in 2012 that focused on the “ultimate” Ultrabook app that could best display what this device could really do. “Organized mayhem” was a good description for the team’s “Wind Up Football” app, a multi-player, multi-touch robot frenzy that utilized many Ultrabook features. The team not only integrated multi-touch, light sensors and GPS into their app, but also utilized superior game design and conceptual computing concepts to bring a more intuitive user experience to the table.

This time around, the Code-Monkeys are still bringing the mayhem, but in a new game format: StarGate Gunship. This is an existing game that was originally designed for mouse and touch input control; for this Challenge, the team is retrofitting with Perceptual Computing features. That means lots of hand tracking for aiming, gestures for menu control and firing mechanisms, voice control for buttons and to switch weapons, and even experimentation with gaze tracking. Basically, users are able to play this game without physically touching anything.

When asked what their experience has been with the Creative* Interactive Gesture Camera and the Perceptual Computing SDK, the team responds with a resounding “really fun to work with!” Gavin pointed out that the camera almost gives out too much depth data to work with, but at the same time, it’s incredibly helpful because it gives you the freedom to do what you need to do to make it work the way you want it to.

The perceptual computing controls that the team has integrated into Stargate Gunship are part of the challenge of building on an existing UI. What have they learned in this process? One thing that came as a surprise was that these integrations actually made their original control schema better and more intuitive. For example, usually with a first person shooter (FPS) kind of game, you’re usually shooting towards the middle of the screen for accuracy. With perceptual computing input control, it’s more of a shooting gallery-type feel, which makes the game much more fun and responsive.  In addition, they found that “training” the voice control by reading to it before actually playing the game made the responsiveness that much more accurate – might be interesting to integrate that training into the game as part of training mode.

The team also discovered that in order for perceptual computing input controls to be as user-friendly as possible, the app needs to give visual feedback, so that the player understands how to interact with the software. This is big for usability, and is one of the hurdles that perceptual computing has to cover in order to be adapted by the average consumer. You can watch the entire interview below:

http://youtu.be/HQvjw2R0Lj8
Einzelheiten zur Compiler-Optimierung finden Sie in unserem Optimierungshinweis.