Last week I and most other participants in the contest ran around GDC showing stuff off, so not much got done in terms of coding, but i for one learned one thing: People wanted to know how my head tracker works. So this week I will do my best to describe the algorithm and the process I used to come up with it.
image courtesy Flickr user bob_duffy
We’re fresh back from GDC and wow…what a great conference! We had so much fun meeting the other contestants, making fun of Lee, and showing off Stargate Gunship to hordes of Stargate fans. Any day you can make a fanboi literally squeak in delight – that’s a good day.
But one of the real high points of the conference was a real-world field test of a technology we’ve been tinkering with for the last several weeks:
Hey everyone, chip from Sixense reporting in once again. Wow, I am beat. Two days after GDC and my feet are still throbbing. I slept most of Saturday away and now here I am wide awake on on Sunday night with a messed-up sleep schedule. This seems like as good a time as any to write our blog post. At least by writing it now, I can sleep in a bit longer tomorrow.
For week six Chris and Aaron made the trek out to San Fransisco to the annual Game Developers Conference (GDC) where they showed the latest version of our game Kiwi Catapult Revenge. The feedback we got was amazing! People were blown away at the head tracking performance that we’ve achieved, and everyone absolutely loved our unique art style. While the controls were a little difficult for some, that allowed us to gain some much needed insight into how to best fine tune the face tracking and the smartphone accelerometer inputs to make a truly killer experience.
Live From The USA
I write my penultimate blog of the Ultimate Coder Challenge II from the comfort and isolation of my GDC hotel room, where I spent the Saturday coding away on my Perceptucam app.