For 7 weeks we've seen 7 teams hack, code, build, and rebuild apps to leverage the Intel Perceptual Computing SDK using a convertible Ultrabook™. Our judges have taken time to review the work, have tested each and scored each of our Challenger teams. The ink is dry, scores added, and the results are final. The Ultimate Coder awards are as follows.
For week six Chris and Aaron made the trek out to San Fransisco to the annual Game Developers Conference (GDC) where they showed the latest version of our game Kiwi Catapult Revenge. The feedback we got was amazing! People were blown away at the head tracking performance that we’ve achieved, and everyone absolutely loved our unique art style. While the controls were a little difficult for some, that allowed us to gain some much needed insight into how to best fine tune the face tracking and the smartphone accelerometer inputs to make a truly killer experience.
Ziggy plays guitar.
Well here we are, nearing the end of the Perceptual Challenge. There's not much longer to go with the competition, so it's time to start locking things down. and really starting to test the application. To that end, I've been concentrating on adding some gestures in to trigger the gesture creation, and doing some of the standard application housekeeping work such as creating an installer project.
Most of this week have been used to work on the Interface library you can see a demo of here:
Bring me the Bacon.
With the realisation that I couldn’t top my week three video, I decided the smart thing was to get my head down and code the necessaries to turn my prototype into a functioning app. This meant adding a front end to the app and gets the guts of the conferencing functionality coded.
I also vowed not to bore the judges and fellow combatants to tears this week, and will stick mainly to videos and pictures.
Week Two brought with it some interesting challenges. We’ve finished our first sprint and many of the features for Kiwi Catapult Revenge are now complete. We were pleased to see that all the tasks we set for ourselves wasn’t too big of a bite to take, and our knowledge of the capabilities of the perceptual computing camera and the PC SDK are making us comfortable with achieving our goal of eye/gaze tracking.
Last weekend I finished off the Beta version of Betray. And that means that its time to start thinking about building my Gui tool kit. Before we get going lets strategize a bit about how I'm going to do this.
Welcome back to my humble attempt to re-write the rule book on teleconferencing software, a journey that will see it dragged from its complacent little rectangular world. It’s true we’ve had 3D for years, but we’ve never been able to communicate accurately and directly in that space. Thanks to the Gesture Camera, we now have the first in what will be a long line of high fidelity super accurate perceptual devices. It is a pleasure to develop for this ground breaking device, and I hope my ramblings will light the way for future travels and travellers.