This the final week in the contest, and we think that we’ve got a pretty solid game out of the six weeks we’ve been at this. This week we will focus on adding that little bit of polish to the experience. We know that Nicole suggested not adding more features, but we simply couldn’t resist. Our character Karl really needed to be able to breathe fire properly, and for the player to truly experience this we felt it necessary to control it with your voice. So, to breathe fire in Kiwi Catapult Revenge you can choose to yell “aaaaaahhhhh”, or “firrrrrrrre”, and the bird will throw flames from his mouth. This feature allows the player to also be able to shoot lasers at the same time doubling your fire power. Beware of timing though, as currently there’s a slight delay. We plan on optimizing this feature as much as we can before our Friday deadline.
We’ve been playing with “mouth open” detection from the perceptual computing camera as well, but the funny thing is that it might have been a lot of work for not much gain. We found that using audio detection was a bit more interesting, and we still have more polish on mouth detection to make it really work well. That said, Steff gave a little overview of the technique we are using for mouth detection in this week’s video.
We also have some words from our protagonist Karl Kiwi, shots from GDC, footage of fire breathing via voice and more insight from our team on how things have gone during the Ultimate Coder contest.
The game we’ve produced is really taxing the little Lenovo Yoga Ultrabook we were given for the contest. The integrated graphics and the overall low power on this little guy doesn’t allow us to do too much while running the perceptual computing algorithms. What runs great on our development PCs really kills the Ultrabook, so now we are optimizing as much as we can. Last week we showed some of the techniques Steff came up with to fine tune the computer vision algorithms so that they are highly optimized, but we didn’t talk much about the other things we are doing to make this game play well on such a low powered device.
Unity is a great IDE to create games like this. It’s not without its struggles, of course, but considering what we’re trying to accomplish, it has to be the best development environment out there. The world (New Zealand) is constructed using Unity’s terrain editor, and the 3D assets were created largely in Blender (we love open source software!). We’ve been able to gain performance with tools like Beast lightmapping that allow us to have a rich looking scene with nice baked shadows and shading. We’ve had some decent hurdles with multiple cameras for the mainview, UniSWF gui and the rearview mirror (render to texture to accommodate organic shape of mirror object), but we’ve been able to handle this just fine. Most of the optimizations, so far, have been concerning draw calls and lighting issues. We typically build apps for iOS/Android, so we tend to keep code/assets lean from the get go. Still, we’ve got a bit more to go before we hand Kiwi Katapult Revenge to the judges.
This week’s Web Build
We are excited to share with you the latest build of the game. Since this is a web build it won’t allow you to use the perceptual computing camera, but we are building an installer for that piece which will be delivered to the judges at the end of the week. With this build you can still fly around using your phone as the controller via the Brass Monkey app for Android or iOS, and you can breath fire by yelling. There are new updates to the UI and environment, and overall the only thing that’s missing from this build are powerups and the perceptual computing camera input. We will have three kinds of power ups for the final build of the game that we deliver on Friday.
Check out the playable demo of Kiwi Catapult Revenge!
What’s Next for Karl Kiwi and the Intel Perceptual Computing SDK?
We really had a great time using the new hardware and SDK from Intel and we are definitely going to keep building on what we have started (remember, we did promise to release this code as open source). We have some optimization to do (see above). And looking back at our plan from the early weeks of the competition, we were reaching for robust feature tracking to detect if the mouth was open or closed, the orientation of the head, and the gaze direction right from the pupils. All three of the above share the same quality that makes them difficult: in order for specific feature tracking to work with the robustness of a controller in real time, you need to be confident that you are locked onto each feature as the user moves around in front of the camera. We have learned that finding trackable points and tracking them from frame to frame does not enable you to lock onto the targeted feature points that you would need to do something like gaze tracking. As the user moves around, the points slide around. How to overcome the lack of confidence in the location of the trackable points? Active Appearance Model (AAM).
So, next steps are to see what kind of a boost we get on our face detection and head tracking using the GPU methods built into OpenCV. HaarCascades, feature detection and optical flow should all benefit from utilizing the GPU. Then, we are going to implement AAM with and without face training to get a good lock on the mouth and on the eyes. The idea behind implementing AAM without face training (or a calibration step) is to see if how good it works without being trained to a specific face in the hope that we can avoid that step so people can just jump in and play the game. With a good lock on the face features using AAM, we will isolate the pupil locations in the eyes and calculate if the gaze vector intersects with the screen and viola! Robust gaze tracking!
Where can you play the game once it’s done? Well we’ve not yet decided on how we want to distribute the game, and there are a ton more features that we would like to add before we are ready for a production release. We are considering putting it on Steam Greenlight. So, with that, you will have to wait a little while before you can play Kiwi Katapult Revenge in all its glory. Let us know your thoughts, how would you distribute such a crazy game if it were up to you?
This has been a great experience for our whole team, and we want to thank Intel for having us be a part of the contest. We really like the game we came up with, and are excited to finish it and have it available for everyone to play. Thanks too to the judges for all the time and effort you’ve given us. Your feedback has only made our game better. Now best of luck choosing a winner! For those of you that have been following us during the contest, please do stay in touch and follow us on our own blog. From the team at Infrared5, over and out!