Is this really the final blog? It’s chip reporting in once again, and I can’t believe how quickly these eight weeks have gone by. It’s amazing to see how much the Ultimate Coders have been able to accomplish in such a short time. And that is truly an appropriate name: “Ultimate Coders”. I’m shocked, humbled, and inspired by the work these teams have been able to accomplish. We’re certainly proud of what our team has put together, and I hope you all are too.
There’s still this final week for one last push, and we’ll be working hard to put some finishing touches on Puppet In Motion before the deadline. We’ve come a long way even since GDC and I’m really excited to show you where we’re at.
Ali is pretty much finished with our codebase, as he spent the last week optimizing and polishing. He was able to get the game to run at 60 fps while recording video. The recording feature now also mixes in-game audio with microphone audio so you can playback the in-game sounds and your voice. He worked on polishing the “hand openness” mouth control mode and added a toggle so you can easily switch between both control modes. He also added a nifty debug feature that allows the depth buffer to be rendered to the screen, in turn allowing the user to see their hand. This affects performance though, and is not meant to be used while recording video.
Dan has been working hard to produce the volume of assets we’d originally scoped. We added a bunch of new plant assets to the scene this week and two of the three pig models are complete with procedural textures. Our legless, handless, blocky wolf finally has his limbs repaired with a few extra spans thrown in to round him out a bit.
This was the first time Dan modeled quadruped hind legs on a upright standing animal. It’s more puzzling than you might think once you actually get into it, as the legs need an extra “thigh” piece added in to get a non-”alien” look and to achieve the cartoon aesthetic we want. His arms were also reduced a bit, as we got feedback that they were looking a bit “pumped”.
Dan also got the straw house finished. It swaps with a destructible model when the player blows it down. Originally he had cut up the walls of the house for the debris and left them as open geometry, but this caused some rather hilarious violent explosions when running the simulation, so he patched up the open faces to fix this.
While I had stated at GDC that we’d hoped to be able to physically blow on the microphone to “blow the house down”, Ali informed me that blowing on the microphone is not a recognized “word” to the SDK and so we aren’t able to do this for the contest. I apologize if I got your hopes up. Currently we are triggering the house to get blown down by recognizing the words “house down”, as in “I’ll blow your house down!”. We’re still experimenting with this, but it seems to be working well. Hey Intel: being able to recognize someone blowing on the microphone would be a great feature to add to the SDK! We’d certainly use it!
Next, Dan will be working on a destructible log cabin and a prettier brick house to replace the placeholders currently in the scene. We’re down to the wire, but I’m really looking forward to finally seeing all the final art in-game!
One challenge we have encountered in developing with the Creative Gesture Camera is that when using the camera for the first time, each person holds his or her hand up differently. Sometimes when you hold your hand out in front of you in a natural position, the camera doesn’t recognize your hand orientation/openness correctly. To help with this, Danny has been putting together a tutorial that teaches the “preferred” posture by showing a virtual hand on screen that matches the user’s hand movements, orientation and openness. Check it out:http://www.youtube.com/watch?v=868SgHF2Too
After calibration the user can select a puppet by simply reaching out with the virtual hand and grabbing the puppet. Overall, this seems to provide a much more intuitive and responsive experience.
Danny has also been working on interactive cutscenes that we’d planned to introduce each of the pigs. Due to time constraints, we chose to put one polished cutscene in-game rather than have several that seem unfinished. We attached one of the pigs to a surfboard while still allowing a user to control the pig’s head. Here is an early version that we will improve. Surf’s up!http://www.youtube.com/watch?v=5u8DAt79pJ8
So where are we now? We had some pretty lofty goals when we started and although we are happy with where we are for the scope of the contest, Puppet In Motion does not end here for us. We still plan on experimenting with eye gazing, head tracking and facial gestures to add additional inputs to the puppets. We also realized pretty early on that multiplayer is very important for the user experience, but rather than force it in too soon, we are looking forward to adding it in after the contest and allowing many users to participate in the puppet theater at the same time.
The “Puppet In Motion” Vision
The Ultimate Coder Challenge provided the perfect opportunity for us to produce a vertical slice of the “Storytelling” mode of what we envision for the final product. This mode allows users to enact, create, modify and share stories. For instance, someone could modify the Three Little Pigs story such that the three pigs get together and trap the wolf, or the wolf simply eats the pigs!
Here are a few additional modes that we have in the works:
- “Music Video” mode allows you to produce a music video with puppets. Players will be able to dress up and accessorize their puppets. You can then record the video by yourself or with friends online, and stitch the performances together.
- “Training” mode will be more realistic and instructional, and will train you to be a better puppeteer.
- “Sandbox” mode will allow you to assemble a set using backgrounds, foregrounds, and props so that you can create all sorts of videos including political satire, interviews, reviews, movie scene re-enactments, etc.
We would also like to support other puppeteering styles such as marionettes and rod puppets for more advanced users.
All I can say is “Wow”... This has been an amazing experience, and a lot of fun. We can’t thank Intel enough for inviting us to the Ultimate Coder Challenge: Going Perceptual and giving us this opportunity. Working with all of you has been a fantastic experience and certainly one I will never forget. It has been a lot of fun to work with such cutting-edge hardware and a pleasure to meet and compete with all of the other teams. If you ever have a chance to be in Intel’s Ultimate Coder challenge, I highly recommend it. You won’t regret it!
A big shout out to Robert, Wendy and Amy for inviting us to the challenge for all of their fantastic work in making this a truly awesome experience! Good luck to all the teams and we are looking forward to seeing the final entries this coming week. I foresee very little sleep between now and then. See you all on the other side.
Thanks again, sincerely.
Sixense Studios - Danny, Ali, Dan, and chip