Hey everyone, chip here again with Sixense Studios.
Wow, thanks for the great video, judges! It’s nice to put some voices to your comments and hear what you’re thinking a bit more in-depth. We had a great week with all of us in the same office for the first time since the contest began. This has been our most productive week yet and we’re inspired to push even harder with this project.
First, I want to clear something up. While marionettes are something we are hoping to add in the final version of our puppet platform, along with rod puppets as well, it is not something we intend to have included in our initial “vertical slice” at the end of this contest. We are focusing on hand puppets for this proof-of-concept deliverable, and we’ve got some great stuff to show you!
This video highlights two big accomplishments. First, we’re controlling a puppet with the camera! Second, it shows that we’ve gotten video recording working in Unity, so our users will be able to record and share their creations. Yay!
We researched several different methods of recording videos in Unity, from writing screenshots to disk to using external tools launched by Unity such as CamStudio or VLC. It became apparent that the best solution would be one that would write out videos from within Unity. However, for performance reasons, that would require a native code plugin to handle the output to disk in an efficient asynchronous manner. We were about to dive into implementing one when we discovered the AVPro Movie Capture Unity Plugin, which does just that and more. We highly recommend it if you ever need to do video recording from within a Unity application. The time saved from not having to develop it from scratch was well worth the relatively low cost.
As for the gesture tracking, we have noticed the data can be inaccurate and not very precise, leading to a lot of “jittery” data. We are using an exponential weighted average to smooth out the data. While this introduces some lag, a good balance can be found between smooth and responsive controls. We would be curious to know if and how anyone else is dealing with the data inaccuracies we get from the camera.
We’ve also come quite a ways in deciding on exactly what we want to deliver. After much discussion and debate this week, we determined that Sixense Studios’ Playtime Puppet Theater will proudly present... <drumroll>... The Three Little Pigs!
We decided on this story/setting for a few reasons. First, there are only two real puppets we have to make: the wolf and a pig. True, there are three pigs, but we can use the same asset and rig and dress each pig slightly different. Next, the entire story takes place in only three locations. So, we can show how users will be able to swap between these 3 locations using ultrabook features without having to devote too much time and energy into creating assets for multiple locations.
The story begins at the straw house and introduces the first pig, who in our version of the story is “Surfer Pig”. We then move to the stick house where we’ll meet “Lumberjack Pig”, and then the brick house where we’ll find “Suburban Pig”. The wolf is introduced here, who chases the pigs back to the straw house. The wolf then starts blowing houses down, and we move through the three house locations once again until we’re back at the brick house, where the story ends. So, we decided that the backdrop to our storywill be one long piece of art that will scroll left and right as the story progresses. Here’s some early concept art that Dan made:
As you can see, users will be able to tell the whole story with puppets in front of this backdrop simply by sliding it back and forth behind our theater’s proscenium.
From the art department this week, we have the beginnings of a wolf model and rig, and Dan is cranking out some set assets. I’m starting to assemble the puppet theater itself (with Dan’s assets) while Danny is working on puppet authoring tools for the artist so they can wire up puppet controls without needing engineering support. Next, he will be working out some UI issues we realized we have to figure out. For instance, how will we determine which hand (or which player) controls which puppet? How will users tell the program that a scene is “over” and it’s time to move to the next scene? We envision this stuff working with perceptual computing features such as touch screen, facial recognition and voice recognition.. Meanwhile, Ali is still hard at work continuing development on the puppet controls with hand gestures, which will likely be an ongoing thing for him. We’re getting really close to having something playable ready to share, but we can’t pull back the curtain (ha!) quite yet.
Tune in next week! Same bat-time, same bat-channel...