Interview with San Francisco Intel Perceptual Computing Hackathon Winner Cyrus Lum

There were a lot of standout developers at the recent San Francisco Perceptual Computing Hackathon, one of which was Cyrus Lum, a veteran in the industry with 25 years of experience in game production, development and management roles with both publishing and independent development companies. Cyrus won with the overall Hackathon Grand Prize of $3000 with his “MyPetShadow”; a fun pet game which integrates both voice recognition and hand gestures. Here’s a video of MyPetShadow in motion:

  

Cyrus graciously agreed to answer a few questions for us about his background, his work in perceptual development, and his entries in the ongoing Intel® Perceptual Computing Challenge. Without further ado, here’s Cyrus!

Tell us about your development background.

I’m an artist, programmer, designer, producer, and business development guy – a “one man band” of game development!  While, it’s a tough thing to undertake, I do have one thing on my side - 25 years of experience in the video games industry…and in that time, I “MacGyvered” my way into learning or teaching myself how to do everything.  Most of my experience was in console development (Sony PlayStation, Nintendo, Microsoft Xbox, etc.).  Now, I'm doing mobile game development with a primary focus on using HTML 5 as a cross development platform.

What got you interested in coding, particularly perceptual computing?

I'm a storyteller - that's how I got into art.  I needed to create art to help tell my story.  With computers and programming, I was able to take my storytelling to the next level - a level of interactivity.  As a storyteller, I really want to enhance that suspension of disbelief - get my audience to get lost in the story and the world I've created. Perceptual computing is the next step - it adds physical involvement to the experience.  This becomes a bridge from the virtual world to the real world - your actions take place in the real world, yet have an effect on the virtual world.  

I always had an interest in perceptual computing - my first experience was in the mid 1990's with a product called "The Ring Mouse" - it attempted to provide 3D position information through the use of dual ultrasonic transducers.  The game that came with the Ring Mouse was a simple ring toss game.  Even though it was simple, it was completely engaging to me as well as to anyone else I showed it to.  Virtual ring tossing was physically fun and curious!  

Later, I began hacking with the Nintendo Wii mote - I created a Hip-Hop music game demo using the Wii Mote - allowing the player to dance, pose with the mic, etc. Then in 2009, I began getting interested in Augmented Reality and how you could blend the virtual world with the real world through overlays.  I used to joke with friends about my true inspiration for Augmented Reality - a 1980's sci-fi movie starring wrestler "Rowdy Roddy Piper" called "They Live" - he got these alien sunglasses that ,when he wore them, would reveal which people where actually aliens!  Now we have Google Goggles!

The Hackathon – a 24 hour perceptual computing marathon. What got you interested in this? What were the highlights for you?

First of all, it was a great chance to work with the latest in perceptual computing technology - "Time of Flight" cameras.  I really wanted to explore the possibilities of this technology that has been made into an affordable product.  The ultimate highlight was actually using and coding a project using the technology!  It was fun just to activate the camera and run the demos - tracking my hands and fingers as well as playing with the head coupled demos where you move a virtual camera with your head movements.  

Hackathons themselves are fairly interesting to me.  While I'm relatively new to the hackathon scene, I am amazed and curious of the subculture surrounding hackathons - the many people who enter these, how often they enter them, and the bonds that start to form with the "regulars" on the scene.  In my short "career" in hackathons, of the three I've entered since May, I've won all three of them - winning 1st Place in two of them and Best Ported App for my game Zombie Coin Pusher in another.  I almost feel like a profession poker player on tournament circuit - haha!

Tell us about your app you created for the hackathon. What are your future plans for this app?

My app was called My Pet Shadow (see video above). You create a virtual pet shadow to use on a magical adventure through a world of Shadows.

In a world of shadows, things are not always what they seem.  After a horrible accident, you find that you've awaken in a "netherworld" of shadows.  To protect you from the mysterious dangers and guide you back to the "Real World", you must raise, grow, and take care of your own shadow creature.  Once you have your shadow creature, you will embark on an adventure through a morphing and evolving world of shadows - trying to make your way back home - with help from your pet Shadow Creature.

The demo I created at the hackathon allowed you to "conjure" a shadow creature from your hand and ask it its name.  Then you can say your name and get the shadow to repeat it.  Once the formalities are done, you can call-up spinning food carousal.  There are three food items on the carousal - gesturing your hand in a circle around a food item on the carousal will feed it to the shadow creature.  Each food item has a different effect on the creature.  The grapes will grow a tail on the creature.  A banana will grow wings on the creature.  Figuring out what to feed your shadow would allow you to "build" your creature for the adventure ahead.

The future plans for this app are to finish the app to include the actual adventure through the land of shadows!  I've also entered this app in the Perceptual Computing Challenge.

Speaking of the Intel Perceptual Computing Challenge…..tell us about your apps you entered for the contest.

So, I entered 4 different apps:

My Pet Shadow:  as described above - but I'll try to include an adventure in the land of shadows.

Shadow Mind: A variation on My Pet Shadow for use in psychological tests - something similar to the Rorschach test.  Patients can use the shadow interface to build, interpret and react to the abstract shadow creature while in the presence of a psychiatrist.  What does this shadow creature look like to the patient?  How to they react to it?  Interpret it?  Relate to it?

Rail World Battle: Remember the Addams Family TV show? The episodes where they would have these model trains race around then finally go on a head-on-collision?  Remember how fun that was?  Well, now you can do that to - on a much grander scale with a Perceptual computing interface to boot! See video below:

 

As a player, you get to choose what kind of huge locomotive you want to use and what crazy "roller coaster-like" track you want to run on. Now, the game is on!

Race around on two tracks - shooting at ammo pick-ups to get more ammo, health pickups to get more health and shield pickups to get more shields.  Build up your shields and health - because you're gonna need it later for the "Ultimate Train Joust"! As you build up your health and shields, you can also shoot at the other locomotive to bring down their health and shields. After 4 laps around the two tracks, the two locomotives will be switched onto the same track for a "head-on collision" - the Ultimate Train Joust! Whoever has the most health and shields will come out the victor, receive a cash prize and move on to the next combatant. With your prize money, you can go to the locomotive shop and buy upgrades to make your locomotive more and more powerful!  This app will use the PerC SDK gesture and hand tracking to allow players to navigate and shoot at targets.

Zombie Coin Pusher: What happens when you take an arcade coin pusher, and give it more action like a pinball game? What happens when you add a Perceptual computing interface where players can use their hands to drop coins on the field, use gestures to swat zombies away or even use voice commands to activate special weapons?

...and then you take players on a wild adventure to Las Vegas , New York and other cool places from around the world! 

…and then challenge them with how they’ll push coins to navigate around moving obstacles like speeding trains or opening drawbridges. 

 Then, what happens, when, on top of that, we throw in….Zombies! 

...Zombies who are chasing after some of the bonuses that you are trying to drop …bonuses like - survivors of the Zombie Apocalypse! 

Survivors that you must collect as you make your way around the world to the last safe stronghold on earth. 

What happens? …well, you end up with game, Zombie Coin Pusher!

Zombie Coin Pusher is an exciting game where the player can use their hands and voice to interact with this fun Zombie game! See video below:

Where do you see perceptual computing going in the next five years?

I see improvements in noise reduction in data capture, better interfaces that allow users to not have to be so spatially conscience about the area of sensoring.  Multi-camera set-ups that allow for 360 sensoring without occlusion.  Higher capture resolutions so we can start doing 3D scanning of objects visually.  Improvements on cameras to allow for a tighter coupling of the depth camera and the color camera - maybe by using a beam splitter so both sensing elements are in the same spot - now you can get depth information of a specific pixel no matter how close or far you are from the camera.

As far as perceptual computing uses, while moving your body around is fun and cool now.  Too much of this can be fatiguing.  I'd like to see perceptual computing move towards more passive interaction - or rather sensing - like reading body language to allow programs to be more empathic to our emotional state - being able to tell if I'm in a happy, sad, or bored state can allow a program to know what content to serve up.  Reading my eye movement or monitoring my heart rate.  If I can operate or have a program react to my more passive physical efforts, you'll be able to make perceptual computing more useful without all of the additional fatigue.

How about the Intel Perceptual Computing SDK?

It's been very cool working with the PerC SDK.  It really is the only way for a hobbyist to explore and invent new interactive experiences with the latest affordable "Time of flight" technology.

Thank you Cyrus Lum for opening up about your development background, your thoughts on perceptual computing, and sharing your apps with us – and congrats on your multiple hackathon wins! 

For more complete information about compiler optimizations, see our Optimization Notice.