Chris Allen is the president and co-founder of Infrared5 and Brass Monkey. Recently, he was one of the Challengers in the Ultimate Coder Challenge: Going Perceptual contest, winning in tandem with Lee Bamber for the award for best overall blog. Chris graciously gave up some time in his very busy schedule to sit down and chat about his development background and interest in perceptual computing technology.
Tell us about your development background.
I always played with computers as a kid. I went to school for music, specifically jazz composition at Berklee College of Music, and around that time came the birth of the Internet. At that time, I got into web development with pure HTML and figured out how to do Java programming. I really did that to promote the band I was in. At the same time I was doing that, I was a computer salesman right out of college. They ended up laying off all their sales staff, so I started making websites for my former clients, a lot of freelance work. I really got more and more into the programming aspect of web design. I started getting clients like Novell and Cambridge, Wildfire Communications, Mass General Hospital, etc. Eventually, I built an open source server for video streaming along with a video streaming protocol, and that was something I got somewhat well known for. That started a lot of consulting work, and out of that, I was able to start Infrared5. I still play music nowadays. There is really a big correlation between music and programming. Programming is so incredibly creative, it’s all about what can we do with this stuff and basically make art.
What got you interested in coding, particularly perceptual computing?
I’ve always been interested in what you can do with the things that are handed to you, along with a very strong interest in multiscreen applications and games. That’s the direction we went with the Ultimate Coder Challenge: Going Perceptual. It’s just cool to explore hand gestures and face tracking. The very fact that these types of things are going to be shipping on next generation devices is something we want to leverage to create amazing experiences. The genie is really out of the bottle; perceptual computing is coming down the pike for everyone.
Ultimate Coder Challenge: Perceptual Computing. What got you interested in this? What were the highlights for you?
It was a big investment, for sure. We had to take people away and put them on this project, but for us, it was an opportunity. We got to take time away from client work and focus on something that would give us total creative freedom, along with the opportunity to experiment with some really neat tech. It also gave us the opportunity to be seen as experts in an emerging field. The chance to show off at GDC was a big for us as well. Some things that weren’t obvious at first in the Challenge was the sense of community and a part in creating that community among the Challengers. Nobody had any hard feelings; we were all rooting for each other. Great cooperation and collaboration.
Tell us about your app you created for the Challenge. What are your future plans for this app?
The name of the app is Kiwi Catapult Revenge. We came up with the concept after seeing an actual news article about the native kiwi population in New Zealand being decimated by cats. The game’s protagonist is a kiwi who takes revenge on all the cats that killed his family. We really took this concept and just ran with it; we had an idea of how we wanted to use the Creative*camera to do the face tracking and see the world in 3D. This game originally was going to be a first person shooter style driving game with a post-apocalyptic world where you could use your head to aim. However, the artists on the team didn’t want to do a standard look and feel, and instead did a very unique cut paper style. Future plans include looking at different ways to release it via app stores. There is a version live on Brass Monkey using game controllers and phones, but it’s not quite done yet; we need more time to finish it up and polish it. More below:
What other contests or perceptual computing efforts are you currently involved in?
We are doing lots of client work that includes perceptual computing technology. Rebecca Allen, our CEO and creative director on the Ultimate Coder Challenge, is doing an art show right now with perceptual computing pieces in it. People can walk by one of the art pieces and it will detect their body movement; there is a gallery exhibit opening scheduled for September (watch this space for updates on this intriguing use of perceptual computing technology!).
Where do you see perceptual computing going in the next five years?
Perceptual computing is going to be integrated with everything else. For example, if you look at Google Glass, it’s targeting the everyday environment. You can walk by a store front, see something specific, and you can interact with it via gestures and devices you already have with you (phone, tablet, etc.). This is going to become commonplace. Our senses are going to become more integrated with our devices, and it’s intriguing to think about what can be done when everything is used in conjunction. There is a huge opportunity for perceptual computing to be used along with AI (artificial intelligence. The ideas and opportunities are truly exciting to think about.
How would you encourage developers who are interested in getting into this technology?
Perceptual computing isn’t about “Minority Report” kind of stuff. It’s about eye and voice and gesture tracking; so much more than what we can imagine right now. We are in the process of creating this technology, this is all brand new. It’s really still being defined; we are pioneers in this stuff! It would be a mistake to lock down definitions at this time. It’s definitely a very exciting time to be a developer. There are people now who are going to be known as the Mozarts and Pollacks of the future. It’s an exciting time to be at the helm and I love what I do!