Week 2 of the Ultimate Coder Challenge is now behind us, and quite frankly, it’s amazing that we’re only at week 2 with the amount of progress that has been made: playable demos, workable frameworks, and actual renderings of 3D/perceptual computing technology in action have all been presented this week. Here’s what our challengers have been up to:
The Oscars gave Sixense a bit of motivation, but that was whisked away with the annoucement that their same concept was already being worked on by a competitor; and to add insult to injury, the announcement was made at the PlayStation 4 showcase. Ouch! However, once they had time to sleep on it, this came as a silver lining – proof of concept indeed. They’ve been working on adding realism to their puppet app, hooking up a test puppet to the Perceptual Computing SDK, and next week they plan to have a demo for us all to “ooh” and “aah” over. Read Sixense’s post here.
Lee gives us a video with highlights of how he managed to get the PerC camera into generating a 3D construct – I have to say, it’s incredibly cool to see this technology in action. He invites anyone with a Perceptual Camera set up to try the prototype themselves, and promises a few next steps: rounding off the 3D object and polishing it up to be a real human-like head, along with texture. You can read Lee’s entire post here.
Along with a quick video intro to their team, we also get to see the Code-Monkey’s put their retrofitted PerC game, Stargate Gunship, through a few paces (are those zombies in there?). There’s also an intriguing discussion about the evolution of touch screens and input, and where personal computing is today (and where it might be going). You’ll want to check out their entire post here.
A virtual pottery studio is what Simian Squared is aiming for, and so far, so good. One of the interesting challenges that these guys have come up against is the nature of clay itself, and translating that into digital format. It’s a bit tricky to figure out how to make the physical properties of a unique substance feel “real” when a user plays around using gestures in mid-air, to say the least. They’re also working on making the environment as tranquil as possible – a nice touch. Oh, and yes, there is a Ghost reference (how could there not be?). You can read their post here.
Eskil aims to solve the problem of hardware outpacing software, and vice versa, by building a platform that will respond to any kind of input: mouse, keyboard, touch, gesture, you name it. To this end, he’s written a lean library with a plugin interface so modules can be added as needed – talk about flexibility. Next week, he’s going to be tying up loose ends in this new platform, moving on to the new interface toolkit. You can read all about it here.
The team at Brass Monkey/Infrared5 has made pretty good progress in this, just the second week, releasing an actual playable demo already for their game, Kiwi Catapult Revenge. They are working on face tracking with custom controllers (you’ll have to read the post to see what I mean), and they are cautiously optimistic that they will be able to implement gaze tracking as well, based on the level of data from the PerC camera. Game mechanics are flowing right along, and artwork (paper textures) seems to be right on target as well. You can read their post here.
Peter gives us a day by day write-up of how he’s developing Huda, his image editor, and I must say this was a fascinating post to read. This week, Peter roughs out the basic interface, gives us an idea of what his user interface is going to look like, puts in gesture recognition code, and played around with the styling. Next week, we get to see him add in filter management and photo saving. Make sure you check out his Ultimate Coder musical playlist, too. Read Peter’s post here.
A few of our judges are at Mobile World Congress this week, and even though their plates are certainly full, they still took time out of their busy schedules to offer up some great insights on how the challengers did this week. Here’s a video update that three of our judges did together:
Here’s what our judges had to say:
Like most of us watching from the sidelines, Steve “Chippy” Paine is really impressed with how each team is confidently charging ahead with their projects, making incredible progress in only the second week of the competition. He gives some smart feedback on hardware and each team’s progress towards Perceptual Computing integration. You can read the entire post here.
“Reckless abandon meets a hard dose of reality” sums up Chris’s reaction to the work done in Week 2. He continues to be really impressed with the challengers’ work so far, and writes up an intriguing discussion on how perceptual computing is set to possibly disrupt how we as users will interact with our computers at the most basic of levels. You can read all of Chris’s thoughts here.
Nicole gives us a quick rundown of her impressions of each coder’s work, with a more in-depth reaction filmed in the video above (live from Mobile World Congress!). You can read Nicole’s update here.
Sascha gave each challenger team some very personalized advice/instruction/constructive criticism this week, focusing on visual effects, core interface design, environment, proof of concept, design, and frameworks. You can hear more of Sascha’s thoughts in the video above or read his entire post here.
Looking forward to week 3!
Next week is already Week 3, and with how much we’ve seen each challenger accomplish so far I’m definitely anticipating great things. You can follow along with this exciting competition at the official Ultimate Coder page, on Facebook, or via the Twitter hashtag #ultimatecoder.