Early Perceptual Demos, Hard Decisions, and Hard Falls for Week 3 Ultimate Coder

After just two weeks into the Ultimate Coder Challenge we are seeing some amazing early work, bordering science fiction like experiences. However some Challengers are realizing there is a difference between what they want to do and what they believe they can do. The judges are starting to comprehend the challenges and early decisions. And perhaps this perceptual computing stuff is a harder than initially thought.  Here’s a breakdown of Week 3 milestones and judge reactions. (To watch the action, see our new video section of the Ultimate Coder Challenge website)

Perceptual Computing Work Revealed:

We saw amazing demos from the developers and it looks like 3 Challengers are showing Perceptual Computing features in their apps already. Judges are impressed!  

Real time 3D video capture: Lee’s video demo shows him being rendered in 3D virtual space in real time.  Lee explains challenges in head tracking and gaze tracking, but shows he is cracking that nut by grabbing the raw data and writing his own stuff (via 29 hours of straight coding).  Who needs the SDK when you don’t sleep. 

Steve: Lee throws up a very impressive video this week, the result of a straight 29 hour coding session.

Nicole: I loved the video, it was such a help for me to really understand how the SDK worked and what exactly the process looks like to debug something…. I really can’t stress how much I loved that video!

Chris: He's not only pushing perceptual computing to the limit but has decided to rewrite the conferencing network code too. He's also showing some vampire tendencies with the rising sun causing him serious damage. I worry, Lee. I really do.

Floating Talking Puppets: Sixense may be showing us the most complete progress toward a Perceptual Computing app. They have fluid movements of puppets and the core mechanics seem to already be there. They are now moving to the story line for their app.  Based on feedback, the Judges may be favoring Sixense already, but have high expectations.

Chris: The Sixense guys have their puppets moving! This is wicked. They are moving on to actual story telling next. Serious progress.

Steve: Sixense are a big team so that factors into the final judging. We’re expecting a high quality result!

Touch Free Photo Editing: While no video this week Peter “Wibble” O’Hanlon is also showing progress with Perceptual Computing in his app. He's made decisions on what 3rd party libraries he can use and determining what features make sense for a hands free photo editing app. Like Lee, Peter is a single developer and Judges are impressed with his steady progress,

Nicole: I like your explanations of why you’ve chosen which library’s to speed your development time. You’re also making better progress than I expected, you’re dealing with quite a bit of data and I do appreciate the step by step

Chris: Pete is writing an application you control through waving your hands and there's no magic, no secret incantations. He's using the same tools we use day in and day out and that, to me, is amazing. There are also no fires or explosions, very little swearing, no tantrums or hissy fits, just constant, solid, back breaking slogging through the code and getting it done. By himself. Much respect.

Hard Decisions  and Technical Roadblocks

All the developers are making progress on their apps.  However some of the developers are finding early challenges. Not everyone can code 29 hours straight. To mitigate, some are working out the bugs this week, while some are shifting gears, or have just pushed out the Perceptual Computing part of the project.  Judges seem to be putting this in perspective.  It’s only been two full weeks, after all.

Mangled Pottery: Simian Squared has also already begun to integrate Perceptual Computing in their virtual pottery making app.  However they found their approach results in some extremely messy clumps of digital clay.  Judges recognize they are grappling with data issues from the camera, but at the same time, wanna to see the mess.

Steve: … they’ll have to take this slightly fuzzy data, as everyone else is, and turn it into something usable and enjoyable. That means more processing and more delay so some cunning techniques will have to be employed in order to make it work.

Chris: … they mention piles of misshapen virtual clay there are no pics. Show us the carnage.

Nicole: Seems like you had a big coding week, I’m looking forward to the visualizations next week. I’m going to be particularly interested in if the Ultrabook is going to be powerful enough for the real time processing you’re going to require.

Eye & Head Tracking… Almost!: Infrared 5 might be showing the most progress overall, having an interactive demo with smartphone control already posted. However the Perceptual Computing part has been a challenge.  They are working toward a clean head and eye tracking, and have an approach.  Judges feel they are showing good progress, and taking on a big challenge.  If they solve it, it could be huge for Infrared5.

Nicole: Love the progress that you’ve made, that we’re able to play your game and actually get to experience the progress first hand.  What you’re attempting with the pupil and eye tracking, it’s totally fascinating to read, it’s the most interesting part of the SDK, but I know that it’s only meant to support Gaze Tracking. How you’re integrating different platform to make it happen is really great.

Chris: they too are moving on rapidly and have a demo of their Kiwi Catapult Revenge game available. The biggest challenge for them? Eye tracking, it seems. I'm praying they crack this because I have my own nefarious needs for decent and cheap eye tracking.

Eye Tracking takes a back seat: Code Monkeys are making good progress but have had to make a hard decision about removing eye tracking from their game, and focus on hand tracking.  Judges seem to appreciate the pragmatic decisions.

Nicole: They had a lot of ambitious ideas about how they were going to change interaction with their game, voice commands, eye tracking for aiming ect. I like that they are taking a more realistic approach to what they can get done, well, in the allotted time.   Wouldn’t have minded a little more explanation around the thought process.

Chris: The Code Monkeys are focusing on input control and, to that extent, focussing on simplification. And their demo code is simple. Crazy simple. Work continues.

Waiting On Perceptual: Quel Solaar appears to have a beta complete for his UI framework, however Perceptual Computing features are not yet there.  Judges appreciate being able to see where Eskil is going with his stunning UI work, but are concerned about lack of Perceptual features so far. 

Steve: Eskil claims he’s finished his Beta which is fantastic progress for this stage but it looks like he hasn’t integrated the Intel PC hardware yet. As the project is based around demonstrations of Perceptual Computing he hasn’t quite reached Beta stage yet.

Nicole: I love it! I totally love what you’re doing! I finally feel like I understand your project and the big data visualization you’re planning on showing to demonstrate the frame work really is what the future looks like in the movies.

Judges Feel The Pain

Overall the judges are becoming aware this technology is a bit harder than it would seem.  These guys are starting at ground zero pulling data from two cameras to figure out interesting UI experiences.  Big teams have resources, but indie guys can code 29 hours straight and make all the decision.

Three of the judges do another video from CeBit with some in depth commentary to the Challengers, unfortunately “a fall from grace” of sorts as Sascha debugs the booth furniture at CeBit. Check out the video


Para obter informações mais completas sobre otimizações do compilador, consulte nosso aviso de otimização.