Ultimate Coder Week 7- The quest for a good night's sleep

As this is the final post, I'm not going to be showing code here. What I've been concentrating on since last week is tightening up on the application, testing it, and coping with those funny little niggles. I thought, instead, that I'd talk about what this contest has taught me and what I see happening from here on.

The first thing that I thought I'd mention is that I came into this contest having preconceived ideas about what Perceptual Computing ( PerC) is, and you know what? I was wrong. Completely wrong, and I bet you are too. The problem is, there's no real hard and fast definition of what constitutes PerC. Sure, like me, you may have some ideas about Minority Report style interfaces or possibly something out of Star Trek and, guess what, this isn't it.

It was only when I was really trying to get the camera code working that it struck me. If you look at the interfaces in films like Minority Report or Star Trek, they are touch based. They aren't really gesture based at all. Sure, films like Avengers Assembled show interfaces where bits are swiped backwards and forwards, and even grabbed, but they still aren't really PerC. 

You may wonder why I say this. Well, one of the key things going into this competition was that we had to produce Perceptual applications and, you know what, we all failed. That's not denigrating the efforts of any of the other competitors, they have all made huge strides forwards and produced some amazing applications but the problem is, the promise of PerC is not there yet. By this, I mean, that there is still a lot of work to be done on the SDK side and on the Camera side to bring PerC towards reality. Don't get me wrong here, there aren't huge problems, and Intel have done an amazing job here, but this is merely the start of a long journey towards redefining how we interact with computers.

Okay, that's enough twaddle and rant for now, so on with the show. Pete, you set out to deliver a MIDI style Photo Editing PerC application, have you succeeded? The answer to that, of course, is a huge NO. As Eskil discussed a couple of weeks ago, the resolution of the camera just isn't enough to support manipulation of realistically sized UI elements. I could have given you immense sized buttons, but that would have been a real kludge.

How about voice control? Again, no. Well, yes and no. I managed to get voice recognition in to Huda (took it out and then put it back), so you can add filters based on their name. That's a long way from voice recognition though. The reason that I say this is that this only recognises a pre defined set of commands in English (and has real problems with none neutral accents like mine). What voice recognition really needs is the ability to recognise context and take appropriate actions based on this. One thing I steered clear of, for instance, was trying to open pictures using voice control. The sheer cumbersome nature of navigating over the folders and pictures meant that this would have been unwieldy for users. This, however, is where PerC should really come into its own though, and where we are arriving at more of a definition of what PerC is - it's the combination of features that would have helped, so it would have been great to combine gaze tracking with gestures to pick the relevant folders and images, and then open them using voice control.

Now, if I'd had time, I'd have added gaze tracking and facial recognition, but there was no time if I wanted to get features in. As the contest went on, I had to ruthlessly cut features, so touch reordering of the list is out (a huge disappointment to me), as are all the filters I wanted to add - there's a very limited subset in there. However, I did get time to make the application friendlier for touchscreen users (think tablet - a potentially huge market for this type of application).

That, to me, is the essence of PerC. It's not about gimicks or trickery; it's not just about waving your hands in the air to control the Z-Beta-Death-Glider-7000-With-Humongo-FG in HALO 3024. PerC is the combination of inputs to provide appropriate and easy control of applications. As other competitors have noted, gesture is tiring, so control shouldn't be restricted just to touch. Use voice and gaze to help choose inputs; hopefully with a camera with a much better resolution.

Okay, that's a lot of talk in there and it does sound negative. It isn't though. PerC is something that I'm hugely excited about. Intel has provided something truly wonderful for us to start with and I believe it can only get better as the technology and SDK matures. This has the ability to offer new interactions and abilities well beyond what we touched in this contest. It allows us to offer new and innovative interactions in areas that are traditionally harder to operate computers in (e.g. because the user has to wear thick protective gloves). More importantly, I can see this opening up a whole new set of UIs for users with some form of disability or impairment - this is an area that really needs to be looked at by disability charities sooner rather than later - let's get advice from the experts in what works for them and what doesn't.

Final thoughts

Okay, as this is my final entry and in honour or all the work I've done so far, my track listing was just based on a shuffle off albums I listened to in the previous weeks.

What this contest opened my eyes to is just how much innovation Intel is making. Like me, you possibly thought that Intel was primarily a hardware supplier. It may come as a surprise for you to know that Intel offers much, much more and you really should take the time to explore what Intel has to offer (did you know that Intel has it's own App store called AppUp? If you didn't, you really should check it out).

As befits any piece of writing, I'd like to thank Bob Duffy and Wendy Boswell at Intel for their huge support during this contest. It's been an absolute joy working with them. I'd also like to thank and wish the best of luck to my fellow contestants, a finer group of people you could never hope to work alongside; the level of knowledge sharing and encouragement between ourselves has been truly eye opening, you'd normally expect people to be trying to sabotage each others works. I don't know if they had the energy to read through my ramblings, but I certainly enjoyed reading their blogs and learned a huge amount from them.

Finally, I'd like to thank Sascha, Steve, Nicole and Chris, our esteemed judges. They have been a great source of encouragement and I'm really going to miss their blog posts each week.

As this week marks the delivery of Huda, you may wonder if that's the end of Huda development for me? The answer, of course, is that it isn't. However, this version of Huda was aimed at showcasing PerC - future iterations will change the emphasis to make it a much more fully rounded photo editing application for users. After all, I aim to wean Sascha off his ancient photo editor, and I also aim to make Huda cloud friendly (again, look at what Intel is offering in the Cloud space) - let's face it, why should I restrict you to desktop photo editing when I can offer you web based editing as well.

Thanks for reading, and thanks for your support.

Pete O'Hanlon. Perceptual convert.

Per informazioni più dettagliate sulle ottimizzazioni basate su compilatore, vedere il nostro Avviso sull'ottimizzazione.