PC technology is rapidly evolving beyond what we’ve known for 30 years. The mouse and keyboard are no longer the only ways to interact with applications. To support further advances in this exciting field, Intel has released the Perceptual Computing SDK, a free framework developers can use for their applications.
Intel Perceptual SDK è una piattaforma software/hardware per il gesture/voice/speech recognition. Per il software si ha a disposizione un insieme di librerie contenenti l’implementazione di algoritmi di rilevamento e riconoscimento esposti attraverso uno strato di astrazione (interfacce standard).
Per l’hardware, Perceptual SDK supporta l’utilizzo della Creative Camera.
L’utilizzo di Intel Perceptual SDK permette allo sviluppatore di semplificare lo sviluppo di applicazioni di nuova generazione basate sul paradigma NUI (Natural User Interface).
Haptic technology revolves around creating a new kind of user interface that better connects our sense of touch with the form factors we use every day: smartphones, tablets, computers, etc. We get a brief glimpse of the possibilities that haptic technology offers when our phone vibrates in our pockets, letting us know that we’ve received a text message, or with the rumbling sensation we get from console game controllers.
There’s nothing like a bit of friendly competition to spark developers to new heights of innovation. Beginning on May 6, coders all over the world will have the chance to forge new ground by integrating voice control, gesture control, facial recognition, and augmented reality within PC apps in Phase 2 of the Intel® Perceptual Computing Challenge.
It is truly amazing what developers all over the world are managing to come up with in the field of perceptual computing. Perceptual computing is something that Intel® is putting a lot of time, energy, and resources into, as seen in the recently finished Ultimate Coder: Going Perceptual competition and the ongoing Intel® Perceptual Computing Challenge, which is about to kick off into its second phase on May 6th.
O Community Manager de UItrabooks/Windows, Felipe Pedroso, apresentará a palestra “Inovação na Experiência do Usuário: Apresentando o Intel Perceptual Computing SDK” no TDC Floripa, no dia 26/5/2013, às 11:10.
O que será apresentado?
Our sense of touch is vitally important. It’s involved in literally every interaction we have. When it comes to human-computer interaction, touch is especially important, especially as touch-enabled input controls and interactive experiences make our computing experiences ever more tactile. As more and more devices are adapting touch-enabled features, controls, and form inputs to satisfy consumers’ ever-growing appetite for tactile input, we’re seeing new advances from a wide variety of fields.
For 7 weeks we've seen 7 teams hack, code, build, and rebuild apps to leverage the Intel Perceptual Computing SDK using a convertible Ultrabook™. Our judges have taken time to review the work, have tested each and scored each of our Challenger teams. The ink is dry, scores added, and the results are final. The Ultimate Coder awards are as follows.
Last week, our Challengers submitted the final version of their perceptual computing apps to the judges, and this week, it was testing time. Our judges took on the task of putting each project through its paces, and wrote up their first thoughts on each and every one – in addition to general thoughts on the Challenge and the future of perceptual computing. Are there clear winners for our judges? Not quite yet, and the final verdict won’t be in until next week when we announce the winners on April 24. Here’s what our judges had to say after their initial testing periods:
When you open Huda, you will see an interface that looks something like this:
The left hand side of the interface is divided into two sections: Folders and Thumbnails.