I see things from Developer lens, it is my job after all. However this was my 5th IDF and I have to say this year more than others, the software developer was a focus of the event. The following is my experience and journey at IDF with evidence and perspective of the software developer being a core ingredient of the IDF experience.
Keynotes: The first day started with Dadi giving the opening keynote. A theme of the keynote was this notion that all us wanting the technology to fit our lives and to seemlessly connect and be accessible regardless of where we are and what we are doing. Evidence of how tech is adapting to us was when Dadi held up two current Intel processors and showed multiple Ultrabook designs; clamshell design; detachable screen design, a fold over screen, and a slide-out keyboard. The message is the technology need to adapt to the preferred use case of the user and a large part of enabling that is with the software. Dadi then showed us a new SDK for developers call the Perceptual Computing SDK. This allows developers to go even further than touch and device orientation sensors. With this SDK voice recongition, facial recognition, and hand, arm and body gestures can be part of the experience. Dadi demo a catapult game and a solar system app all manipulated by simply moving your hands in the air. Dadi then showed us integration with 3rd party systems to allow things like secure payment through NFC technology. Wrapping up Dadi brought it all together saying the experiences needed to allow users to seemlessly connect with data and application is in the hands of the developer and that Intel needs to and wants to collaborate with developers to deliver these experiences.
On day two, Renee James took the stage. She discussed Transparent Computing, and explained Dadi's notion further that apps and their data need to connect with us and the devices we have on us wherever we are. Peter Biddle took the stage and showed us new Intel Services that will allow for local based messages and photography, that span devices and not tied to a proprietary vendor. Renee discussed HTML5 as a key way Intel wants us to get to Transparent Computing. She showed off an interactive language book that allowed kids to quickly and easily learn Mandarin Chinese. Also she had McAffee show a way of securing our content in the cloud with a photo security service that allows you images to be seen by your social network but not copied and shared.
On day three Justin showed us the near future of computing. One thing he showed us is how Intel is been able to turn Radio from an Analog medium to digital and with that was able to put WiFi on the chip. Another thing he showed was a secure authentication which allowed you to hover your hand over a device and log into that device as well as any and all services there after. One of the coolest demos was the WiGig demo where we saw devices that eliminate the need for wires. The demo had an external hard drive, Ultrabook and two monitors. Without a single wire connected between the devices, the Ultrabook streamed a video from the external hard drive and displayed the video (extended) across both screens. Finally he showed how we can reduce the energy footprint of cell base stations by putting the base station hardware into software. China Mobile showed how they were able load share base station hardware on Intel servers and are going to attempt to power 100 base stations on one Intel server.
Live Coding: After the Keynotea, developers from our Community showed of their solutions to bring these experiences to life. Chris Skaggs of Soma Games showed off his solutions for leveraging HTML5 to build applications that will run across Intel devices and platforms. Chris showed us a real time earthquake tracking solution that leverages local storage, as well as a metric conversion app that can change its UI on the fly. Lee Bamber also took to the Live Coding stage and showed off is Transparent Computing framework called Freedom Engine, which allows a developer to write and manage one set of code then package that app for nearly any platform and device in the market. Freedom Engine is doing a lot of things and since HTML5 doesn't yet do everything, Freedom engine will create a bit of byte code and drop in a virtual machine for any many native languages as well as HTML5. So this is an early solution for Transparent Computing that looks be adding more and more pure HTML5 support plus native solutions when then matter most.
A number of developers joined Intel for a private "Fellowship" meeting where we got up close and personal with Perceptual Computing. This technology packages the Nuance* speech recognition with facial and hand gesture recognition in one SDK. The result is the ability to create applications that you can interactic with through hand and facial gestures and natural speech. The demo was amazing and it exposed 3 levels of data that the developer can use. You can use the prebuilt intellegent gesture recognition, or go lower and get data on the mapped points of the face and hands, or go even lower and get the raw RGB and 3 Dimensional data being captured in real time. In the end the developer has the data to build amazing apps that don't require you ever touch the device to operate it.
On the Showroom we demoed 10 apps including early versions of all the Ultimate Coder applications. Also much of the Intel App Show for developers was shot there interviewing developers and bloggers, as well as taping demos for the show. Seeing people play with sensor enabled Ultrabooks was a real kick. People first thougth, this is just a small laptop, then when they saw these apps under development they started rethinking their assumptions of what a PC is.
On day two we had a student hackathon where more than 20 students attended and used their time at IDF to code projects and work in touch and sensor features into their applications. Lee Bamber and I joined the studends and did our best to work in features. It was a learning experience for me and the result is a code sample I can share (see post)
Black Belt Dinner
At the Black Belt dinner the best of the best of our developers joined us at the Exploratorium in San Francisco. Geeky people geeking out on geeky experiments and demos is something to see. I certainly could not keep up with the explanations of how and why resonance frequencies create patterns in sand.. etc. A number of Black Belts were given the floor and we heard how programming robotics facilitates problem solving in students, to whether or not something is a tool or a toy, ..meaning that coding in "joy" and "fun" into the most utility driven apps is rewarding the avid users. Also we learned from Intel why an Ultrabook is an Ultrabook and the vision behind the Ultrabook. All I can say is it has something to do with what kind of PC Tony Stark (Ironman) would carry around with him.
Overall it was an active and developer engaged event from beginning to end. I learned alot and I came away believing that we need to do more developer focused activities, and provide an opportunity for developers to show us what they would do or want to do with our technology. Hopefully more to come on that later.