Research at Intel 2011

The 9th annual Research@Intel was held at the Computer History Museum on June 7th and 8th, 2011. The event highlighted more than 40 research projects demonstrating the latest in technology innovation from researchers at Intel Corporation. Intel Labs pulls back the curtain to give media a peak at the future in the areas of cloud computing, visual computing, HPC,mobile computing, platform innovations, and security. Check back here over the next couple of weeks as we post a few videos and white papers from some of the most popular research demonstrations.

In case you missed it, Intel launched a new visual research center earlier this year. The Intel Science and Technology Center for Visual Computing (ISTC-VC) will develop innovations in lifelike computer graphics, natural user interfaces, and realistic virtual humans that will make people's technology experiences more immersive in the future. The goal is to drive visual computing applications that look, act and feel real, and to make the technology broadly accessible to consumers.

Here are a few of the demos from Research@Intel 2011:

Silicon Photonics
Lauren Jones from Intel Labs demonstrates the 50Gbps Silicon Photonics link; the first end-to-end silicon photonics link with integrated lasers. As the amount of digital information grows, new interconnects will be required to move data farther and faster around billions of connected systems. Silicon Photonics research aims to enable faster data communication over longer distances. Its potential for low cost and scalability could alleviate data bottlenecks and enable new architectures in consumer electronics, data centers and high performance computers. A wide range of potential applications of the technology are possible, from high-definition video transfer in the home to novel memory configurations to enhance server performance. See for more information.

Photo Realistic Rendering (includes source - see Embree Sample)
Manfred Ernst and Sven Woop from Intel Labs demonstrate an interactive Monte Carlo ray tracing engine. It generates photo-realistic images of complex models within seconds, and it is capable of displaying previews in real-time. This technology is used for virtual prototyping, architectural visualization, movie production, online car configurators and many other applications. Our highly-optimized implementation on Intel Architecture allows people to generate photo-realistic images of virtual objects faster than previously possible. For more information, see Embree Sample.

Extreme Scale Systems Software
Rob Knauerhase from Intel Labs demonstrates "Runnemede". Extrapolation of supercomputing trends from tera- and peta-scale to exa-scale (a million trillion operations/sec) highlights the challenges of current technology for programming complexity, energy efficiency, and overall performance. "Runnemede" is an Intel Labs project funded in part by DARPA and executed with academic and industrial partners. Our research is exploring fine-grained event-driven execution models to exploit very-high-core-count systems. Using introspective observation and adaptation, our system dynamically schedules tiny "codelets" to cores according to control and data dependencies. The result is extreme-scale performance with low power consumption and high reliability.

Work or Play Online with a Cast of Thousands (includes source - see Scalable Virtual Environments Sample)
Dan Lake from Intel Labs demonstrates a new virtual environment. Virtual environments have applications from gaming to disaster response training. However, today's approach typically limits each environment to running on only one server. Intel researchers have developed a new software architecture that breaks the environment into separately executable components, which, when combined with a cloud computing model, allows applications to scale user experiences far beyond existing limits. For more information, go to

Improved Perception of Digital Content (New! - paper here)
Hans-Christian Hoppe from Intel Labs demonstrates XML3D. Today, the Web is lacking non-proprietary mechanisms to model dynamic 3D scenes, render them in standard browsers and to interact with the content. In this demonstration we added extensions to Firefox and Chromium to arbitrarily modify 3D scenes described by the XML3D HTML extension without the need to retransmit the scene itself. With this technology users can immerse themselves in a fully interactive 3D Web on any browser platform. See Resolution Enhancement Content for more information.

Steerable Sound
Dr. David Wessel showcases spherical loudspeakers that not only give best-in-class sound reproduction, but also dynamic, steerable sound analogous to traditional acoustic musical instruments. This sound quality is made possible by the spherical nature of the speakers plus the signal processing enabled by parallel processing. A range of applications are demonstrated including interactive sound installations, musical uses, and those involving room acoustic measurements.

The Magic Mirror
Nola Donato from Intel Labs demonstrates the Magic Mirror. The Magic Mirror project aims to provide a new shopping experience which provides a realistic avatar of the consumer dressed in the latest fashions. This demo shows our research in body tracking and our parametric human body model. We show a 3D avatar that tracks your movements in real time and demonstrate how to change the dimensions of the body using gestures.

Research within Intel's Academic Centers
Greg Leeming from Intel Labs discusses the Intel Science and Technology Center (ISTC) program. Intel launched the ISTC program to establish collaborative research centers within academia to drive the state of the art in specific technical areas. The first of these Centers, The Intel Science and Technology Center for Visual Computing (ISTC-VC) was launched as a complement to the Intel Visual Computing Institute that was launched at Saarland University in Germany. Together, these Centers engage 50+ researchers and 80 graduate students to pursue a broad research portfolio. Besides providing ISTC program overview we will present examples of this research and the impact it will have in the future. See and for more information.

Variation Aware Dynamic Adaptation
Jason Howard from Intel Labs demonstrates Variation Aware Dynamic Adaptation (V-ADAPT) which shows the benefits of V-ADAPT on Intel Lab's Single Chip Cloud Computer running industry standard benchmarks. The relentless pursuit of Moore's Law has enabled the era of many-core processors, but device scaling has led to greater on-die variability which results in increased frequency and leakage variations amongst identical cores on the same die. Our proposed V-ADAPT technology exploits these variations and enables higher performance in the same power envelope or increased energy efficiency for a given performance. For more information, see V-ADAPT whitepaper.

3D Internet with XML3D
Professor Philipp Slusallek demonstrates the future 3D-Internet with XML3D: XML3D is a new technology that has been integrated into Firefox and Goggle Chrome browsers and which extends HTML-5 with the ability to support interactive and realistic 3D graphics without any plug-ins. XML3D has been developed in the context of the Intel Visual Computing Institute at Saarland University in Germany. In contrast to previous approaches, XML3D builds on top of the current Web/HTML-5 Web technology stack, thus allowing millions of current Web developers to easily integrate 3D functionality into their Web sites just like they use text, images, and video today by fully exploiting their existing skill sets. XML3D supports rendering of fully dynamic or animated 3D content, full user interactions, realistic materials, virtual characters, as well as 2D and 3D links for things like taking you to another 3D view, modifying existing or loading new 3D content from the net, or simply starting an animation. For more information, go to

Automatic Device Driver Synthesis
Mona Vij from Intel Labs demonstrates how to generate device drivers automatically. A device driver is the part of the operating system (OS) that is responsible for controlling an input/output (I/O) device. Currently, drivers are a primary source of bugs and driver development is a major bottleneck for platform validation and time-to-market. Jointly with NICTA, Intel proposes to improve the driver development process by automatically synthesizing drivers from formal OS and device specifications. This approach improves driver reliability by reducing manual intervention, avoiding misinterpretation of device documents by driver writers. Moreover, given a device specification, drivers can be generated automatically for all supported OS, thereby eliminating the costs associated with porting drivers. See NICTA site and The Register article for more information.

Perceptive Environments
Terry O'Shea from Intel Labs demonstrates Perceptive Environment. With low power Wi-Fi it is now possible to connect sensors directly to the cloud using inexpensive and off the shelf technology. This enables large scale processing for enormous mixed sensor arrays in extreme and geographically diverse environments. Research trials are underway sensing structural movement of offshore oil-rigs and weather conditions in seaports. Goals include the ability to connect from remote locations, reliability, low cost and scalability. This demo shows individual environmental sensors and explains how the cloud can bring thousands of them together to show a bigger picture.

Intelligent Advertising Framework
Sanjay Addicam from Intel Labs demonstrates the Intelligent Advertising Framework (IAF) which integrates Anonymous Video Analytics (AVA) and data mining technologies to achieve targeted and interactive Advertising. It relates AVA viewership information with point-of-sale (POS) data and learns advertising models. Using these models, IAF is able to intelligently identify and show the most appropriate advertisement to the audience in real time. The audience can also interact and easily find interesting ad content based on IAF recommendations.

Smart Spaces
Alec Leckey from Intel Labs demonstrates PERSIST. A fundamental component of pervasive computing is the efficient management and use of context information. PERSIST is a smart space middleware framework developed by Intel Labs that improves management and use of context information. PERSIST's context model was designed to support personal smart spaces and manage context across all the personal devices of a user. The context engine can derive high-level context information based on the raw sensor data and/or context history of all personal devices. In addition, it has a "user intent" system that discovers and manages models of user behavior. While a user preference normally specifies one action to perform when a context situation is met, PERSIST's user intent system specifies sequences of actions to perform based on past and current user behavior. The demo will showcase this technology as an information and visualization tool for disaster relief workers, a navigation system that learns user behavior based on previous events suggesting re-routes, and a personalized smart workspace for intelligent conference/meeting management. For more information, go to

High-Fidelity Planet Viewer
Wei Sun of Intel Labs demonstrates a novel, full-scale planetary rendering solution that delivers realistic visuals. Rendering planetary scale objects is challenging due to geometric complexity and the need for managing huge amounts of data (terabytes or higher). This demo shows how our application can enable real-time graphics rendering on the fly without having to purchase additional expensive hardware. Currently, much work in this field focuses on exploiting costly high-end discrete graphics hardware. The Intel Labs demo shows a full-scale planetary rendering solution that delivers realistic visuals on a less costly solution using a 2nd Generation Intel Core processor. For more information, see Planet Viewer white paper and Planet Viewer IDF presentation.

Smart Vehicle Research: Car, Cloud, and Phone Working Together
Vijay Kesavan from Intel Labs demonstrates combining Intel in-car platforms, cloud-based services, and smart phones into a Smart Vehicle. Observe an innovative one-touch pairing technique using NFC touch or barcode to securely pair and provision a previously unpaired smart phone with a car and cloud services. Once paired, the phone's application provides the user a virtual key fob, remote video surveillance, car alarm notification, and other compelling capabilities. For more information, go to SMART Vehicle Blog.

Many Core Application Research Community
Bob Noradki of Intel Labs highlights promising research results from some of the 80+ institutions worldwide using the Single-chip Cloud Computer (SCC). This 48-core concept processor developed at Intel Labs recently won the German Innovation Prize for Climate and the Environment for its energy-efficient architecture. He shows the SCC running applications on each core. In addition, David Lin and Ziyad Abdel Khaleq of Stanford their research on the SCC. For more information, see

Faster Web Apps with Data-Parallel JavaScript
Stephan Herhut from Intel Labs demonstrates data-parallel JavaScript. In a world where the web browser is the user's window into cloud computing, browser applications must leverage all available computing resources. Data-parallel JavaScript research aims to put the computing power of multi-core client systems into the hands of web developers while staying within the secure boundaries of a familiar programming paradigm.

Authentication of the Future
Vinay Phegade from Intel Labs demonstrates Authentication of the Future. Today's user authentication mechanisms have inherent weaknesses that can lead to online fraud. According to a recent report, 73% of people reuse their passwords and share them across sensitive web sites such as banks and other non-sensitive web sites. Online service providers rely on various infrastructure protections as they are unable to trust the client to behave and execute correctly. This demo shows how a trusted client is able to use advanced authentication and user presence techniques to locally authenticate the user and then assert the user's identity to a remote service provider, improving both the security and user experience.

Cloud Based Ray Tracing on Handheld Devices
Daniel Pohl from Intel Labs demonstrates the capability to bring more realistic visuals to portable devices beyond what is possible today. A cloud using Intel Many Integrated Core (MIC) processors renders complex effects using ray tracing, which can then be streamed to a variety of Intel clients, including small handheld devices. See for more information.

High End Visualizations on Mobile Devices
Hans-Christian Hoppe from Intel Labs demonstrates a mobile device using our advanced volume rendering technique to display streamed data to provide excellent realism on platforms with high end CPUs and integrated graphics. Such a use could enable visualization on the new era of mobile devices and hence new usage scenarios. For example, end users in the medical field can now use their smartphones to view tomography data in an operating theater. See for more information.

Technologies for Immersive Virtual Environments
Immersive virtual environments require integration and coordination of numerous real-time technologies including human-recognition (face, emotion, and gesture detection), 3D (capture, model generation, rendering), virtual-physical reality integration, real-time networking, sensor and data-flow management. Intel Labs is funding research at the University of Illinois on an integrated framework of such components to support development and evolution of immersive applications. Professor David Raila demonstrates standalone and integrated components of the framework based on recent research including: robust 3D non-rigid face tracking for hands-free gaming and 3D human tracking for game and avatar control. For more information, go to


    For more complete information about compiler optimizations, see our Optimization Notice.