INTEL® DEVELOPER ZONE

Intel® Perceptual Computing Showcase


NuiLogix - Perceptual Computing for Network-Based Device Control

Bryan Brown
United States

NuiLogix is a middleware app that adapts emerging perceptual computing technologies to device and machine control applications. It transports natural interaction sensor data (body coordinate, gesture and speech recognition) over an Ethernet network for remote monitoring and control applications. NuiLogix uses industry-standard Ethernet-based communication protocols like Modbus/TCP to accomplish this goal. Modbus/TCP is an open data communications standard that is ubiquitous in the field of industrial automation. There are hundreds of off-the-shelf data acquisition and control devices available on the market that support a Modbus/TCP server interface, and countless thousands of deployed devices worldwide communicating on Modbus networks. The goal of NuiLogix is to introduce emerging natural interaction sensor technologies to fields and applications that currently rely on conventional "hands-on" user interface techniques for machine and device control.

App Specifications

The NuiLogix prototype is built on a client-server architecture. It supports a continuously running Modbus/TCP server thread that responds to queries from local and remote clients. A Modbus/TCP client interface is also implemented, allowing NuiLogix to "push" data down to a local or remote server.The logic layer references a custom class library that wraps the .NET DLL included in the Intel Perceptual Computing SDK (libpxcclr.dll). The wrapper approach allows NuiLogix to work with other NUI devices and present data over a common interface. The logic layer acquires coordinate, gesture and voice data streams from the Creative gesture camera, translates this data into numerical representations that are appropriate for control applications, and maps this data to Modbus register addresses.NuiLogix is a C#/WPF application developed in Microsoft Visual Studio 2010 Professional. For the low-level Modbus implementation we use a third-party driver. For the prototype, we created a lightweight user interface scaled to allow multiple apps to be displayed on the screen simultaneously for demonstration purposes. The UI allows different combinations of natural interaction data to be selected for control and monitoring applications.