Implementing Touch and Sensors for Windows* 8 Desktop Games: Confetti Interactive’s experiences developing "Blackfoot Blade"

Download Article

Implementing Touch and Sensors for Windows* 8 Desktop Games: Confetti Interactive’s experiences developing "Blackfoot Blade" [PDF 408KB]

Abstract

Blackfoot Blade from Confetti Interactive* is a 3rd person helicopter game that takes advantage of the touch and sensor capabilities of Ultrabook™ PCs running Microsoft Windows 8. Initial development came together quickly, but the team encountered some issues that would need to be optimized around the way they chose to implement both touch and sensors.

Tilting the Ultrabook to make the helicopter turn, strafe, advance, and retreat go with the expectation of immediate responsiveness. Firing missiles or machine guns with touch widgets on the screen is only fun if the response is immediate and consistent.

The implementation for touch and sensors is new for many PC game vendors and filtering out the proper API’s and best practices took some experimentation to get it right. This case study shares some interesting findings for what worked and didn’t work. Specifically we look at initial choice of WM_GESTURE vs. the WM_TOUCH API and why WM_TOUCH was deemed the superior choice for this project and why we used Polling calls instead of Asynchronous calls for our sensor implementation.

Introducing Blackfoot Blade

Blackfoot Blade gives the gamer a battle against the ultimate enemy with the most advanced helicopter ever developed along with shockingly realistic landscapes and action-packed levels.

Image is from Confetti Interactive Webpage

Blackfoot Blade comes equipped with heat-seeking missiles and machine guns that demolish reinforced steel and enemy militia. The goal was to design an interface where the user could use only the sensors and touch screen on an Ultrabook for flight and weapon controls and not have to use the keyboard or mouse. Tilting the Ultrabook mimics the flight stick controls of a real helicopter. A touch-first interface allows the user to control the missiles, machine gun and flare weapons by touching weapon widgets on the touch screen along with some additional raise, lower, and strafe movement with finger panning. The game is also keyboard compatible for desktop play too. And although it was designed for the Windows 8 Ultrabook or Tablet, it runs on Windows 7 or 8 PCs with DirectX 11 capable GPUs.

Throughout the development process of Blackfoot Blade, Intel engineers worked with Confetti software developers to create an enticing game that takes advantage of the new touch and sensor capabilities of the 3rd Generation Intel® Core™ platform (Intel® architecture code name Ivy Bridge). The target operating environment was Windows 8 Desktop

Blackfoot Blade was designed to use touch and sensor capabilities from the beginning. This brought on a bit of a learning curve, however, as this was Confetti’s first Windows 8 desktop app involving touch and sensors. The following paragraphs detail the challenges that Intel and Confetti engineers encountered and overcame while implementing the new touch and sensor capabilities.

Implementing Touch: WM_GESTURE vs. WM_TOUCH

Two touch event models are available to game developers writing for Windows 8 desktop. The two models are mutually exclusive and a game must declare which of the two it is prepared to handle. WM_GESTURE events are designed to interpret common touch-based gestures and turn them into distinct events (tap, pan, zoom, etc.) for the game to respond to. WM_TOUCH events provide an increased level of detail on individual touch contacts and movement, which allows the game to interpret gestures directly, but requires increased code complexity.

One of the first challenges the team encountered was properly implementing touch. Functionality and responsiveness are the top considerations. Both ‘panning’ and ‘single finger touch’ are required. Apps built using the WM_GESTURE event model have access to a built-in panning gesture recognizer. Also, the ‘single finger tap’ and ‘press and hold’ gestures are remapped to left and right mouse button press events, respectively, allowing the game to leverage existing event processing for mouse events. By using WM_GESTURE, we would save a lot of time since implementing WM_TOUCH requires working with raw inputs and messaging in order to interpret gesture and touch events.

For these reasons, we started with WM_GESTURE. What we found out was that WM_GESTURE has to determine if it is a single press event or a pan event before it can process and assign an event message enumeration, introducing a delay in some events. If it is a pan event then the WM_GESTURE message processes it. If it is a tap or press and hold event the event is passed off to the mouse message events. For the gesture API to determine if it is a tap or press and hold event it has to rule out that it is a panning, zoom, or rotate event. All this extra processing causes lag. It became clear that this lag made WM_GESTURE unsuitable for the game, so then we took another look at WM_TOUCH.

WM_TOUCH requires taking every event and processing it. The level of detail available is very high. If desired, a game can track each contact and movement per finger, up to the number of touch points the touchscreen can report. Implementing WM_TOUCH results in a much finer control for touch actions; the messaging is immediate and therefore very responsive. With WM_TOUCH you can easily override the ‘single finger tap’ and ‘press and hold’ so they don’t default to the mouse button messages. The downside is that this work takes more implementation time; fast response is the upside. The resulting flexibility in the implementation choices was deemed by Confetti to be far superior to the WM_GESTURE implementation.

Once the decision to use WM_TOUCH was made, we defined the functionality requirements and implemented the touch event interpreter to handle the touch requirements. The new implementation passed our tests and so we integrated it back into Confetti’s code. Both Intel and Confetti engineers were very pleased with the changes. A benefit of using WM_TOUCH is that while you can implement your interpreter to cover all the things that WM_GESTURE offers, there is no constraint to implementing any of them and you are free to extend your concept of what gestures are to whatever suits your own functional requirements.

Implementing Sensors

Ultrabook PCs come with a variety of sensors. For Blackfoot Blade we needed to access the inclinometer sensor for tilt feedback. The inclinometer is a fusion sensor which uses multiple physical sensors to report tilt data. We needed x, y, and z tilt data for the turn, strafe, and advance and retreat controls for the helicopter. To work best with the Confetti engineers, Intel engineers needed to provide a project that could be easily dropped into the game and acquire the tilt data that could interact with the physics engine of the game.

Intel engineers started with the Microsoft Sensor Manager Interface provided as part of the Microsoft Windows 8 SDK. In order to get a comprehensive understanding regarding how sensors on Windows 8 work, we referred to a guide which includes a sensor manager sample that other Intel engineers had written called Accessing Microsoft Windows 8 Desktop Sensors.

Initial coding progressed smoothly and quickly, but Microsoft sample code is based on an asynchronous data design. When our implementation was run in a simple test application, the sensors would respond quickly with minimal lag. As soon as it was integrated into the Blackfoot Blade project the sensors appeared to become unresponsive. Initial testing focused on checking the sensitivity and the response times. The sensor parameters proved to be a false trail as these appeared to be within expected values. Time stamps attached to the events suggested the sensors were being read multiple times for every game frame. Similar problems were revealed when the original test application was modified to mimic running alongside a more graphically intensive workload. As the frame rate dropped, the sensors became less responsive even though the update messages should have been independent of the game loop.

Working within the test application, the team started to isolate the cause of the apparent input lag on the sensors. Debug messages were added to track when the input data was received. We were then able to determine that as the application’s frame rate reduced, the messages became grouped into bursts with the duration between bursts related to the frame rate. Analyzing the call stack during an event notification showed they were actually coming through Window’s message dispatch system although they never triggered a message to the Windows message procedure.

Instrumenting the actual drivers used for the sensors further showed that the driver was submitting notification events at regular intervals that matched the time stamps attached to the readings. It became apparent the Windows message loop was causing the messages to stack up. During a typical game, the application’s message loop would be processed once per frame, which at 30 frames per second is once every 33ms, while the sensors might be sending notifications at 2 to 3 times that rate. To further complicate matters, we found that if multiple sensors were being read, the driver would submit data from them in an interleaved manner but they would appear in batches when processed by the application. For example, this could manifest as multiple accelerometer messages arriving on one frame and all the inclinometer data arriving on a subsequent frame. Between the driver submitting the data and the application receiving it, the Windows kernel was doing various preprocessing of the data which was adding to the perceived lag.

Given that the driver appeared to be sending data at the requested intervals and Windows internal processing of the events was introducing problems, the decision was made to re-architect the engine to move the most of the sensor input to a polling approach. With the problem isolated, the code redesign went smoothly. Infrequent event notifications for system changes like sensors becoming removed was still handled using the original system. The Windows sensor API allowed individual asynchronous data changed messages to be turned off, and that data was instead read using polling methods. A more traditional game engine approach was taken where the sensors were polled once per frame directly. Care still had to be taken to ensure a potential delay of a frame or two before a sensor removed/inserted message was processed, but that did not cause unreliable behavior when polling. The simplification in other parts of the code made the tradeoff worthwhile.

Once the re-architecting of the test application was complete, it was stress tested with multiple sensors sending information continually. The test application was forced to run at various rates both below and above those required for a full game to ensure response times remained consistent. The sensor time stamps were plotted against the current frame and the result was smooth, continually changing data that closely matched the polling interval. The new sensor framework became a drop-in replacement for the original system and immediately fixed the unresponsiveness seen previously. Sample code that was used can be downloaded from github.

Conclusion

Confetti Interactive faced issues while implementing touch and sensor capabilities. In both cases they had to abandon their original implementation and re-architect their software in order to correct lag (touch) and unwanted batching of data (sensors.)

For implementing touch, it was determined that using WM_GESTURE resulted in lag, therefore not good to use for games. The team abandoned WM_GESTURE and implemented touch using WM_TOUCH instead.

When implementing the code to take advantage of the sensors, the team first did so using the asynchronous model as provided in the Microsoft Sample Code. This unfortunately, when used in a game, resulted in the input becoming too slow and unresponsive so the initial sensor code had to be replaced with a polling model instead.

In spite of the challenges posed with implementing touch and sensors, the developers felt that if familiar with the code already, the implementation could be duplicated in a few days. While it took four months to complete the initial development of Blackfoot Blade, next projects will take much less time using the knowledge that was gained from these initial efforts.

In the end, the game turned out great – it has been enthusiastically viewed on demo floors.

Acknowledgements

Special thanks to Hugh Smith (Intel) and Leigh Davies (Intel) who worked closely with Confetti Interactive throughout their development cycle

References

Intel Resources

Microsoft Resources

Confetti Interactive Website

About the Author

Gael Hofemeier is an Application Engineer at Intel Corporation. She is currently focused on Use Case References for Windows 8 Sensors and Intel® Active Management Technology.

Intel, the Intel logo and Ultrabook are trademarks of Intel Corporation in the U.S. and other countries.

*Other names and brands may be claimed as the property of others

Copyright© 2013 Intel Corporation. All rights reserved.

Performance Notice

For more complete information about performance and benchmark results, visit www.intel.com/benchmarks

For more complete information about compiler optimizations, see our Optimization Notice.