iRacing and the Ultrabook - integrating touch and tilt sensor support in a cutting edge videogame

Download Source Code


touchtiltsamples.zip (20.65 KB)

Recently, iRacing.com partnered with Intel to see just what could be achieved by porting a top-of-the-line racecar simulator to the Intel Ultrabook™ platform. I was the lucky one selected to make it all happen. While this project was fun and educational, it was not as easy as I hoped. Not because the task is that difficult, but because the cutting edge APIs I was using were not sufficiently documented. I’m hoping to rectify that problem now by telling you about the problems I faced so you can avoid them in your projects.

We divided the iRacing/Intel Ultrabook project into three parts: get our game to run well on the Ultrabook platform, support the new available touch screens, and support the new advanced sensor package. The first challenge, getting iRacing to run on the Ultrabook platform, was really a freebie—it just worked out of the box. The only change we made was to dial back the graphics frame rate when on battery power to maximize battery life.

The touch screen interface project gave me a chance to really reevaluate how users interact with iRacing. Historically, iRacing relied heavily on keyboard and joystick input and left the mouse (and touch screen) on the side. Adding touch had the added benefit of enhancing the experience for all users by providing better mouse support in the user interface. To offer the best experience for tablet users, I made sure to remove any dependency on joystick or keyboard input. Now users can play iRacing from start to finish without ever having a keyboard attached to their device. This provided additional benefits to our users, beyond the touch interface, by minimizing what hardware they needed to use our game.

When designing the touch driving controls, I went through several versions, starting with the simple idea of letting the user touch the screen anywhere without restriction. I ended up limiting where users could touch and created several touch zones to help make things more clear. I also provided lots of visual cues because users tended to feel lost if they did not have clear direction as to what to do. The key point I took away from touch was that careful design was paramount. We went through many attempts that appeared good on paper but completely failed to work with our testers. There’s no such thing as spending too much time testing the interface when it comes to touch.

The biggest design issue with tilt was never knowing how users are going to pick up their devices. On top of that, the orientation sensors are located in different parts of the device and at different orientations depending on if it is a tablet, laptop, or convertible. In the end, I came up with a system that worked quite well. My solution was to calibrate the system based on the initial orientation when users turned iRacing on, and provide a simple ‘zero’ button on the display to allow them to shift their grip at any time while playing. Users can twist the laptop or tablet like a steering wheel almost a full 180 degrees to the left or right and still use 90 degrees of rotation forwards and back to accommodate the throttle. The other design hurdle was that tilting the device makes it difficult to keep your eye on the display, so I discovered there’s a trade-off between increased tilt sensitivity and decreased screen visibility.

I decided to base my touch code around the WM_TOUCH interface. This provided multi-touch support with a simple API and backwards compatibility with Windows* 7 touch hardware http://msdn.microsoft.com/en-us/library/windows/desktop/dd562197(v=vs.85).aspx.

This API mirrors the WM_MOUSE event API that has been in Windows since the beginning. Many other tutorials go into the details of how WM_TOUCH works, so I will not explain it here. But I do want to cover some of the challenges most demos don’t show. So I put together a simple test application that allows users to draw on the display using as many fingers as their touch hardware supports. Below are several of the issues I ran into while developing touch support in iRacing and some pointers for you to avoid them.

The first issue I ran into with my touch implementation was how to shut off the default Windows touch visualizations. Those are the little circles, squares, and dashes that users can draw when they complete some touch gesture. These get in the way when you are developing your own interface and not using the gesture engine provided by Windows. The only reliable way I found to turn them off was with the SystemParametersInfo(SPI_SETGESTUREVISUALIZATION) call. This turns on/off the global visualization setting found in Control Panel->Pen and Touch->Show visual feedback when touching the screen. This call will turn off gesture visualizations for all applications until you reboot your computer. It is possible to read the existing setting and restore it at shutdown, but this is not an ideal solution.

The second issue I had was in reliably detecting that touch hardware was available. It turns out that the Windows Pen API call GetSystemMetrics(SM_DIGITIZER) will let you know if a touch window is available. However, it will not indicate what display is touch capable, so it is still possible for your application to not be able to receive touch input if the user is on a multi-monitor setup.

The third issue I had was with the Windows DPI setting. Windows has had an option for a long time that allows you to scale up the UI to improve visibility. This option has always relied on the application being 'DPI aware' to work properly http://msdn.microsoft.com/en-us/library/windows/desktop/dd464646(v=vs.85).aspx. Since this was never a well supported feature, Windows Vista* introduced a new mode that forced applications that were not DPI aware to scale up the UI anyway. This works fine, unless you are using touch. In that case, the touch messages are not properly translated from raw desktop coordinates to scaled Windows coordinates. I had to call PhysicalToLogicalPoint() followed by ScreenToClient()to translate the touch coordinates into something that mimics the mouse coordinates from the WM_MOUSE events. The other solution is to properly support the DPI aware API, but if you have a DirectX* application, it may not need to be DPI aware and implementing a legacy program may not be so simple anyway.

Microsoft added support for tilt sensors (accelerometers) in Windows 7 through the sensor API http://msdn.microsoft.com/en-us/library/windows/desktop/dd318953(v=vs.85).aspx. In Windows 8 they further expanded tilt support by adding a sensor fusion-based orientation sensor. This sensor combines several sensors together for a more stable orientation value. In my testing the orientation sensor was superior to a straight accelerometer, with smoother and more stable outputs. For our application we simply needed to find the tilt angle of the device and used that directly to control the steering and gas/brake of our vehicle. I have included a simple tilt sensor application that will convert your device orientation to an angle that can be used directly for controlling your game.

I hate to admit publicly, but my biggest headache with tilt was a simple beginner mistake. When I started the tilt project, I immediately wrote a demo app that queried all the tilt sensor interfaces and logged them to the display. But the app had horrible latency and nothing seemed to work properly. I ended up rewriting it three different times and stressed over it for several weeks before discovering the issue. When initializing COM, I had called the old CoInitialize() function and not the CoInitializeEx() function. This resulted in COM not having multiple thread support and everything going to the dogs. Once I initialized COM properly, everything came up roses.

Another issue was with using the proper interface. The Windows sensor API supports both a polled and callback interface. I initially used the polling interface because it was simpler and fit with our existing control code. But after doing some testing in the field on different hardware I discovered that not all sensor drivers like being polled. In the end I had to bite the bullet and implement the callbacks.

Windows 8 has a new feature that will auto rotate your display as you move your device around. This is not very useful when developing a tilt-enabled application. The latest platform SDK has an undocumented call, SetDisplayAutoRotationPreferences (), that turns this behavior off. You can force the orientation to landscape or portrait, but I haven’t found a way to detect the current orientation. So on a tablet, you may end up inverting the display with this call.

Finally, I found very little useful information on how to extract orientation as an angle from the available sensors. It turns out that the very handy atan2() function is a simple way to deal with this. Here is a brief overview of how it works. You can find a detailed example of this in my sample code. If you assume x points to the right, y points away from the viewer, and z points up, then you can use the following equations to get the orientation. Roll is atan2(x, -z) and pitch is atan2(y, sqrt(x*x+z*z)). We feed in both the x and z vectors into our pitch measurement so that we can still detect pitch when there’s a significant amount of roll. If you combine this with code that rotates the y and z axes based on some initial reference orientation, you can eliminate any confusion about which way the sensor is mounted in your device.

References


http://www.iracing.com
http://software.intel.com/en-us/articles/touch-and-gesture-for-windows-8-desktop/
http://software.intel.com/en-us/articles/designing-for-ultrabook-devices-and-touch-enabled-desktop-applications/
http://software.intel.com/en-us/articles/ultrabook-and-tablet-windows-8-sensors-development-guide/

Для получения подробной информации о возможностях оптимизации компилятора обратитесь к нашему Уведомлению об оптимизации.