By Benjamin A. Lieberman, Ph.D.
The personal computing environment has become vast and complex. Offerings range from “wearable” computers to mobile devices and high-performance desktop computer systems. These various platforms offer a wide variety of options to software developers but also often represent trade-offs between power consumption, screen size, touch-enabled interfaces, and portability. This is especially true for mobile game developers, who until now have only had tablet-sized screens (or smaller) within which to create their complex worlds. The highly mobile nature of these devices constrained what could be done on limited screen space. With the introduction of the portable all-in-one (pAIO) device, exemplified by the portable AIO computing system, a wide range of new application and game development opportunities have arisen.
Note: AIO PCs consist of a monitor (ranging from 18 to 27 inches) with a motherboard built behind the screen. They have high-performing processors, full high-definition (HD) resolution (1080p/720p), and a Bluetooth* wireless keyboard and mouse and support a built-in high-capacity battery, making the device easily portable around the home. AIOs are one of the fastest-growing categories of PCs and are estimated to sell 20.2 million units this year, according to the research firm IDC. A big reason for their popularity is that AIOs let you do everything you expect in a PC—keep track of household expenses, do homework, play interactive games, browse the Web, chat with friends, watch TV and movies—all in a portable, clutter-free, stylish design, with a big bright display and no sacrifice of performance.
The pAIO devices enable game and application developers to take full advantage of the larger screen real estate, high-performance networking capability, and a multitouch user interface (UI), all in a slim-line, portable device that can be used in both tilted and flat-surface modes. The internal battery ensures continuity of experience, and built-in wireless networking allows roaming from one home location to another. The large HD display is supported by high-end graphics processors as well as a full multitouch-enabled user experience (UX). All of these features together offer an attractive package for developers looking to break free of the constraints of a single-user mobile device.
Portable All-in-Ones in Game Play
To illustrate the capabilities of pAIO devices, the UX development company IdentityMine chose to create a multiplayer game called Air Hockey, modeled after a classic air-hockey arcade game (see Figure 1). When designing this game, the team chose to take advantage of the pAIO multitouch interface to enable players to directly interact with the puck via an on-screen “handle” that the player can use to capture, drag, shoot, and block against an opponent on the opposite side of the screen. The game play is remarkably realistic, right down to the ability to “grab” and flick the puck into the opponent’s goal.
Figure 1. IdentityMine Air Hockey
As shown in Figure 2, the system developers effectively leveraged the Windows* 8 operating system (including the Microsoft* XNA* Game Studio framework) and associated software libraries as well as the open source Farseer* Physics Engine for collision detection and response.
Figure 2. Air Hockey application architecture
The system resources were organized by game images and sounds (including a miniature Zamboni-like machine that would resurface the ice during “intermission”—see Figure 3) using standard Microsoft .NET resource management. The overall application was then built and packaged for distribution.
Figure 3. Intermission—Zamboni time!
Performance and Architecture
This deceptively simple game demonstrated several key architectural and performance considerations for developing games for the pAIO platform. In particular, the need to rapidly perform both physics calculations and screen rendering simultaneously presented a significant challenge for the development team. In one case, a cosmetic feature of the game caused surprising issues; when a player moved the puck, scratches were drawn into the “ice” to simulate the movement across the surface. Although not affecting game play, the loss of this feature would reduce the level of realism, so the development team concentrated efforts on improving the response time of interface-to-player movements.
One key step in improving performance was to separate the physics calculations (e.g., puck speed, direction, rebounding) from the touch-management and screen-rendering process threads using the
DispatcherTimer class provided by the Windows operating system management framework. The first attempt to solve the timing problem between the physics and the screen-refresh frames-per-second (FPS) involved separating the physics calculations to one scheduled thread and the scratch drawing into another separate thread, giving the physics calculations the higher priority. By splitting up these operations, they could also be run in parallel on different parts of the system processors, thereby allowing faster response time on screen.
However, thread execution turned out to be inconsistent with the FPS the player saw, so the game play experience wasn’t improved. The development team next used a threadless implementation whereby the screen-rendering system became the driving force behind game management. Operating under the assumption that screen rendering would operate at a relatively constant frame rate, the system updated the physics system N times for each screen render, so the physics updates always occurred N * FPS times per second, with a constant time-step used for each physics update. In theory, this could mean that the physics could become unrealistically slow if the FPS were to drop substantially, but it’s preferable to the alternative, where the simulation’s time-step falls out of line with the actual passage of time, leading to an unstable simulation.
The application uses the Windows Presentation Foundation (WPF) to render the visual interface. The team used the Microsoft XNA Game Studio framework to manage the game sounds and basic data types that the Farseer Physics Engine used. Within WPF, the puck and paddles are essentially just
Image objects with render transforms applied based on their position and orientation in the Farseer-simulated world. This technique is recommended to game developers who are going to be working on these larger-scale displays to avoid unacceptable lag resulting from complex game play.
As for the multitouch aspect of Air Hockey, one optimization was made to accept input only from touches that occurred over the game paddles. This eliminated the need to check for inadvertent touches, such as a palm or thumb, thereby allowing for less processing resources. However, this is not a general solution, so game and other application developers will need to address this problem during system design. This is particularly true where players will be required to interact with the screen controls for extended periods of time (e.g., first-person action games). Other UI studies have shown that individuals will rest the palm or other parts of the hand when interacting with a touchscreen for long periods of time. The recommendation from the development team is to create controls that respond to short-duration events, such as tap or swipe gestures, rather than require extended tiring interactions.
As part of overall multitouch performance, it was noted during game testing that players tended to not only push the paddle forward during game-play but also “flick-retract,” where the paddle ended movement going backwards. This observation required the development team to treat all motions equally to keep the game true to player expectations. In the game world, there are no discretely defined gestures (as would be found in the Apple* iOS or Windows 8 software development kits) other than the basic built-in tap gesture for activating the menu buttons. Rather than referring to a predefined flick or fling gesture, the game developers chose to deal with the literal motion of the player’s finger. This caused trouble within the physics simulation, given that the touch point is anchored to the paddle by a rigid spring (for system stability reasons). If the player were to flick his or her finger back and forth quickly enough, the spring would resonate opposite the actual movement, causing a disruption in the overall experience.
Therefore, the team minimized the flicking motion issues by finding the right degree of elasticity for the touch “spring.” The elasticity couldn’t be too responsive to a player’s motion, because it would oscillate around the position where the user would drag his or her finger. In some cases, the visual would appear to move backwards toward the user, because at the last moment of input during the flicking motion, the touchscreen would be observing a user’s finger dragging back slightly as he or she lifted a finger off the screen. In addition, low responsiveness for the elasticity would result in a greater amount of lag between where the objects are on the screen and where the user is attempting to drag them. The team found a good medium for the game play by adjusting the spring elasticity and observing the resulting experience.
Finally, the development team noticed issues with extended touch events (e.g., long touch) that had negative impacts on game play, particularly those gestures that were recognized as an operating system event, such as program switch or closure. Two such cases—the right-edge-in swipe (set charm) or top-down swipe (close application)—were interfering with game play and required special treatment.
Note: Microsoft has noticed this issue and is creating a special “kiosk” mode in the Windows 8.1 release.
One further observation worth noting from Air Hockey game development was to avoid where the player was required to have frequent small-motion movements, such as scribbling. Instead, game play should focus on recognition of regions where each player is touching a particular part of the screen, especially when multiple players are acting at the same time. This will focus processing resources on game play rather than on determining correct player identification. The drawback to this approach is a limit for the on-screen region within which a particular player can interact, such as the small hockey paddle. Finally, the development team noted that there are significant issues with long-touch events that affected the physics calculations and overall system performance as well as extraneous input from the touch-recognition components. All of these factors will need to be considered during game development to maintain a high level of realism and expected game play.
Air Hockey was developed using Windows 8 and the Microsoft .NET application framework. There were several advantages to using this approach over dedicated game console development (e.g., Xbox*). In particular, PC development is better supported, with familiar languages and framework packages. There is also a wider pool of talent from which to draw, and many high-end UX graphics-conversion tools are available. Moreover, the cost of testing hardware and certification cycles for console games can be high compared to game development for PCs.
Other advantages that Windows 8 offers include multiresolution support for a variety of devices; snap-view mode, which allows an application user to return to the screen last viewed (e.g., a difficult-to-access game level); and the ability to have multiple processes running in simultaneous windows. These features are not available to console game players or mobile tablet users.
The pAIO, multitouch-enabled computing device represents the next step in portable, interactive computing. The advantages of long battery life, high storage capacity, extended HD screen, and a multitouch interface provides developers with a unique opportunity to craft games, applications, and multimedia entertainment for the entire family.
IdentityMine is focused on UX, from both a design and an engineering perspective. Specializing in natural UIs, IdentityMine is experienced in building rich, custom applications across a variety of industry verticals and enterprises, including game design, retail, media, aviation, automotive, and hospitality.
The IdentityMine team has extensive experience with large-format displays, projection walls, tablets, touchscreens, and several mobile device platforms. Using a variety of input modalities, including multitouch, voice, and camera-based vision, IdentityMine delivers solutions for technologies such as Windows 8, Windows Phone, Xbox LIVE* applications, and Kinect* for Windows.
- Farseer Physics Engine for management of collision detection and response: http://farseerphysics.codeplex.com
- Development resources:
- Codeplex (http://www.codeplex.com):Microsoft’s free, open source hosting site, where you can find things like the physics engine used in Air Hockey
- MSDN Code Gallery (http://code.msdn.microsoft.com): Microsoft’s code examples showcasing the features of its platform
- Windows 8 Game Development Guide: http://msdn.microsoft.com/en-us/library/windows/apps/hh452744.aspx
- GameDev.net (http://www.gamedev.net/page/index.html): A site with resources on general game development
- Review stylus research from Intel’s Dr. Daria Loi: “Pointing the Way: Designing a Stylus-driven Device in a Mobile World,” http://software.intel.com/en-us/articles/pointing-the-way-designing-a-stylus-driven-device-in-a-mobile-world
- Video: Evan Lang of Identity Mine Discusses their Air Hockey Game for the All-in-One PC