The proliferation of Android* devices has opened a new world for game development, bringing with it both the touchscreen's “magical” sense of interaction, along with a raft of new challenges for designers. Throughout the history of video games, developers have had to adapt titles to mice, keyboards, joysticks, gamepads, and other types of input devices, sometimes with less than ideal results. Touchscreens add yet one more component to the mix, with the accompanying concern of associated costs for development and quality assurance, especially for existing titles. In the following pages we'll identify some of the common issues encountered, along with potential solutions.
The Fundamentals of Touch
When starting down this road, it is important to evaluate what strengths the respective interface medium brings to the table and how to play to those strengths. Some game designs will be more easily applicable than others, but even the most complex, touch-unfriendly title still has a spectrum of potential “good” and “poor” touch adaptations. So, aside from the compelling market share of touchscreen Android devices, what makes them great to use?
The simplest answer can be illustrated by the joy of a finger-painting child. It is highly intuitive for humans to reach out and touch things, to move them about with our fingers. We see the interface react and change based on our inputs, and we (as users) may immediately comprehend and appreciate the correlation between our action and the result on the screen. This illusion of tactility is the beauty of touch, and any game should first start from a position of trying to use that feeling, capturing that intuitive sense of motion and the user's simple joy at seeing an interface mechanic react the way they expect. A well-crafted touch interface can be a plaything and a game unto itself.
The interface mechanics should feel organic, natural, and always smooth. Frame rate becomes a core component of design at this point, making a simpler and more smoothly reactive interface preferred over a slower one with more “eye candy.” Where there is no touchscreen, a menu interface at 15 frames per second is practically as usable as one at 30 fps. On a touchscreen, however, the “chunky” nature of a low frame rate experience becomes psychologically distasteful, especially in scrolling and motions where smoothness is expected. Plus, on most devices the ability to record touch inputs is diminished with the reduced frame rate, leading to input errors and further increasing user frustration. But even without this, the psychological effect cannot be understated; a jerkier interface may not actually degrade its usability very much, but on a touchscreen will strongly detract from its appeal. This is the flipside of our intuitive, childlike comprehension of touchscreens: the user will not only have an immediate expectation of how things should work, but they will be disappointed if that expectation goes unmet.
In the real-world of game development, frame rates are difficult to guarantee, and Android devices include a very broad spectrum of relative device performance. Using chip-family detection to configure the game’s graphical settings to the performance of the device on first runtime can be helpful, along with simplifying those interface areas where touch scrolling and similar interactions may be common. Replacing a real-time 3D menu background with a quality screenshot can make an interface far more usable, with only a minor reduction in visual quality.
Beyond the amorphous considerations of user psychology, more concrete issues come to the fore. Primarily, where does the user touch the screen, and how are they intended to use those touch points? How many fingers will the player use at once? More than two is generally not advisable if the player is also expected to hold the device at the same time. Certain “two players on a tablet” games may call for many simultaneous touch points, and most high-end devices support ten or more points. However, some multi-touch devices only support two simultaneous points, which can be troublesome if the user wants to quickly transition one of two active points to a new location: if the second-touch “release” is not detected by the device before the “new” second-touch location, the new touch point may be ignored. Detecting the presence of multi-touch on a device is trivial, but detecting support for more than two points is currently problematic on Android, beyond the “android.hardware.touchscreen” manifest cases. Hopefully, this will improve in time.
Unless only a single device is being targeted, screen configuration and scale are important issues. An interface designed to feel comfortable on a 4:3 aspect device may be a little odd on a 16:9 device, and it can be challenging to adapt the same interface across all screen sizes from 4” up through recent 12” designs. Users may not be holding the device in the same way, their thumbs may naturally fall in different places, control locations for a tablet may be very poor on a phone (or vice versa), and at all times one wishes to avoid the user's hands visually blocking game play.
Testing and design iteration is the only way to truly address these issues and polish the results, but a number of options can help customize the experience to the individual user. On initial startup, a game can ask the users to place their thumbs where they fall naturally, and then configure the interface to those positions. Similarly, all major touch platforms feature screen size and DPI detection of some kind, allowing the developer to at least make “ballpark” decisions and select between “tablet” vs. “phone” interfaces. The best interfaces should be completely scalable; however, this is challenging as the range of device sizes is ever widening.
Zero Feedback: Buttons Without Tactility
Prior to recent years, every major video game interface in history has had a significant commonality: when a user pressed a button, they could feel that button depress in some way, giving them immediate knowledge that their command had been given, even before the resulting (acoustic and visual) feedback from the game. One of the major challenges of touchscreen game design lies in replacing this tried-and-true interface mechanic with a smooth pane of glass, devoid of any true tactile feedback. Consider the immediate losses from this change: every button on a keyboard or joystick has a natural boundary that the user can detect to help them maintain contact, or make an accurate choice of which button(s) to actuate. Additionally, most keys and buttons are slightly indented, allowing the user to naturally center their digits on the respective key and improve their accuracy. Without the direct “feel” of pushing down a button and immediately feeling it physically depress, the player is disconnected from the certainty of their action, and relies entirely on the input scan rate of the device and response of the game for feedback. Depending on acoustic and visual feedback alone will inherently add a latency, a lag time that can make certain types of games more challenging. Fighting games are perhaps some of the best known for their demanding button and joystick requirements, thanks to very fast multi-player action; but even the classic shooters of the arcade benefitted from the use of rapidly usable, tactile buttons.
So how then, do we make our titles work with only a smooth pane of glass? At the time of this writing, there is no ideal solution, only a process of adaptation. Dedicated consumer gaming devices still include “real” buttons, but the majority of touchscreen devices do not. Some platforms provide API support for external game controllers, both wired and bluetooth, which are worth considering. However, at the end of the day, our games must be purely touchscreen compatible to appeal to a reasonable market share. There are a few strategies to help mitigate the issue.
For one, some devices have a haptic feedback option available; basically the use of the device's vibrate function. Almost all phones support this, but a number of tablets do not. Supporting this option is a good idea and will improve navigation of menus and other areas, as it gives an immediate “feel” of confirmation without requiring sound (the user may be playing in a quiet public environment). But, this is not a magic bullet because it is not universally supported, the degree of feedback is limited to a slight vibration, and the latency is greater than that of a physical button press.
Acoustic feedback is also worthwhile, a small “schtick” noise accompanying successful use of a button can be helpful, even in addition to a haptic. It makes a good supplement to an interface, improving the user experience, but only helps to mitigate the issue without solving it. In practice, acoustic feedback can't be guaranteed if the user is playing in a quiet environment, and inherently has a degree of latency that can be problematic for games where reaction time to a random event is critical.
But the biggest interface challenge that remains unsolved by these techniques is that of keeping the user's fingers or thumbs on the correct interface locations during fast-paced game play. Due to the lack of “feel” of any virtual button, it is easy to lose track of button position, leaving the player frantically tapping for a fire button in the wrong place, creating a frustrating user experience. Think this through carefully during the initial design process. If possible, for a given game, try to avoid the scenario of critical fixed buttons entirely and make the important interface components follow the user's fingers as they drag around the screen. Particularly for “virtual joystick” type input areas, allow the user to continue dragging around the screen, even outside the assigned joystick area, as long as they maintain a continuous drag motion without “releasing” their touch from the screen.
Finally, there are benefits to using “long-touch” mechanics, where a user touches and holds a given location until some event occurs. This can be a context sensitive menu, or other advanced option. Generally, long-touches are not very intuitive, so are best left to more advanced and optional functionality within the game. That said, they can make some content far more convenient to access for those advanced users, or allow for an additional confirmation process for certain types of game play. Long-touches should always be accompanied by the maximum appropriate use of haptics, acoustic, and visual feedback, to be sure the user is aware that their long-touch usage has succeeded (particularly for the case of drag-and-drop type activity, etc.).
Some platforms have more of an issue with hardware variability than others, but unless you are only releasing for a single device family, this challenge will come up. While Android device consistency has improved over the last couple of years, there are still meaningful differences in the quality of touch controllers, the number of available multi-touch points, input control latency, audio latency, along with all the usual aspects of varied hardware designs. For hybrid devices, such as smartphones with keyboards, some thought should be given to handling the events sent by the Android OS when the keyboard becomes physically enabled (sliding out, etc.). This can give a transparent transition from on-screen keyboard usage, to a hardware keyboard input fields with a more maximized usage of game screen space.
Beyond this, developers of musical, timing-based games should be aware that certain platforms and devices may have poor internal audio latency, making music-to-action synchronization very difficult for owners of those devices. On some platforms, lower-level API options (such as Android's “native” activities) may make for improved input and audio latency with certain types of hardware, or future OS revisions. As time goes on, platform and hardware specific audio issues should be minimized by improved testing and expectation on the part of device manufacturers; but for the moment, developers should be aware of the possible issues.
Many of the fundamental tenets of game design remain true when applied to touchscreens, and should always be kept in mind:
Make the user feel brilliant and skillful.
In general, making your players feel awesome and incredibly skilled will result in an improved play experience. This is actually not the same as the game being “easy;” it's more like giving the impression of difficulty and yet being “deceptively easy”. Yes, there is also a place for mercilessly hard-core game play, and some of us still revere that standard; but it is less accessible, especially with all the other challenges inherent to touch screens and small devices. At least strive to make the user happy in the early game, which leads into..
The first few minutes of game experience are the most critical.
You want to impress your new players as quickly as possible. Fifteen minutes has long been the rule of thumb in game development, but the length is arguably much shorter in the mobile space where users have an even shorter attention span. Make the first two minutes the best that you can. If your new player is spending those minutes being frustrated while learning your obfuscated interface, fewer players will stick around to find your “good” game play. Granted, we don't all make trivial, casual games; but even with an incredibly complex title, there is still a “best-case” balance to be found in how you introduce the user to your game and teach them to play.
Make the most cohesive, native-feeling game experience you can.
Regardless of which platform was the original target for your game, when you bring it to touch, try to make it feel as though it was built from the very beginning as a touchscreen game. Trivially bolted-on touch interfaces usually give a poor user experience and will be reflected in sales and reviews. Real-world game development has limitations, and sometimes the time or budget are not available for a truly “native” re-tooling of the game mechanics and interface. But even within these limitations, strive to make the most native-feeling and polished experience you can, and the results will pay off in the long run.
A Look Ahead
Lastly, a few thoughts for the future. The world of mobile devices is quickly evolving, and new interfaces and use cases will emerge in the near future. It is already possible to plug a standard phone into a TV via HDMI, controlling a TV “display” with the phone as a “controller.” This case will only become more convenient with the evolution of wireless HDMI standards. Additionally, some phones already have integrated small projectors, potentially allowing big screen game experiences with the use of a phone, a white wall, and a compact bluetooth game controller. High-performance Android-powered TVs also exist, along with various set-top boxes, all of which have the potential to increase the market size for your game.
Beyond this, it's possible that “screens” themselves may eventually give way to virtualized interfaces displayed within specialized glasses, or even projected onto the user's retina. These potential types of “wearable computing” are worth considering , as they make the case for more Xbox* Kinect*-style “gestural” interfaces, which would still be compatible with many touch UI designs, but would have even less physical feedback mechanics (no haptics, etc.).
Wherever the future will take mobile gaming, it's sure to be an incredible journey.
John Bergman - CEO, Guild Software Inc. An experienced online-game executive, developer and designer, John founded Guild Software in 1998 for the sole purpose of pursuing the next generation in persistent, evolving online worlds. After shepherding the company through the process of developing a complete MMO engine from scratch, Guild Software successfully launched "Vendetta Online" across major retail in 2004.
More recently, the title gained added fame as the first traditional PC MMO to transition to mobile, launching with extensive TV coverage from partners Verizon and Motorola in 2011. Outside of his work life, John also enjoys mentoring new game developers and would-be startups, and sits on the game-related advisory boards of several universities.
INFORMATION IN THIS DOCUMENT IS PROVIDED IN CONNECTION WITH INTEL PRODUCTS. NO LICENSE, EXPRESS OR IMPLIED, BY ESTOPPEL OR OTHERWISE, TO ANY INTELLECTUAL PROPERTY RIGHTS IS GRANTED BY THIS DOCUMENT. EXCEPT AS PROVIDED IN INTEL'S TERMS AND CONDITIONS OF SALE FOR SUCH PRODUCTS, INTEL ASSUMES NO LIABILITY WHATSOEVER AND INTEL DISCLAIMS ANY EXPRESS OR IMPLIED WARRANTY, RELATING TO SALE AND/OR USE OF INTEL PRODUCTS INCLUDING LIABILITY OR WARRANTIES RELATING TO FITNESS FOR A PARTICULAR PURPOSE, MERCHANTABILITY, OR INFRINGEMENT OF ANY PATENT, COPYRIGHT OR OTHER INTELLECTUAL PROPERTY RIGHT.
UNLESS OTHERWISE AGREED IN WRITING BY INTEL, THE INTEL PRODUCTS ARE NOT DESIGNED NOR INTENDED FOR ANY APPLICATION IN WHICH THE FAILURE OF THE INTEL PRODUCT COULD CREATE A SITUATION WHERE PERSONAL INJURY OR DEATH MAY OCCUR.
Intel may make changes to specifications and product descriptions at any time, without notice. Designers must not rely on the absence or characteristics of any features or instructions marked "reserved" or "undefined." Intel reserves these for future definition and shall have no responsibility whatsoever for conflicts or incompatibilities arising from future changes to them. The information here is subject to change without notice. Do not finalize a design with this information.
The products described in this document may contain design defects or errors known as errata which may cause the product to deviate from published specifications. Current characterized errata are available on request.
Contact your local Intel sales office or your distributor to obtain the latest specifications and before placing your product order.
Copies of documents which have an order number and are referenced in this document, or other Intel literature, may be obtained by calling 1-800-548-4725, or go to: http://www.intel.com/design/literature.htm
Software and workloads used in performance tests may have been optimized for performance only on Intel microprocessors. Performance tests, such as SYSmark and MobileMark, are measured using specific computer systems, components, software, operations, and functions. Any change to any of those factors may cause the results to vary. You should consult other information and performance tests to assist you infully evaluating your contemplated purchases, including the performance of that product when combined with other products.
Any software source code reprinted in this document is furnished under a software license and may only be used or copied in accordance with the terms of that license.
Intel and the Intel logo are trademarks of Intel Corporation in the US and/or other countries.
Copyright © 2012 Intel Corporation. All rights reserved.
*Other names and brands may be claimed as the property of others.