Application input modes are crucial to accessibility

Think of how many times you’ve picked up your mobile device today to do something. If you’re anything like the “average”  user, you might be surprised at how often you actually do this. A recent study found that we check our mobile devices upwards of 100 times (or more) throughout the day:

  • People look at their phones the most between peak hours of 5pm and 8pm
  • During these hours, 75% of users are actively using their devices
  • This drops to a quarter of active users between 3am and 5am 
  • Average number of times a user checks their phone is nine times an hour
  • This increases to once every six seconds for 'highest frequency users'

While there is probably room to be concerned about how many times a day we’re checking our mobiles (seriously), this also brings up the crucial issue of accessibility in applications. User experience thought leader Luke Wroblewski gives a quick video lesson below on the importance of intuitive input modes:

As Luke points out in the video, we do definitely use our mobile devices a lot throughout the day, and it’s often in situations where we’re looking for a distraction: lines, waiting rooms, etc. There are other things that are demanding our attention, so a tricky interface that needs a lot of babying isn’t going to last very long. Input modes allow users to jump right into getting things done is where developers have to aim their UX efforts, skipping as many unnecessary steps as possible to get to the task at hand.

What makes an app something we keep using, let alone let stay on our devices? We’ve probably all been through this drill: look for an app, download it, give it about one minute to perform the task we’ve downloaded it for (tops), and then essentially fish or cut bait. Our apps have a finite amount of time to impress us with their functionality, and we are fully aware that there are many, MANY other apps waiting in the wings that could possibly step in to do the job better.

Luke gives several examples of good input modes in applications in his video, including Google+ and Twitter, which both make it as easy as possible for the user to instantly start sharing all sorts of content.  He points out three top things to remember:

  • Autofocus on primary input controls
  • Surface suggestions that prompt input
  • Augment primary input controls

Luke Wroblewski’s series of touch design principles videos talks about this very topic in Touch Design Principles Part 3: Gestures and Discoverability:

“Touch gestures are implemented without on-screen control prompts (arrows, icons, etc.). They are, quite literally, invisible. How can we help people discover the touch options available to them within programs? There’s no visible interface element saying “hey, click on me and I’ll help you do something!””

Luke talks about several different visual cues that can help people discover touch-based input actions, such as removing other options, faded edges, animations, and “just in time” educational pop-ups that are triggered by a certain action. Of course, since each OS is different, it’s important to familiarize yourself with standard controls so they don’t conflict with what you’re going to ask your users to do within the app.

There’s a wholly different perspective behind designing an app vs. using it,and sometimes usability can get lost in translation. In this video series, Luke gives us an inside look at what the end user is really looking for when your app catches their eye and gets to that elusive and risky download stage. These guidelines can serve as both transformational and cautionary signposts for developers who want to create a better user experience when designing their apps.

For more in this 2 in 1 series from Luke Wroblewski, check out the following links:



For more complete information about compiler optimizations, see our Optimization Notice.