With the introduction of Windows 8* and touch-enabled computers, like Intel Ultrabook™ devices, where touch is an additional input method, users have new ways of interacting with applications and software. Historically while touch has been limited to niche devices requiring specialized software, today we’re seeing a lot of manufacturers creating devices supporting these new input methods.
The Ultrabook is unique in that it includes not just touch, but also has a keyboard and mouse trackpad in a laptop format for traditional usage. So users have the option of using the keyboard as they have in the past, but can also use touch as a source of input.
Now with Ultrabook convertible, the user experience is only getting better. It’s a laptop when you need it and a tablet when you want it. They offer unique modes to create, collaborate, and communicate. Convertibles merge our standard keyboard technology with flexible hinges, and add the power of touch to give you a unique Windows 8 experience.
Developing touch interface for a convertible is the same as developing one for a tablet. Users are not only in optional touch situations, but in some cases, such as when a device’s keyboard is hidden or removed, may be in a touch only environment. If an application depends on mouse behaviors such as hover to reveal commands or actions, users will not be able to use the software. To make it useable, we have to ensure that the controls are large enough to support touch and that the placement makes sense given the expected usage patterns.
Touch Design Principles
The designers of Windows 8, introduced at the Build conference in 2011, discussed the set of principles they used to develop the framework for making touch a first-class citizen. These include:
- Touch should be natural and intuitive. Touch input is captured by the user touching the screen to select and manipulate objects and controls. This means that the end user does not require special training to interact with the application. For example, a great feature on Ultrabooks is the ability to create and use a touch password to log in. Not only is it more personal, it is faster and more secure than some of the other modes of authentication.
- Direct and engaging. Touching a control manipulates that control, not something in a different area of the interface, and gives the user feedback that something is happening. This could include sounds or vibration. Software with features like inertia and momentum provides a more realistic experience and is more in tune with the real world. For instance, flicking a finger on a control across the screen moves a picture, but it behaves with friction and inertia to come to an eventual rest. If users try to do something that is not supported, like moving a control to a location on the screen that doesn’t work, it moves slightly but snaps back into place.
- Portable and Consistent. Applications should follow industry standards and implement gestures in a consistent way. Gestures and manipulations mean the same thing across different applications. For example, a Pinch gesture will zoom out or affect the control by resizing it to be smaller while the expand gesture does the opposite. Touch and dragging performs a panning manipulation. You should think carefully before creating new custom gestures, especially if one of the system gestures does the same thing.
- Not intrusive. The controls that are designed to be touch aware are easy to access and fit in a logical place in the interface design. Manipulation of these objects does not obscure or prevent completion of tasks.
Sensors Recommended for the Ultrabook
The following table provides information about the new sensors that are recommended for convertibles. It is up to the OEMs which sensors are included in their specific models/usages.
Sensor Usage Models Delivering Software Opportunity
Using sensors to enable devices to respond to environmental factors has been commonplace for some time, although it has mostly existed in background processes and other forms that don’t necessarily gather a lot of attention from users. Accelerometers have long been used to protect hard drives and other moving parts when the sensor detects that the device is being moved or dropped.
The next generation of usages could be more in the foreground, directly impacting (or creating) the user experience, such as in the following examples:
- Security. Watchdog applications could potentially sound an alarm in response to movement of the Ultrabook while it is being used to display a presentation at a conference or while left unattended in a coffee shop. If the device suspects theft because of moving away from the owner’s cell phone, for example, sensitive data could be locked down, a text message could be sent as an alert, and GPS could track the device to help get it back to its rightful owner.
- Adapting to context. Utilities could adapt the system to specific pre-set GPS locations such as home, work, and elsewhere to control factors such as the visibility of alerts from social media, whether sharing is enabled for specific files, whether the webcam is enabled, etc. Similarly, when the system is in specific altitude ranges, it could automatically disable Wi-Fi* to comply with air-travel regulations.
- Lifestyle and travel. Augmented reality applications using geographical location and compass bearing could overlay point-of-interest (POI) information over an image captured in real time by the Ultrabook’s camera, providing a virtual tour guide. In conjunction with conventional navigation functionality, a pedometer could calculate distance travelled and average speed, as well as calculating calories consumed by the effort. Geo-tagging could add location information to vacation photos.
- Gaming and entertainment. As described elsewhere in this paper, sensors provide for modalities such as motion input that are well suited to games. Depending on specific system capabilities, it may be possible for some sensor-related functionality to be handled by the microcontroller firmware, freeing processor resources for demanding tasks such as real-time 3D rendering.
With the advent of Ultrabook™ convertible devices, UX designers can be more creative than ever. UX designs should provide an optimal user experience in tablet and laptop modes, support multiple orientations, devices with multiple form factors, screen resolutions, etc. UI components have to be carefully designed to support both touch and traditional mouse/keyboard inputs.
Considerations of multi-touch
Touch has certain limitations, like the size of fingers and type of screen technology that can capture the touch action. As a result, to function effectively in this environment we need to provide larger targets and more space between targets. This necessarily limits how much we can put on a screen, and understanding the ergonomics of the device helps to determine where to place the most commonly used controls.
For the Ultrabook, which includes a keyboard and a trackpad (similar to mouse) as well as touch, we need to be aware of how users will typically use it to perform their work.
The placement of controls on the page needs to reflect the expected usage patterns for the device. When the user wants to reach over the keyboard, the easiest touch targets are going to be near the edges of the screen, on the top and sides—good things to keep in mind when laying out controls.
Some of the controls typical of previous versions of Windows that are tuned for mouse and keyboard such as ribbons and menus present challenges when designing for touch due to their size and placement. For instance if the menu uses hover, an event that fires when the user moves their mouse pointer over a control, there is no corresponding event with touch, which results in menus that are rendered useless in a touch only environment.
While applications built for previous releases had varying levels of awareness of the possibility of touch, now that it’s a reality we need to adjust and work with the technologies. Here are some of them and approaches for adapting to a touch-enabled environment.
Buttons respond to click events, so developers don’t need to do much to enable them to work with touch. However, developers need to consider that hover is not supported so rendering effects, like highlighting the button when a mouse-over event occurs, won’t be reflected with touch.
The main consideration is to ensure that the buttons are large enough to support touch. Make sure they are at least 23 x 23 px in size, and larger is better. Secondly, include enough margin between controls so that touching one isn’t mistaken as an action on another.
Menus have been around since DOS when users could select from a list of choices. The advantage of menus is that they provide a simple hierarchical way to organize commands the user may need. The challenge is that, as our applications add functionality, the number of commands buried in the menu becomes cumbersome.
In a traditional non-touch window, the standard distance between menu items is fairly small, but with touch you need to ensure there’s more margin in order to minimize accidental selection of the wrong commands.
In markup languages like Windows Presentation Foundation (WPF) you can override the menu’s Control Template to specify how you want it to behave.
For more information on Ultrabooks
Intel, the Intel logo, and Ultrabook are trademarks of Intel Corporation in the US and/or other countries.
Copyright © 2012 Intel Corporation. All rights reserved.
*Other names and brands may be claimed as the property of others.