In his ongoing series discussing how to implement touch features in Ultrabook apps, Luke Wroblewski gives us a thoughtful look at how various touch factors work when integrated into working apps. So far in this series, we’ve looked at basic touch design principles, what postures and touch targets should look like for optimum user experience, and how developers can incorporate elements of discoverability into their apps so touch features are more easily discovered. In his fourth video, Luke goes over location detection, something that has been a ubiquitous part of smartphone and tablet app design for quite a while now, and is becoming ever more powerful with the advent of Ultrabooks™. In this article, we’re going to learn more about location detection, why it’s important, and go over basic principles of Ultrabook app implementation.
We’re all familiar with location-enhanced apps on our mobile phones and tablets; in fact, many of us have checked in to wherever we might be going via Facebook, Twitter, Foursquare, and other apps that use location detection as part of the overall user experience. There has been a virtual explosion of location-based mobile services for smartphones and tablets, partly due to our immersion in social media and the perceived need to broadcast what we’re doing and where; but location detection is more than just an “see what I ate for lunch today” update – or, at least it has the potential to be. With next generation Ultrabooks, there are many opportunities for designers and developers to integrate location detection into their applications using GPS, which gives incredibly accurate location data and doesn’t use up all the battery (unlike my smartphone, which seems to drain battery juice like nobody’s business).
In previous videos, Luke shared a storm chaser social media application called Tweester with us; this is an app that storm chaser enthusiasts use to connect with other people, chat about current storms, and collaborate on possible storm interceptions. It’s really a very well-done app and shows off various Ultrabook sensors incredibly well, including location detection. Tweester has been re-engineered for touch support, and Luke continues to build on that Ultrabook implementation with location.
Tweester is designed for the Ultrabook, as already stated, but it goes the other way as well: Tweester and the Ultrabook device are perfect for storm chasers since they need something relatively hard-core and mobile in order to do what it is that they do. Location detection seemed like a logical step towards making this app the best it can be, especially since storm chasers tend to move around quite a bit and need something light, powerful, responsive, and mobile. Tweester on an Ultrabook seems to fit that bill.
A Tweester user initially had to manually input their location information into the application, which was a bit clunky. Imagine being out in a field watching a tornado a few miles away and having to take the time to do that – not the best use of your time (personally, I believe the best use of my time would be driving far, far away from that area, but that’s just me). Capturing location data with enough information to make it completely accurate can be tricky, and manual entry is inconvenient and can lead to important errors, especially when other people depend on your updates for safety.
With a Tweester redesign for the Ultrabook and touch-enabled Windows 8, Luke designed a simple callout with permission-based location detection to come up when the user first logged on (you’ve probably seen this in other apps you’ve used with location features). This automatically detects all location information, and takes the manual entry (as well as the user) out of the location detection process. With location awareness included as part of the design, it’s automatically appended to a user’s update and no data entry is required. This not only cuts down on required data entry, but can actually end up capturing a lot more information, making for a much richer experience.
Moving to the dedicated Maps section of Tweester is where you can really start to see where location detection can get useful; for example, you can pinpoint exactly where you are in relation to a current storm or other weather situation. Just tapping on the map icon gives the user a wealth of information; for instance, say you have a buddy fifty miles away who’s spotted a possibly evolving tornado situation. You could use Tweester to map out an intercept route so you can get there as quickly as possible, matching with the best way to not only get there, but getting storm information as you go: current weather updates, for example, from local weather stations and from your fellow storm chasers like recently updated new points of interest.
That map also can give users the power to store historical data, which was a very intriguing use of location detection data. All the information that Tweester users upload – statuses, images, etc. – can be used to create a framework when they are back in that area again, giving them past data that can be extraordinarily useful for future storms. All they have to do within Tweester to enable this feature is check off the integrated reminder feature to check local data when they’re back in that geographical location; the data automatically comes up when the same location is visited.
As you can see from this brief overview and Luke’s video, location detection is so much more than just figuring out a way to get from Point A to Point B on a map. It’s another layer that can make any app that much richer of a user experience, and paired with the processing power of an Ultrabook, it’s truly an intuitive experience.