Augmented cognition is about understanding the state of a user’s brain and using that understanding to manage the user’s interaction with a computer. For example, if a user were receiving too much information in image form to process it effectively, you might trigger an audio alert to ensure that he responds to another pressing matter. In this way, the user avoids becoming overloaded with information and is in a better position to act appropriately.
The commoditization of smartphone hardware is just the beginning. Plunging prices of integrated “system on a chip” devices, paired with free Linux clones like Android, have enabled not just cheap devices, but cheap cloud-based devices. This has applied to phone products like the Sony Ericsson LiveView, and also to home appliances like the Sonos home music system. These examples are just the initial, telltale signs of a huge new wave of cheap devices about to invade our lives—a zombie apocalypse of electronics, if you will.
Until recently, Josh Clark’s charts of thumb-sweep ranges represented the state of the art in understanding touch interactions. In creating his charts, Josh surmised that elements at the top of the screen—and especially those on the opposite side from the thumb, or in the upper-left corner for right-handers—were hard to reach, and thus, designers should place only rare or dangerous actions in that location. Since then, we’ve seen that people stretch and shift their grip to reach targets anywhere on the screen, without apparent complaint. The iPhone’s Back button doesn’t appear to present any particular hardship to users. So the assumption behind those charts seems to be wrong—at least in the theory behind it. But are there other critical constraints at work? I am starting to think that it’s time for us to start designing for fingers and thumbs instead of for touch.
If you were to draft a profile for a UX thought leader, you'd likely come up with something that closely resembled Kim Lenox. Known for resetting the perimeters of everyday problem solving, Kim has devoted her career to making life—if not the world—better through user experience design.
Modern mobile experiences must answer to steep user expectations with rich and secure interactions regardless of context. As designers, we negotiate a razor-thin margin between too little (restricting features and content to fit small screens) and too much (complicating interactions with irrelevant web-legacy elements). Yet our users’ horizons are vast beyond a single screen. The experience we build has to inhabit the multiple touch points within their daily digital ecosystem. Content is the central component.
From ATMs to Siri to the button text in an application user interface, we “talk” to our tech—and our tech talks back. Often this exchange is purely transactional, but newer technologies have renegotiated this relationship. Joscelin Cooper reflects on how we can design successful human-machine conversations that are neither cloying nor overly mechanical.
As I transitioned from academia to industry, I discovered that while mobile UX was discussed, it wasn’t discussed from the same broad frame of reference that I was used to within the confines of a research-based institution. Although more recent mobile UX conversations I have found myself in have undoubtedly benefited from the ongoing smart phone revolution, overall I still find these conversations to be needlessly driven by tactical adoration and lacking a conscious consensus regarding the fundamental principles of the mobile-user experience.
One thing we know is that the iPad is not simply a larger iPhone, nor is it a smaller computer. Developers have been quick to port their apps from the iPhone to the iPad to ensure they don't miss out on this trend, but there are big differences in the underlying specs and form factor of the iPad that make this a fundamentally different user experience.
The conclusion of the Nielsen Norman Group’s April 2010 study of iPad usability is that it has problems and more standards are the solution. Yes, the iPad is imperfect, but resorting to standards as the solution is an antiquated reaction that fails to consider how interactive systems have evolved. We’re not Usability Engineers anymore (not most of us, anyway); we’re User Experience Designers. Experience is more than just usability.
Mobile is here to stay, with its own set of rules and constraints. At the same time, it’s a rapidly evolving platform, with new technologies and capabilities being added by the quarter. We can’t design for mobile like we used to do for posters and Web pages. So what toolkit and mindset does a mobile designer need to thrive?
It’s a common misconception that UX for mobile is all about creating something for users on-the-go—users with little time, checking in on their mobile on the train or at the bus stop waiting for a bus. But today’s mobile user is so much more than that, with the rise in tablet usage further contributing to the growth and variety of their needs. No longer can UX practitioners expect to satisfy the mobile user with added pinch-and-zoom functionality or bigger call-to-action buttons; these things are expected, and don’t improve UX.
While the mobile opportunity has been clear for some time, how to best tackle it remains a subject of debate. In particular, when building software for mobile should we invest in Web-based solutions or a native apps? Yes.