Dec 302013
 

About ten years ago, usability testing of software got a lot easier, thanks to improved tools that let us see our products in the hands of users, along with the faces, voices, and actions of test participants. But then along came mobile devices, and usability testing of apps again became difficult.

Research needed an expensive facility

Around the turn of the century, it was possible to measure how an interface performed in the hands of customers, but that required an expensive lab that had cameras, microphones, and seats for observers behind one-way glass. In those days, only large companies had a budget for usability labs, so smaller companies made design decisions based on best guesses and opinions.

Due to budget constraints, I had no access to a usability lab. Instead, I would talk to software developers about the user interfaces we were building. “We can put text in the interface to explain how it works,” I’d say, and: “We can use different controls so it becomes more obvious how to use the product.” The developers and I were all interested in quality, but we didn’t always agree on what quality might look like. We relied on our opinions and personal biases to predict how software interfaces would perform in the hands of customers.

Then research got easy and affordable

One day, almost a decade ago, I heard about TechSmith Morae. This software was a game changer because it could turn any computer into a usability lab. TechSmith’s product evangelist, Betsy Weber and her user-research colleague, Josie Scott attended industry conferences and spoke tirelessly talked about this miracle product that could record someone’s actions—clicks and typing—along with everything on a computer screen, plus their face and voice. All we had to do was plug in a camera and microphone, because these were the days before laptops had built-in cameras and microphones. Usability practitioners embraced this product. We gave people tasks to complete while we used Morae to watch them in action.

Suddenly, we could invite developers and other stakeholders to watch live user testing from an adjacent room, where they could watch and listen in near real time. They could see the participant’s puzzled expressions. They could see where the user was mousing, what they looked at and clicked, and what they overlooked. We could also record everything and then, from the recordings, make video clips to show which parts of our software caused participants to struggle. Suddenly, it was easy to help every team member understand the plight of their customers.

One of my earliest participants, during a half-hour usability test, went from blaming herself to expressing extreme dislike of the product. For the development team, I made a video clip of the key points to show how the participant’s attitude toward the product changed from neutral to extreme dislike, over 28 minutes. The participant gave us permission to use the video for product research, but not for this blog, so I’ve paraphrased the video’s story here:

Test participant after 2 minutesTest participant after 8 minutesTest participant after 19 minutesTest participant after 28 minutes
This video was incredibly persuasive, because it showed the participant’s emotional reactions. When testing a product’s usability research, it’s humbling to see the product fail and awful to the product cause frustration and anger. But the point is to identify and fix problems before customers see it, and to get the development team thinking about involving users during the design process. (Usability testing also allowed me show user delight at innovative new features that worked well, to reinforce that our user-centered design process was working.)

Around the same time that TechSmith released Morae, Web 2.0 enabled the development of competing tools that partially overlapped with Morae’s features. The majority of these tools only worked on web pages, not on installed apps. Also, the majority of these tools did not let researchers see and hear their users in action, as you can read in the descriptions of testing tools from 2009.

Mobile made research difficult again

Much as we love mobile devices, they’ve make usability testing harder. Diverse operating systems and the free movement of mobile devices present challenges that we haven’t seen for almost a decade. While the tools that assess websites still work, there are no tools that provide rich data about installed apps. We’re back to external cameras and the rigs that hold them, and we have to ask participants to keep their phones in a fixed position for the camera. So we’re back to expensive labs and special equipment. What we need is software that can do on a mobile phone what software can already do on a laptop computer: capture and transmit

  • the app’s screen
  • the participant’s voice
  • the participant’s facial expressions
  • the participant’s taps or gestures
  • the participant’s typing or speech-to-text input

Ironically, smartphones, many tablets, and most hybrid devices have the required camera and microphone. Unfortunately, (in January 2014) no company offers software that can capture and transmit the data from mobile devices that run Android-, iOS-, Windows-, and BlackBerry operating systems. It’s especially the camera and voice data that helps researchers to understand how participants feel—are they puzzled, frustrated, or delighted?

Not giving up

Development teams tend to be made up of tenacious and skilled people—including business analysts, designers, developers—and they’ll follow the evidence. As practitioners, we want development teams to let go of the old ways and get them evolving toward evidence-based, user-centered design. And we’ll continue to look for ways to measure the empirical performance and the emotional impact of our designs, through usability testing.

Usability testing is such an excellent way to show development teams what’s usable. Building a product that is measurably more usable leads to persuasive case studies that show the benefit of usability and user experiences. It’s too bad that easy measurement tools are currently missing for apps that run on mobile devices. But a few challenges haven’t stopped us before, and won’t stop us now.

Dec 022013
 

Online, I found a lecture about user-experience design that Jeremy Lyon gave to software developers at Stanford University. Lyon explained how important it is for the software’s visual design to reinforce the software’s use and meaning. He listed five principles that he applies:

  • Balance.
  • Rhythm.
  • Dominance.
  • Motion.
  • Unity.

For fun, I decided to use these five visual-design principles to assess a recent project I worked on with a team of developers. In this article, I’m only illustrating a small portion of our design—an application that helps people set up a corporate event, issue formal invitations, and then track all related communication that results.

Balance

Lyon told his students that balance reduces friction from the extra processing the brain must do when elements are out of balance, visually. In this illustration, the group of boxes is imbalanced because one button sticks out beyond the boxes:

These elements are visually unbalanced.

In contrast, in our design we visually balanced the boxes and button:

The elements are visually balanced.

Rhythm

A common way to create visual rhythm, Lyon said, is by repetition. When a visual detail gets repeated, it increases predictability. One example of visual rhythm is a list, another is a data-entry form that has a series of boxes. Rhythm comes from a regular a visual beat: consistent spacing, consistent weight, consistent colour, and so on. During development, the rhythm of our software’s user interface was disrupted by the uneven spacing of the boxes and labels:

These elements are not rhythmically spaced.

During development, we noticed this problem, and we fixed it by aligning and evenly spacing the boxes and labels.

Dominance

In some user interfaces, sometimes one element needs to be dominant. In an online store, the Buy button may be larger. In your email Inbox, new email may have more weight by using bold text or colour. In the first example, above, the button draws attention to itself because it is placed outside the visually balanced boxes. In the example, below, the large and bold text, “Event details,” signals that the other elements are lower in the organizational hierarchy. This provides cues that help people understand and complete the task faster.

The larger, bolder element has dominance.

Motion

Movement can reinforce the meaning by signalling relationships and hierarchy between different elements. In his lecture, Lyon named various types of motion: zooming, sliding, scrolling, and panning. In our design, we used motion to “slide in” additional boxes in the middle of the set. This technique is called progressive disclosure, and this illustration shows how, in our design, two new boxes slide out below a larger box:

The new boxes show they are related to the one the slide out of.

After sliding out, the user can enter additional details. Most users of this software want to accept the company’s default start time for events:

Two boxes and their labels slid out below the larger box.

During the design stage, we realized that motion alone wasn’t clear enough to signal that the additional elements were part of a group, which leads to the next point.

Unity

Unity is Lyon’s word for visually showing which elements belong together. To provide visual grouping, we could have uses spacing, colour, shape, and weight, placement, and more. For example, in this illustration, you’ll see two groups of circles, two groups of triangles, and two groups of squares.

You see two groups of circles, two groups of triangles, and two groups of squares.

Lyon also identified enclosure—providing a visual fence—as a way to group objects. In our design, we used colour and enclosure to visually identify the group:

You see two groups of circles, two groups of triangles, and two groups of squares.

Success!

By assessing our work, I confirmed that we successfully applied all five visual-design principles that Lyon listed in his lecture. Below is the as-designed portion of the software on which the above illustrations are based.

The example as designed.

If you want to hear about these five visual-design principles—and more—from the source, you can watch the Jeremy Lyon lecture about mobile user-experience design on YouTube.