Photos help user personas succeed

If your user persona includes an image, which type of image helps the team produce designs that are more usable?

frank-long-style-user-persona-pic

The illustration on the left?  Or the photo on the right?

According to Frank Long’s research paper, Real or Imaginary: The effectiveness of using personas in product designphotos are better than illustrations. Teams whose user personas include a photograph of the persona produce designs that rate higher when assessed with Nielsen’s heuristics for UI design.

Frank Long compared the design output of three groups, drawn from his students at National College of Art and Design (NCAD) in Ireland, in a specific design project. Over the five-week project, two groups used user personas of different formats. One group was the control group, so they worked without user personas. The experiment looked for differences in the heuristic assessments of their designs.

Photos—versus illustrations—are one of the ways I’ve engaged project teams with the user personas that I researched and wrote for them. Here’s a teaser:

Usability of a potential design

Three-quarters of the way through a Five Sketches™ session, to help iterate and reduce the number of possible design solutions, the team turns to analysis. This includes a usability analysis.

 generative-design-stage-3

After Œ informing and defining the problem  without judgement  and  generating and sketching lots of ideas  without judgment , it’s often a relief for the team to start Ž  analysing and judging  the potential solutions by taking into account the project’s business goals, development goals, and usability goals).

But what are the usability goals? How can a team quickly assess whether potential designs meet those usability goals? One easy answer is to provide the team with an project-appropriate checklist.

Make your own checklist. You can make your own or find one on the Internet. To make your own, start with a textbook that you’ve found helpful and inspiring. For me, that’s About Face by Alan Cooper. To this, I add things that my experience tells me will help the team—my “favourites” or my pet peeves. In this last category I might consult the Ribbon section of the Vista UX Guide, the User Interface section of the  iPhone human-interface guidelines, and so on.

[local /wp-content/uploads/2009/04/make-usability-checklists.wmv]

Heuristics at the design stage

On the IxDA’s discussion list for interaction designers, Liam Greig posted his “human friendly” version of a heuristics checklist based on Nielson’s originals and the ISO’s ergonomics of human-system interactions.

Here are  just the headings and the human-friendly questions, which are useful at a project’s design stage.

A design should be…

  • transparent. Ask: Where am I? What are my options?
  • responsive. Ask: What is happening right now? Am I getting what I need?
  • considerate. Ask: Does this make sense to me?
  • supportive. Ask: Can I focus on my task? Do I feel frustrated?
  • consistent. Ask: Are my expectations accurate?
  • forgiving. Ask: Are mistakes easy to fix? Does the technology blame me for errors?
  • guiding. Ask: Do I know where to go for help?
  • accommodating. Ask: Am I in control? Am I afraid to make mistakes?
  • flexible. Ask: Can I customize my experience?
  • intelligent. Ask: Does the technology know who I am? Did the technology remember the way I left things?

Sometimes, at the analysis end of a Five Sketches™ ideation-design session, the design participants see more than one path forward. Use heuristics to frame the discussion of how each path rates, to reach a decision faster.

The heuristics can be about more than just usability. You can also assess coding costs, maintenance costs, code stability…. And, of course, you must also assess potential designs against the project’s requirements.

From napkin to Five Sketches™

In 2007, a flash of insight hit me, which led to the development of the Five Sketches™ method for small groups who need to design usable software. Looking back, it was an interesting journey.

The setting. I was working on a two-person usability team faced with six major software- and web products to support. We were empowered to do usability, but not design. At the time, the team was in the early stages of Nielsen’s Corporate Usability Maturity model. Design, it was declared, would be the responsibility of the developers, not the usability team. I was faced with this challenge:

How to get usable products
from software- and web developers
by using a method that is
both reliable and repeatable.

The first attempt. I introduced each development team to the usability basics: user personas, requirements, paper prototyping, heuristics, and standards. Some developers went for usability training. In hindsight, it’s easy to see that none of this could work without a formal design process in place.

The second attempt. I continued to read, to listen, and to ask others for ideas. The answer came as separate pieces, from different sources. For several months, I was fumbling in the metaphorical dark, having no idea that the answer was within reach. Then, after a Microsoft product launch on Thursday, 18 October, 2007, the light went on. While sitting on a bar stool, the event’s guest speaker, GK Vanpatter, mapped out an idea for me on a cocktail napkin:

  1. Design requires three steps.
  2. Not everyone is comfortable with each of those steps.
  3. You have to help them.

The quadrants are the conative preferences or preferred problem-solving styles.

I recognised that I already had an answer to step 3, because I’d heard Bill Buxton speak at the 2007 UPA conference, four months earlier. I could help developers be comfortable designing by asking them to sketch.

It was more easily said than done. Everyone on that first team showed dedication and courage. We had help from a Vancouver-based process expert who skilfully debriefed each of us and then served us a summary of remaining problems to iron out. And, when we were done, we had the beginnings of an ideation-and-design method.

Since then, it’s been refined with additional teams of design participants, and it will be refined further—perhaps changed significantly to suit changing circumstances. But that’s the story of the first year.

Blended usability approach “best”

I received a brochure in the mail, titled Time for a Tune-up: How improving usability can improve the fortunes of your web site. It recommends this blend of usability methods:

  • A blended approachExpert reviews focus on workflows and give best results when the scope is clearly defined.
  • Usability studies are more time-consuming than expert reviews, but is the best way to uncover real issues.
  • Competitor benchmarking looks at the wider context.

The brochure is written by online-marketing consultants, and with Web sites in mind, but its content is also relevant to other development activities.

Effectiveness of usability evaluation

Do you ever wonder how effective expert reviews and usability tests are? Apparently, they can be pretty good.

Rolf Molich and others have conducted a series of comparative usability evaluation (CUE) studies, in which a number of teams evaluate the same version of a web site or application. The teams chose their own preferred methods—such as an expert review or a usability test. Their reports were then evaluated and compared by a small group of experts.

What the first six CUE studies found

About 30% of reported problems were found by multiple teams. The remaining 70% were found by a single team only. Fortunately, of that 70%, only 12% were serious problems. In one CUE study, the average number of reported problems was 9, so a team would be overlooking 1 or 2 serious problems. The process isn’t perfect, but teams found 80% or more of the serious problems.

Teams that used usability testing found more usability problems than expert reviewers. However, expert reviewers are more productive-they found more issues per hour-as this graph from the fourth CUE study illustrates:

CUE study 4 results

Teams that found the most usability problems (over 15 when the average was 9) needed much more time than the other teams, as illustrated in the above graph. Apparently, finding the last few problems takes up the most time.

The CUE studies do not consider the politics of usability and software development. Are your developers sceptical of usability benefits? Usability studies provide credibility-boosting video as visual evidence. Are your developers using an Agile method? Expert reviews provide quick feedback for each iteration.

To learn more about comparative usability evaluation, read about the findings of all six CUE studies.