Agile design and usability: Will a prototype do?

The Agile Manifesto includes twelve principles. These two are related:

Deliver working software frequently [in short sprints that last] from a couple of weeks to a couple of months, with a preference to the shorter timescale.

Working software is the primary measure of progress.

From discussions with other practitioners, I know that software teams can get dogmatic about the above principles, and often work in sprints of one, two, or three weeks in duration to ensure they deliver frequently. Since the classic software-development problem has been delivering the wrong features too late, I understand the desire to deliver working software frequently, with more frequent opportunity for customer feedback.

When the “deliver frequently” principle becomes a fixation, though, other things can get bruised in the rush to produce working software. Consider design and usability research. Sometimes a system’s features are complex and interrelated. Sometimes an elegant and simple solution is elusive. (Technical limitations can contribute to this.) Sometimes a weak design needs testing to make sure it is usable. To ensure design and usability work gets the time it needs, this work is often moved to preceding and successive sprints. This sets the designer and usability analyst apart from the rest of the team.

Here are my ideas to help design and usability work integrate better into short sprints:

  • Ensure a designer understands the entire set of features before the first sprint. Ask for a broad-strokes design for the whole, and share it, so the team has a vision for the user experience. Before a sprint completes, compare the working software to the user-experience vision.
  • Agile tries to keep the deliverable as simple as possible, but simple features from different sprints may not combine into a good user experience. This is a defect, to be fixed like any other bug, before the sprint completes.
  • Ensure that usability and design bugs are resolved, by testing with users to validate the changes.
  • When you need work done that substantially shapes the user experience but that is not related to a specific feature, regard this user-experience work just as you regard database, architectural, or other foundational work. Foundation work is easier for the team to prioritise when planning a sprint.

The above suggestions work within the “deliver working software frequently” principle. I have another suggestion that pushes at Agile’s envelope.

It works!

I think the words “working software” are too restrictive. In my opinion, there are other deliverables that are equal to working software, especially for teams that use very short sprints. A deliverable such as a paper prototype or a low-fidelity interactive prototype is equal to a piece of code, and allows team members—all team members on an Agile team—to focus on design, prototyping, and validation together instead of making a designer or usability work one sprint ahead or one sprint behind the rest of the team.

Leaner, more agile

This week, I’m attending a few days of training in agile software development, in an Innovel course titled Lean, Agile and Scrum for Project Managers and IT Leadership.

My first exposure to agile was in Desiree Sy‘s 2005 presentation, Strategy and Tactics for Agile Design: A design case study, to the Usability Professionals Association (UPA) annual conference in Montreal, Canada. It was a popular presentation then, and UPA-conference attendees continue to be interested in agile methods now. This year, at the UPA conference in Portland, USA, a roomful of usability analysts and user-experience practitioners discussed the challenges that agile methods present to their practice. One of the panellists told the room: “Agile is a response to the classic development problem: delivering the wrong product, too late.” There was lots of uncomfortable laugher at this. Then came the second, thought-provoking sentence: “Agile shines a light on the rest of us, since we are now on the critical path.” Wow! So it’s no longer developers, but designers, usability analysts, etc, who are holding up the schedule?

An agile loadDuring this week’s training, I’m learning lots while looking for one thing in particular: how to ensure agile methods accommodate non-developer activities, from market-facing product management activities, to generative product design, to early prototype testing, to usability testing, and so on.

I’m starting to suspect that when agile methods “don’t work” for non-developers, it’s because the process is wagging the dog (or that its “rules” are being applied dogmatically). I think I’m hearing that agile isn’t a set of fixed rules—so not a religion—but a sensible and flexible method that team members can adapt to their specific project and product.

Effectiveness of usability evaluation

Do you ever wonder how effective expert reviews and usability tests are? Apparently, they can be pretty good.

Rolf Molich and others have conducted a series of comparative usability evaluation (CUE) studies, in which a number of teams evaluate the same version of a web site or application. The teams chose their own preferred methods—such as an expert review or a usability test. Their reports were then evaluated and compared by a small group of experts.

What the first six CUE studies found

About 30% of reported problems were found by multiple teams. The remaining 70% were found by a single team only. Fortunately, of that 70%, only 12% were serious problems. In one CUE study, the average number of reported problems was 9, so a team would be overlooking 1 or 2 serious problems. The process isn’t perfect, but teams found 80% or more of the serious problems.

Teams that used usability testing found more usability problems than expert reviewers. However, expert reviewers are more productive-they found more issues per hour-as this graph from the fourth CUE study illustrates:

CUE study 4 results

Teams that found the most usability problems (over 15 when the average was 9) needed much more time than the other teams, as illustrated in the above graph. Apparently, finding the last few problems takes up the most time.

The CUE studies do not consider the politics of usability and software development. Are your developers sceptical of usability benefits? Usability studies provide credibility-boosting video as visual evidence. Are your developers using an Agile method? Expert reviews provide quick feedback for each iteration.

To learn more about comparative usability evaluation, read about the findings of all six CUE studies.