The ease of user research goes in cycles

The tools of user research have evolved substantially over the past three decades, and need to evolve more.

Here’s a history from last century through today, based on my experience.

User researchers have had to learn to test

  • computer software using expensive usability labs,
  • desktop software by using other desktop computers,
  • smartphone apps by using apps,
  • household appliances and outdoor digital experiences the hard way.

Back when usability labs were expensive

At the end of last century, it was possible to measure how software’s user interface performed in the hands of customers, but it required an expensive lab that had cameras, microphones, and seats for observers behind one-way glass. In those days, only large companies had a budget for usability labs, so smaller companies made design decisions based on best guesses and opinions.

I worked as a usability practitioner for a small software firm with 30 developers. The company was still low on the usability capability-maturity model, and I had no access to a usability lab. Instead, I would talk to software developers about the user interfaces we were building. “We can put text in the interface to explain how it works,” I’d say, and: “We can use different controls so it becomes more obvious how to use the product.” The developers and I were all interested in quality, but we didn’t always agree on what quality might look like. We relied on our opinions and personal biases to predict how software interfaces would perform in the hands of customers. I had no evidence.

Then research got easy and affordable

One day, after the turn of the century, I heard about TechSmith Morae. This software was a game changer because it could turn any computer into a usability lab. TechSmith’s product evangelist, Betsy Weber and her user-research colleague, the late Josie Scott, attended industry conferences and spoke tirelessly about their miracle product that could record someone’s actions—clicks and typing—along with everything on a computer screen, plus their face and voice. We did have to plug in an external camera and microphone, because back then laptops did not have them built in. Usability practitioners embraced Morae as a game changer for small software companies. We gave users tasks to do while we used Morae to watch them in action.

Suddenly, we could invite developers and other stakeholders to watch live user testing from an adjacent room, where they could watch and listen in near real time. They could see the participant’s puzzled expressions. They could see where the user was mousing, what they looked at and clicked, and what they overlooked. We could also record everything and then, from the recordings, make video clips to show which parts of our software caused participants to struggle. Suddenly, it was easy to help every team member understand the plight of their customers.

One of my earliest participants, during a half-hour usability test, went from blaming herself to expressing extreme dislike of the product. For the development team, I made a video clip of the key points to show how the participant’s attitude toward the product changed from neutral to extreme dislike, over 28 minutes. The participant gave us permission to use the video for product research, but not for this blog, so I’ve paraphrased the video’s story here:

Test participant after 2 minutesTest participant after 8 minutesTest participant after 19 minutesTest participant after 28 minutes
This video was incredibly persuasive, because it showed the participant’s emotional reactions. When testing a product’s usability research, it’s humbling to see the product fail and awful to the product cause frustration and anger. But the point is to identify and fix problems before customers see it, and to get the development team thinking about involving users during the design process. (Usability testing also allowed me show user delight at innovative new features that worked well, to reinforce that our user-centred design process was working.)

Around the same time that TechSmith released Morae, Web 2.0 enabled the development of competing tools that partially overlapped with Morae’s features. As cameras and microphones became standard in laptops, these tools let researchers see and hear their users in action. I conducted remote usability tests using Skype video calls, but I had to ship each participant a camera and microphone.

Mobile made research difficult again

Much as we loved early mobile devices, they made usability testing harder, because none of the software could share the user’s screen.

Diverse operating systems and the free movement of mobile devices presented challenges that we hadn’t seen for almost a decade. There were no tools that provide rich data about installed apps, no capturing taps and swipes. To observe users in action, we’d fastened a smartphone to a large rig so it would remain in view of an overhead camera pointed at the screen. This artificial environment interfered with the two-handed and portable experience of real-world use.

Meanwhile, there was an explosion of apps for smartphones. People learned to turn to their phones rather than their computers. “There’s an app for that.”

Finally, around 2015, there were apps that could run on a smartphone and access its user-facing camera, microphone, and touchscreen simultaneously, to capture and transmit

  • the app’s screen actions,
  • the participant’s voice,
  • the participant’s facial expressions,
  • the participant’s taps and gestures,
  • the participant’s typing,
  • the participant’s speech input.

This rich data let usability researchers test apps on smartphones the same way Morae had allowed us to test software on desktop computers.

It was especially the camera and voice data that helped researchers to understand participants’ feelings—were they puzzled, frustrated, or delighted?

The cycle repeated again as software moved into the cloud. Initially called “SaaS”—a technical term for Software as a Service—such online services are now commonplace.

The advent of applications like UserZoom and asynchronous services that record a user’s interaction on websites allow us to continue gathering rich data with which to assess how our software performs in the hands of users.

Interestingly, since customer data “in the cloud” is often on servers to which the software company has access, there’s another substantial source of data to slice and dice.

Once data is in the cloud, with user permission,  server logs and database transactions can provide data such as “how often does situation X occur” or “is there a need for function Z”?

Changing development and design culture

I’ve seen that regular exposure to user experiences changes a team’s culture from one that

  • uses guesses to make decisions about future products and designs

to a culture that

  • uses evidence to make decisions about future products and designs.

Let me tell you about a significant but quiet personal victory. After two years of consulting with a client’s development team, always pointing them to evidence and data, one day a manager proposed a new function. Developers immediately spoke up to ask: “What does the data tell us that users need ” I didn’t interrupt the discussion, but I was very proud of them.

New channels, new devices

Over the most recent decade, digital experiences have migrated to all manner of “device”—think of refrigerators (“Remember to buy milk”), smart-home systems (“Hey, Alexa”), share-bikes (“This is your route”), and more. How are user researchers testing these apps and services in natural settings? For smaller companies, it’s back to testing the hard way, with external cameras and microphones, for now.

As digital experiences move into other new arenas, challenges for user researchers will always arise. But the challenges haven’t stopped us before, and they won’t stop us going forward.

One Reply to “The ease of user research goes in cycles”

Leave a Reply

Your email address will not be published. Required fields are marked *

This site uses Akismet to reduce spam. Learn how your comment data is processed.