Unreliability of self-reported user data

Many people are bad at estimating how often and how long they’re on the phone. Interestingly, you can predict who will overestimate and who will underestimate their phone usage, according to the 2009 study, “Factors influencing self-report of mobile phone use” by Dr Lada Timotijevic et al. For this study, a self-reported estimate is considered  accurate if it is within 10% of the actual number:

Defining 'accuracy'

Underestimated Accurate Overestimated
Number of phone calls (number of people) (number of people) (number of people)
High user 71% 10% 19%
Medium user 53% 21% 26%
Low user 33% 16% 51%
Duration of phone calls
High user 41% 20% 39%
Medium user 27% 17% 56%
Low user 13% 6% 81%

If people are bad at estimating their phone use, does this mean that people are bad at all self-reporting tasks?

Not surprisingly, it depends how long it’s been since the event they’re trying to remember. It also depends on other factors. Here are some factoids that should convince you to be careful with self-reported user data that you collect.

What’s the problem with self-reported data?

On questions that ask respondents to remember and count specific events, people frequently have trouble because their ability to recall is limited. Instead of answering “I’m not sure,” people typically use partial information from memory to construct or infer a number. In 1987, N.M. Bradburn et al found that U.S. respondents to various surveys had trouble answering such questions as:

  • During the last 2 weeks, on days when you drank liquor, about how many drinks did you have?
  • During the past 12 months, how many visits did you make to a dentist?
  • When did you last work at a full-time job?

To complicate matters, not all self-report data is suspect. Can you predict which data is likely to be accurate or inaccurate?

  • Self-reported Madagascar crayfish harvesting—quantities, effort, and harvesting locations—collected in interviews was shown reliable (2008, Julia P. G. Jones et al).
  • Self-reported eating behaviour by people with binge-eating disorders was shown “acceptably” reliable, especially for bulimic episodes (2001, Carlos M. Grilo et al).
  • Self-reported condom use was shown accurate over the medium term, but not in the short term or long term (1995, James Jaccard et al).
  • Self-reported numbers of sex partners were underreported and sexual experiences and condom use overreported a year later when compared to self-reported data at the time (2002, Maryanne Garry et al).
  • Self-reported questions about family background, such as father’s employment, result in “seriously biased” research findings in studies of social mobility in The Netherlands—by as much as 41% (2008, Jannes Vries and Paul M. Graaf).
  • Participation in a weekly worship service is overreported in U.S. polls. Polls say 40% but attendance data says 22% (2005, C. Kirk Hadaway and Penny Long Marler).
Can you improve self-reported data that you collect?

Yes, you can. Consider these:

  • Decomposition into categories. Estimates of credit-card spending gets more accurate if respondents are asked for separate estimates of their expenditures on, say, entertainment, clothing, travel, and so on (2002, J. Srivastava and P. Raghubir ).
  • For your quantitative or qualitative usability research or other user research, it’s easy to write your survey questions or your lines of inquiry so they ask for data in a decomposited form.

  • Real-time data collection. Collecting self-reported real-time data from patients in their natural environments “holds considerable promise” for reducing bias (2002, Michael R. Hufford and Saul Shiffman).
    Collecting real-time self-report data
  • This finding is from 2002. Social-media tools and handheld devices now make real-time data collection more affordable and less unnatural. For example, use text messages or Twitter to send reminders and receive immediate direct/private responses.

  • Fuzzy set collection methods. Fuzzy-set representations provide a more complete and detailed description of what participants recall about past drug use (2003, Georg E. Matt et al).
  • If you’re afraid of math but want to get into fuzzy sets, try a textbook (for example, Fuzzy set social science by Charles Ragin), audit a fuzzy-math course for social sciences (auditing is a low-stakes way to get things explained), or hire a tutor in math or sociology/anthropology to teach it to you.

Also, when there’s a lot at stake, use multiple data sources to examine the extent of self-report response bias, and to determine whether it varies as a function of respondent characteristics or assessment timing (2003, Frances K. Del Boca and Jack Darkes). Remember that your qualitative research is also one of those data sources.

Researching usability research

I’m conducting ethnographic research into how usability analysts regard usability research.

How will I conduct this research?

How will I conduct this research? I’m conducting a form of community-based participatory research, so members of the community—the research subjects—will help me set the questions or lines of inquiry and will influence the research methods. This is appropriate since I’m researching people who research—people who likely have a greater awareness of research epistemology and the range of methods that can be used.

Want to participate?

If you have conducted any usability research at all, and you want to participate, please contact me by commenting. (Look for theimmediately below.) These comments are private. I’ll be at the 2009 UPA conference in Portland this week, June 8-12, if you want to meet in person.

Replies so far: 3. I have slots for only 8 more.

Software UX/GUI design in education

I was wondering whether the “design” of web sites and software is anything more than “intermediation” (inserting a layer between the user and the raw data), whether “intermediation” is just a synonym for “information architecture,” and whether “design” must therefore be something greater—something that includes the emotional impact of the experience. Or is that last phrase merely another way to say “user-experience design”?

Apparently, it was a day for wondering, because, next, I thought about the many excellent software developers I’ve worked with, and wondered how they would respond to my apparently pointless musings. Then I wondered: would the opinions of my software-development colleagues be informed by their formal education or their work experience, attendance at conferences, or professional development reading? [For me, as a usability practitioner and CUA, it’s all of the above.]

What core competencies are taught?After this, I wondered how much software developers are formally taught about user-experience design and user-interface design, in school.

A quick online search led me to the course lists, summarised in the table, below, for the different program types offered where I live. I’ve highlighted the two courses that specifically mention  interface design. There’s no mention of  usability, or of the all-encompassing  user experience. There is one program at Capilano College that includes user-experience design, and my own course, Fundamentals of user-interface design, is only offered every two years through one of SFU’s continuing studies programs. Also, I’ve noticed an increase in the proportion of software-development students at monthly Vancouver User-Experience events. So change is in the wind.

What’s the situation in your community of practice?

It seems to me there’s a hole in the bucket, but we can mend it. The answer simple. Go back to your school and ask to sit as an industry representative on the academic-advisory committee. The local chapter of your professional association can help open doors. Once appointed to the committee, participate in a curriculum review. This is a slow, formal, and somewhat political process—but it works. It’s a great way for experienced software developers and interaction designers to improve our communities of practice. And it looks good on a resume.

Bachelor degree, Computer science, Simon Fraser University Certificate, Software systems development, BC Institute of Technology Certificate, Software engineering, University of British Columbia
CMPT310 Artificial Intelligence Survey.
CMPT411 Knowledge Representation.
CMPT412 Computational Vision.
CMPT413 Computational Linguistics.
CMPT414 Model-Based Computer Vision.
CMPT417 Intelligent Systems.
CMPT418 Computational Cognitive Architecture.
CMPT419 Special Topics in Artificial IntelligenceComputer Graphics and Multimedia.
CMPT361 Introduction to Computer Graphics.
CMPT363 User Interface Design.
CMPT365 Multimedia Systems.
CMPT368 Introduction to Computer Music Theory and Sound Synthesis.
CMPT461 Image Synthesis.
CMPT464 Geometric Modeling in Computer Graphics.
CMPT466 Animation.
CMPT467 Visualization.
CMPT469 Special Topics in Computer GraphicsComputing Systems.
CMPT300 Operating Systems I.
CMPT305 Computer Simulation and Modeling.
CMPT371 Data Communications and Networking.
CMPT379 Principles of Compiler Design.
CMPT401 Operating Systems II.
CMPT431 Distributed Systems.
CMPT432 Real-time Systems.
CMPT433 Embedded Systems.
CMPT471 Networking II.
CMPT479 Special Topics in Computing Systems.
CMPT499 Special Topics in Computer Hardware.
CMPT301 Information Systems Management.
CMPT354 Database Systems I.
CMPT370 Information System Design.
CMPT454 Database Systems II.
CMPT456 Information Retrieval and Web Search.
CMPT459 Special Topics in Database Systems.
CMPT470 Web-based Information Systems.
CMPT474 Web Systems Architecture.
CMPT373 Software Development Methods.
CMPT383 Comparative Programming Languages.
CMPT384 Symbolic Computing.
CMPT473 Software Quality Assurance.
CMPT475 Software Engineering II.
CMPT477 Introduction to Formal Verification.
CMPT480 Foundations of Programming Languages.
CMPT481 Functional Programming.
CMPT489 Special Topics in Programming Languages.
CMPT307 Data Structures and Algorithms.
CMPT308 Computability and Complexity.
CMPT404 Cryptography and Cryptographic Protocols.
CMPT405 Design and Analysis of Computing Algorithms.
CMPT406 Computational Geometry.
CMPT407 Computational Complexity.
CMPT408 Theory of Computer Networks/Communications.
CMPT409 Special Topics in Theoretical Computing Science.
MACM300 Introduction to Formal Languages and Automata with Applications.
SSDP1501 Systems Foundations 1. Application development, OOP, C#, Java, fundamentals of programming and program design.
SSDP2501 Systems Foundations 2. Web-based applications, architecture, web design, principles, HTML, XHTML, CSS.
SSDP3501 Systems Foundations 3. Medium and large-scale applications, dynamic web technologies, project management, relational databases, security issues of web applications.
SSDP4001 Specialty Topics. Enterprise-scale applications, ASP.net, advanced Java.
SSDP5001 Projects. Practical experience with an internal and external software-development project.
IE535 Software Teamwork: Taking Ownership for Success.
IE520 Introduction to Practical Test Automation.
IE523 Agile Development Methodologies.
IE527 Applied Practical Test Automation.
IE507 Object-Oriented Methods: Object-Oriented Modelling and Development with UML.
IE526 Principles and Components of Successful Test Team Management.
IE503 Requirements Analysis and Specification: A Practical Approach.
IE505 Software and System Testing: Real-World Perspective.
IE504 Software Architecture and Iterative Development Process: Managing Risk through Better Architecturd.
IE510 Software Configuration Management: Controlling Evolution.
IE502 The Software Engineering Process.
IE506 Software Project Management.
IE509 Software Quality Assurance: More Than Just Testing.
IE511 Software Team Project.
IE525 Strategic Test Analysis and Effective Test Case Design.
IE528 Testing for the Global Market.
IE508 User Interface Design: Designing an Effective Software Interface.

This sugar packet is a movie

Whether it’s ethnographic research, usability research, or marketing research, I’ve learned that the best insights aren’t always gleaned from scheduled research.

Here’s a photo of impromptu research, conducted by Betsy Weber, TechSmith’s product evangelist. I was her research subject. Betsy recorded me pushing sugar packets around a table as I explained how I’d like Camtasia to behave.

Jerome demos an idea to Betsy. Photo by Mastermaq

Betsy takes information like this from the field back to the Camtasia team. There’s no guarantee that my idea will influence future product development, but what this photo shows is that TechSmith listens to its users and customers.

The ongoing stream of research and information that Betsy provides ensures better design of products that will be relevant and satisfying for TechSmith customers down the line.