Design better online-video chatting

Last year I worked with a team most of whose members were on a different continent. Since my job as a usability analyst and interaction designer often requires me to influence, motivate, and give feedback about work already completed, I quickly adopted online video chat in order to see the non-verbal communication cues of my teammates.

In the course of my work, I spent many hours chatting online with team members in Australia, India, and Canada. I experimented with camera locations and different video software. I also read about the research of David Nguyen and John Canny in Face-to-Face: Empathy Effects of Video Framing. The researchers explain how the right use of cameras makes an online experience as good as a face-to-face experience. And I combined this with research presented by Byron Reeves and Clifford Nass in The Media Equation: How People Treat Computers, Television and New Media Like Real People and Places. The authors explain how, in many ways, the human brain cannot distinguish between an online experience and a live, in-person experience.

I realised that it’s not just about how I communicate with my team members. As an interaction designer, I can improve the user experience of online video chat and online video calls—for example, in live Support calls—by considering:

    • What is needed to give the illusion of eye contact?
      Since people aren’t in the same space, eye contact isn’t real, but eye  contact can be simulated, with all the benefits that ensue from actual eye contact. To address this, place the other caller’s video close to your camera.
    • How do we minimise the false non-verbal cues that online experiences can introduce?
      Poor camera position creates cues that aren’t really there, but the viewer’s brain still processes and reacts to them. False cues from apparently looking down can convey boredom, submissiveness, disrespect, and so on. False cues from apparently looking up can convey daydreaming, making things up, and aloofness. To address this, place your camera – and the other caller’s video – at eye level.
    • What exactly should the video include in its frame?
      To get results that are equivalent to a face-to-face meeting, what’s in the frame is critical. For live online video calls, the common heads-only frame is undesirable. To address this, include both your face and your shoulders in the frame.

Since a lot of the above information is best conveyed visually, here’s a video to explain it:

3 common problems with video calls: lack of eye contact, eye level, and framing (or distance from the camera).

Usability testing distant users

When a product’s users are scarce and widely dispersed, and your travel budget is limited, usability testing can be a challenge.

Remote testing from North America was part of the answer, for me. I’ve never used UserVue because the users I needed to reach were in Africa, Australia, South America, and Asia—continents that UserVue doesn’t reach. Even within North America, UserVue didn’t address the biggest problems I faced:

  • My study participants commonly face restrictive IT policies, so they cannot install our pre-release product and prerequisites.
  • I need to prevent study participants from risking their data by using it with a pre-release product.
  • There’s no way to force an uninstall after the usability test. Who else will see our pre-release?

Instead, I blended a solution of my own with Morae, Skype, Virtual Audio Cable, and GoToMeeting. I Testing that's really remoteused GoToMeeting to share my desktop, which addresses all three of the problems listed above. I used Skype to get video and audio. I used Virtual Audio Cable to redirect the incoming voice from Skype to Morae Recorder’s microphone channel. Morae recorded everything except the PIP video. It worked. However, my studies were sometimes limited by poor Internet bandwidth to the isolated locations of my study participants.

Amateur facilitators. I realise this is controversial among usability practitioners, but beggars in North America can’t be choosers about how they conduct usability tests on other continents. I developed a one-hour training session for the team of travelling product managers. Training included a short PowerPoint presentation about the concepts, followed by use of Morae Recorder with webcam and headset while role-playing in pairs. The main points I had to get across:

  • Between study participants, reset the sample data and program defaults.
  • When you’re ready to start recording, first check that the video and audio are in focus and recording.
  • While you facilitate, do not lead the user. Instead, try paraphrasing and active listening (by which I mean vernacular elicitation). Remember that you’re not training the users, so task failure is acceptable, and useful to us.

I had a fair bit of influence over the quality of the research, since I developed the welcome script and test scenarios, provided the sample data, and analysed the Morae recordings once they arrived in North America. Due to poor Internet bandwidth to the isolated areas of my study participants, the product managers had to ship me the Morae recordings on DVD, by courier.

It worked. I also believe that amateur facilitation gave the product managers an additional opportunity to learn about customers.