This is the Lady Gaga of research methodologies:

 

“I’m your biggest fan” you say to your design.

Your users sit down and say “I’ll bring him down, bring him down, down.”

You look back at your participant: “No he can’t read my poker face.”

 


Setup

If you run your usability test right, your user won’t get your intentions off the bat. They’ll work through the tasks as you learn what can be improved with your design. Usability tests are core to determining how your design could function in real-world conditions. The crux of this process involves building a prototype and prompting the user to complete a series of tasks.

Headcount

Aim for 5-6 solid user tests for confidence in a design direction. If you are building a feature that may differ between personas or your user types, try to test with 4-5 users within each major persona.

Basics

In a usability test, you will ask your participant to try to use a prototype or low-fidelity mockup of your product. They will try to complete a set of tasks using this mockup, and throughout the time “thinking aloud” on what makes sense to them, frustrates or confuses them, and what they were expecting to see along the way.

Usability tests provide you with clear validation to move forward in the design process.  The questions and tasks you setup should be focused on the stage you’re at.

For product validation stages you should be clarifying and validating the minimum requirements for a feature or user experience. For design discovery and design validation phases, you will start with testing the user flow using low fidelity interfaces. Then, as you ramp up fidelity, you are testing the ability for users to accomplish full tasks independently and without your intervention. 

How to Present Each Task

Outline a list of tasks to complete in your interface. For instance “run this report” or “update a software plugin” or “share a file with another user.”

Break up tasks into distinct processes that take up to 2-5 minutes to complete. Try to avoid using identical terminology as the micro-copy in your interface. Similarly, avoid tasks that just ask users to hunt-and-find a button. Define an end-goal that the user will need to use the prototype to accomplish.

For instance, one that asks Export a list all invoices that are more than 45 days overdue” will help you test your design far better than asking Filter a report show overdue invoices, then adjust to show only invoices that fall into the date range of 45-days past due.” The second task provides users with the order of operations and matching micro-copy.

If you are in an earlier stage of the research cycle, you can use more open-ended questions that focus on what the user suspects the goals of this interface are. For example, you could ask: What does this interface allow you to do?”

Instructions & Thinking Aloud

How to Think Aloud

When conducting any sort of usability test, you’re going to have to instruct participants to vocalize their internal thought processes. This is not something most people feel comfortable doing naturally. Even when prompted, users can ignore or quickly stop talking aloud as they work through a task. 

To help users get used to thinking aloud, I recommend walking through a simple thinking aloud demo. Take out your smartphone and walk through taking a photo and texting it to someone on your team. During that process think aloud and show what that might look like. Ask them to do the same and provide them the nudge to open up. Laugh a little and break the tension by calling out how awkward or weird it is.

I try to explain the thinking aloud process as follows. (In this case for a click-through InVision-style prototype):

  • So during this next part, we are going to use a prototype of our software. This isn’t totally functional, in fact, a lot of things won’t work, and that’s ok! To start, I’ll give you the brief instructions on what you’ll do with this prototype and then while you click around, I’ll ask that you vocalize your own stream of thoughts.’I know it seems weird, but it’s really important you tell me what you think, read, and feel along the way.
  • For instance, if I showed you this camera app, [pull out phone camera] might say  I see this button, and I know you want me to take a photo, so I’ll click that.’ Or you’ll say I’m looking for the option to share the photo, but all I see are these weird icons.’  Does that make sense?
  • I know it sounds odd, and it might take a minute to get used to doing it but there isn’t a wrong way to do this. And I’ll prompt you to keep talking if you get a little quiet. Sound good?”

Throughout the process, be sure to prompt them if they get quiet or if they are reading something intently. Say things like Tell me a what you’re thinking” or Talk to me a bit about this…” After they complete all the tasks, high fives and a round of giggles are in order. 

Instructions

When in person, and wherever possible, print out your set of tasks onto a piece of paper, so there is one task per page. Ask participants to read aloud the instructions and then continue to talk aloud during each task. This consistent reinforcement can help. 

Prompting participants directly can work, but it will require you to be very consistent in how you instruct users to complete this step. I find this works but only if you are able to break tasks down into smaller steps. Usually, I phrase each instruction as How would you do this?” or Could you show me how you would do this?” or Talk me through how you would do XYZ?”

Participants should feel free to ask you questions while they complete the test. However, you should refrain from responding with long answers, instead ask them the question nicely.

For example:

  • Participant: Where should I look for the export option?
  • Researcher: Hmm, where do you think it should be?
  • Participant: I’m not sure…
  • Researcher: If you had to guess…
  • Participant: I was expecting it here. But it’s not there…
  • Researcher:….

Again, don’t guide them. Stay calm and give them space. Encourage them to guess or try to find the solution on their own. Repeat that there are no wrong answers. 

If time is slipping away or users are getting frustrated, you may need to skip ahead or provide slightly more guidance in helping them narrow down choices. If they are asking for help repeatedly prompt them: Where else could that button be?” or Where else could you look for that option?” or What buttons do you see?”

After they complete the task with help, ask them what would you change about that?” Do your best to avoid giving away answers or using words that match the microcopy. 

Logistics

Whether you are talking to someone in person or moderating it online, it’s best to introduce yourself as a user researcher and not a designer or product manager. 

If it’s reasonable, clarify that you 1) are a researcher and 2) mention that you did not work on this particular project directly, but you are responsible for providing them feedback. With users who appear uncomfortable providing critical feedback or who have a track record of not being able to vocalize issues until later, it can be helpful to go as far as saying To do my job well today, I need to bring back suggestions to the designers and engineers working on this tool.” 

Tools

I find it unbelievably valuable to use a tool to record a user’s screen and their reactions during tests. 

One option is Lookback, which records the user’s screen and captures video recordings of the users as they work through the outlined tasks. You can also use built-in screen recording software like QuickTime or the screen recording tool on iOS/Android. Chorus.ai can be used to record the video and audio of a remote conference to be reviewed later on. 

Lastly, for quick testing using InVision prototypes, I’ve had great success with InVision’s Live Share feature, combined with one team member on your side recording their screen and the phone call. 

In the tools section, I go into significant detail on these tools and how to best use them.


Next: