1. Ask open-ended questions
- Ask questions that cannot be answered with “yes” or “no” and allow for a richer explanation
- Don't give participants canned choices (as framed by you), allow them to explain in their own words
- Watch out for “or” — you may find yourself framing choices, “do you think X or do you think Y?”
- Instead ask, “what do you think about topic Z?”
2. Ask for specific, past examples
- “Can you tell me about the last time that happened?”
- “Can you walk me through a specific example?”
- People tend to self-report how they would like to be and act, rather than what actually happened
- Concrete past examples are the best way to try and uncover true behavior where observation is not an option
- Generalizations wash away the specific, human incidents that we care about (there is no “average” experience)
- If you hear the words “typically,” “generally,” or “usually,” you know you need to back up and ask for specifics again
- “And let's go back to the specific example we started with... what happened next?”
3. Continue to ask why
- Don’t fill in the gaps with your own assumptions about what a person was doing or how they were thinking
- “Why is that hard?”
- “Why did you do that?”
- “Why do you say it feels like a game?”
- Don't worry if it sounds like a stupid question—you can even say “this might sound like a stupid question...” then ask why
4. Let the participant lead
- Don’t adjust a person’s point of view or correct them!
- Sometimes a participant will answer a question in a different way than you expected: their top-of-mind reaction helps indicate what is important to them
- Follow up on the participant's point of view before correcting or restating questions
- Don't lose the opportunity to understand a new perspective by writing it off — be willing to identify, accept, and further investigate user's different ways of thinking about the product/service/workflow
5. Allow for uncomfortable silence
- Ask a question and allow your participant time to internalize it and answer how they will
- It takes some practice: don't feel the need to fill in spaces or interrupt participant thought
- If somebody can't, or won't, answer a question, they will let you know
Bonus — if you're evaluating a specific feature or product:
6. Never, ever, ask “do you like it?”
- It violates rule #1
- There's no cost to saying “yes” even if the answer is no
“Liking” or “not liking” aren't meaningfully correlated to behavior, use, or understanding. "Do you like doing your taxes, or taking out the trash?" This question does not gain any useful information within the frame of specific product context or the goals trying to be achieved. A user may like your feature or product, but have 15 or 20 things that are more important to them... this question gives no insight into your product's value.
This guide was originally written by Dave Hora and used with permission from Dave's Research Company.
For an in-depth guide to interviewing users, please go straight to the source: Steve Portigal’s book, Interviewing Users.