At a Glance
Screener questions identify specific contributors for your tests. These best practices can help your team know which questions work well for finding the best contributors.
Screener questions are available on the following subscriptions:
Are you interested in learning more about writing great test plans? Register to attend one of our upcoming virtual Live Training Events. You'll learn from a Platform expert how to structure the flow of your test to make it clear to contributors and how to get the best insights possible.
Click on the following header titles to skip to that section of the article:
- When should you test with your exact target market?
- Guidelines for screener questions
- How to check that your screener is capturing the right users
- Screeners based on familiarity with a product
- Screeners based on the frequency of use
- Screeners based on industry or occupation
- Screeners that deal with personal information
Many UX thought leaders encourage researchers not to be too granular about the users included in their studies. After all, a vast majority of products should be clear and intuitive enough that anyone can figure them out.
However, there are many circumstances in which researchers need to capture insights from a particular type of user because they're the only ones who would know whether the tool could be helpful to them as they work.
If you're in one of those circumstances, and you're testing remotely, you need to use screener questions—multiple-choice questions that can either eliminate users from taking part in your study or give them access to it.
UserTesting's Research Team knows firsthand how important (and challenging) it is to write solid screener questions. Below are some guidelines and examples to help you get just the right user for your next remote, unmoderated user test.
Many of the guidelines for writing good screener questions are the same as the guidelines for writing great Multiple Choice questions:
1. Always provide a "None of the above," “I don't know," or “Other" option just in case you've forgotten to include an answer that applies to the user or the user is confused by the question. This is especially important to include in screeners because if users don't have this option and pick an answer at random, they might accidentally end up in your test.
Poor example: Which of the following social networks do you have an account with?
Should be...: Which of the following social networks do you have an account with?
2. Provide clear and distinct answers that don't overlap each other.
Poor example: How many salespeople do your team support?
- 1 - 10
- 10 - 30
- 30 or more
Should be...: How many salespeople do your team support?
- 1 - 10
- 11 - 29
- Over 30
- I'm not sure
3. Avoid asking leading questions because users will be inclined to give you the answer they think you want instead of the one that applies to them. Instead of asking direct questions, instructing users to select the option that most closely applies to them, followed by a list of statements, is the most neutral way to phrase most screeners. This method ensures that users will answer honestly because it's less obvious what answer is desired.
Poor example: Do you like shopping online?
- Of course, I do it often
- I never do it
Should be...: What are your thoughts on shopping online?
- I like shopping online
- I do not like shopping online
- I'm indifferent about shopping online
- I don't shop online
4. Avoid asking yes/no questions so that people can't guess the "right" answer.
Poor example: Do you work for Microsoft?
Should be...: Which of the following companies do you work for?
- None of the above
5. Avoid double-barrelled questions so that you give people time to process and respond to one thing at a time.
- Poor example: How dissatisfied or satisfied are you with the pay and work benefits of your current job?
- Should be two questions: How dissatisfied or satisfied are you with the pay of your current job? How dissatisfied or satisfied are you with the work benefits of your current job?
Suppose you need someone with a particular background (like a medical degree) or someone who is going through a specific experience (like shopping for a new car). In that case, we recommend that, in addition to screeners, you use the first task of your test to verify this:
“You indicated in the screener questions that you are currently shopping for a new car. Please describe what kind of car you are looking for, where you have looked so far, etc."
Sometimes, just listening to a user describe their experience can let you know if they're the right fit.
One of the most common kinds of screener questions that researchers use is capturing users' level of familiarity with a product or a brand. Sometimes they need fresh users to test out a new tutorial for their app. Other times they are looking for insight from their most frequent users.
Whatever the case, you don't want to ask point-blank if users fit the mold; people are naturally inclined to say yes just to please you! Instead, ask users to indicate their familiarity and then define the different levels of understanding.
Similar rules apply to the related—and equally popular—frequency-of-use screener. As with experience levels, it's essential to define frequency in solid terms, not just "rarely," “sometimes," “often," etc.
Another common screener related to the frequency of use might have to do with how recently a user has participated in a specific activity. For example, many e-commerce product researchers prefer to hear from users who purchase items online often, and many travel product researchers want to hear from those planning a trip within the following year.
In those cases, it may be wise to create two screeners: one to confirm that they purchase items online/have an upcoming trip and then a follow-up screener to determine time frames.
Another occasion when multiple screeners might be needed to reveal a single characteristic would be when you need users within a particular occupation.
For example, a massage therapy retailer might want to hear from people in the massage therapy industry.
Obviously, massage therapy is a very specific profession, and it would be hard to come up with an exhaustive list of options inside of one screener question. But you also want to avoid asking a yes/no question, so you might start by listing broader professional categories, including Health (which would encompass massage therapy). In a follow-up screener, have users indicate their role within the Health industry.
The last type of screener that the UserTesting Research Team relies on frequently involves users providing sensitive information, such as their income, race, Facebook profile, or body type.
If the study requires the contributor to disclose sensitive personal information during the user test, it's important to forewarn them with a screener question. We recommend asking these types of screener questions first so that contributors avoid wasting time if they don't want to opt-in.
If your study involves Protected Health Information (“PHI”), please review our article on collecting insights under HIPAA. Only accept users who are willing to be open about this personal information.
Need more information? Read these related articles.
Want to learn more about this topic? Check out our University courses.
Please provide any feedback you have on this article. Your feedback will be used to improve the article and should take no more than 5 minutes to complete. Article evaluations will remain completely confidential unless you request a follow-up.