How to Avoid and Handle Imposter Participants (UserZoom)

Learn strategies for writing effective screeners and asking questions during the test in order to validate that your participants are who they say they are.

 

On this page:


 

We understand the importance of having high-quality participants you can trust when collecting feedback and insights. We continually invest and update our IntelliZoom quality solutions designed to ensure you receive high-quality feedback by flagging "imposter participants" - individuals who misrepresent themselves and give poor-quality feedback before it reaches your dashboard. 

There are also many ways to help reduce your exposure to imposter participants when you build a test, from how you recruit participants to data collection and analysis. This article covers best practices and how to report participant feedback issues to UserTesting.

 

Avoiding imposter participants

The most important way to avoid bad participants is to set up your screeners, scenarios, and test questions to help minimize exposure to bad actors.

 

Recruitment

  • Join the IntelliZoom Panel. Understanding the participant experience - from completing screener questions and reviewing Welcome Pages to taking tests - being a participant is one of the best ways to improve your tests.
  • Write good screener questions. One of the best ways to reduce imposter participants is to make it hard for an unqualified participant to easily guess the correct answer. Here are some techniques we recommend:
    • Avoid asking leading questions. Participants will be inclined to give you the answer they think you want instead of the one that applies to them. Instead of asking direct questions, instructing users to select the option that most closely applies to them, followed by a list of statements, is the most neutral way to phrase most screeners. This method ensures that users will answer honestly because it's less obvious what answer is desired.
    • Avoid asking yes/no questions. Make it hard for participants to easily guess the "right" answer.
    • Add a screener question about fraud. Add a screener question that reminds people that if they misrepresent themself, they will be subject to removal and/or non-payment.

      Example

      Do you agree to share your own thoughts and experiences? As a reminder, if you misrepresent yourself in any way, you could risk not getting paid for this test and/or being removed from the panel based on UserTestings Terms of Service. 

      • I agree (Must select)
      • I do not agree (Disqualify)
       
    • Embed “dummy” responses. When listing out multiple answers to a screener question, it’s good practice to include multiple incorrect and/or fake answers, especially when the participant can select multiple responses.
    • Include product or industry-specific questions and jargon. Create screener questions that only someone with specific knowledge about your product, experience, or industry would be familiar with.
    • Ask participants to validate their experience. You should request that participants validate their experience or demographics as part of a test (e.g., share their LinkedIn profile). However, you must notify participants in the screener questions that you will ask them to share sensitive personal information, allowing them to opt-out. 

      Example:

      During this test, you will be required to share your LinkedIn profile. This information will only be used for the purposes of validating your experience and will not be shared. Do you consent to provide this information?

      • Yes, I consent to share my LinkedIn profile to participate in this test. (Must select)
      • No, I do not consent to share my LinkedIn profile to participate in this test. (Disqualify)
    • Utilize saved screener questions library - Writing multiple complex questions designed to weed out imposters each time you create a test takes time. Make sure to save your high-quality screener questions to speed up your test creation process and make them available for others on your team to deploy best practices. However, you should adjust these - especially dummy answers - periodically however to reduce unwanted participants from recognizing patterns in your questions.

💡Please review the Screener questions best practices article for additional tips and examples.

  • Don’t give away too much information when you provide a Welcome Page. Remember the Welcome Page is shared with the participant after they pass the screener questions but before they start the test. Giving away too much information allows an imposter participant to get familiar with the topic before seeing any questions.
  • Recruit your own. Recruiting your own participants can help with quality if you are confident you know who you are inviting to the study. However, if you are sharing a test link on social media, as part of an intercept study, or other approaches where you don’t know the participant's identity, you will need to validate that within the study. Please be conscious of asking for sensitive personal information.

 

Data collection

Even when preventative measures have been taken to prevent imposter participants from getting through, it is still a good idea to build an unmoderated test plan to identify invalid responses quickly:

  • Validating identity. Asking participants to validate or expand on their qualifications at the beginning of the test is a good practice. Focusing on industry- or product-specific topics not easily found online is a great way to identify fraudulent responses quickly.
  • Video capture. Capturing visual cues that signal someone is misrepresenting their identity or experiences may also help. Turning on Camera View to capture someone’s reaction as they provide feedback will help. Some imposters are also generally more reluctant to turn on their cameras — it does make it easier for UserTesting to track down serial imposters.
  • Moderated sessions. Imposter participants during live moderated sessions or interviews can be easier to spot, but extracting yourself from the conversation may be uncomfortable when something feels off. The following are signals to look for:   
    • A tendency to stall more than a typical participant
    • Fake connectivity issues
    • Occasionally contradict themselves
    • Seem to be reading content versus. sharing their thoughts
    • May seem to have multiple other windows open on their computers

If you have doubts about the validity of a participant's responses after probing, you should politely excuse yourself from the conversation. Simply state that you think there was a mismatch with whom you were hoping to speak, thank them for their time, and flag the session to UserTesting immediately (see Analysis for details)

 

Analysis

The last way to identify imposter participants is during the analysis phase. Here, you are looking for signals that a response is invalid:

  • Review visual cues and responses to validation questions at the beginning of the test. 
  • You might also note that some responses are especially brief, lack depth and detail, or have an unusually high number of pauses.
  • Answers across multiple questions may be inconsistent in terms of chronological details. 
  • While it’s hard to identify an outlier with a small sample size, healthy skepticism is warranted if one of the responses is completely different from the other perspectives expressed, especially in conjunction with some of the other signals mentioned previously (e.g., brevity).

 

Handling suspicious activity

  • If you have a participant for any reason that doesn’t meet your expectations, you can exclude that participant from your study. Please review our help article on excluding or including participants to learn more.
  • Please provide detailed feedback on the reason you are excluding them from your study to our Support team, especially if you believe the participant is an imposter. For instance, you can note if the participant didn’t follow instructions, was unclear, feedback was insufficient, ChatGPT was used to generate a response, demographics were not a match, etc. We take this feedback seriously. We investigate the participant and remove them from the panel if they’ve violated any of the Terms of Service or the Code of Conduct they agreed to when signing up.

Contacting Support

Lastly, you can reach out to Support to help remedy issues you have with participants who provide invalid feedback. Availability of support channels - online chat, email, and phone - depends on your subscription.  You can reach our support team here.

 

Related Content

Want to learn more? Check out these Knowledgebase articles... 

Prefer a short video? Check out our University courses...

Please provide any feedback you have on this article. Your feedback will be used to improve the article and should take no more than 5 minutes to complete. Article evaluations will remain completely confidential unless you request a follow-up. 

Was this article helpful?
0 out of 0 found this helpful