Avoid and handle imposter participants (UserTesting)

Learn strategies for writing effective screeners and asking questions during the test in order to validate that your participants are who they say they are.

This article applies to: ut logo tiny.pngUserTesting 

On this page:

 


 

Avoiding imposter participants

The most important way to avoid participants who misrepresent themselves is to set up your screeners, scenarios, and test questions to minimize exposure to bad actors.

Defining audiences

  • Join the UserTesting Network.
    • Understanding the participant experience–from completing screener questions and reviewing scenarios to actually taking tests–is one of the best ways to improve your tests.
  • Write effective screener questions.
    • One of the best ways to reduce imposter participants is to make it hard for an unqualified participant to easily guess the right answer.
    • In addition to following our screener best practices, here are some techniques we recommend:
      • Embed “dummy” responses.
        • Include answers that are fake or made up.
        • These answers can consist of a list of products a person might use in their current role, brands of food a person might feed their dog, or a treatment they might receive for a medical condition.
        • If the person selects the dummy answer, they are "rejected" from the test.
    • Include product or industry-specific questions and jargon.
      • Create screener questions that only someone with specific knowledge about your product, experience, or industry would be familiar with.
    • Ask participants to validate their experience.
      • Request that participants validate their experience or demographics as part of the test (e.g., share their LinkedIn profile).
      • However, you must notify participants in a screener question that you will ask them to share sensitive personal information, allowing them to opt out.

        Example:

        During this test, you will be required to share your LinkedIn profile. This information will only be used to validate your experience and will not be shared. Do you consent to provide this information?

        • Yes, I consent to share my LinkedIn profile to participate in this test. (Must select)
        • No, I do not consent to share my LinkedIn profile to participate in this test. (Disqualify)Read more about asking for sensitive personal information and PII.
      • Utilize saved screener questions.

        • Save your high-quality screener questions to speed up your test creation process, and make them available for others on your team to deploy best practices.

        • However, you should adjust these–especially dummy answers–periodically to prevent unwanted participants from recognizing patterns in your questions.

      • Add a screener question about fraud.

        • Add a screener question that reminds people that if they misrepresent themself, they will be subject to removal and/or non-payment.

          Example:

          Do you agree to share your own thoughts and experiences? As a reminder, if you misrepresent yourself in any way, you could risk not getting paid for this test and/or being removed from the panel based on UserTesting's Terms of Service. 

          • I agree (Must select)
          • I do not agree (Disqualify)

          💡 Take our on-demand course for additional screener best practices.

      • Don’t give away too much information when you write a scenario.
        • Remember the scenario is shared with the participant after they pass the screener questions but before they start the test.
        • Giving away too much information allows an imposter participant to get familiar with the topic before seeing any questions.
      • Save high-quality participants to test with in the future.
        • Capture feedback using favorite contributors. This is a great way to gather feedback from trusted participants.
      • Recruit your own.
        • Using your own participants via Invite Network or Custom Network can help with quality if you are confident you know who you are inviting to the study.
        • However, if you are sharing a test link on social media, as part of an intercept study, or other approaches where you don’t know the participant's identity, you will need to validate that within the study.
        • Please be conscious of asking for sensitive personal information.

 

Data collection

Even when preventative measures have been taken to prevent imposter participants from getting through, it is still a good idea to build an unmoderated test plan to quickly identify invalid responses:

  • Validating identity.
    • It is a good practice to ask participants to validate or expand on their qualifications at the beginning of the test.
    • Focusing on industry—or product-specific topics that are not easily found online is a great way to quickly identify fraudulent responses.
  • Video capture.
    • Capturing visual cues that signal someone is misrepresenting their identity or experiences may also help.
    • Turning on Contributor View to capture someone’s reaction as they provide feedback will help. Some imposters are also generally more reluctant to turn on their cameras — it does make it easier for UserTesting to track down serial imposters.
  • Moderated sessions.
    • Imposter participants during live moderated sessions or interviews can be easier to spot, but extracting yourself from the conversation may be uncomfortable when something feels off. The following are signals to look for:   
      • A tendency to stall more than a typical participant
      • Fake connectivity issues
      • Occasionally contradict themselves
      • Seem to be reading content versus sharing their thoughts
      • May seem to have multiple other windows open on their computers

If you have doubts about the validity of a participant's responses after probing, you should politely excuse yourself from the conversation. Simply state that you think there was a mismatch with whom you were hoping to speak, thank them for their time, and flag the session to UserTesting immediately (see Feedback analysis section below for details).

 

Feedback analysis

The last way to identify imposter participants is during the analysis phase. Here, you are looking for signals that a response is invalid:

  • Review visual cues and responses to validation questions at the beginning of the test. 
    You might also note that some responses are especially brief, lack depth and detail, or have an unusually high number of pauses.
  • Answers across multiple questions may be inconsistent in terms of chronological details. 
  • While it’s hard to identify an outlier with a small sample size, healthy skepticism is warranted if one of the responses is completely different from the other perspectives expressed, especially in conjunction with some of the other signals mentioned previously (e.g., brevity).

 

 

Handling suspicious activity

  • Be sure to rate contributors who provide poor feedback for moderated and unmoderated studies. We manually review all 1- and 2-star rated sessions.
  • You can also report a problem for any session that did not meet your expectations and flag if the participant didn’t follow instructions, was unclear, feedback was insufficient, ChatGPT was used to generate a response, demographics were not a match, etc.
  • We take this feedback seriously and will work with you to remedy the situation. We also investigate the participant and remove them from the network if they’ve violated any of the Terms of Service or the Code of Conduct they agreed to when signing up.

 

Rating sessions

  • With UserTesting, you can rate individual sessions.
  • Our team of specialists always reviews one—and two-star rated sessions. Adding a reason for a poor rating helps the team reviewing the session know what to look for.
  • Participants with bad ratings are less likely to receive invitations to tests. Conversely, giving positive feedback to good participants will help them continue to receive tests in their dashboard—and is encouraged. 

 

Reporting a problem

  • In addition to rating the participant, you can report a problem by selecting the Actions button next to the session.
    report a problem actions button.png
  • This feedback also gets sent to our Support team to help them troubleshoot issues you may experience. If the participant is found to have violated any of the Terms of Service or the Code of Conduct, we will remove the participant and refill that session when possible, or if not, refund the session. 
    report a problem.png

 

Contacting Support

  • Lastly, you can reach out to Support to help remedy issues you have with contributors who provide invalid feedback.
  • Availability of support channels - online chat, email, and phone - depends on your subscription.  

 

Related content

information icon.png

knowledge icon.png

Want to learn more? Check out these Knowledge Base articles... 

Interested in growing your skills? Check out our University courses...

video icon 2.png

team icon.png

Need hands-on training?

Can't find your answer?

Was this article helpful?