How to Avoid and Handle Imposter Contributors (UserTesting)

Learn strategies for writing effective screeners and asking questions during the test in order to validate that your contributors are who they say they are.

 

On this page:


 

We understand the importance of having a high-quality network you can trust when collecting real human insight. We continually invest and update Contributor Network solutions designed to ensure you receive authentic, high-quality feedback by flagging "imposter contributors" - individuals who misrepresent themselves and give poor-quality feedback before it reaches your dashboard. 

There are also many ways to help reduce your exposure to imposter contributors when you build a test, from how you recruit contributors to data collection and analysis. This article covers best practices and how to report contributor feedback issues to UserTesting.

 

Avoiding imposter contributors

The most important way to avoid bad contributors is to set up your screeners, scenarios, and test questions to help minimize exposure to bad actors.

 

Defining audiences

  • Join the Contributor Network. Understanding the contributor experience - from completing screener questions and reviewing scenarios to actually taking tests - is one of the best ways to improve your tests.
  • Write good screener questions. One of the best ways to reduce imposter contributors is to make it hard for an unqualified contributor to easily guess the right answer. In addition to following our screener best practices, here are some techniques we recommend:
    • Embed “dummy” responses. When creating screener questions, include answers that are fake or made up. These answers can consist of a list of products a person might use in their current role, brands of food a person might feed their dog, or a treatment they might receive for a medical condition. If the person selects the dummy answer, they are ‘rejected’ from the test.
    • Include product or industry-specific questions and jargon. Create screener questions that only someone with specific knowledge about your product, experience, or industry would be familiar with.
    • Ask contributors to validate their experience. You should request that contributors validate their experience or demographics as part of a test (e.g., share their LinkedIn profile). However, you must notify contributors in a screener question that you will ask them to share sensitive personal information, allowing them to opt-out. Read more about asking for sensitive personal information and PII here.

      Example:

      During this test, you will be required to share your LinkedIn profile. This information will only be used for the purposes of validating your experience and will not be shared. Do you consent to provide this information?

      • Yes, I consent to share my LinkedIn profile to participate in this test. (Must select)
      • No, I do not consent to share my LinkedIn profile to participate in this test. (Disqualify)
    • Utilize saved screener questions - Writing multiple complex questions designed to weed out imposters each time you create a test takes time. Make sure to save your high-quality screener questions to speed up your test creation process, and make them available for others on your team to deploy best practices. However, you should adjust these - especially dummy answers - periodically to reduce unwanted contributors from recognizing patterns in your questions.
    • Add a screener question about fraud. Add a screener question that reminds people that if they misrepresent themself, they will be subject to removal and/or non-payment.

      Example:

      Do you agree to share your own thoughts and experiences? As a reminder, if you misrepresent yourself in any way, you could risk not getting paid for this test and/or being removed from the panel based on UserTesting's Terms of Service. 

      • I agree (Must select)
      • I do not agree (Disqualify)

💡Take our on-demand course for additional screener best practices.

  • Don’t give away too much information when you write a scenario. Remember the scenario is shared with the contributor after they pass the screener questions but before they start the test. Giving away too much information allows an imposter contributor to get familiar with the topic before seeing any questions.
  • Save high-quality contributors to test with in the future. Capture feedback in the future using favorite contributors. This is a great way to gather feedback from trusted contributors.
  • Recruit your own. Using your own contributors via Invite Network or Custom Network can help with quality if you are confident you know who you are inviting to the study. However, if you are sharing a test link on social media, as part of an intercept study, or other approaches where you don’t know the contributor's identity, you will need to validate that within the study. Please be conscious of asking for sensitive personal information.

 

Data collection

Even when preventative measures have been taken to prevent imposter contributors from getting through, it is still a good idea to build an unmoderated test plan to quickly identify invalid responses:

  • Validating identity. Asking contributors to validate or expand on their qualifications at the beginning of the test is a good practice. Focusing on industry- or product-specific topics that are not easily found online is a great way to quickly identify fraudulent responses.
  • Video capture. Capturing visual cues that signal someone is misrepresenting their identity or experiences may also help. Turning on Contributor View to capture someone’s reaction as they provide feedback will help. Some imposters are also generally more reluctant to turn on their cameras — it does make it easier for UserTesting to track down serial imposters.
  • Moderated sessions. Imposter contributors during live moderated sessions or interviews can be easier to spot, but extracting yourself from the conversation may be uncomfortable when something feels off. The following are signals to look for:   
    • A tendency to stall more than a typical contributor
    • Fake connectivity issues
    • Occasionally contradict themselves
    • Seem to be reading content versus sharing their thoughts
    • May seem to have multiple other windows open on their computers

If you have doubts about the validity of a contributor's responses after probing, you should politely excuse yourself from the conversation. Simply state that you think there was a mismatch with whom you were hoping to speak with, thank them for their time, and flag the session to UserTesting immediately (see Feedback Analysis for details).

 

Feedback analysis

The last way to identify imposter contributors is during the analysis phase. Here, you are looking for signals that a response is invalid:

  • Review visual cues and responses to validation questions at the beginning of the test. 
  • You might also note that some responses are especially brief, lack depth and detail, or have an unusually high number of pauses.
  • Answers across multiple questions may be inconsistent in terms of chronological details. 
  • While it’s hard to identify an outlier with a small sample size, healthy skepticism is warranted if one of the responses is completely different from the other perspectives expressed, especially in conjunction with some of the other signals mentioned previously (e.g., brevity).

 

Handling suspicious activity

  • Be sure to rate contributors who provide poor feedback for moderated and unmoderated studies. We manually review all 1- and 2-star rated sessions.
  • You can also report a problem for any session that did not meet your expectations and flag if the contributor didn’t follow instructions, was unclear, feedback was insufficient, ChatGPT was used to generate a response, demographics were not a match, etc. We take this feedback seriously and will work with you to remedy the situation. We also investigate the contributor and remove them from the network if they’ve violated any of the Terms of Service or the Code of Conduct they agreed to when signing up.

 

Rating sessions

With UserTesting, you can rate individual sessions. One- and two-star rated sessions are always reviewed by our team of specialists. Adding a reason for a poor rating is helpful to the team reviewing the session so they know what to look for. Contributors with bad ratings are less likely to receive invitations to tests. Conversely, giving positive feedback to good contributors will help them continue to receive tests in their dashboard - and is encouraged. 

 

Reporting a problem

In addition to rating the contributor, you can report a problem by selecting the Actions button next to the session.

report a problem actions button.png

This feedback also gets sent to our Support team to help them troubleshoot issues you may experience. If the contributor is found to have violated any of the Terms of Service or the Code of Conduct, we will remove the contributor and refill that session when possible, or if not, refund the session. 

report a problem.png

The quality of our network is of the utmost importance and something we’re incredibly proud of. We aim to keep it that way and will continue to invest in solutions that allow us to deliver on that promise.

 

Contacting Support

Lastly, you can reach out to Support to help remedy issues you have with contributors who provide invalid feedback. Availability of support channels - online chat, email, and phone - depends on your subscription.  

 

Related Content

Want to learn more? Check out these Knowledgebase articles... 

Prefer a short video? Check out our University courses...

Please provide any feedback you have on this article. Your feedback will be used to improve the article and should take no more than 5 minutes to complete. Article evaluations will remain completely confidential unless you request a follow-up. 

Was this article helpful?