Inside the participant experience: What testers see and do

Understanding the participant experience is key to designing effective tests. This article walks you through what participants see when they take your test, helping you create clear and engaging tasks that yield valuable insights.

This article applies to: ut logo tiny.pngUserTesting 

On this page:

 


 

Why the participant experience matters

  • When setting up a test, it’s easy to focus on what you need from participants—but have you considered what the experience looks like from their perspective?
  • A well-designed test isn’t just about gathering insights; it’s about ensuring that participants clearly understand the tasks, feel engaged, and can provide thoughtful, high-quality feedback.
  • By stepping into their shoes, you can improve test clarity, reduce confusion, and ultimately collect better responses.
  • A smooth participant experience leads to more accurate insights, fewer dropouts, and a higher likelihood of useful feedback.

 

 

How participants join a test

Before participants can provide feedback, they must find and access your test. Understanding this process helps ensure that your study reaches the right people and has a smooth start.

How participants are recruited

  • Participants may come from the UserTesting Contributor Network or your own audience (via Custom Network or Invite Network).
  • If you’re using UserTesting’s network, the platform automatically selects contributors who match your criteria based on demographics, screener questions, and availability.
  • If you're testing with your own audience, you'll need to distribute the test link manually.

Receiving and accepting an invitation

  • Once selected, participants receive an invitation to take the test.
  • This invitation includes key details such as:
    • Test type (unmoderated video, survey, card sort, etc.)
    • Estimated time to complete
    • Any special requirements (e.g., specific devices, actions to complete before starting)

participant dash.png

  • Participants must accept the test and may need to answer screener questions to confirm they qualify. 
    screener example.png
  • If they don’t meet the criteria, they’ll be redirected elsewhere, ensuring only the right contributors move forward.
  • Participants do not see your test details, such as: 
    • Title of test
    • Scenario
    • Folder name
    • Company name

💡Tip: If you’re running a multipart test, like in a diary study, you’ll want to use the "Other requirements" field to let participants know which test they are taking. This will show up on the 'card' they see before taking any additional screener questions.

or as option 2.png

Preparing for the test

Before starting, participants may need to:

 

 

What participants see

  1. First, participants see what tests are available:
    participant deashboard mobile.png
    This is an example of a participant dashboard on a mobile device.
  2. If you have included screener questions, that’ll be the first thing participants see.
    screener example.png
  3. If participants meet your screening requirements, they’ll see your test scenario next. 

    Alternatively, if you provide a starting URL, it will be added to the scenario, and participants will access it before starting the tasks. Otherwise, they will proceed directly to the tasks.
  4. Then, participants will see any tasks or questions you’ve added to your test.
    Here is an example:





 

 

How participants provide feedback

Unmoderated tests (Think-out-loud tests)

  • Once participants begin the test, they’ll work through a series of tasks designed to capture their thoughts, behaviors, and reactions.
  • Ensuring these tasks are clear and engaging helps maximize the quality of their responses.

Types of tasks participants encounter

Participants may be asked to complete a variety of task types, including:

  • Tasks: Participants follow specific instructions to complete an action, such as navigating a website, testing a feature, or exploring a prototype.
  • Verbal response: Participants answer a question or share feedback using their voice, typically in a think-aloud format, while completing a task.
  • Multiple-choice and rating scale questions: Quick responses that help quantify opinions.
  • Written response questions: Open-ended questions where participants explain their thoughts in more detail.
  • Card sort: Participants organize information into categories that make sense to them.
  • Tree test: Participants navigate a simplified menu structure to evaluate the findability of content.
  • Five-second test: Participants view an image or design for five seconds and then recall their impressions, helping measure first impressions and clarity.
  • Camera task: Participants record a video using their device’s camera to share their thoughts and facial expressions or demonstrate a real-world action.

How participants record their feedback

  • For unmoderated usability tests, participants complete tasks while recording their screen and voice. 
  • The UserTesting platform captures:
    • Where they click, scroll, or navigate
    • What they say aloud about their experience
    • How they interact with the test content
  • For written surveys or card sorting exercises, participants type their responses or categorize items based on the given instructions.

Challenges participants might face

A well-structured test minimizes confusion, but common participant struggles include:

  • Vague or unclear instructions – Ensure each task explains what is expected in simple terms.
  • Too many steps in a single task – Break complex instructions into smaller, digestible parts.
  • Too long over overly repetitive tests – Ensure your test is brief and matches the test type to engage participants and avoid fatigue.
  • Technical issues – Ensure your test works smoothly across different devices and browsers.

💡Note: Participants can report problems if they encounter an issue during testing. If two or more participants report a problem, your test sessions will be placed on hold. To learn more, view our article What does it mean if a session has a “Problem” or is “On Hold?

Once the test is complete:

  • After completing a UserTesting test, participants see a confirmation screen, and any recorded video uploads automatically.
  • Their submission is reviewed for quality, and once approved, they receive compensation.
  • They can then check their dashboard for new test opportunities.

 

Interaction tests

Unlike traditional usability tests, Interaction Tests do not record video or audio of participants, focusing instead on captured interactions and structured responses.

  • Instructions: A text-based prompt that guides participants on what to do next. Instructions help set context and ensure participants understand their tasks before proceeding.
  • Navigation task: Participants are asked to find specific content or complete a journey within a website or app. This helps evaluate usability, ease of navigation, and potential friction points.
  • Figma task: Participants interact with a Figma prototype to test design functionality, user flow, and overall usability before full development.
  • QXscore: A proprietary quantitative metric that evaluates the overall user experience by combining multiple usability factors, such as task success, efficiency, and user satisfaction.
  • Multiple choice: Participants select one or more options from a predefined list to quantify their preferences, behaviors, or opinions.
  • Written response: Open-ended questions that allow participants to provide detailed, qualitative feedback in their own words.
  • Rating scale: A scaled response format (e.g., 1-5 or 1-7) where participants rate their experience, satisfaction, or perceived difficulty.
  • NPS (Net Promoter Score): A standardized survey question asking participants to rate how likely they are to recommend a product or service on a scale from 0 to 10, helping measure overall satisfaction and loyalty.
  • Matrix: A structured question format that allows participants to rate multiple items across the same scale, making it useful for comparing attributes or evaluating multiple aspects of an experience simultaneously.

 

Surveys

Unlike other test types, UserTesting Surveys do not record video, audio, or screen interactions, focusing solely on structured and open-ended responses.

  • Responses to all survey questions – Including multiple-choice, written responses, rating scales, NPS, and matrix questions.
  • Time to complete the survey – How long participants take to finish.
  • Survey drop-off rate – If applicable, tracking where participants abandon the survey.

 

Live Conversation

Live Conversations provide real-time insights and allow moderators to probe deeper into participant responses, unlike unmoderated tests that rely on pre-set tasks and questions.

  • Participant’s video and audio – The entire session is recorded, capturing both the participant and moderator’s interactions.
  • Screen sharing (if enabled) – If the participant shares their screen, their navigation and interactions are recorded.
  • Moderator and participant dialogue – The full conversation, including follow-up questions and real-time feedback.
  • Session duration – The length of time the conversation lasts.

To view the participant’s experience, see How do I participate in a Live Conversation on a computer?

 

 

Best practices for creating a smooth participant experience

  • A well-structured test leads to better insights, and that starts with making sure participants can easily understand and complete their tasks.
  • Here are some best practices to ensure a smooth experience for contributors:

Write clean concise instructions 

  • Avoid jargon or overly technical language.
  • Use direct, simple sentences to explain what participants need to do.
  • If a task has multiple steps, break them into numbered lists for clarity.

Set realistic expectations

  • Let participants know how long the test will take and what kind of tasks they’ll complete.
  • Provide context where needed, but don’t overload them with unnecessary details.

Test your test

  • Before launching, preview your test to check for clarity and potential confusion.
  • Run a pilot session with a colleague or a small sample group.

Use a mix of task types

  • Keep participants engaged by combining different task formats, such as video responses, multiple-choice questions, and written feedback.
  • Ensure that tasks are appropriately timed so participants don’t feel rushed or overwhelmed.

Avoid leading questions

  • Keep questions neutral to avoid influencing participant responses.
  • Instead of “Did you find this feature useful?” try “What did you think about this feature?”

 

 

Contriutor Support Center

Want to learn more about the participant experience? Check out these helpful articles from our Contributor Support Center.

 

 

Related content

information icon.png

knowledge icon.png

Want to learn more? Check out these Knowledge Base articles... 

Interested in growing your skills? Check out our University courses...

video icon 2.png

team icon.png

Need hands-on training?

Can't find your answer?

Was this article helpful?