UserTesting best practices

Review these UserTesting best practices to get the most value from contributor feedback.

This article applies to: ut logo tiny.pngUserTesting 

 

On this page:

 


 

Audience size

  • We recommend a sample size of 5 contributors per audience, per test.
  • A range of 5–8 contributors (per group, or audience, if your test is made up of multiple audiences) works for most tests.
  • For more information, check out our What sample size do I need? article.

 

 

Write effective screener questions

To learn more about screener questions and finding the right audience, check out the Best practices for Screeners course in the UserTesting University

Use None of the above, I don’t know, or Other as an answer.

  • This helps in case you’ve forgotten to include an answer that applies to the user.
  • This also helps if the user is confused by the question.

 

Provide clear and distinct answers that don’t overlap each other.

  • Providing answers that overlap or are very similar can skew your data. 
  • For example:
    • If the answer options are 1 - 10, 10 - 20, and 20 - 30,  users could answer 2 or more options if they need to answer 10 or 20.
    • A more effective way to write these options is 1 - 10, 11 - 20, and 21 - 30.

 

Avoid asking leading or Yes/No questions.

  • A leading question subtly encourages participants to answer in a particular way.
    • Small differences in wording can have a large impact on your data.
    • Non-specific wording leads to ambiguity and confusion in your study.
    • Words like could, should, or would produce significantly different results in a question, even though these words sound familiar.
  • Similar to leading questions, avoid using questions with Yes or No answers.
    • Some participants in online research are trained to recognize these questions as disqualifying points.
    • They are encouraged to select Yes to qualify for the study.
    • This potentially biases the resulting data.

 

Use screener questions to collect extra information about your participants.

  • This is a great way to better understand the context in which the test was taken.
  • Use the multiple select question options for this.

 

Validate your screener questions by asking a confirmation question at the start of your test.

  • For example, “You indicated in the screener questions that you are currently shopping for a new car. Please describe what kind of car you are looking for, where you have looked so far, etc.”
  • Sometimes, just listening to a user describe their experience can let you know if they’re the right fit.

 

 

Write great questions for unmoderated tests

  • Avoid leading or biased questions.
  • Use rating scale and other metrics questions.
    • These question types will validate and confirm your verbal or task questions.
    • Rating scales, written, multiple-choice, and multiple-select questions are more quantitative and will help you understand ‘what’ is happening and to what magnitude. 
  • Use qualitative questions to capture the why and how.
    • This is best accomplished through task-based questions or deep verbal questions.
    • For this, you gain empathy with your contributors and find that core human insight. 
  • If you need to emphasize something in a question or task, try these shortcuts to format emphasis like bolding or italicizing.
  • Plan for about 15-20 minutes for an unmoderated session.
  • Learn more about writing great tasks here.

 

 

Preview and pilot your test

  • First, preview your test yourself.
    • This allows you to see if your test flows as expected.
    • Previewing your test also lets you make sure everything is worded exactly as you'd like.
  • Then run a pilot of one contributor (per audience).
    • Pilot tests help to verify that:
      • All assets included in the study work properly.
      • All tasks are made clear to contributors.
      • You get the desired feedback from the study.
    • Review the results of the pilot and make any updates that are needed before sending the test to a larger number of contributors.
  • Skipping previewing and piloting your test can result in a wasted test or a test that gets put on hold because assets weren’t entered correctly. Save time and effort, and preview and pilot. 

Pro tip: Launch the same study by adding more contributors.

  1. In the dashboard, click on the 3 dots to the right of the project you want to modify.
  2. From the dropdown, select Add contributors.
    Screen_Shot_2022-02-04_at_10.26.14_AM.png
  3. Add the desired number of contributors.
  4. Select Add contributors to launch the additional sessions.
    Screen_Shot_2022-02-04_at_10.32.03_AM.png

 

 

Create similar tests

  • If you want to run a similar test, save time by using the Create similar test option from the drop-down menu for any test you'd like to replicate.
    • Once you make a copy, make any changes necessary, then launch it to a new set of contributors. 
    • Pro tip: If you need to change a website study to an app study, you'll have to create a new study.
  • To create a similar test, follow these steps:
    1. In the dashboard, click on the 3 dots to the right of the project you want to duplicate.
    2. From the dropdown, select Create similar test.
      Screen_Shot_2022-02-04_at_10.32.52_AM.png
    3. Edit the unlaunched copy of the test as needed.
    4. Change the name of the test, otherwise, it will have the same name as the original but (Copy) will be added to the title.

 

 

Save test drafts for future use

  • By creating a draft of a test, you can edit, finish, or launch it at a later time.
    • Drafts are automatically saved as you edit your test plan.
    • You can also create drafts to have colleagues review a study before launching it.
  • Access drafts from your dashboard. You can sort for just drafts by selecting the Drafts filter in the navigation menu.
    Screen_Shot_2022-10-10_at_3.00.30_PM.png
  • From there, you can Share your test plan with colleagues or Notify (via Slack integration or email) team members about the results.
    ShareNotify.png
  • Pro tips: 
    • Give the study a descriptive title to easily find it in your drafts.
    • Avoid editing the same draft in multiple tabs to make sure all changes are saved.

 

 

Watch contributor videos at high speed

  • Although not funny cat videos, contributor videos are easily the most fun part of gathering human insights.
  • Watching every video is time-consuming, and your team may not have time to do so.
  • To make this less time-consuming, watch your videos up to 3x speed.
    • This allows you to pick out memorable insights even faster.
    • To speed up the video, follow these steps:
      1. Select 1x at the bottom of your video player.
      2. Choose the speed you'd like to use.
      3. If you hear or see something unclear, slow down the video to normal speed.

PlaybackSpeed.png

 

Was this article helpful?