Learn how to create a pilot test and why it's an important step in your testing journey. |
This article applies to: UserTesting
On this page:
- Testing out your script
- Five things to check for in a Pilot test
- Making changes to your test plan
- Adding more contributors
Testing out your script
- UserTesting's Research Team has learned that one of the key ingredients of a great study is performing a pilot test.
- In a pilot test, just one(1) contributor goes through the test plan, and then the team watches the video noting any possible challenges the contributor encountered or ways the script could be improved.
A successful pilot test is one in which:
- The contributor answered all of the questions
- The script didn't accidentally use confusing terms or jargon that made the contributor stumble
- The contributor evaluated the correct areas of the website, app, or prototype
If the answer is "no" to any of the above, the test can be altered as needed and tried again with another contributor. Continue iterating on your test script until contributors can successfully complete the test and you collect the feedback you need.
Note: If you need to change the task order or remove unnecessary tasks, we advise creating a similar test to avoid malfunction with your Metrics tab.
Five things to check for in a Pilot test
-
Do your tasks and questions make sense to contributors?
While watching a contributor's video, focus on how they read the tasks and execute instructions:
- Do they understand all the terminology used?
- Are they providing verbal feedback that directly answers your questions?
- Is there ever a time when you wanted them to demonstrate something, but they only discussed it?
Note: Often, an edit to your tasks can keep contributors on track. If they misunderstand either your terms, questions, or assignments, rephrase them until they can be easily understood.
-
Can contributors adequately answer your questions?
Here's what to look for. Do the contributors...
- Provide answers that contain a sufficient amount of detail?
- Present quality feedback (both in terms of content and sound quality) that can be clipped and shared with others?
- Adequately address your goals and objectives for the study?
Note: If not, consider breaking up these questions into individual tasks before launching the full study.
-
Can contributors complete all required steps (e.g., logging in to a specific account, interacting with the correct pages)?
- There’s nothing worse than sitting down to watch contributor recordings, only to discover that all of the contributors checked out the live site when they should've been reviewing a prototype.
- Or they couldn't complete the login because of a glitch on your app.
Note: Oftentimes, inserting an extra sentence of instructions, or inserting a URL in an expected location on a site or app can make all the difference.
-
Are all the links in the script functioning properly?
A pilot test is a perfect opportunity to verify that all the URLs included in your study are functioning properly and accessible to your contributors.
-
Are the screener questions landing the contributors you need?
- When the type of contributor is important to your study, the pilot test is a great way to tell whether your screener questions capture the right demographic group.
- Add an additional question to your pilot test that asks contributors to describe whatever trait aligns with the demographic group you're targeting.
- This can be a job title or an industry.
Note: Continue revising your screener questions until you've recruited the best group for your study.
Making changes to your test plan
Based on your initial pilot test results, you may need to update a word or two in your test plan.
To make minor changes to your test plan, follow these steps:
- Open the Actions menu on your test results page and select Edit test details.
- If you found that the contributor wasn't the right fit for your test, review and edit the screener, as appropriate. (Go to Sessionstab and click Edit Screener, as shown in this image.)
To make significant changes to your test plan:
- Navigate to the Actions menu and select Duplicate to revise your existing draft as needed.
Note: Creating a duplicate test when making significant changes to your test plan will help ensure that your Metrics tab accurately captures test results.
Please note that pilot tests will count towards any usage limits on your account.
- If you're on a plan that includes limits on the number of tests, we suggest selecting a single test, repeating it, and then adding additional contributors.
- However, if you need to add or remove tasks, we recommend creating a duplicate test so that your Metrics tab shows the results accurately.
Adding more contributors
- Remember, just one (1) contributor goes through the pilot test.
- After successfully conducting a pilot test, you then add more users to the existing study.
- If you have multiple audiences in your test, run that one pilot test with each of those audiences. This includes audiences distinguished by the type device (e.g., desktop, smartphone) they'll be asked to test with. You won't be able to add additional audiences or devices to your test.
- Demographic requirements cannot be changed, just the screener questions.
- If you have to make substantial changes to your pilot, copy the test, make your changes, and relaunch another pilot test.
To add more contributors, follow these steps:
- Go to the same Actions menu and select Add contributors.
- If you do have multiple audiences, select from the drop-down the desired number of contributors you wish to add to each audience.
- Select Add Contributors to launch the additional sessions.
Related content
|
|
Want to learn more? Check out these Knowledge Base articles... |
Interested in growing your skills? Check out our University courses... |
|
|
Need hands-on training?
|
Can't find your answer?
|