At a Glance
This article explains how to use UserTesting’s tree testing tool, a usability feature that can help you evaluate the findability of topics on a website.
Tree testing is available on the following subscriptions:
|Flex plan||Seat-based plan|
|Advanced ✓ (add-on)||Professional ✓ (add-on)|
|Ultimate ✓||Premium ✓ (add-on)|
We recommend most users start with the Tree Testing template available in the Template Gallery. To learn more about tree testing—why it’s useful and when you should do it—read our "What is tree testing?" article. If you prefer to build the tree test right in the UserTesting Platform, see our article on “Integrated Tree Testing.”
Note: UserTesting information architecture (IA) features, such as tree testing, are included in our Ultimate subscription plans and can be purchased as add-ons for Professional, Premium, and Advanced plans.
Why should you run a tree test via UserTesting?
Running your tree test via UserTesting can deliver both qualitative and quantitative insights:
- Results from a tree test help you understand how many people could find the information you wanted them to find, how many people were unable to do so, and the paths people took, going through your content, before they settled on an answer. You can use the results of your tree testing to make more informed decisions related to how the information on your website or app is labeled and organized.
- Recordings of the contributors' sessions will give you behavioral insights into those who complete your tree test and answer questions such as “Did people find any task particularly confusing?” and “Why did people enter the tree one way and not another?”
- Tree testing is included with our Ultimate Edition subscriptions. For customers on our Advanced, Pro, and Premium Editions, our Information Architecture Testing package, which also includes card sorting, can be added for an additional cost. Please contact your account team for more information.
Note: Tree testing is not optimized for mobile.
How to set up a tree test in the UserTesting Platform
1. Create a new unmoderated test in UserTesting.
One of the tasks you will be able to add to your test is a tree test task. To start your test, select Create test > Create a test.
Next, click Website.
You'll have two options to reach your audience: Build audience (when you want to recruit contributors from the UserTesting Contributor Network) and Create link (when you want to generate a link to share with anyone outside the UserTesting Platform).
2. Select your sample size and target audience in UserTesting.
In order to get qualitative feedback when running a tree test, it’s best to opt for a larger sample size than you would when conducting a standard test. This is because you want to derive meaning from the number of contributors who complete each task in a particular way. We recommend distributing the test to 30–50 contributors to have some statistical confidence in your data. (See MeasuringU's website for resources about statistics and User Experience.) Additionally, you’ll get quite a bit of value from watching the videos of just a handful of those contributors.
If your account limits you to 15 contributors per test, launch one test with 15 contributors, then create a copy of that unmoderated test and launch to 15 more contributors. Alternatively, you can keep all the results in one test by inviting 30-50 of your own Invite Network contributors to take the test.
(Remember to pilot the study to a couple of contributors first to validate that you’ve set up the test correctly.)
For more information about finding contributors who are a best fit for your testing goals, see our other Knowledgebase articles that cover how to target an ideal contributor audience.
3. Include starting URL and provide some context.
In the Starting Instructions section of the Select Audience page in the UserTesting Platform, choose a URL to give contributors. If you don’t have any specific site you want your contributors to start on, use the A blank page option (see image below). Note that this starting URL will not link to your tree test—that link will be added in the Test Plan section of your test—because you want to provide more context and instruction before users reach the tree test activity.
Also, provide some context in the Scenario field, such as:
“Please note: One of the activities in this test will be conducting a tree test. This activity helps companies determine how they should structure and organize items.”
4. Build out your test plan, adding a tree test at the appropriate point in your test.
You may want to ask some background questions before having contributors tackle a tree test. For example, you may want contributors to rate how familiar they are with the tree test's topic. Do not include too many tasks before the tree test because you want to (1) avoid biasing the contributor before they take the test and (2) give the contributor ample time to complete the tasks.
5. Set up the tree test.
In the tree test task itself, follow the link to the tree testing app. The link will open in a new tab where you create a new tree test and enter test details.
After you've entered the test details, begin building your tree.
- You can include several levels in your tree, but a maximum of 300 nodes can be included.
- The text is limited to 100 characters for the text of each node.
- If you would like to copy a tree from a previous test, go to the Test Details accordion tab for that test, and click Clone Test (not seen in the screenshot below) at the bottom of the tab.
Once you’ve set up the tree of your test, add questions that the contributors are to answer about the tree you just created.
We recommend a maximum of 10 questions for a single tree, though there is no limit to the number you can ask. Remember the more tasks someone completes using the same tree, the more the results will be biased by how familiar they are with that particular tree.
Select the right answer for each question—you can have multiple correct answers to account for the fact that the same information may be available in different parts of your site or app.
6. Once you are done drafting the questions, your test is ready to publish.
By publishing the test, you are generating a URL to distribute to test contributors.
After your test is published, you'll see a message to that effect, along with the publicly accessible link that you'll distribute.
From here you can edit test details, clone the test, or unpublish it so that contributors can no longer access it. Setting project results to Public will generate a standalone results page with a separate URL you can share with other stakeholders, including stakeholders who do not have a UserTesting account.
Note: Creators can access, edit, and view results for their own tests. They cannot view or access tests launched by others on the account. Admins can publish and unpublish tests created by others, just as the creator can. However, admins cannot edit tests created by others. Once a test is deployed, admins can also view and share results.
7. Next, copy your link, return to the Build Test Plan page, and paste it into the Tree test URL box (within the Tree test task section).
8. Include follow-up questions to get additional insights from users.
Finally, add tasks for any follow-up questions you want to ask in order to evaluate the contributor’s experience of doing the tree test activity.
For instance, you might ask people the following:
- How easy or difficult did you find this tree test?
- Which labels, if any, were easy or difficult to understand?
- In general, how would you assess the labels in terms of finding the information you wanted?
Results of your tree test
Once the contributors have completed your test, the results will be visible in the Platform. Watch videos of contributors completing the tree test on the Sessions tab.
To view the quantitative results of your test, return to the tree testing app where you initially created the tree.
Click the Projects drop-down and select My Projects.
Locate the tree test project you want analyze and click Details to review the results.
The Details page appears:
Selecting the Results tab will reveal the summary of your results as well as the individual tasks. This includes correct and incorrect answers, paths taken, and average "time on task" (which, as it sounds, registers how long it took contributors spent on the task.) You are also able to export this data as a .csv or .xls file.
You can also view results for individual contributors by selecting the Participants tab:
Need more information? Read these related articles.
Want to learn more about this topic? Check out our University courses.
Please provide any feedback you have on this article. Your feedback will be used to improve the article and should take no more than 5 minutes to complete. Article evaluations will remain completely confidential unless you request a follow-up.