At a Glance
This article will explain how to get quick insights using the Metrics tab in the UserTesting Platform.
The Metrics tab is a feature available for the following subscriptions.
If you had the time and the bandwidth to do so, you’d want to watch all the contributor videos in their entirety before analyzing and sharing results, making annotations throughout so as to attain as robust an understanding as possible of each contributor’s experience with your product or service.
But what if you don’t have the time to watch each video? This is when starting your analysis at the Metrics tab in the Platform can help manage your time most effectively, while still generating the insights you’re looking for.
Types of metrics
The tab reflects the tasks you settled on when building your test plan. Some common measurements are...
- Rating scale questions
- Multiple scale questions
- Verbal questions
- Written questions
(Each metric type also features a Watch task button that enables you to view contributors completing the task):
1. The rating scale-question metric tabulates how contributors feel about a particular task, in this example a 7-point rating scale that ranges from "Very difficult" (1) to "Very easy" (7).
2. The answers to multiple choice questions are plotted along a circle, reflecting the answers tabulated in the legend next to the circle.
3. The “verbal” question metric includes a keyword map that displays the adjectives contributors used to describe their experience completing the task. The adjectives are plotted according to how favorable (green) or unfavorable (red) the users found the task, or whether they gave a neutral reaction (gray).
4. The "written" response metric features “Smart tags” and “Time on task” values in addition to the answers contributors gave to the question.
Advantages of using the Metrics tab
The advantages the Metrics tab provides in terms of generating meaningful insights from your test results include...
Faster data analysis
As noted above, the Metrics tab tabulates the answers contributors gave to rating scale, multiple choice, and written-response questions. If you have interactive path flows available on your account, those paths will appear on the Metrics tab too, showing you in at-a-glance summaries what the contributors actually did on your site or app, and enhancing your ability to understand why contributors took the series of actions they did.
A method for homing in on what parts of the videos to watch first
Reviewing the data in the Metrics tab first gives you an overview of each contributor's experience and can help more quickly determine which parts of their videos should be prioritized—to "get to the good parts." Doing so allows you to close in on the results and subsequent insights that most impact your project.
The ability to quantify the responses
Quantifying contributor feedback allows you to better interpret that feedback and to...
- Clearly identify and communicate opportunities to your project team.
- Better compare results across contributors, both from within a test and across tests that use the same metrics.
Types of tests
The metrics found in the Metrics tab can help with all types of tests, but they are especially useful for tests that are...
- Quantitatively focused
If numbers are important to your stakeholders, then including metrics in your tests will help give weight to the findings.
Stages within a benchmark-test project
Using metrics will enable you to more easily compare feedback against the baseline test of your benchmark project. Doing so will also make it easier to see over time whether or not design changes are being well received.
Part of a quick design sprint
The insights gained from metrics found on the Metrics tab can guide any design iteration you undertake for the next round of testing.
Need more information? Read these related articles.
Want to learn more about this topic? Check out our University courses.
Please provide any feedback you have on this article. Your feedback will be used to improve the article and should take no more than 5 minutes to complete. Article evaluations will remain completely confidential unless you request a follow-up.