How to Analyze and Share Results: Overview

At a Glance

Quickly and easily analyze and share your findings. This article shows you how to gather and assess your data while highlighting best practices, and using metrics and dashboards to gain actionable insights. 



Notes and Tags

Mark as Important

Compiling Data

Harness the Power of the Spreadsheet

Share Your Findings


Learn More


Analyze Your Results


Once you’ve gotten back your test results, it’s time to get to work on analyzing those results. 

Before you begin to do any analysis, think back to your objectives. Keeping your study objectives front of mind will help you sift through the wealth of contributor observations and gain actionable insights into your contributor’s experience. 

Continue to ask yourself “Is this directly relevant to my objectives?” For example, the usability of a checkout process and contributor comments on their interest in buying a certain product might be an interesting topic to you—and could be a topic to explore in another study—but you should refrain from pursuing this area of analysis if it’s unrelated to your current objectives.

With these goals in mind, it’s time to jump into the data and videos. Here are some steps to help you get started.


Ideally, when thinking about how to analyze your test results, you’d want to watch all the contributor videos in their entirety, making annotations throughout. This can result in a deeper, more holistic understanding of each contributor’s experience and result in insights that best serve your testing objectives.

But doing all that is manageable only if you have the time to undertake such a commitment. If you’re pressed for time, you can use metrics to quickly see a high-level summary of the data, from rating scale, multiple choice, and written response questions. The types of metrics in the UserTesting Platform are “subjective”: ease of use, difficulty, preference, and time on task. These can be used to uncover trends in the contributors’ experiences without you having to watch every video.

You can access your study's metrics by selecting the Metrics tab in a study or by exporting the results through the Export to Excel option.


When summarizing the data, see if there are clear trends, interesting patterns, or surprising responses. Do the findings satisfy your objectives? Did any of the trends defy expectations? It’s important to remember that metrics don’t always tell the whole story—several low or mixed ratings might be disappointing, but they likely indicate that the recorded sessions contain some valuable insights. Investigate these moments by clicking on each rating and seeing what experiences led to that rating.

Please see our “Using the Metrics tab” Knowledgebase article to get an overview of the Metrics tab—why it’s used and what types of tests it best supports.

Sentiment Analysis

Sentiment analysis offers you immediate access when reviewing a completed session in the UserTesting video player, to moments of positive or negative reactions. This feature works with the Transcripts tool to process phrases and sentences in a naturalistic language. 

(Note: You will need an advanced, ultimate, or premium subscription to access this feature.)


If your plan allows, you may have access to smart tags and smart tags for written tasks. With smart tags, you can capture a total of 10 tags, within the feedback, that cover positive, negative, and neutral moments. For written tasks, smart tags are added in-line with the Written Response section on the Metrics tab and are color-coded according to the sentiment expressed at that moment.

Notes and Tags

While reviewing your videos, create notes to bookmark or highlight interesting moments or observations related to your objectives. You can use them to capture points where the contributor has gotten stuck or frustrated, finds something really useful, answered a question, or has reached a particular part of the process.

A tag is a set of keywords used to categorize notes into patterns or trends. Tags should be guided by your objectives, and refer to the specific outcomes or behaviors that you’re interested in (e.g., did the contributors notice the call-to-action that you’re testing). See some of our recommended tags for organizing notes and clips.

Just add a “#tag” to any note or clip summary. Adding an end timestamp to an annotation automatically turns the annotation into a clip.

As you go through a video of a study session, create and tag notes wherever you observe an event or behavior relevant to your objectives. When creating a note, write down what you observed, when it happened, and what may have caused it (and what may have resulted from it). When you’re later reviewing your data or sharing it with a colleague, this process helps clarify the context of your observations.

Mark as Important

As you create clips, there’s a star icon that you can select to mark notes and clips as important.


A filled star means the clip has been flagged as important; a hollow star means no opinion has been made as to the clip's significance (or you/a colleague believe the clip to be insignificant).

If and when a contributor has either a negative or positive moment, mark the clip containing that moment as either a positive or negative sentiment (clicking the face icons):


Here are some suggestions for using these clip features:

    • The clips you mark as important immediately transfer to the Highlight Reel Editor.
    • Pay attention to clips that you or colleagues have marked as insightful. Doing so can save you time later.
    • Focus on the clips with the most engagement, then share these key insights with stakeholders and decision-makers.

Compiling Data

As you gather observations, you may notice patterns where contributors encounter issues. Documenting these patterns will create discussion points for team meetings and identify opportunities for improvement.

Having identified and tabulated patterns and trends in the contributor videos, see if these insights answer the questions you posed at the start of the study. Summarize your notes. What issues are most contributors encountering? Dig deeper into these issues and try to understand why they occurred.

In this example, while a few contributors found the homepage of a website attractive, the majority thought otherwise.

While you may be focused on the issues contributors are encountering, it’s also important to recognize the things that people love. Paying attention to both helps you avoid trying to “fix” something that’s not broken.

Harness the power of the spreadsheet

It’s easy to gather, analyze, and share your findings from the UserTesting dashboard. At a glance, you can see time-on-task measurements, responses to metrics questions, and more.

However, if you want to perform a more detailed analysis, download the data from your study into an Excel spreadsheet (selecting the Export to Excel feature). This can be especially helpful for compiling findings from multiple studies, placing the results side by side. By using an Excel spreadsheet, you can break down patterns in tests that had a large number of contributors and compare responses from different demographic groups.

Share Your Findings

Now that you’ve observed the data, get ready to share your insights.

UserTesting highlight reels, where you can string together highlight reels of 2–3 minutes, offer an easy way to capture and share with your team the most relevant behaviors and trends emerging from the test results.

See our “Creating Highlight Reels” article, which gives step-by-step instructions on making a reel of clips to share with team members and other stakeholders.

Other Ways to Share Your Videos

From the Sessions tab inside your dashboard, there are some options for sharing your videos with colleagues or stakeholders. You can share multiple videos by checking the boxes next to the individual videos. At the bottom of the tab is an Actions button. Click it and a pop-up menu appears that…


…contains a Share videos option. You can also...

  • Click the Download videos option to put the video asset(s) on your local computer, at which point you can decide how to distribute them.
  • Share individual clips from within larger videos from within the video player. Returning to the same video player menu where you made and tagged clips, click Share from the drop-down Options menu to the right:

Another option is to share clips from within the video player. Returning to the same video player menu where you made and tagged clips, click Share from the drop-down Options menu to the right: 


Sharing research findings with stakeholders and colleagues in multiple departments can be a great way to promote a user-centered culture in your company.


To recap the tips and best practices for analyzing and sharing results covered in this article:

    • Keep your objective in mind when reviewing data and videos.
    • Use metrics to quickly see a high-level summary of your data.
    • Create and tag notes wherever you observe an event or behavior that is relevant to your objectives.
    • String together clips of your observations into a 2–3 minute highlight reel and share your findings.
    • Present the findings to your stakeholders.

Learn More

Need more information? Read these related articles.

Want to learn more about this topic? Check out our University course:

Please provide any feedback you have on this article. Your feedback will be used to improve the article and should take no more than 5 minutes to complete. Article evaluations will remain completely confidential unless you request a follow-up. 

Was this article helpful?
0 out of 0 found this helpful