At a glance
This glossary describes a range of terms and features associated with the UserTesting Platform.
A quantitative research process by which two alternative versions of a webpage or app screen are displayed to randomly chosen users. UserTesting's qualitative approach helps to overcome a primary deficiency inherent to A/B testing: such tests do not account for WHY users make the choices they make.
Additional country filters
An extension of the standard Countries demographics filter (Canada, India, United Kingdom, and the United States) found in the Filters section of the Select Audience page, the Additional country filters list provides the means to target test contributors in a variety of countries.
Add to Favorites
A link on the Sessions tab, next to the name of each contributor who completed your test, that allows you to add that contributor to your network of favorite contributors.
Select the Assets feature when building a test plan in UserTesting Platform. An asset can be an image, ad, a message—any type of content you want tested and evaluated.
UserTesting uses the term "audience" to refer to the group of contributors you're either in the process of targeting or have decided on for your test.
Turn on this feature in the Build Test Plan page when you want your test to compare two things. The order of the compared sections alternates for each contributor, thereby reducing the bias that can occur when assets are presented in the same sequence.
Camera task section
Add camera tasks to an unmoderated-test test plan to instruct contributors to use their camera when performing specific test activities. Camera tasks are especially useful during customer environment and context testing.
In a card sort, contributors sort “cards” containing different items into groups. This research methodology helps you discover how people understand and categorize information. There are three primary types closed, open, and hybrid.
As part of the overall visualization of a contributor's journey through a site or app when completing a task, a click map shows precise moments where the contributor interacted with the screen (i.e., where and at what point the contributor clicked on the screen). Using a click map, you can immediately detect those screen elements contributors clicked the most.
The term "Collaborator" describes a license account role in both the seat-based subscription model. This account role gives user-specific privileges to access certain features of the UserTesting test sessions.
Custom Confidentiality Terms
Whenever you want to present customized confidentiality terms to contributors, in addition to the UserTesting Terms of Service contributors have already agreed to, switch on Confidentiality terms toggle on the Select Audience page in the Platform. You can then drag and drop a PDF file of your customized terms into the field that appears after you activate the toggle. Be sure to present custom terms to contributors before they take the test and after they pass screeners.
Formerly referred to as a "participant," a "contributor" is the term given to someone who participates in a UserTesting test of your product or service. A contributor can come from any of the three networks: the UserTesting Contributor Network, Custom Network, and your Invite Network.
This feature helps you remotely capture facial expressions outside of a lab environment. Use Contributor View to see how contributors react to products, apps, and messaging. The Contributor View is activated on the Build Test Plan page in the Platform.
Custom Network (formerly "My Panel")
Allows you to invite anyone to join your contributor network. UserTesting takes care of test distribution or the scheduling of Live Conversations. UserTesting also handles notifications and incentive payments. Custom Network allows you to test again and again with the same contributors. (Note: Customers on Premium and Ultimate subscription plans have access to Custom Network.)
The page you land on after logging in to the UserTesting Platform, the Dashboard is designed to help you easily build, launch, and manage all your tests in one place. You can access your account settings, and through the vertical navigation menu, you can switch among workspaces, folders, tests, drafts, and highlight reels in a single view. The Insights page, situated in the middle of the Dashboard, displays the results of your tests and is where you go to review video clips of contributor sessions.
Also referred to as "Test Filters," or just "Filters," this refers to a group of criteria found on the Platform's Select Audience page that help you target contributors who best fit your testing goals. Age, Income, Job role, Languages, and Web expertise are some of the most commonly used filters.
One of the primary tabs on the Insights page presenting the results of a completed test, the Details tab displays the demographics and screeners used to select your audience and the details of your test plan (the questions and tasks you wrote to make up your test plan).
A diary study describes a study where you interact with the same contributor over a period of time, such as running a test with a contributor when they first download a piece of software and then running a weekly test with that same contributor for their first month of using the software.
Discovery interviews allow you to learn about a customer’s background, goals, expectations, and attitudes about current experiences. Run discovery interviews when you are early in the process of defining a product or service.
An Empathy Map is a collaborative visualization used to articulate what we know about a particular type of user. A well-thought-out map can provide insights into your customers and their needs.
Exclude contributors from prior tests
This option allows you to control how much access past contributors have to a new test that you are looking to fill. Once you click on this filter, the names of the ten most recent tests launched by you or anyone working in the account are listed. Check one or more of these test names to exclude people who participated in these past tests from qualifying for the new one.
See Fresh Eyes
Export to Excel
A tool that allows you to consolidate and transfer the results of a test to an Excel spreadsheet. This procedure is designed to make analyzing and sharing results more efficient.
See Metrics tab
A feature in the Platform that allows you to create and manage networks of contributors you want to test with again in future tests. Favorite Contributors allows you to track favorite test contributors, create networks of test contributors who match stricter requirements and use the same contributors across tests.
To access this feature, go to the Sessions tab for one of your completed tests and click the Add to favorites link.
Five second test
The Five second test is one of the Tasks and questions options on the Build Test Plan page. The contributors are shown a visual—a landing page, a shopping cart, an image—for five seconds, after which they'll be asked three questions to recall their impressions and understanding of what they saw.
The Flex plan is one of two primary subscription models (seat-based being the other) that defines testing capacity and usage, as well as permissions on the UserTesting Platform. There are three types of testing capacity you can purchase on the Flex plan, and each capacity level consists of Session Units, which are exhausted whenever account members launch tests.
The advantage of the Flex plan over the seat-based alternative is that it allows you to get the most flexibility and continue to use the Platform even after you have met or exceeded the usage you committed to when purchasing the subscription up front.
Flex plan subscriptions
There are three kinds of subscriptions associated with the Flex Plan model: Essentials, Advanced, and Ultimate.
Flex plan licenses
A default account setting for the UserTesting Contributor Network that prevents repeat contributors from completing more than one of your tests (on the account level) in a given time frame. Test filters allow you to override this setting at the test level.
See Test Frequency
A highlight reel is a curated selection of video clips you've created. These reels can include multiple clips from the same test or clips from multiple tests. And you can use highlight reels to show your team common responses and essential feedback from your end users.
Information architecture (IA) is a science of organizing and structuring the content of websites, web and mobile apps, and social media software. The goal is to help users find information and complete tasks. To do this, you need to understand how the pieces fit together to create the larger picture, and how items relate to each other within the system.
Found on the Insights page Metrics tab, the Intent path is an interactive visualization that groups specific customer behaviors together, based on that individual's intent, and evaluates the web elements users engage with. As is the Sentiment path and Paths filter, the Intent path is overlaid on the Interactive Path Flow (IPF).
Interactive Path Flow (IPF)
IPFs are visualizations that show how contributors behave when navigating a website or prototype. In addition to navigation, these path visualizations are useful when testing other multistep experiences: onboarding flows, creating an account, and finding content. Note that an IPF is produced only when a contributor navigates across at least two screens within the task-based experience.
Invite Network (formerly My Recruit)
This network option provides you the ability to create and launch a study with anyone, anytime—on-demand. The feature is entirely self-service, helping you tap into insights from employees, existing customers, industry experts, and people within unique demographics. Unlike with Custom Network, you are responsible for compensating and scheduling the people you invite.
There are different licenses/roles depending on whether you have a seat-based or Flex plan subscription model.
Seat-based plan: For each of the three seat-based plan subscriptions (Startup, Professional, Premium), the licenses/roles are Unlimited, Standard and Light, Collaborator, and Admin. Licenses have to be purchased on the subscription in order for them to be assigned to team members.
Flex plan: For each of the three Flex-plan subscriptions (Essentials, Advanced, Ultimate), the licenses/roles are Collaborator, Viewer, and Account Admin.
A Live Conversation interview is a moderated two-way video conference interview across all platforms, including mobile. You can run Live Conversation tests with contributors from either the UserTesting Contributor Network, Custom Network, and your Invite Network. To begin the process of scheduling a Live Conversation, go to the Dashboard and click the Schedule a Live Conversation option in the Create test drop-down menu.
One of four tabs (or five, depending on the type of subscription you have) in the Platform's test results page that displays the results of a UserTesting test. Here, you'll find data and data visualizations for the metrics-based tasks you included in your test plan—results ranging from multiple choice and rating scale questions to the more qualitative-driven results of written and verbal contributor responses. You can export the results using the Export to Excel link found with each metric.
This is a live conversation in which the moderator asks questions, and instructs and directs contributors on executing the study’s tasks. A moderated test can be moderated by the customer or by UserTesting.
A moderated test is useful for when you want ask contributors follow-up questions in real time, or if you need to intervene real time to resolve any issues the contributors may be having.
Refers to the vertical menu to the left of the Dashboard. Contains links to reach sections containing tests, drafts, and highlights, and links to the Template Gallery and the Template Libary. If you have multiple accounts, you can go to the menu to toggle between and among them. The menu also lists all folders and workspaces belonging to your account(s).
Needs assessment study
A needs assessment study allows customers to raise and talk about the gaps in the current service or product they are using, or to rank possible features based on how well those features are expected to meet their needs. Run a needs assessment when you have identified new offerings or features and need feedback from users about how they expect these new offerings or features to work.
The group of potential test contributors recruited and maintained by UserTesting. Network members (called “contributors”) have access to an online Dashboard that notifies them of tests, and often have the mobile app pre-installed on their mobile devices.
You may have your own system or methods for taking notes during a Live Conversation, but you, as the moderator, and colleagues can also deploy the UserTesting Platform’s feature for note taking so as to take down real-time observations during a Live Conversation session.
An omnichannel study captures activities that involve more than one channel or device. As such, channels within an omnichannel customer experience need not be exclusively digital or physical. For example, if a customer visits a retailer’s website to check on whether an item is in stock at a nearby store and then visits the brick-and-mortar store to purchase that item, they have completed an omnichannel experience.
Other requirements field
Allows you to inform contributors of special requirements before they agree to take your test. Access this field on the Select Audience page in the Platform. (The field is displayed to contributors before they answer any screener questions, so take care when writing your screener question not give away the correct answer to a screener.)
The Path filter is the search tool within the Interactive Path Flow that allows you to filter for specific pages, intents, or sentiments. A great means for quickly focusing on specific contributor behaviors, the Path filter is available through the Flex plan's Advanced and Ultimate subscriptions.
Switch on the Path tracing toggle, located above the Interactive Path Flow visualization, and you'll be able to highlight paths that go through particular screen steps. You can also highlight a single contributor’s path.
The highest subscription level within the seat-based model.
If your subscription includes Premier Support, you'll have access to concierge-level assistance that includes access via chat and a phone line to our veteran product and technical/Support teams. You’ll also have access to an unlimited number of 30-minute Power Sessions with our dedicated research consulting team.
UserTesting's Professional Services team guides you through your research projects or conducts the research on your behalf. Among other services, a Professional Services expert can review your test plan or create one for you, and recruit contributors from the UserTesting Contributor Network.
More evolved than a design concept, a "prototype" refers to a sketch or wireframe, and you want contributors to provide feedback on this early, semi-function design—the labeling, organization, and navigation—by describing what they expect to see or happen.
The results of a qualitative test explains the "why?"—what were the reasons users did what they did—behind the numbers of quantitative data. You can collect qualitative data before, during, and after you assign tasks and questions designed to generate quantitative numbers. Having customers/contributors describe verbally or in writing their user experience—telling a story about that experience—is an example of qualitative insight (though having the user/customer show you their experience is even better).
This type of data tells you WHAT users or contributors did when using or testing your product. Examples of quantitative-driven tests are A/B and preference tests, card sorts, and surveys. In and of themselves these numbers-driven results don't tell you the WHY behind user behavior. That's where qualitative-driven tasks and questions come in handy. Ideally, numbers (quantitative) and narratives (qualitative ) reinforce each other, working together so as produce customer feedback at its most meaningful and actionable.
Quick Answers tests
This type of test is geared towards customers who need insights and feedback fast with as little heavy lifting as possible. For Quick Answers tests, the tasks and questions are written for you, and designed to provide feedback on the most common product and marketing challenges, everything from validating a concept to testing how easy or difficult it is to use your website. The test results include recorded video feedback of customers or prospects as they complete the tasks and answer questions.
Saved screener question
A collection of screener questions you've created and found on the Select Audience page of the Platform. Use saved screener questions to target again and again specific audiences. Doing so will generate the feedback you need to make the right decisions as you develop products, campaigns, and other digital experiences.
Screener questions identify specific contributors for your tests. Over time, your team learns which questions work well for finding the best contributors.
For this subscription model (sometimes called “Insight Core”), admins for the account need to assign all the members to a user seat. There are three subscription types—Startup, Professional, and Premium, and various licenses under this model; until each team member is assigned a license, they'll be unable to create or launch tests.
Secure prototype hosting
Allows customers to upload an HTML prototype accessible by test contributors only during the time of testing, adding an extra layer of security to your top-secret designs. The prototype is hosted by UserTesting.
This interactive path feature is laid on top of the Interactive Path Flow (IPF) visualization tool. As part of the feature, a "sentiment indicator" reveals the number positive (green) and negative (red) sentiments expressed by contributors at each screen.
If your subscription has it activated, the Sentiment path tool will be visible on the Metrics tab for all task-based questions.
This refers to the response of a single contributor to a UserTesting test. A session consists of the video plus the metrics of a recorded test, as well as the video of a single interview in Live Conversation.
One of the tabs on the test results page, this tab contains displays the video clips, organized by username, of each contributor who completed your test. You can watch, download, and share video clips; publish screener questions; add more sessions to your test; or add specific contributors to a network of favorite contributors.
See Add to Favorites, Metrics tab, Details tab, Summary tab
A unit of measurement that tracks the testing capacity used and available on a Flex plan account. Different kinds of tests consume different amounts of Session Units. You can manage the usage of such units through Session Unit rate card.
This is a targeted test plan with a recommended limit of five tasks. You'll get back five-minute video responses of all contributors completing your test.
A setting on the Select Audience page that allows you to enter the usernames of certain contributors from the UserTesting Contributor Network whom you want to participate in your test(s). Select Specific contributors when you have a particular contributor in mind that you want to test. You'll then be prompted to add their username and any other requirements in the space provided.
With the Summary tab—found on the Platform's test results page—you can create an overview of a test’s results and share it with colleagues. The tags, notes, and clips that you created when reviewing videos of a test's contributor sessions are imported into the Summary tab, making it easier and faster to access and share insights.
Completing tasks and answering questions are the activities contributors perform during the test so that you can get answers to your research questions.
See Test plan
A repository for the templates you can use to create a test plan and build a test. You can access Quick Answers templates from this gallery, which comes with questions and tasks already loaded into the template. Access the gallery from within the navigation menu on the Dashboard.
Testing capacity is a measure for how much the UserTesting Platform is used on the Flex-plan subscription. Testing capacity is unlimited; it is also purchased up front and more can be added later if needed. It includes committed usage, flex usage, or add-on usage purchased throughout the subscription period.
See Flex plan
Found on the Select Audience page of the Platform, this option allows you to manage how often a contributor can take a test presented by your account. It can be especially useful if you want to prevent contributors from taking more than one of your tests—keeping feedback “fresh.”
See Fresh Eyes
This informs the structure of your test and includes the questions and tasks to be used, as well as the demographics you decide are best for recruiting contributors.
Test results page
The page within the UserTesting Platform Dashboard where you go to analyze and review the results of completed tests. The page consists of the Sessions, Summary, Metrics, and Details tabs.
A type of test in which contributors test the organizational structure or your site and/or app and how easy that structure is to navigate.
In the UserTesting environment, a transcript refers to a machine-generated written version of everything said in a UserTesting session. It’s displayed in the video player in a separate tab.
Users with this seat-based account license that has the greatest range of permissions, most notabiy the ability to run an unlimited number of tests. This user type is available on all three seat-based subscriptions: Startup, Professional, and Premium.
A subscription type under the Flex plan that gives you access to the Platform’s various capabilities (e.g., targeting, test-creation, research, reporting, and sharing capabilities).
See Flex plan subscriptions
Unlike its “moderated” counterpart, this type of test has no moderator conducting the test. The contributor decides when and where they would like to complete the test, and records their feedback out loud.
Unmoderated tests are useful for when you want to gather feedback quickly. And having no moderator often helps to reduce bias in the contributor feedback.
See Moderated test
Usage and history dashboard
Found on the Settings page, the Usage and history dashboard displays the testing usage summary for accounts subscribing to either the Flex plan.
The Dashboard is divided into three sections: Usage summary; Top usage by team member; and Top usage by workspace, with the Summary reflecting the number of Session Units available for and used by your account/subscription. You have the option to export the data as a CSV file.
A type of test, either moderated or unmoderated, that tests the functionality of your design. Want to know whether the purchase flow of your website is structurally sound and easy to navigate? Run a usability test.
UserTesting Contributor Network (formerly "UserTesting Panel")
The group of test contributors recruited and maintained by UserTesting. Network members (known as “contributors”) have access to an online dashboard that notifies them of tests, and often have the mobile app preinstalled on their mobile devices.
The UserTesting University training site is the place to go for on-demand, interactive lessons and live training events (Customer Labs). Learn about methodologies, use cases, and Platform features so as to collect actionable insights for your own projects, and to help everyone in your company scale their capabilities.
The Viewer is the Flex plan equivalent of "Collaborator" for the seat-based plan. A Viewer has access to many features on the UserTesting Platform (e.g., view videos, conduct analysis, create clips, and create highlight reels).
See Collaborator, Flex plan
Workspaces allow you to effectively organize and control access to your templates and tests across projects and teams. Workspaces can either be open or private and created by account Admins or anyone on the account with an Unlimited license.