At a GlanceCompleting tasks and answering questions are the activities contributors perform during the test so that you can get answers to your research questions. This is Part 5 of a series of articles. |
Overview
While completing tasks, contributors will speak their thoughts out loud so you understand their thought processes and the reasoning behind their actions. A couple of examples of tasks are:
- "Imagine you are thinking about buying a Bluetooth headset. Use Google to do the research you normally would."
- "After using this webpage, what are three words that describe how you feel about this company?"
Can’t wait to get started?
We offer a variety of templates with editable tasks so that you can get started quickly. These are great shortcuts to help you write your first test. From the Create test drop-down menu in the Dashboard, select Customize a template.
Want to start fresh? Below is a quick overview of the tasks available on the Build Test Plan page. Review our "Setting Up Tasks" article for more support.
If you would like in-depth information on the various task types, please see our "Tasks Overview" article. Additionally, our "Writing Great Tasks" article will help strengthen your task-writing skills.
- Short test: This feature helps you get focused feedback from higher sample sizes. Only available on the Flex plan.
- Assets: This feature allows you to include a URL, Image, and Video or audio that contributors see or hear during the test. Display an asset and follow it up with additional tasks and questions regarding that asset.
- Task Groups: The options in the Task groups and Tasks and results sections help streamline the test-creation process and organize contributor answers for at-a-glance discoveries. Select the task templates relevant to your desired outcome (e.g., findability, content, design evaluation) and the results will be autogenerated in your Metrics tab.
- Task: Sometimes referred to as a "blank task," this is an open field where you direct contributors to complete tasks. It is best to frame tasks as a goal you want the person to accomplish, such as "Spend 1 minute finding the ATM closest to Pike Place Market."
- Verbal Response: These questions prompt the contributors to provide a spoken answer that correlates with where a contributor is in the test. This task type is great for asking interview questions, such as "Tell us about the last time you were online looking for an airline ticket." This can then be followed with a URL asset pointing to your prototype and a task asking, "With your previous answer in mind, how would you find that same ticket using this site."
- Multiple Choice: This question allows you to ask a question and have the contributor select from a set of answers you provide. You decide if you want to allow the contributor to select only one answer or if they can select multiple answers.
- Rating Scale: This type of question allows you to ask a question and provide a scale with defined endpoints, such as "very difficult" to "very easy." You define the endpoints and can select from a number of different sizes for the scale. Be sure to use consistent scale sizes throughout the test (e.g., all 7-point scales). Rating scale and multiple choice questions are useful for collecting discrete numbers across your contributors ("8 of 10 contributors rated the task as very easy.").
- Written Response: This question prompts contributors to type their answers into a text box, making it easier to analyze their responses. Note that their verbal responses are recorded at the same time, so a contributor may speak of detail beyond what they type into the box.
- Card Sort and Classic Card Sort: The card sort task allows you to set up an open, closed, or hybrid card sort in the UserTesting Platform. The classic card sort task allows you to set up your card sort using our external tool, ia.usertesting.com. Use this task to understand how contributors understand and categorize information.
- Tree Test andClassic Tree Test: Run a tree test when you want to evaluate the structure and labeling of your site or app, to learn whether contributors would be able to find specific items based on how well (or not) you organized and labeled your content, menus, and calls-to-action (e.g., "Add to Cart").
Notes about tasks:
- If you need inspiration, select the link under Popular tasks to review and use a bank of common and popular tasks and questions.
- There is no limit to the number of tasks in a test; however, tests should take 15–20 minutes to complete. Consider how long you expect contributors to spend on each task and limit the number of tasks accordingly.
- Being able to save individual tasks is not available, but this article provides some guidance on another option.
- Tasks should not require a contributor to provide any sensitive Personally Identifiable Information (PII). If you are a covered health entity and have signed a Business Associate Agreement (BAA) with UserTesting, you may collect Protected Health Information (PHI); read our article about collecting insights under HIPAA to learn more.
- Separate follow-up questions from the tasks themselves. Primarily, this ensures that contributors see and answer each of your questions. Secondarily, because each task and question is tagged in the recording, you can skip directly to the specific question and answer in the video. This makes it easier to review the results and create clips to share.
Next Quick Start section: "Launch a Study"
Learn MoreNeed more information? Read this related article: Want to learn more about this topic? Check out our University courses. |
Please provide any feedback you have on this article. Your feedback will be used to improve the article and should take no more than 5 minutes to complete. Article evaluations will remain completely confidential unless you request a follow-up.