The QXscore™ is a UserTesting metric that evaluates digital experience quality on a 0–100 scale, combining behavioral and attitudinal measures. |
This article applies to: UserTesting
QXscore™ is available on the following subscriptions:
| Advanced ✓ | Ultimate ✓ | Ultimate+ ✓ |
On this page:
What is a QXscore?
- QXscore™, short for Quality Experience Score, is a metric developed by UserTesting that quantifies the quality of a digital experience on a scale from 0 to 100 by equally weighing attitudinal measures (i.e., ease of use, appearance, trust, and loyalty) and behavioral measures (i.e., task success).
- The QXscore is a composite of the combined behavioral UX KPIs (Task Success) and attitudinal UX measures (Ease of Use, Trust, Appearance, and Loyalty).
- It was designed to produce a single measure that UX professionals can use to measure UX quantitatively, and is easy for stakeholders to understand.
- We recommend including no more than six tasks.
- While the platform allows more, exceeding this range can lead to participant fatigue, lower data quality, and reduced completion rates.
How to set up a QXscore
1. Select Create test from the top navigation bar. This opens the full menu of available test types. From the dropdown, select Unmoderated test.
2. When the prompt appears, choose Use the new test experience. This ensures you have access to the latest features, including enhanced task design and improved participant usability.
3. Select Interaction test. This mode captures screen activity only, making it ideal for tasks that do not require think-aloud feedback.
4. In the test builder, select Add task. From the task list, choose QXscore to include this standardized experience measurement in your test.
5. A minimum of two navigation tasks with clear success/failure criteria must be provided in order to collect behavioral data for a QXscore.
6. Add a Starting URL for the key navigation tasks you want to test.
7. Add the Success URL that contributors need to land on to complete the task successfully. Consider using “Contains” versus “Exact match.” Contributors can land on the success page and if there is a slight difference in the URL, UserTesting won’t recognize the contributor as successful.
8. Within each task, create a scenario so contributors can interact with your prototype, site, or app in context. If contributors are using a prototype, let them know that not everything will be clickable. Provide clear end-points for the task so contributors know when to move on. For example:
“The following is a prototype – keep in mind not everything will be clickable. Imagine you are interested in understanding your options for a subscription plan. Show us how you would decide between the subscription plans. STOP before enrolling and move on to the next task.”
9. Add follow-up questions. Although follow-up questions for individual tasks are not included in the QXscore calculation, you can assess the task's ease of use and ask contributors why they rated the task easy or difficult for more insights.
10. At the end of all tasks, standardized QXscore attitudinal questions are presented to participants to assess the overall experience's usability, trust, appearance, and loyalty.
11. Here is an example of the results from a completed study:
Related content
|
Want to learn more? Check out these Knowledge Base articles... |
Interested in growing your skills? Check out our University courses... |
|
Need hands-on training?
|
Can't find your answer?
|