The QXscore™ is a UserTesting metric that evaluates digital experience quality on a 0–100 scale, combining behavioral and attitudinal measures.
|
This article applies to: UserTesting
On this page:
What is a QXscore?
- QXscore™, short for Quality Experience Score, is a metric developed by UserTesting that quantifies the quality of a digital experience on a scale from 0 to 100 by equally weighing attitudinal measures (i.e., ease of use, appearance, trust, and loyalty) and behavioral measures (i.e., task success).
- The QXscore is a composite of the combined behavioral UX KPIs (Task Success) and attitudinal UX measures (Ease of Use, Trust, Appearance, and Loyalty).
- It was designed to produce a single measure that UX professionals can use to measure UX quantitatively and is easy for stakeholders to understand.
How to set up a QXscore
1. Choose Create test in the platform, then Interaction test.
2. Select Add questions, then choose QXscore in the panel that opens up.
3. A minimum of two navigation tasks with clear success/failure criteria must be provided in order to collect behavioral data for a QXscore.
4. Add a Starting URL for the key navigation tasks you want to test.
5. Add the Success URL that contributors need to land on to complete the task successfully. Consider using “Contains” versus “Exact match.” Contributors can land on the success page and if there is a slight difference in the URL, UserTesting won’t recognize the contributor as successful.
6. Within each task, create a scenario so contributors can interact with your prototype, site, or app in context. If contributors are using a prototype, let them know that not everything will be clickable. Provide clear end-points for the task so contributors know when to move on. For example:
“The following is a prototype – keep in mind not everything will be clickable. Imagine you are interested in understanding your options for a subscription plan. Show us how you would decide between the subscription plans. STOP before enrolling and move on to the next task.”
7. Add follow-up questions. Although follow-up questions for individual tasks are not included in the QXscore calculation, you can assess the task's ease of use and ask contributors why they rated the task easy or difficult for more insights.
8. At the end of all tasks, standardized QXscore attitudinal questions are presented to participants to assess the overall experience's usability, trust, appearance, and loyalty.