QXscore™ use cases

Learn about different ways to use QXscore™ to measure the quality of digital experiences across time, platforms, and competitors.

Note: The use cases shared in this article are illustrative examples based on supported testing methods and customer scenarios. They are designed to help teams understand how QXscore can be applied across different research goals, but they do not represent real customer data. 

This article applies to: ut logo tiny.pngUserTesting 

 

On this page:

 


 

Understanding QXscore™

QXscore™ is a composite metric that combines behavioral (task success) and attitudinal (ease, appearance, trust, loyalty/NPS) data into a single score out of 100, allowing teams to:

  • Find the friction points in navigating a live site
  • Understand customer attitudes towards an experience.
  • Give an easy score to measure progress over time and across competitors.

 

 

Use case: Benchmarking a live flow for usability health

Scenario:

A retail client runs a summative usability test on their live checkout flow to generate a baseline QXscore. They use this score to track performance across quarterly releases.

Key Attributes:

  • Real users on the live site
  • 30+ participants in an Unmoderated Interaction test
  • Tasks structured to reflect a realistic purchase flow
  • Metrics captured: QXscore, NPS, SUPR-Q, time on task

Outcome:

  • The initial QXscore highlighted friction in the shipping selection step, where users were confused by delivery options and pricing.
  • After simplifying the layout and labels, the next test showed a 9-point improvement in QXscore, with better task success and satisfaction.
  • The team now includes QXscore in their quarterly UX health reports to track progress and guide checkout optimizations.
     

 

Use case: Comparative benchmarking: You vs. Competitor

Scenario:

A financial services company tests its online loan application flow alongside a competitor's flow using matched task structures in separate QXscore-enabled tests.

Key Attributes:

  • Live site testing on both brands through an Unmoderated Interaction test
  • ~35 participants per test
  • Tasks structured for behavioral scoring (success, time)
  • Attitudinal measures (trust, ease, appearance)

Outcome:

  • The client’s flow receives a QXscore of 68 vs. the competitor’s 78.
  • Upon further review, the competitor had an easier document upload process, and participants found their fees more transparent.
  • Results inform a redesign focused on simplifying document upload and transparency around fees.
     

 

Use case: Track design iterations over time

Scenario:

A SaaS company evaluates its onboarding flow before and after a redesign. Each iteration is tested using the same task flow and QXscore structure.

Key Attributes:

  • Pre/post comparison using identical tasks in an Unmoderated Interaction test
  • Prototype used pre-launch, live site used post-launch
  • 40 participants per round
  • Tasks measured: sign-up, walkthrough, first-use setup

Outcome:

  • The first test round on the prototype produced a QXscore of 62, with task failures and low scores for ease and trust during account setup and the first-use walkthrough.
  • After making usability improvements, the team retested the live flow using the same structure.
  • The QXscore improved to 81, reflecting stronger task success and more positive perceptions of the experience.
  • These results helped the team demonstrate measurable UX gains and secure continued investment in onboarding improvements.
     

 

Use case: Improving navigation and information architecture

Scenario:

A media company uses QXscore to evaluate how easily users can navigate to key content areas like subscriptions, support, and archived articles. The initial test shows low task success, with participants using inconsistent paths and spending too long searching.

Key Attributes:

  • Live site tested with 35 participants in an unmoderated test.
  • Tasks reflect real navigation goals: “Find how to cancel a subscription,” “Locate archived stories,” etc.
  • Metrics captured: QXscore, time on task, success rates, attitudinal feedback

Outcome:

The test reveals inefficient navigation paths, poor labeling, and low trust scores. They make a new IA (information architecture) structure after some research and rerun the same QXscore test post-redesign and see:

  • Higher task success
  • Reduced time on task
  • A 12-point improvement in QXscore, with boosts in “ease” and “credibility” ratings.

The team uses this data to validate their IA updates and justify further investment in UX design.

 

 

Related content

information icon.png

knowledge icon.png

Want to learn more? Check out these Knowledge Base articles... 

Interested in growing your skills? Check out our University courses...

video icon 2.png

team icon.png

Need hands-on training? 

Can't find your answer?

 

Was this article helpful?