Hard Data from Think-Aloud Tests

Q: We run think-aloud usability tests at my company, but some people here don’t pay attention to the results because we don’t get much “hard” data. Do we need to run different tests?

Hard Data from Think-Aloud Tests

Q: We run think-aloud usability tests at my company, but some people here don’t pay attention to the results because we don’t get much “hard” data. Do we need to run different tests?

Fill out form to continue
All fields required.
Enter your info once to access all resources.
By submitting this form, you agree to Expero’s Privacy Policy.
Thank you! Your submission has been received!
Oops! Something went wrong while submitting the form.

Expero Staff

October 9, 2009

Hard Data from Think-Aloud Tests

Q: We run think-aloud usability tests at my company, but some people here don’t pay attention to the results because we don’t get much “hard” data. Do we need to run different tests?

Tags:

Q: We run think-aloud usability tests at my company, but some people here don’t pay attention to the results because we don’t get much “hard” data. Do we need to run different tests?

A well-executed think-aloud study can yield useful quantitative data (for example, rate of an occurrence or behavior) as well as qualitative data. We recommend that before you invest in more costly testing techniques, you try collecting and analyzing the quantitative data you may be missing from your think-aloud studies.

Start by counting how often important things happen during the study sessions (a user fails to complete a specific task, a user goes to the Help section, etc.). Ask users to rate the usefulness, ease of use, or appeal of particular features, as well as the overall user interface. All of a sudden, you have hard data to report: “Users rated the feature a 3.4 out of 5 on Usefulness”; “70% of the users tried to access Help when trying to create an account”.

A more advanced but very helpful step is to quantify what the users said during the study. This is “content analysis,” a technique from the communication and psychology fields. In content analysis, you categorize users’ comments (for example, as positive or negative), then you count. This yields even more hard data: “78% of users’ comments about creating an online account were negative, and only 22% were positive”; “Overall, 32% of users’ negative comments about the site focused on account creation, which suggests that this area needs a lot of improvement”.

User Audience

Services

Project Details

Similar Resources

Hey TigerGraph! Your 2020 Forward Looking Thoughts Are Already Here!

At Expero, TigerGraph’s 2020 projections are already in place or well underway.

Watch Demo

Chatting with Data: Interrogative UIs for Exploratory Data Analytics

Natural language processing is a powerful tool in building investigatory user interfaces for data exploration.

Watch Demo

5 Things Developers Should Know About UX

Software and web developers often wear many hats, but some lack the knowledge to design or collaborate effectively with UX designers and researchers.

Watch Demo

When to Test What: Validating Standard Features & Game-Changers

As a user researcher, I’m always inclined to say, “Test everything, all the time!” when people ask, “What/when/how should we validate with users?”

Watch Demo