Blog

Just What Is Good Usability Testing?

Hi.  My name is Corey Shulman and I am a researcher at User Insight.  I feel very fortunate to be involved in such interesting work and am looking forward to sharing insights about our work with you (when I’m not out interrogating the end-users of products, of course!).

This week, I read a blog post that forced me to ask myself, “What is good usability testing?” The article, entitled “The Four Forgotten Principles of Usability Testing” written by David Travis lists the four principles as follows:  1) Screen for behaviors, not demographics  2) Test the red routes  3) Focus on what people do, not what they say 4) Don’t ask users to redesign the interface.

Screen for Behaviors, Not Demographics

The first principle, “screen for behaviors and not demographics” is a good one. Here, Travis notes that trying to balance gender, age and other demographic factors in such a small sample size is impossible. Very true. It’s also unnecessary.  It’s most important to recruit based on user behaviors and their previous experience with products or services. We must be careful when screening.  In order to avoid people telling us that they have experience with a product that they don’t, we must ask probing questions during the screening process in order to ensure we’re recruiting valid participants that do indeed exhibit the required user behaviors for the study.

Test for the Red Routes

The second principle, “test for the red routes” states that you should focus solely on the tasks that are critical to both the user and the company.  We find, however, that a very narrow study typically delivers very narrow results; which, I’d say, is not a good thing.  Often, the cause of a problem is not what the client originally anticipated.  Therefore, we don’t know what the problem is until we immerse ourselves in the experience with the user.

Focus On What People Do, Not What They Say

The next one, “focus on what people do, not what they say” also sounds very appealing. While I agree that usability testing’s primary concern is understanding whether or not the user can complete critical tasks, I believe that never asking “why?” will prevent you from uncovering vital information.  Usability cannot be examined in isolation, as it is only part of the user experience. It is crucial that moderators explore not simply IF users can do it, but all of the questions surrounding it, such as “Do they want to do it?”, “How much would they pay for this service?”, and “How would they change it?”  Although Travis communicates that there is no place in usability testing for these questions, I strongly disagree.  The goal of such questions is not to get amateur design advice (as Travis suggested in his fourth principle), but to reveal the user’s intentions, needs, and perception of the product. Learning that a user can do something is only part of the story; they might not know what they accomplished or they might not have the desire to ever do it.

I do agree with Travis’ point that user testing needs to continually strive to create a natural user experience. As he notes, untimely questions can create a flawed experience for a participant during a task-based study. However, I think it is both appropriate and valuable to have a user examine a screen after the task has been completed. I have seen first-hand how valuable this can be. Encouraging a participant to take a deeper look at a screen after completion of a task can reveal issues with terminology, layout and content which could all go unnoticed if not probed by the researcher.

These principles also apply to contextual research.  Contextual research allows the researcher and the strategist to observe the user’s environment and make observations based on their surroundings and the ways in which they behave. This is a great way to hone in on what people do, rather than what they say.  And to understand who they really are, not just what they tell you. To see some fun examples of observations made during our SoMeTV contextual interviews, click here.

One last thought: When conducting one-on-one interviews, or other types of user research, research and analysis should not be synonymous.  As a researcher, I depend heavily on intelligent interpretation of the data by a strategist.  Here at User Insight, the strategist’s only mission is to watch all of the research objectively and report user patterns.  Each project reminds me of the importance of this; more recently, when our SoMeTV team was in the  “war room” conducting analysis of the Social TV Experiment data, the strategists who observed the research and the researchers who conducted the interviews were offering different perspectives, ensuring a “check and balance.” Ultimately, this back and forth allows our team to uncover critical, move-forward data.

 

1 Comment

  1. Site Usability

    Hello,

    The above principle are very informative for good usability testing. We can not get the demand of users without direct interactions with them. Contextual research allows the researcher and the strategist to observe the user’s thoughts and wants. Thanks a lot.