Let Your Users Teach You: Document Quality and Usability Testing

By Lisa D. Gay

We know that understanding a document’s audience affects countless decisions: structure, order, what to include (and what to exclude), tone, and so on. One way we can approach document quality is by asking, “How well does it meet users’ needs?” To answer this, we are often limited to second-hand information about users from colleagues who interact directly with customers, such as those from product management, marketing, sales, support, and others.

However, your colleagues are immersed in your company’s products full time. Tasks that were confusing when they first encountered the product are now part of their knowledge about how things should work. They might even forget that that they were ever confused. That’s part of building product expertise.

But that also means that your main source of information about your audience has a lot of foundational knowledge that your audience does not. Inaccurate assumptions lead to documentation that doesn’t align well with your readers’ needs—a mark of poor quality.

As a technical writer who has partnered with user experience (UX) designers and learned about UX design practices through that collaboration, I have discovered that usability testing is one methodology from UX design that has helped me to improve document quality by giving me a new understanding of my audience.

Real User Knowledge Makes a Difference

The first time I documented a product while following UX design principles from the start, I was amazed at the difference. The documentation was easier to write, because the features were fully intuitive. The documentation was also easier to use. I could eliminate entire sections that explained counterintuitive concepts or described how to find settings in a series of haphazard dialog boxes.

But more importantly, I could write better documentation, because I had a better understanding of our customers. Until then, I thought I knew my audience. I had a notion of the kind of job they had, what problem our product solved for them, and the specialized terminology of their industry. From that, I determined their priorities, what product features were most important to them, where in the workflow they might struggle, and how they understood the role the product played in their day-to-day life.

One technique in particular, however, revealed that I was making many incorrect assumptions without even realizing it: usability testing. Watching real people use the product and its documentation gave me solid information I could use to create higher-quality documents.

What Is Usability Testing?

Usability testing is easiest to understand in the context of user-centered design (UCD), a product development process that focuses on identifying and understanding users’ needs. Briefly, the UCD process involves learning about your users through interviews, creating an initial product design, testing the design on real users, and then using the results of those tests to iterate on the design until it’s ready to hand off to developers for implementation.

Typically, usability testing comes into play when you test the product design on people who are similar to the expected end users for your product. A technical writer observing product usability testing can learn many things relevant to improving documentation, like context, workflows, and terminology.

If there isn’t a UX team running product usability testing, a technical writer can run testing on the documentation directly. Ideally, you would have access to current users or sales prospects, a person available to do note taking, half a dozen colleagues from different departments to be observers and to help analyze the results, a couple of weeks to run five or six test sessions, and a development sprint to make improvements to the product itself.

Most of us don’t have all of those resources available, but that doesn’t mean usability testing isn’t possible. You just need to make a few compromises.

Testing Document Usability

Let’s take a look at what a formal documentation usability session looks like. In each session, a single test subject attempts to complete a small number of specific tasks with the product using the documentation to help them. Observers write down observations about the user’s actions and words, not attributing any meaning to them. For example, the observation “Alice re-read the second sentence very slowly” is more useful than “Alice was confused.”

Usability testing also requires a facilitator, someone who is familiar with the principles of usability testing and can keep the session on track. Skilled facilitators need to juggle a lot of functions. For example, they must:

  • Keep the test subject at ease. “We’re testing the documentation, not you. You can’t do anything wrong.”
  • Encourage the test subject to talk about their thinking process out loud. “What did you expect to happen when you clicked that?” “What do you think that term means?”
  • Resist the temptation to help the test subject when they get stuck. “If you didn’t have any technical support available to you, what would you do to figure this out?” “Good question! What do you think that button does?”
  • Think on their feet when the test subject says or does something completely unexpected. “Could you tell me more about why you went to that page of the help?”
  • Ask follow-up questions after the subjects have completed the planned tasks. “What did you find most frustrating?” “What questions did the documentation not answer for you?” “If you could rename that button, what would you call it?”

Usually, the number of new observations tapers off after about five test subjects, at which point you can analyze the results to find what patterns emerge. A good analysis occurs in two phases, kept carefully separate.

  • First, list the observations. It’s important to not interpret them, yet, so that you can avoid leaping to conclusions shaped by your assumptions. Note whether each observation was unique to a single user or occurred in several sessions.
  • Second, shift your attention to interpretation. Discuss what each behavior means and what you might want to do about it.

These results will help you to adjust the documentation so that you can test again with the next revision.

This summary only describes the basics. You can learn more about how to plan and carry out usability testing from the resources listed at the end of this article.

It’s tempting to think that other testing methods can take the place of usability testing. When you complete a document, you probably test it against the product. If your organization’s release process calls for it, a QA tester might validate the procedures you wrote. These approaches are helpful and can find some kinds of flaws, but usability testing has two key differences: you’re observing, not participating; and the tester is someone who wasn’t involved in building the product.

These two factors might seem simple, but they explain why usability testing can make such a big impact on product and document quality. Letting real users’ behaviors guide documentation decisions is the most direct way I’ve found to create documents that meet my audience’s needs.

Making Do with What You Have

The formal usability testing process might sound like more than you can manage, especially if you’re strapped for resources. Or, you might have trouble getting permission to interact with customers or show a draft outside the company. You can take a lighter approach, drawing your pool of test subjects from your colleagues.

Reach out to people in different roles—product management, engineering, customer support, marketing, and sales all have useful perspectives. Most importantly, seek people who are not familiar with the product or, at the very least, haven’t attempted the specific workflow you are testing. If you have trouble finding people willing to spend time on this, seek anyone who has voiced concerns about users struggling with the product. Position yourself as their ally in improving the customer experience. And after testing, ask if they want to recommend any of their colleagues as test subjects or observers.

What You Might Learn

The initial intent of usability testing is to improve products. When my colleagues observe testing sessions (or are participants themselves), I often hear comments like the following.

“I never realized how complicated that configuration process is!”

“I thought everyone understood what ‘key’ meant in this context, but most of the users assumed a completely unrelated meaning.”

“I thought our users needed all of the report data to decide what to do. I’m surprised they didn’t want to analyze all of those metrics. How can we simplify the report view?”

This can inspire product managers and others to push for design changes that make the product easier to use—and to document. But usability testing can also help technical writers with their work. Some gems you can mine immediately:

  • Finding a better word to describe a difficult, abstract concept.
  • Identifying the most confusing tasks so that you can prioritize your work efficiently.
  • Learning a better way to organize content in the online help.

For example, I created a getting started guide for a cloud delivery network product. Many steps applied only if the user first enabled HTTPS delivery. I prefaced each of those steps with “For HTTPS sites only.” One thing I wanted to learn from usability testing was whether that wording was clear. But the test subjects showed me that I was asking the wrong question: “I don’t want to have to think about what protocol I’m using, over and over through the whole process. Why not have two versions of the guide? That way I can see only the steps I need to see.” The next edition of that document had two versions: one for HTTP-only sites and one for sites with HTTPS delivery. For our users, a documentation experience that only asked them whether they used HTTPS once (when picking which process to follow) was better than a single, comprehensive process that covered both cases.

You can also get valuable insights about the context in which your users interact with your product. Consider the delicate balance of password requirements. At first glance, it might seem very secure to require passwords to contain at least 15 characters of four types and to force a password change every month. But usability testing would probably reveal that users won’t even try to memorize a password that complex. They’ll write it down somewhere, creating a security vulnerability.

All of this information can help writers improve document quality.


In my experiences with usability testing, one thing has become clear: there is no substitute for watching users interact with the product. It’s hard to overstate the value of observing someone try to complete a task when they don’t have background knowledge of the product or preconceived notions of what the workflow should be. It’s also very hard to communicate what it’s like to have a user show you an entirely new understanding of a feature unless you have had the experience yourself.

I encourage you to try usability testing on your own products and documents. If you have a UX design team available, team up with them. If not, plan a testing session with the resources you have available. The books in the resources section below can guide you. Steve Krug’s book, Don’t Make Me Think Revised: A Common Sense Approach to Web and Mobile Usability, has a good chapter about conducting fast, lightweight usability testing (see the chapter “Usability Testing on 10 Cents a Day”). His website has additional resources, including a sample script, tips for observers, and a video of a usability testing session. David Platt’s book, The Joy of UX: User Experience and Interaction Design for Developers, provides more detailed instructions and is especially helpful for navigating the process in relation to software products. Don Norman’s book, The Design of Everyday Things, is a foundational text in UX design, with more focus on physical objects than the other two.

Do some usability testing, and really get to know your users. Your documents will be better for it.


Krug, Steve. Advanced Common Sense. Personal website. http://sensible.com/.

Krug, Steve. Don’t Make Me Think, Revisited: A Common Sense Approach to Web and Mobile Usability. New Riders: San Francisco, CA, 2014.

Norman, Don. The Design of Everyday Things. Basic Books: New York, NY, 2013.

Platt, David. The Joy of UX: User Experience and Interaction Design for Developers. Pearson Education: San Francisco, CA, 2016.

LISA GAY, M.A. (lisagay@alumni.uchicago.edu), came to technical writing in 2006 from a background in the humanities. She has written in several fields, including smart grid technology, cloud delivery networks, and pharmaceuticals. Currently, she documents products at RightHand Robotics, Inc., a leading provider of piece-picking robots for supply-chain logistics. Known as a passionate advocate for users, she draws on user experience design principles to create effective, frictionless user assistance.

Add Comment

Click here to post a comment

Download the May/June 2022 Issue

2020 PDF Downloads