Talking Usability: A Cost-Saving Tool for Software Development

I met Nicki Davis at the 2013 Summit. She talked to me about the ethnographic research she was doing and I thought her research might be of interest to you. Encouraging her to write about her research took a lot of coaxing, but she finally relented to my request. I am proud to share Nicki’s research with you.

Field research (also called ethnographic research, contextual inquiry, or job shadowing) is a usability technique in which you observe people using a product at their work, record their actions, and analyze what you’ve observed. Some people call it “Users in the Mist” because it’s the same principle that Dian Fossey used when she studied gorilla behavior: observe them in their natural environment.

It can be hard to justify taking time from other project work to visit customer sites, not to mention traveling expenses. But what if you were able to tell management that field research gave you a 50% reduction in project scope? And what if you discovered that every day spent on field research saved you three days of work documenting features that weren’t needed? If you’re interested in hearing more, read on.

Case study

A vendor of chemical drawing software needed to replace a legacy product from 1996 with a second-generation product that used more advanced technology. One feature of the legacy product was a palette of two specialized drawing tools (Sequence Tool and Shape Tool) that were used only by a small minority of chemists:

Because the market was so small, the company couldn’t justify spending a large amount of resources on implementing the tools in the new product. To reduce the scope of the release, we needed to find out which features were most important to the chemists so that we would be sure to implement them in the new product. Field studies would provide the answer.

Methodology

Four of us spent a week at the customer site, observing how the scientists used the specialized tools in the legacy product and documenting our findings. As the technical communicator on the team, I not only interviewed users at the customer site, but transcribed our notes from the study into a detailed 20-page report. In fact, for each day I spent at the customer site I spent one day writing the report. Why? Because most of the developers working on the project had not been able to visit the customer site. These people needed detailed information on exactly what customers were trying to do and why they were doing it. The report gave them the level of detail that they needed.

Results

Our observations showed that the legacy product was cluttered with features that the chemists never used. These features included the Shape Tool, the second tool in the palette:

… and 12 of the 13 controls on the following dialog box:

At the same time, we saw that the legacy product lacked direct support for fully one-third of the most common tasks. Moreover, three of the missing tasks had been completely unknown before the study. Users had developed workarounds to perform the missing tasks.

Was it worth it?

Now let’s return to the two metrics I mentioned earlier: The 50% reduction in scope and the 300% return on time invested.

Scope Reduction: Before the study, the marketing department had identified 61 user tasks that they thought the product should support. The study revealed three tasks we hadn’t known about, raising the total number of tasks to 64. Of those 64 tasks, however, we knew that only 27 were used every day—a scope reduction of nearly 60%.

Return on Investment: My records showed that I had spent a total of four weeks on user research and 13 weeks documenting the features. If we hadn’t done the study, I estimated that it would have taken 30 weeks to document all 61 of the original features. The field study saved me 17 weeks of work, so for each week I spent on research I saved three weeks of work. I regard these savings as a conservative estimate, because it doesn’t count the time saved by developers who didn’t need to program, test, or fix bugs in features that weren’t needed.

In my career as a technical communicator, carefully crafting explanations of features, I’ve often wondered, “Does anyone really need this?” On this project at least, I knew the answer to that question.

If you have a story about your research or lessons learned in the field, please let me know and I’ll let you be the next guest editor.

I’m David Dick and I’m Talking Usability.

Leave a Reply