By Kumar Dhanagopal | STC Senior Member
As technical communicators, we appreciate how important it is to understand the users for whom we develop and publish information. Some of us even plan and undertake formal audience analysis and research before we start developing content for our projects. But access to reliable sources of information to help us understand our users is often limited. Even when we do have access to such sources, the exercise of engaging with our target users can be quite expensive.
In this article, I explore the sources of information that we can use to learn more about our users. I also present a comparative analysis of each source based on two parameters: availability and reliability. For the purposes of this discussion, availability is a measure of how easy and inexpensive it is to access a given source of information, and reliability is the degree to which information from that source can be used, in isolation, to make information planning decisions.
This evaluation of the availability and reliability of various sources of information about our users is based on my personal experience developing technical content for information technology products in a variety of domains over nearly two decades. However, every product and user is unique, so your experience might vary. At a minimum, you can use this analysis as a framework to shortlist the information sources that you can explore for your technical communication projects.
In Figure 1, I’ve plotted the relative position of each source of information in a four-quadrant matrix based on the availability (y-axis) and reliability (x-axis) of the information source.
The upper-right quadrant of the graph consists of the information sources that are relatively more available and more reliable. The sources in the other quadrants are relatively less accessible or less reliable.
Most Accessible and Most Reliable Sources
The experience and the knowledge base developed by the support team in most organizations is a rich source of information about the issues that users run into. This information might reveal patterns that we can use to predict issues that users might face in the future. This is a very reliable source of information, and it is readily available. So it takes the top spot in our A-R scale.
Most large organizations sponsor a community forum where users can post questions and seek solutions. Often, experts within the organization monitor such forums and post answers that users find very useful. In some forums, full-time or volunteer curators monitor the information exchanged in the forums and categorize that information using tags and keywords that others can use to search for information. Such forums offer us an easy avenue to feel the pulse of our user base. The discussions in these forums help us understand what topics are currently “hot.” We also get an idea of the range of solutions that experts on the forums are providing. This source of information ranks quite high on the A-R scale; the time spent by technical communicators in watching such discussion forums is time invested well.
Thinking Like Our Users
Most of us attempt to think like our users, when feasible, but it’s not easy to completely remove our information-developer hat and think in exactly the same way as the real users of our products. We are typically quite familiar with the products for which we develop information, so we often miss questions and issues that are obvious to our users. In addition, mimicking the customer environment is not easy. One way to think like our users would be to try to break the product; that is, try out the features in what is clearly not the prescribed or correct way. This might reveal opportunities to improve the product and the documentation. Test cases written by our QA colleagues are useful proxies for thinking like our users.
We rely quite heavily on input from our partners in product management, sales, marketing, and technical support teams. It is reasonably easy to get information about our customers from such internal sources. As long as we approach the right expert, we can be reasonably confident about the quality of the input that they provide.
Most Accessible and Least Reliable Sources
Content analytics consist of metrics that tell us how our information products—documentation, tutorials, training, videos, and so on—are used by our users. Many organizations track both content and product analytics data, particularly for products and content that are hosted on the Web. Because content analytics often consist of “vanity metrics,” such as hits or views, reliability depends heavily on the type of data collected.
Research reports are at a median level of availability. In some cases, research reports are sponsored by a company or a group of companies, so the conclusions in these reports tend to be “moderated” and thus less reliable.
Most Reliable and Least Accessible Sources
While roughly as accessible as research reports, market intelligence data is often acquired from third-party sources and thus tends to be more reliable than general research reports.
This source includes information that’s available publicly from competitors and information that’s published by trade analysts and research groups to private subscribers. The privately published information is obviously more difficult to get. Publicly available information must be used with caution, because the insights from that information may not apply directly to our products and services.
This method is quite popular with product designers, particularly UI designers. Essentially, we assemble a carefully selected group of individuals who we think represent our user base, and then we get this group to use our product and content and share their experience. This method is relatively more expensive, particularly if it’s done in real time and face-to-face. But the value of the data collected through this method can be more reliable than data collected from other sources.
Product analytics help us understand how our users use our products. Many organizations track both content and product analytics data, particularly for products and content that are hosted on the Web. Product analytics are relatively less accessible to technical communicators, but the information they provide offers more direct and reliable insights into user behavior, user experience, and user preferences.
After direct observation of users, this method offers the most reliable insights into customer behavior and preferences, but it is also expensive. As with surveys, there may be legal restrictions that dictate the questions that we can ask in such interviews.
Watching our users in action is one of the most rewarding approaches to understanding our users. Though it can be quite expensive, this method is the closest we can get to experiencing our product and content like our users do. When we watch users using our product, we get interesting insights into how they navigate the product, how they interact with input fields, and when they seek help. Users will always do things that we never expected!
Least Accessible and Least Reliable Sources
Online and Mail Surveys
Surveys help us reach large segments of our users, but the usefulness of survey data is a function of how well the survey is designed, how carefully the participants are selected, and how honestly the participants answer the survey questions. Also, there are often legal constraints around engaging directly with our users, particularly external users.
When we make day-to-day decisions about our audiences, we want to base those on the most reliable information we have. However, we also must be realistic about the availability of that reliable information. As we plan our projects, we should strive to collect information from the most reliable sources, starting with those that are most readily available. This is the low-hanging fruit of audience intelligence. While this framework is not exhaustive, use it as a starting point for identifying and classifying your own sources of audience information, and use the sources provided to shortlist sources—those in the upper-right quadrant, for example—that you can explore for your technical communication projects.
KUMAR DHANAGOPAL (firstname.lastname@example.org) works as a Consulting User Assistance Developer at Oracle America, designing and producing technical content, tutorials, and other user-assistance collateral for Oracle Cloud services. He holds a Master of Science degree from Utah State University with a specialization in technical communication. You can reach him on LinkedIn at https://www.linkedin.com/in/kumardhanagopal/.
Hackos, JoAnn T. Managing Your Documentation Projects. Hoboken, NJ: John Wiley & Sons, 1994.
Kaushik, Avinash. Web Analytics: An Hour a Day. Indianapolis, IN: Wiley Publishing, Inc., 2007.
Krug, Steve. Don’t Make Me Think: A Common Sense Approach to Web Usability. Berkeley, CA: Peachpit Press, 2000.
Norman, Donald A. The Design of Everyday Things. New York, NY: Doubleday, 1990.
Schriver, Karen A. Dynamics in Document Design. Hoboken, NJ: John Wiley & Sons, 1997.