By Jenn Carlos | Member, Travis Kelly, and Vivianne Costa
Most marketing writers know a lot more about their content’s performance than technical writers do. They know how many times an article has been read. They know what type of content is more important to one prospect versus another. Marketing writers know what articles convert and they can quickly gauge the impact of a single edit.
How do they know all of this? Web analytics.
And what do marketing writers do with all this knowledge they’ve gleaned through analytics? They prioritize their time and content with precision. They don’t have to say yes when they are asked to work on a high-effort, low-value piece. They push back with cold hard facts. And they can prove—P-R-O-V-E—that their content is impacting business goals.
You need to be able to do all of this, too. After all, technical content is more important than marketing content; theirs just looks better.
How Our Team Did It
Our team at PROS owns documentation and the help portal it lives in, called Connect. Eight strong, we support the technical content needs of 15 enterprise software products, and we use analytics on a daily basis, just like marketers.
In the beginning, the team published to a cloud-based documentation portal while Google Analytics (GA) ran passively in the background. When we decided to build our own portal on Drupal, we got dead serious about user experience and content optimization.
Before we provide a walkthrough of our subsequent process, we want to emphasize that the optimal mix of insight is one that includes Web metrics and qualitative data. We conduct regular user studies and usability testing sessions with our in-house UX researchers. We also have “was this helpful” on every page of content with the option to add comments, and a persistent feedback button on our documentation site. This helps us paint the whole picture and provides much needed context to the raw data.
We started with Google Analytics’ default reports and segmented traffic into two groups: customers and employees. It was easy to do this because, like many companies, most of our employees access the portal through the same network. Network domain proved to be a very important dimension for us because we could also see our customer’s individual network domains, like acmeinc.host.com, and further segment down to their level.
Separating Employee from Customer Traffic
Your IT department can tell you what your company’s network domain is called (it’ll look something like companyname.net or internetprovider.net), and there may be a few of them. Your marketing department might already have this on hand as well, as they are likely filtering customer traffic on your public website.
Once we had the network domain, we Added a Segment, called Employees. Then we created a New Segment and configured the following under Advanced Conditions:
Filter Sessions Include
Ad Content: Network Domain
Contains: exactly matches
Field: companyname.net (if you have multiple network domains, then add them using the OR condition button)

Customer Segment
We filter for customers by excluding our company’s network domains. We followed the same steps above to create a New Segment, called customers. We kept all configurations the same, except the Contains condition, which was changed to does not exactly match.
Once we knew what customers were doing, we started sending periodic emails to the product leaders and executives in our organization—just the month-over-month basics and a bulleted summary. The summary contained data on unique visitors, total visits, pages per visit, visit duration, most active customers, and most popular content. These emails generated interest in what we were doing and made us look like we were on top of things, but we were reporting vanity metrics because they had no goal, we were just hoping for uptrends.
If you want to know what your most important key performance indicators (KPIs) are, then start with your business goals. We asked ourselves and our stakeholders, “What is the ultimate goal of our content?” Enable self service. Luckily, there’s a lot of existing research on measuring self service, and we settled on a simple formula: the number of visits that didn’t result in a support ticket. This is our self-service rate and most important metric. After monitoring it for six months, we knew our average was 91%. If this number fluctuates, we drop everything and look into why.
Measuring Self Service
At PROS, Connect is the entry point for submitting support tickets, so it’s easy for us to grab this data ourselves. We installed a tracker on the “Submit a Ticket” button. If you don’t have access to your support system, then just ask your customer support team to send you a monthly report on how many tickets have been opened. If they can strip bugs and feature requests out of it, even better. Your results may not be as precise, but this is about establishing a baseline and then monitoring wild swings. We use the following formula.
100% Total Sessions – % of Sessions that resulted in tickets = Self-Service Rate
We also identified other important goals for our content and portal, and then started working backward to develop our ideal reporting strategy. What did we want to see for ourselves? What did we want to show others? Once our goals were clear, we could dig into Google Analytics data and define the KPIs for measuring them, which can be tricky for a documentation site.
Goal: More customers using our content and site
KPI: Increase in total sessions
KPI: Increase in new user registrations
KPI: Increase in new visitors
KPI: Increase in returning visitors
Goal: Help users find what they need quickly
This is where things diverge from marketing content-strategy KPIs. From research, we know most of our traffic uses the site to solve a problem. That aligns nicely with our goal of self service. So what we want to see is our users getting information quickly, leaving, and coming back when they need something else.
KPI: Decrease in pages/sessions
KPI: Decrease in average session duration
KPI: Increase in returning users
Goal: Increase participation in our community and forums
One less support ticket, right? We want our customers asking questions in the forum, so we have a searchable record of solutions for other users to reference. We built a custom report in Drupal to help us with two of these KPIs.
KPI: Increase in forum posts + comments (Drupal)
KPI: Decrease in unanswered forum posts (Drupal)
KPI: Increase in visits to community pages (Google Analytics)
We also measure activity at a user level in Drupal to see who posts the most often. We call out the top three employee participants in our monthly stakeholder report and treat them like gods.
Beyond those site-wide metrics, we wanted to see our content’s performance at a detailed level so that we could hone in on what was truly useful to our audience and gauge optimization efforts. In order to do that, we dug deep into Google Analytics’ Content Drilldown.
Our content URLs are structured by site section, product, type, and topic as follows. This structure makes them very easy to segment and analyze.
portal.com/documentation/microwave/user-guide/button-panel
portal.com/knowledgebase/refrigerator/maintenance/coolant
Most Popular Content
We make the assumption that more page views equals more user value, and we sort the path page levels in Google Analytics by total page views. When tasked with two projects, both of equal size and stakeholder priority, we focus on the one that statistically results in more activity.
- Path Page Level 1 tells us what sections of the site are most frequently visited.
- Path Page Level 2 tells us what products are most frequently visited.
- Path Page Level 3 tells us what content type is the most frequently visited. Our content types include guides, help topics, a knowledge base, forums, and release notes.
We break it down even further at the product level. This is super important data that you’ll most definitely want to share with respective product teams. It’s post-purchase insight they are otherwise blind to. Remember how we are making the assumption that more activity is a good thing? Your product teams may have another explanation. Sometimes we discover that the frequency means there is a bug in that area of the product, or something that needs UX’s attention. Sometimes a customer is struggling to onboard into the new system or multiple customers are searching for a feature that doesn’t exist. Your qualitative data will help you tell the full story as well. We have GA dashboards for each product and monthly reports that include:
- # of Customer Sessions
- # of Employee Sessions
- Most Active Customers (network/hostname, sorted by page views)
- Most Popular Content (what specific content pages are getting the most page views)
Search Activity
Search data is fascinating, as it gives us insight into how our users want to interact with our content. We look at the most popular search terms and compare the search results. Are the results accurate? How many search results pages do they have to click through to get what they want? Are the top results well written? Are users searching for content that doesn’t exist (and that we need to create)? Search terms also tell us how users search so that we can optimize metadata and titles for maximum findability. We monitor search using GA’s preconfigured dashboard widgets and standard report:
- Top Search Terms Widget: For a given segment/date range
- Search Usage Widget: Showing % of search usage and search exit rate
- Search Terms Standard Report: Detailed search activity in a variety of contexts
We also passively monitor (meaning, we do not report on) other types of GA data, such as:
- Browser & OS: What are our users running? This data was used to justify end of IE9 support for our company’s products.
- Language & Location: This helps us scale/justify localization and optimal times for system outages.
- Bounce Rate & Exit Pages: This shows us how our users are leaving the site.
Putting It All to Use
The analytics we collect serve two purposes: 1) it informs our content strategy, and 2) it promotes the importance of our work.
Informing the Content Strategy
Monitoring content performance and user behavior as a team makes prioritization and brainstorming so much easier. Rather than relying on intuition or engineering mandates, our data shows us what works and what needs improvement. Our strategies typically revolve around increasing the breadth and findability of high-performing content and optimizing low-performing content until we can prove that it needs to be eliminated.
Analytics also takes a lot of the guesswork out of where to start. We were thinking of removing system interface diagrams from Connect; maintaining changes with every release was tiresome and we never got any feedback on them. Our first step was to refer to GA and, low and behold, we found lots of returning traffic. So we kept the diagrams, and then we made them easier to find for a better user experience.
Conversely, when we’re asked to start work on something that doesn’t statistically perform well—like news articles—we come to the table with data to prove that this effort isn’t worth the time and a few alternative solutions that consistently yield views. Analytics can take you from supporter to knowledgeable collaborator very quickly.
Promoting Our Work
Our Web analytics report is like a monthly newsletter to stakeholders and the people who sign our paychecks, many of whom know very little about metrics (or technical content), so a data dump of activity is wasted effort. We used piktochart to create an infographic for our monthly report and embed it in an email. Making the report attractive is mandatory; sexy it up or it’ll go unread. It’s critical here to show that you are working to meet business goals, so we took ours—enabling self service—and made the self-service rate our focal point. We also distilled the report down to the most essential metrics to not dilute our message. We included subheadings to explain the data in layman’s terms and added a footer noting that the reports are available at a product level. It would be an injustice if I didn’t remind you to blind copy (BCC) your recipient group. After the report is sent, a lot of questions may fly around, and some people may get defensive because the data highlights product inefficiencies, plus it’s no fun justifying an anomaly on blast to your CTO. Do it or else.
We hope this insight into how we’re doing it at PROS empowers you to explore your content and fine tune its impact through analytics. Long live the docs, and the writers who create them. We’d love the opportunity to talk more about analytics in documentation and how we are beefing up reporting this year with new data. Reach out at any point during your journey with them.
Here’s a list of some resources that helped us along the way:
- Digital Marketing and Measurement Model by Avinash Kaushik, www.kaushik.net/avinash/digital-marketing-and-measurement-model/
- Everything a Product Manager Needs to Know About Analytics by Simon Cast, www.mindtheproduct.com/2013/02/everything-a-product-manager-needs-to-know-about-analytics/
- The #1 Mistake Everyone Makes when Creating Infographics by Maria Parra, https://venngage.com/blog/the-1-mistake-everyone-makes-when-creating-infographics/
- Google Analytics Platform Principles, Google Analytics Academy, https://analyticsacademy.withgoogle.com/course/2 gi
JENN CARLOS (jennymcarlos@gmail.com) manages the UX content strategy team at PROS. She brings a progressive user-centered approach to documentation and knowledge management and has worked across content strategy, user experience, and product management for 15 years.
TRAVIS KELLY (tkelly@pros.com) is a content strategist and GAIQ certified resident analytics expert for the PROS UX content strategy team.
VIVIANNE COSTA (vcosta@pros.com) is the connect product manager; she is a Certified Scrum Product Owner and also holds a GAIQ certification.
To the authors, I feel that this article is critically important to our field. To those of us in academia (teaching technical communication or research methods in the universities), this article explains what you need to be learning ourselves and teaching future students. Thank for a clear description of data analysis in the internet-driven world. Ida L. Rodgers