By Stephanie Trunzo and Jozef de Vries
Like many roles in this difficult climate, information architects (IAs) are challenged to crystallize their value proposition in a way that resonates at a business level. One of the industry trends is to measure value through business analytics; however, even if there is a set of defined metrics, measurements in and of themselves are meaningless. According to a 2008 Towers Perrin Study, “90% of employees are looking to make improvements through increased metrics, but almost all of them lack clear metrics and become confused, frustrated, and emotionally disengaged.” If we think about applying measurement to the information architecture discipline in software or product development, for example, we could say that having a dedicated IA on a team resulted in 25% fewer lines of written documentation. Is that good or bad? What time period was this measured across? Is this an improvement over previous measurement periods, or attached to any stated goal for that product area?
The IBM Rational software development organization consists of several thousand analysts, architects, project managers, developers, and quality professionals distributed over six continents. We create and maintain 57 product families, with hundreds of software releases. Our team of IAs has been thinking about how to drive maximum value to the business for several years leveraging IBM's notion of measured improvement, a set of theories and concepts that lets companies analyze whether they are really tackling the right set of metrics, and attaching them to business and operational objectives.
In this article, we review how we are working to apply measured improvement to the IA discipline, and explore how that transformation could take shape across our IBM IA community. Leveraging the work that we did in IBM Rational as a case study, we have extrapolated best practices that are equally useful at an enterprise scale as they are at an individual level for a single IA working in a small firm. Key concepts include:
- What business objectives does information architecture, as a discipline, answer? How do those business objectives break down into operational objectives? For example, we could analyze cost reduction in terms of task analysis, removing unneeded content, or improving market share in terms of enhanced designs as a differentiator against competition.
- What metrics support those objectives? For example, we could look at translation cost deltas, customer satisfaction ratings, writer's time saved (in dollars), or, better but harder, time to task completion improved in key user scenarios.
- What practices could be defined to enable teams to deliver against those objectives? For example, the IBM IA Council develops standards, templates, and best practices that could be mapped to the business objectives they help achieve. We also work toward tooling in many of the standards; however, can we become more intentional about this?
- How do we capture improvements to strengthen the business case? We have a compliance process that we will analyze to build in the measured improvement process and see delivered results.
Deriving IA measures from customer satisfaction
One of the clear and obvious objectives to pursue with a product release is customer satisfaction. However, that's a statement more easily included in a chart or emboldened in an email blast than put into practice. What constitutes customer satisfaction? How do we know if the customer is satisfied if we do not hear from them? If we only hear criticisms from the customer, does that mean they are not satisfied? For each discipline in a development effort, the criterion for satisfaction is different; from identifying what satisfies the customer, pursuing the goals to achieve that satisfaction, and evaluating whether or not the customer is satisfied in the end. For product documentation efforts, customer satisfaction takes on a particularly unique nuance in that documentation is a means to an end; it is not the product, and is not always even viewed as part of the product. However, when the “end”—that is, the product experience itself—is viewed as poor, there is a very good chance that documentation will be highlighted (rightly or wrongly) as a contributing cause. It is the responsibility of the information team, ideally the IA, to head off those issues before they arise. The challenge is to identify the areas with which the documentation struggles most, repurpose those into business objectives, and improve on those in a measurable sense. Initially, the quantitative metrics involved in measuring improvement are also the basis for identifying the objectives. Ultimately, it is the IA identifying these objectives, building strategic solutions to meet the objectives, and executing through to delivery.
In the Rational team at IBM, we set out to determine how we can prioritize areas of the product that need the most attention, justify our focus with quantitative methods, build skills and processes that enable us to execute on those priorities, and prove improvement through the measurements that drove our focus in the first place.
Benchmarking and iteration
Metrics are important on two fronts: both at the onset of planning a release and later, at some point prior to the onset of the next release when enough time has lapsed, to capture data that reflects the currently available release. These two lifecycle points allow us to identify our focus items that could be measured over a series of product releases, and specifically allow us to gauge customer satisfaction as it improves (or worsens) at the close of a release, and to guide plans for the pending release. The effort for us became perpetual, allowing us to see trending over a series of releases—in other words, it measured improvement. We gathered our metrics with two primary methods, defects and customer surveying; both were structured to derive statistical data.
Defects
Tracking defects or bugs is a standard in the software development business, but we neglected to gain maximum value from this process for our information architecture activities. Previously, defects were managed one-dimensionally: they'd be submitted, triaged to the appropriate owner, scoped to some stage in the release, maybe the severity reemphasized, and essentially tossed over the wall to that owner with the expectation that they'd do their job. That approach works, for what it's worth—defects are addressed, resolved, and onward into the release we go. But there's another dimension to defects that we were not appreciating, and that was a more holistic view of the sum of the parts as input to our overall approach to product documentation. We wanted to identify defect trends that highlighted the parts of the products with the most issues—not necessarily the functionality of the product, but of the overall product experience (as perceived through the experience with the documentation). This required that we identify a supplemental set of properties in our definition of a defect that we could use to make such evaluations. The list we came up with is as follows:
Property |
Values |
Usage |
---|---|---|
User |
Developer Administrator Development Lead Project Manager Tester Business Analyst (Tailored to the product selected) |
Allows us to identify which users struggle most, and ultimately which sets of information they use |
Task area |
Getting started Configuration Installation Task completion Troubleshooting |
Allows us to identify which stage during the product lifecycle they struggle with most |
Doc type |
Installation guide Tutorial General help Reference Videos |
Allows us to identify which type of documentation they are having difficulty with |
We didn't expect that the defect submitter would always provide values for each of these properties. When they did not, we were usually able to extrapolate from the details the appropriate values. Importantly, the reason we wanted to capture this information via property values in our defect tool was so we could query against them.
Now that were able to attribute values to our defects that held specific meaning to the work of information development, we were able query the data to identify which combination values appeared most in defects.
Survey input
Trying to solicit customer input on documentation via surveys is a challenging task, not to mention the challenge (including bias) in asking a customer who has complained about your documentation to then spend more time filling out a survey to answer why they are dissatisfied. However, as challenging as it can be, we knew that whatever results we could obtain would be useful. We already had the material for the survey based on the trends we were seeing in our defect analysis. Our focus was improving the return rate of the surveys and determining how to parse the feedback in order to produce statistically valid data.
We require the customers who participate in our client programs at IBM (such as beta programs and customer conferences) to complete surveys at various stages throughout the program or event. We wanted to leverage these existing opportunities to incorporate the type of information we were seeking. To be sure we could gather statistical data, each question required the user to select a number on a sliding scale representing their degree of satisfaction. In addition, we asked that they elaborate on the reasoning behind their responses. The number selection allowed us to rate degrees of satisfaction and the elaboration allowed us to derive themes.
Identified themes and objectives
From the defect and survey metrics, we were able to detect three major problem areas that we translated into our core business objectives and corresponding actionable operational objectives (at least until the improvement metrics indicate it is time to shift focus as part of the evolution and iteration of measured improvement). Our results are included in the table below.
Table 1. Identified Themes and Objectives
Metric results |
Business objective |
Operational objective |
---|---|---|
Delays in installation and configuration phase |
Improve time to value: Whether your architecture is for a suite of enterprise software or a desktop application with a narrow audience, ensuring that your users realize the value proposition of the content as quickly as possible is a broadly applicable business objective. |
Reduce time to product deployment: A majority of our products are heavyweight enterprise products that can take upwards of six months or more to put into production. We cannot avoid the labor-intensive nature of these complicated deployments, but we can minimize the failure rate and maximize efficiency throughout with the appropriate documentation. This is a key measure for how quickly a customer begins to see value from our products, and directly correlates with how well designed the information architecture is. |
Lack of understanding and general frustration getting started |
Lower ramp-up costs: The faster a customer understands the skills or information that are needed to leverage the product or service, the lower the cost of adoption. In the end, this could mean lower switching costs, shortened repeat business cycles, and higher customer satisfaction. |
Improve user up-skilling to reduce time to normal production mode: Getting users to maximum productivity as soon as possible is a very important aspect for our customers, and can be a costly one when delays prolong this stage. The documentation plays a vital role in making this transition smooth with tailored information, including intro tours, demos, and tutorials, among other architected content to achieve learning objectives. |
Difficulty locating the just-in-time information needed to complete tasks |
Optimize efficiency: Users have goals, and they want those goals to be achieved as quickly and effortlessly as possible. If they can't complete their tasks because they are unable to locate the information they need, or aren't guided through the experience, they will assuredly have a negative perception of the product or service quality. |
Improve findability and retrievability: With such robust products, there will always be tasks that even a seasoned user comes across that they have not performed before, and will need some degree of assistance. The longer it takes them to figure out the new task, the more dissatisfied they will be. Documentation can aid in these situations with effective retrievability and content organization, allowing them to find the information they need faster. |
Scenario-driven information architecture
Once we understood the areas that needed the most improvement, we had to identify the actions to execute against the operational objectives to ensure that we met our business objectives. This meant looking closer at the information supporting these objectives and understanding why what was already being produced was unsatisfactory (according to our data). To understand where the disconnect was—that is, why the information was not resonating with the users—we looked at the direction that our technical writers were receiving when producing this information. What were the development teams telling our writers to write about and was it being guided by the information architecture strategies? From what perspective was this information deemed necessary—from a technical or an IA perspective? Were the writers writing about something that our users told us they needed? Or were the technical writers taking the “document everything” approach, causing our users to experience information overload?
We held some information-gathering sessions with the technical writers and discovered that the problem was a little bit of a lot of things. In some cases, the development team dictated what the writers needed to cover, but this information was based only on the developers' own familiarity (and preference) with the product. In other cases, the writers wrote only what was tested, but that didn't always account for all the options that we support as a business. There were also cases where the writers leveraged the information they had to determine their content prioritization, but they didn't have the appropriate inputs they needed. In any case, whether or not the input of these various sources were accurate and sufficient, we had to standardize the process with which we define information requirements for a release, particularly in response to our newly defined business objectives. Specifically, we needed a process that ensures our users' needs are addressed.
The resulting process centered on scenario-driven information development. The common theme across all the data that led us to our business objectives was that the current content was not framed in a way that helped the user accomplish higher goals, nor did it explain the context and rationale behind those goals. In other words, we needed to present the information in an architecture that complemented their needs, laying out a pathway for them to follow. Once we had established the foundation of our process, we were able to identify actions against our operational objectives:
Identify prioritized scenarios with key stakeholders. This involves working with development, product management, support, and test teams to reconcile the perspectives we all have on the product and release with the input we receive from customers. We do this to derive a list of prioritized, real-world scenarios that focus on deploying the product, user up-skilling, and task completion. Granted, this can cover a very broad spectrum, but by leveraging the existing data that originally highlighted problem areas, we are able to narrow the scope to focus on what customers are telling us they're trying to do.
Maximize coverage of scenarios with other information producing teams. At IBM, the information development team is not the only group producing publicly consumable information for our products. Contributions also come from advanced education, support, community engagement, and other subject matter experts who regularly publish blogs, articles, and other media. It was imperative to the overall success of our effort that the content being produced from these various teams and roles all aligned with an accepted scope of scenarios. This was not to assign scenarios to specific groups, but rather to make sure that we accounted for the broader ecosystem of content producers to maximize our efficiency.
Adhere to the process through culture change and governance. Documentation practices currently employed by our writing teams were very much engrained in their day-to-day activities, and it was part of those activities that needed to change to make sure the scenario-based information was being developed according to the defined architecture. The process employed to change how our writers worked, and adhering to it involved both educating the writing teams and also introducing a governance mechanism to monitor that the process was being followed.
Results
What executive would not want to see that the quality of their product was improved or that customer satisfaction had increased? By using the measured improvement concepts, we were able to show the benefits of information architecture at a business level. We reduced the defect backlog related to documentation by 80% and increased our customer satisfaction score to 85%, up 30% from the previous release. While our team is working on goals related to other metrics and objectives, we highlight those two metrics in this case study to illustrate the power that measurement can have when approached rigorously.
Rational has been focusing on measuring improvement across a variety of metrics important to our software development business since 2006 in order to gain development intelligence. This kind of improvement over time against identified metrics allows teams to highlight the value their work is driving, as shown in Table 2.
Table 2. A Sample of Rational Measured Improvement
Metric |
2006 Measurement |
2008 Measurement |
2009 Measurement |
---|---|---|---|
On-Time Delivery |
47% |
82% |
100% |
Defect Backlog |
9+Months |
88% |
3.5 months |
Beta Defects Fixed Before GA |
3% |
4.5 months |
94% |
Customer Calls |
∼135,000 |
–24% |
–26% |
Customer Defects Arrival |
∼5,900 |
–22% |
–20% |
Beta Program Participation |
9 |
26 |
33 |
If you do not inherently understand the value of information architecture, as many managers and executives don't, it can be simple to dismiss the skills and the role as noncritical in a challenged economic climate. In order to prove value at a business level, you need to use the language of business. Analytics and metrics intelligence has been a growing focus area for companies forced to evaluate every means available for optimizing their business, and it is predicted to continue to trend. The International Institute for Analytics predicts “a strong growth for analytics [overall], with a growing competitive edge for companies using analytics.” Given that analytics requires a logical appreciation of often large amounts of data, and evaluation and assessment against defined criteria, what better match could there be for the work than an information architect, already armed with an analytical and organizational mind?
Leveraging the skills that come naturally, information architects should take on the challenge of defining their own value to their business. Analyze what is important to their company, what they can learn from their customers, and define a set of business and operational objectives they can execute against. And, critically, don't stop there! Iterate and evolve those measurements so that you are able to demonstrate positive trends over time through measured improvement. Keep a pulse on exactly what objectives are pivotal for IAs to achieve in order to deliver high-value assets to their businesses.
Stephanie Trunzo, the first information architecture manager at IBM, has used her 12 years in industry to apply information architecture skills to roles in business analytics and software development. Stephanie also continues to pursue her love of academics through her work with her alma maters, North Carolina State University, Chatham University, and Carnegie Mellon University.
Jozef de Vries, a founding member of the first information architecture team at IBM, has spent the past 7 years applying IA concepts and practices to user technology and software development efforts across the IBM Rational portfolio.
REFERENCES
The Information Architecture Institute, http://iainstitute.org.
International Institute for Analytics, www.enterpriseappstoday.com/business-intelligence/nine-business-analytics-predictions-for-2011.html.
Rational Measured Improvement. IBM, www.ibm.com/software/rational/mcif/.
Trunzo, Stephanie, and Krista Meyer. “Transforming Software Delivery: An IBM Rational Case Study.” IBM White Papers, www-01.ibm.com/software/rational/leadership.