By Scott Abel | STC Senior Member
In the digital age, change happens quickly. This column features interviews with the movers and shakers—the folks behind new ideas, standards, methods, products, and amazing technologies that are changing the way we live and interact in our modern world. Got questions, suggestions, or feedback? Email them to scottabel@mac.com.
Scott Abel: Thanks for agreeing to chat with me today. For those readers who may not know who you are, tell us a little about yourself. Who are you and what do you do?
Rob Hanna: Thanks, Scott. I’ve been a professional technical communicator for almost 20 years. Before that, I didn’t know there was a name—let alone a profession—to describe the work I did. Now I own a company employing a team of technical writers, developers, and information architects serving businesses across North America. You can learn more about us at PrecisionContent.com.
My career in technical communication began in 1997 as a lone technical writer working with close to 100 engineers at a large aerospace company in Montreal, Canada. These folks had assembled an extensive system for topic-based authoring to document software requirements and design specifications using databases, a code management system, and Generalized Markup Language (GML)—a precursor to Standard Generalized Markup Language (SGML). The system was a thing of beauty—process, product, and people in perfect harmony. While it was elegant, it wasn’t ideal: engineers toiling over green screen terminals to hand-code their documentation using an archaic markup language, all to produce binders of printed documentation. I’ve dedicated my career to building the ideal system in a similar image—a balance of people, process, and technology.
I have served the Society for Technical Communication (STC) in several roles at the local and international level—from 2007 to 2009 as a director on the STC board, then several years on the STC Body of Knowledge and Certification Commission. Today I mainly serve the Toronto Chapter as a mentor to new and aspiring technical communicators.
SA: We’ve talked a lot about our shared goals. For the past decade or more, I have been working to convince technical communication professionals that their value isn’t writing, per se. I know you think similarly. As a change agent and thought leader, you’re hoping to affect change as well. What kind of change do you think is necessary to help elevate our profession?
RH: My time in the STC has helped me to realize that our profession is destined to assume a much greater role within the enterprise. We have the necessary skills to be the guardians of business information. When the day comes that the real value of information takes its rightful place on corporate balance sheets, the expertise of technical communicators will be called upon to secure the growth of these highly-valued assets.
It is our duty as leaders in our profession to advance the science and state-of-the-art in technical communication. We must strive to push technical communicators beyond our traditional roles within technical publication departments and into supporting roles across the enterprise.
As technical communicators, our greatest benefit to the organizations we serve doesn’t lie solely with serving our customers. It lies with serving the needs of our colleagues to help build better products and provide superior service.
SA: You’re an evangelist for advanced information management solutions. The Darwin Information Typing Architecture (DITA), component content management, and dynamic content delivery are practical solutions to some technical documentation challenges. But, one size seldom fits all. That’s one reason you created a hybrid methodology called Precision Content. What is Precision Content, exactly? Why do we need it?
RH: I’ve been a follower of DITA since first reading Michael Priestley’s article in Technical Communication in 2001. I’ve since become a contributor to the DITA standard through my participation in the OASIS Darwin Information Typing Architecture (DITA) Technical Committee. I am a DITA evangelist not because it is the perfect embodiment of technical communication but because it is a standard. Further, it is a standard with considerable uptake in our profession. This fact alone makes it an essential component towards advancing the science of technical communication.
Having a standard is a great start, but it isn’t enough on its own. To construct an optimal content ecosystem, we need a balance of:
- Utility—content must be in a form that is useful
- Maintainability—content must be managed, and
- Usability—content must be written to suit its function.
With DITA we get exceptional utility for our content. The DITA framework allows us to leverage content in ways we never could before. Our community of tool vendors has made working with semantically rich markup easier than ever. This ease of use extends beyond authoring to include every aspect needed to maintain the investment in content management. We’ve come a long way from the hand-coding of markup on a terminal.
The frequently overlooked piece of the equation is usability. Companies are investing heavily in the design and implementation of technology without regard for the prerequisite skills needed to use the system effectively. These investments go well beyond training writers to use software. Technology only takes us so far. We must look at the essential skills and quality standards necessary to author for these systems.
Teaching writers how to author intelligent content in a topic-based system requires a robust, universal content standard. Content standards inform writers how to author content that aligns with the structures, semantics, and reuse mechanisms we have designed into our solutions.
Creating content to a standard like DITA requires both discipline in writing and rigor in content creation and management processes. Each is needed to ensure content quality.
SA: As humans, we can’t help but build new ideas upon previous knowledge and experiences. Everything is a remix. From where do the ideas, concepts, and standards for Precision Content come?
RH: Eight years ago I was working on the Enterprise Business Document DITA Subcommittee led by Ann Rockley and Michael Boses. The Subcommittee was attempting to define a series of standards that would extend DITA beyond technical publications. My role was to gather information about existing content models with which we could work.
I approached Information Mapping Canada to invite them to join the conversation on the subcommittee. Information Mapping was developing and teaching a structured writing methodology. The company had sister organizations around the globe that had been teaching the standards to businesses for 20 years. And very interestingly, I found that the Information Mapping methodology seemed to align closely with DITA.
One year and several white papers later, the subcommittee went into hiatus without advancing a formal proposal for a set of new standards. I continued the work on my own to blend DITA with Information Mapping content standards. I studied the rules and eventually became a certified expert in Information Mapping. From there, I was able to adapt and modernize the methods with new research and develop new practices for authoring intelligent content. These new standards eventually became what we now call Precision Content.
SA: Why is Information Mapping—or DITA—alone insufficient to tackle today’s content challenges?
RH: I’ve worked with many Fortune 100 companies over the past ten years, helping them transition to structured authoring. I’ve collaborated with leading-edge technology companies—and some very smart people—to design and develop best-of-breed DITA solutions. Invariably the one critical flaw in any of our solutions has been the lack of attention to retraining, skills development, and quality assurance processes. DITA is complicated and requires training be provided to authors to ensure they use the standard to create consistent, high-quality output.
Also, DITA was initially designed to develop support documentation for high-tech products. And it does an excellent job of this, providing authors with 80 to 90 percent of the semantic structures they will ever need to support that kind of content. But once we move outside of technical publications, we see a significant increase in the types of semantic elements required to support enterprise content. DITA does not currently possess that support out-of-the-box. DITA is an ideal technology framework, but it is not a content standard.
Information Mapping, on the other hand, is a trainable content standard that has stood the test of time with organizations around the globe. It is backed up by decades of practice and published research. And still, it has failed to attract the attention it deserves from technical publication groups mainly due to costs and licensing restrictions. It has also failed to evolve over time, lacking more recent research and incorporation of new technologies.
We’ve also found that Information Mapping can be difficult to institutionalize within an organization, in part due to technology limitations. Information Mapping is still taught using a Microsoft Word-based application. This approach is suitable for small documents, but it is tough to use when creating larger documents. It’s also challenging on projects that require collaboration amongst content creators.
By streamlining parts of the Information Mapping methodology and adapting the DITA framework, we’ve been able to create an approach that better suits modern content development lifecycles and practices.
Half of the writers on my team are experts in Information Mapping; the other half, DITA pros. When both groups began using our new approach, an interesting thing happened. At first, both teams found the combination of the constraints of the DITA XML and the limitations of the writing standards made their work more onerous. But they soon discovered they were creating fewer words and writing in a more focused manner. Writers from both groups described the new writing style as more precise. Thus the term Precision Content was coined.
SA: Why is Precision Content valuable outside of pure-play technical communication projects?
RH: Organizations today have complex requirements for their high-value content. The need for speed, efficiency, collaboration, globalization, personalization, and multichannel dynamic content delivery are placing significant pressures on departments outside of technical publications. But, there is no magic, one-size-fits-all solution. Content creators need simple yet effective tools, new processes and standards, and practical training. Many will also need support and guidance as they attempt to navigate the many changes that accompany content transformation projects. Our writers and trainers work alongside the client’s in-house writers to create new quality benchmarks for their content.
New approaches like Precision Content will help us overcome many of these challenges. And, they provide us with greater opportunities to automate processes and improve the utility of content.
But more than anything else, content creators must buy into the real—and visible—usability improvements these changes can help us make to our content. Improved usability is dependent on the introduction—and enforcement of—a robust content standard.
SA: Neuroscience and content go hand in hand. Experts like Dr. Carmen Simon of RexiMedia have been making a case for producing useful and memorable content and mapping our ideas to the way the human brain works. What lessons from neuroscience have you adopted for Precision Content?
RH: That’s a big subject area, Scott. Fundamentally, I believe that a universal content standard must be rooted in neuroscience and cognitive research.
There is a broad body of research about how our brains analyze, organize, and retain new ideas. Unfortunately, there is little direct evidence to help us better understand how our brains interact with content. Our Precision Content methodology is based on existing science from Information Mapping as well as new evidence from experts such as Dr. Carmen Simon and Dr. George Gopen. We continue to invest in research and development in an attempt to build better tools and guide the evolution of our content standards.
We recently presented a 2-part series on the cognitive science behind intelligent content that is available on The Content Wrangler channel on BrightTalk.com. I recommend readers check out the free recorded presentations if they want to learn more about this topic.
SA: Unfortunately, all good things come to an end and we’re at the end of our time today. Thanks for sharing your knowledge and experience with us, Rob. I appreciate it—and I know our readers do as well.
RH: Thanks for the opportunity, Scott. It’s been a pleasure.