62.2, May 2015

Special Issue Introduction Technical Communication: How A Few Great Companies Get It Done

Miles A. Kimball

Abstract

Purpose: This article introduces a special issue assessing the attitudes, ideas, and practices of technical communication managers representing several prominent companies on the Society for Technical Communication’s Advisory Council in 2013–2014.

Method: The research team used a modified Delphi method to assess the opinions of this group of experts over several rounds. This article describes and justifies the methodological approach of the entire study.

Results: The results are described in three individual articles following this introduction in the special issue, taking up the topics respectively of Identities and Relationships, Products and Processes, and Training and Education.

Conclusion: Technical communication should pay more attention to the perspective of publications managers.

Keywords: Delphi method, technical communication, managers

Practitioner’s Takeaway

  • Technical communication professionals should pay more attention to the insights of publications managers.
  • The Delphi method can be a helpful way to assess consensus among groups in product development and assessment, as well as in academic research.
  • The method can be modified to value disagreement as well as consensus.

Consulting the Oracles

Most technical communicators would likely agree that although the field constantly changes, it is today in a particularly dramatic state of flux. We still do our work “accommodat[ing] technology to the user,” as David Dobrin (1983) put it, but the techniques we use and the contexts in which we do so have changed significantly. We have gone through a variety of progressive revolutions in thinking and practice in the past three decades. User-centered design, once a novelty, is now an accepted value in the field. The visual and physical design of technical information is now something we all recognize as important. We have drawn ethical awareness to the forefront of professional identity. We have made considerable progress in solidifying our professional status (we are now recognized by our own Standard Occupational Code) and growing our collective professional body of knowledge (STC-BOK) (Hart & Baehr, 2013). And we have enthusiastically explored and grappled with new communication technologies, from desktop publishing to distributed content and mobile delivery. Now a technical communicator’s work is not commonly centered on composing and managing the thick tomes we called “manuals” (though they were anything but handy). Even the electronic version of those tomes – online help files – merely formed a transitional state on the way to what technical communicators increasingly work on today: fully networked systems of live information, pulled from databases on the fly to present users with the information and guidance they need at the moment they need it. Working with distributed technical information in this networked environment differs significantly from working alone to develop a discrete, coherent technical document.

The field has changed so much that even the broader term “technical communicator” has had to stretch to fit the growing number of specialties that have developed in the past decade, including user experience designer, interface designer, usability expert, content developer, information designer, communication engineer, and so forth. What we do, what we call ourselves, how we form and encourage the development of new members of our profession – all have changed so quickly that we must constantly reassess where we stand, what we do, and who we are as technical communicators.

This special issue engages in this kind of reassessment. In this regard, it extends the excellent work done by many researchers, practitioners, and commentators examining the direction of the profession, including Johnson-Eilola (2005), whose Datacloud opened our awareness to the redefinition of technical communication as “symbolic-analytic” work; Kynell and Savage (2003, 2004), whose two-volume edited collection, Power and Legitimacy in Technical Communication gathered many perspectives on these issues; and more recently, Coppola’s 2012 and 2013 special issues in this journal on the Professionalization of Technical Communication.

Technical Communication Managers

This special issue adds to these and similar academic studies by examining what technical communication is today, how it works, and who does it, from the perspective of the people who manage technical communication practitioners in successful, highly prominent companies. Understanding the perceptions and attitudes of these managers is particularly useful for professional technical communication practitioners and researchers because these managers serve as the connecting point between the profession and the companies that professionals work for. In this liminal position, these managers must simultaneously direct the work of technical communicators, while at the same time implementing the directions of their own managers in moving toward corporate goals. Accordingly, they have a unique perspective about the professional identity, practices, and values of technical communication as it is related to business purposes.

As Carliner (2004, p. 45) has commented, “systematic studies of management practice in technical communication are scarce.” Similarly, Amidon and Blythe (2008, p. 6) have observed that “the management of communication groups is one of the least addressed research topics in professional and technical communication.” Of course, as Amidon and Blythe point out (2008, pp. 6–7), there is plenty of practical advice available about the management of technical communication (see for example among many others Allen & Deming, 1994; Dicks, 2003; Hackos, 1994; Lips, 2007). This advice is invaluable (and often based on research), but it does not aim to provide research about technical communication management.

Carliner’s (2004) survey-driven study of the quality metrics used by technical communication managers (followed up in Carliner, Qayyum, & Sanchez-Lozano, 2014) and Amidon and Blythe’s (2008) thoughtful analysis of interviews of 11 technical publications managers have helped close this gap. However, even this small body of research has focused primarily on the practices of technical communication managers. We also wanted to find out what these people think about technical communicators and technical communication today. Our focus was less on the experiences and challenges of publications managers themselves, than on their perspectives about technical communication and the practitioners they manage. In this regard our study is somewhat more like Whiteside’s (2004) study of the attitudes of publications managers toward the training and education of technical communicators. But we wanted to explore broad questions not only about education, but about identities, relationships, products, and processes in technical communication today.

Despite the good work done thus far, there is much yet to be gained from looking at technical communication from the perspective of publication managers. We have not at all plumbed the depths of the understanding that could result from working with these people and examining their perspective on the relationship between technical communicators and the companies they work for. But we hope this study makes a contribution in that direction.

A Unique Special Issue

As special issues go, this one is perhaps more special than most – or at least it’s a bit different. Typically special issue editors send out a call for papers on a particular theme, receive contributions, see them through peer review, and publish them as a collection of articles on that theme.

This special issue, however, holds the results of a single large study. As such, it might provide a model for journals to support research projects that are too large for a single article, yet too focused or too topical to make a book.

This shift from the norm is justified by a unique opportunity: the founding of the Society of Technical Communication’s Advisory Council in 2013. The STC Advisory Council was formed of technical communication managers from industry, STC personnel and officers, and academics (including me and fellow researcher Jim Dubinsky) to function as a sounding board for STC’s leadership. The inaugural Advisory Council included representatives from an impressive list of companies, including Adobe, Boston Scientific, CA Technologies, Google, IBM, Madcap, and Oracle. In other words, in one room, STC had gathered some of the most powerful voices in determining the shape of technical communication as it’s done at some of the most successful, forward-thinking companies in the tech sector.

It seemed natural – indeed, imperative – that we should ask these people some questions about how their companies do and think about technical communication, so as to understand some of the current practices and attitudes and spread that understanding to others in the field. With the aid of STC Executive Director Chris Lyons and Communications Director Liz Pohland, we assembled a team of experienced researchers to do just that.

In particular, the resulting study focused on questions orbiting four themes:

  • Identities and Relationships. How do these companies define “technical communicator” in terms of job titles, personnel structures, and professional activities and responsibilities? Where and how do technical communicators work in the corporate structure?
  • Products and Processes. How do these companies engage in technical communication? What do they produce, and how do they produce it?
  • Education and Training. What kinds of education and training do these companies value in their technical communicators, both in terms of qualifications for hiring and in terms of continuing professional development?
  • On the Horizon. Where and how do these companies anticipate the field of technical communication will develop in the next few years? In what directions are these documentation managers leading their companies toward the technical communication of the future?

These topics are deeply interconnected: who we are is related to what we do and how we learn to do, and all of these things are changing fast. But dividing our questions into these four primary themes gave shape and direction to our inquiry.

It also helped us organize this special issue. We begin with this introduction, which provides the methodological explanation and justification for the entire study. From here, we branch out into three chapters, prepared by my collaborators Craig Baehr (Texas Tech University), Jim Dubinsky (Virginia Tech University), and me. Each chapter focuses respectively on one of the first three themes listed above: Baehr on Identities and Relationships, Dubinsky on Products and Processes, and Kimball on Education and Training. The responses about the future of technical communication informed all three of these primary areas.

Methodology

We approached the study with big questions, but a small participant group already defined by the membership of the STC Advisory Council. Accordingly, we settled on a flexible methodology that would allow the participants to engage in a structured, iterative, generative conversation. This conversation involved only a few participants, but it generated a lot of data about what technical publications managers think and feel about technical communication today.

Given that the rationale behind this project was to consult a group of experts, we applied a method developed just for that purpose. The Delphi method was pioneered in the 1950s by the RAND Corporation (Okoli & Pawlowski, 2004, p. 16) as a way to form a consensus on a central issue. In this way, the Delphi method attempts to make groupthink, typically understood as a danger to be avoided in focus groups, into an asset. The Delphi method’s central features differ from other methodologies in two ways. First, the Delphi method employs several rounds of surveys, each building on the results of the last. And second, the results of each round are shared with the participants before the subsequent round. The participants then can read and respond to each other’s ideas, helping to shape a consensus view (Landeta, 2006; Geist, 2010; Okoli & Pawlowski, 2004).

The Delphi method’s reiteration can be both a strength and weakness. It allows a group of experts to think carefully through a problem, topic, or issue. Since its aim is consensus, the method allows researchers to reiterate the study as many times as necessary to reach this goal. However, this focus on consensus as the primary goal can extend the time scope of the study significantly and raises the likelihood of fatigue and attrition among the participants. The Delphi method’s focus on consensus also by necessity undervalues dissensus and disagreement, which can be valuable to recognize and generate new ideas and understanding.

In response to these limitations, Kendall, Kendall, Smithson, and Angell (1992) proposed the SEER method (Scenario Exploration, Elaboration, and Review), which they described as the “converse of the Delphi technique.” It starts with a group of experts, just like a Delphi study. But rather than focusing on reaching consensus through anonymous rounds of surveying, SEER explores differences among a small group of experts through iterative, structured, face-to-face interaction (Kendall, Kendall, Smithson, & Angell, 1992, p. 125).

In this study, we struck a middle path that takes advantage of both approaches, in these ways:

  • Valuing difference. Consensus is great, but disagreements are interesting, too. We wanted to value and explore both, so like SEER, our goal was not necessarily to reach consensus, except insofar it arose from the participants’ responses and interactions. Differences of opinion among the participants also provided fruitful opportunities for analysis.
  • Limiting scope. Because the goal was not consensus, we designed the study as a discrete series of rounds of inquiry with a definitive beginning and ending.
  • Building knowledge from the ground up. While the Delphi method asks the same questions repeatedly, we allowed the responses from each round to shape the questions for subsequent rounds. This grounded approach helped build a deeper sense of the dynamics surrounding the issues the study addresses. In this way, our methodology could perhaps be described as employing participatory grounded theory, as described by Teram et al. (2005).
  • Valuing community. Because the participants were members of the STC Advisory Council and already knew one another, anonymity was impossible to sustain. So we decided to value community and follow Kendall et al.’s (1992) SEER approach, including not only anonymous surveys, but face-to-face and virtual focus groups. However, to avoid researcher bias, all results were anonymized before the researchers began to analyze the results.

Specifically, the study took place through four iterations, or rounds.

Round 1 employed a survey instrument mixing qualitative (discursive) and quantitative questions to gain a baseline of response. (See appendix A.) We then anonymized the responses and provided them to participants in preparation for Round 2. To foster conversation, some of the quantitative results of Round 1 were visualized for the participants in the form of charts.

Round 2 used a second survey instrument, based on the responses from Round 1. (See appendix B.) Just as with the Round 1 results, the Round 2 results were anonymized and provided to participants for Round 3.

Round 3 was conducted as a focus group at the 2014 STC Summit, centering on the issues raised in Round 1 and Round 2. (Because only three of the participants were able to attend the Summit, these results were not shared more broadly with the participant group.)

Round 4 followed up with a final Web conference focus group, using the audio and chat functions of the common Web conferencing tool GoToMeeting. To prepare for this round, we again provided participants with the results from Round to 1 and Round 2, and we asked participants to provide before the Web conference a written response to what we felt was a central unresolved question for each of the four research themes. (See appendix C.) In the Web conference, we took up each question in turn, giving each participant two minutes to respond, then opening each question for general discussion.

The variety of methods we used to gather data differed from traditional Delphi method approach, which relies exclusively on reiterated surveys. However, we felt that shifting the methodology for Round 3 and Round 4 to synchronous (whether face-to-face or online) conversation would lead to greater engagement among the participants and avoid the survey fatigue for which other researchers have criticized the Delphi method.

One of the greatest advantages of this method is its iterativity. How often have researchers prepared a single survey and sent it out, only to realize on receiving the results that they forgot an obvious question or worded a question in a way that confused participants or fostered unusable responses? In a typical research project, such an error can potentially invalidate the entire study. Using the Delphi method, however, we were able to repair any deficiencies in our questioning from round to round. Or put more positively, we could use each subsequent round to tunnel down to greater detail on interesting points, to ask for clarification, or to foster further conversation. For example, in Round 1 we asked participants to rank the relative importance of a variety of skills for technical communicators. One of the skills we included was “research skills,” by which we meant skills in conducting field, usability, or product research. When we received the results, we were surprised to find that participants marked “research skills” as having a very low importance in a technical communicator’s work. In a typical study, this result might remain a mystery – but the Delphi method gave us a second chance at understanding the importance (or lack thereof) of research skills. So in Round 2, we asked participants to speculate on why research skills came out so low in the rankings. We also followed up on this question in Round 3, eventually finding out that when participants saw the term “research skills” in a survey from a group of college professors, they read it as library research skills. This confusion led to a discussion of the wide variety of research skills technical communicators need.

The Delphi method, whether used straight or modified as in this study, can form an effective research method not only in answering academic research questions, but in answering questions about product development and usability in corporate settings. After all, the Delphi method was pioneered in the defense industry by the RAND Corporation. It could direct the management of extended, reiterative focus groups or participatory design projects, for example.

Participants

Overall, we had eight participants, representatives from seven companies: Adobe, Boston Scientific, CA Technologies, Google, IBM, Madcap, and Oracle. (To maintain privacy, we have anonymized responses and links from employees to specific companies.) However, these were busy people, and even in the truncated version of the Delphi method we used, this was a long study, extending from March to August 2014. Accordingly, their participation was not entirely consistent (see Table 1). Two members of the Advisory Council could not themselves participate fully; one participant nominated a general company evangelist in his place, assuring us that he would be consulting with this substitute. In any case, the substitute participated only in Round 2. Somewhat more involved was a participant who, rather than participating herself, sent along her assistant manager – a publications manager with considerable experience (see Table 1).

kimball_ed_table1

Five of the seven participants can easily be described as technical publications managers in traditional sense of the term: someone who manages the work of technical communicators. Despite the trends toward flatter organizational structure and distributed work, all of these participants manage a separate publications team. Two of these five have had extensive experience as professional technical writers, while the remaining three are managers primarily, and not especially experienced in technical communication practice. Of the remaining two participants, one is the CEO of a small software company, and one is the company evangelist described above.

Data Analysis

The mass of data gathered from multiple methodologies presented a daunting prospect for analysis. For any particular survey question, we were limited by our small N; by itself, any particular question-response set could not be statistically trustworthy. But the strength of the Delphi method lies in asking similar questions repeatedly and sequentially, to dig down into people’s perceptions. So while we may have had only seven respondents, on any particular issue we have many times more than seven comparable responses.

To analyze this mass of data, we began with simple descriptive statistics to analyze surveys for Round 1 and Round 2. Although these statistics have far too low an N to be reliable, they gave us good cues about where to ask further or more specific questions in subsequent rounds. For the transcripts of the focus groups in Round 3 and Round 4, we might even a few years ago have used a technique such as that employed by Amidon and Blythe (2008): manual content analysis, beginning with the development of a coding scheme and then a manual application of all of the codes to matching utterances. Such a methodology has many good things going for it – but efficiency and ease-of-use are not among them.

So for Rounds 3 and 4, we recorded and transcribed the conversations, then conducted a statistical analysis on the transcripts through text mining techniques, specifically semantic content analysis, using a well-regarded software package, Leximancer (http://leximancer.com). Leximancer applies a variety of text mining approaches aimed at analyzing the natural language in conversations and qualitative (text-based) survey responses. Leximancer offers a convenient, rigorous, and well-justified suite of tools for text analysis (Smith & Humphreys, 2006). It is specifically designed for analyzing corpora of natural language texts. Leximancer starts with a transcript of a conversation, previously marked for speakers, and builds a thesaurus of terms in the conversation. It then uses cluster analysis techniques – specifically, relative co-occurrence frequency (the number of times one word appears near another, and the distance between them) – to group related terms into concepts. It then groups the concepts into general themes. Leximancer also allows researchers to include or omit specific “seed terms” (Leximancer, 2010) to focus the analysis on particular ideas or to filter out misleading data. For example, we specified that the utterances of the researchers in the focus groups should be excluded from the analysis, as well as functional terms such as “time” and “question” (as in, “It’s time to move on to the next question”).

Just as with manual content analysis, which begins with subjectively determined categories by which the researchers objectively label utterances, the cluster analysis performed by Leximancer inevitably involves a good amount of researcher subjectivity in “reading” the results. Leximancer’s primary approach is K-means cluster analysis. This technique requires the researcher to specify in advance how many clusters (K) into which the algorithm will sort multivariate data (K= 2, 3 … n). In other words, researchers can choose to sort concepts into fewer groups with looser relationships, or more groups with tighter relationships. At the extreme ends of the spectrum, one could put every item into one group, or sort every individual item into its own group, although neither approach would give much insight into the relationship between the items. So cluster analysis typically requires that the researchers look at the data at several different levels of K until patterns or interesting relationships begin to emerge. This emergence is inevitably influenced by the researchers’ attitudes, opinions, and experience. In other words, cluster analysis is less a measurement technique than an exploratory data analysis technique, as described by John Tukey (1977).

Given that we were exploring attitudes of the group, the exploratory nature of this analytical technique made sense. The subjectivity involved in this technique does not mean that the results are invalid – they’re simply contingent. However, this contingency is moderated by the reiterative nature of the study, by the intense conversations between the participants, and by the close collaboration of the researchers. In other words, each participant (including the researchers) had multiple voices and viewpoints to provide checks and balances upon our reading of the data.

Visualizing the Data

Our methodological approach naturally directed our approach to analysis and ultimately to structuring the three articles presented in this special issue. In other words, each of these articles traces the development of ideas through the successive rounds of the study, focusing on descriptive statistical analysis for Round 1 and Round 2, then moving over to semantic content analysis for Round 3 and Round 4.

In each round, we used data visualizations to guide not only our analysis of that round, but our research design for subsequent rounds. For Round 1 and Round 2, these visualizations included sparklines and radar plots. Just like the raw results, while these visualizations are based on a small N, they gave us good indications about where we should ask more questions in the next round. Like all of the other data in the study, we shared these visualizations with the participants as we went along.

One of the most useful visualization tools was provided by Leximancer itself, which outputs its results in a network diagram (see for example Figure 1). These diagrams gave us the opportunity to see potential themes, consensus, and dissensus among the participants in their responses to various questions and groups of questions. We used these diagrams primarily as analytical tools to understand the final two rounds after the study ended, so we did not share them with participants.

Because these visualizations may be unfamiliar to readers, I will describe them somewhat more fully here. Leximancer’s standard diagram uses “bubbles” to mark each theme, and labels within the bubbles to mark each concept within the theme. The themes are heat mapped from red through blue for cohesion or connectedness, although unfortunately color cannot be represented in these pages. The most connected theme is always rated 100%, and other themes are compared to it in terms of a “connectivity score,” represented as a percentage of the connectivity of the most connected theme. In the Leximancer interface, users can click on any theme or concept and see the links and their relative strength between the item clicked on and the rest of the cluster. Leximancer also allows researchers to tunnel back from themes to concepts to the individual utterances of participants in the transcript.

In reading these visualizations, please note that the circles do not form Venn diagrams – any overlap is simply an artifact of the three-dimensional nature of the graphics, which can be rotated for viewing from various angles. In addition, the size of the circle for each theme carries no significance – the software simply sizes the theme circles big enough to make room for the concepts they contain. The more important visual indication is the distance between concepts. Concepts that are farther away from each other are less tightly connected than those that are close to each other. Concepts with direct linkages are more tightly connected than those with indirect linkages.

As with cluster analysis itself, these diagrams essentially show the relative connectedness of concepts and themes to each other. These diagrams therefore allowed us to find interesting connections and disconnections among concepts and themes in the transcript and then trace those patterns down to individual utterances of the participants for further analysis.

Limitations

By employing a variety of methods, we were able to foster a dynamic, but structured conversation that valued both consensus and dissensus – our primary goal in designing the study. However, such a study is bound to be messy, in the manner described by Parkhe (1993). The dynamism we valued necessarily means that the results are not entirely reliable or extendable to other companies or other contexts.

Also, the data represented a small group of voices – and even that group naturally fluctuated from round to round. With such a small number of companies represented, our study was inevitably exploratory. But at the same time, the small group of people generated a mass of data for analysis, as they contributed to multiple rounds of survey and discussion.

Three Articles

The rest of the special issue is comprised of three articles addressing three significant topics for technical communication.

First, Craig Baehr describes the responses of technical communication managers to questions regarding professional technical communication in terms of its Identities & Relationships. Baehr found that technical communication managers valued adaptable technical communicators who could practice through multiple skills and specializations. He notes that while job titles vary from company to company, experienced technical communicators often build on their technical communication skills to advance into management roles throughout the product development process.

The second article of the special issue holds James Dubinsky’s analysis of technical communication managers’ responses to questions about the meat and potatoes of technical communication: Products & Processes. Dubinsky observed in particular the renewed focus on content expressed by the participants, as well as the rapidly shifting technologies and techniques technical communicators must constantly work to master.

In the final article, I report the findings on technical communication managers’ attitudes toward Education & Training. I found that the managers who participated in the study valued relatively traditional educational credentials and skills, while at the same time holding that basic writing skills are no longer sufficient for success in the profession. Practitioners also need strong iterative project management skills, business skills, and interpersonal skills. Participants felt that technical skills and domain knowledge were best gained in the context of work, rather than in school.

Acknowledgments

Our thanks to Technical Communication editor Menno de Jong and the editorial board of this journal for seeing the potential of such an innovative special issue.

Lisa Meloncon (University of Cincinnati) was an original member of the research team and contributed significantly to the research design and data collection. We appreciate her contributions.

Thanks to STC Executive Director Chris Lyons and Communications Director Liz Pohland, who masterminded the formation of the STC Advisory Council and supported our research project.

Finally, thanks to our participants for their time and enthusiasm in completing a long and complex project.

Appendix A: Round 1 Survey

Identities and Relationships

Q1: What do you call your company’s technical communicators – what are their job titles? Please mark all that apply.

  • □ Communications Manager
  • □ Content Developer
  • □ Documentation Manger
  • □ Documentation Writer
  • □ Information Designer
  • □ Information Developer
  • □ Information Services Engineer
  • □ Publications Manager
  • □ Publications Specialist
  • □ Software Engineer
  • □ Technical Author
  • □ Technical Communicator
  • □ Technical Documentation Specialist
  • □ Technical Editor
  • □ Technical Writer
  • □ Other

Q2: For whom do your technical communicators work? Who supervises them? Who evaluates their performance?

Q3: Describe the organizational context in which your technical communicators work. Do they work in a separate publications group? As part of a multi-function team? With whom do they work every day? With whom do they interact less directly?

Q4: What’s the relationship in your company between technical communication and similar fields that deal with the design, development, and distribution of technical information for users?

Q5: Describe the typical pattern for advancement or promotion for technical writers in your company (if there is one). Do they generally stay in the field of technical communication, or branch out into other roles? If the latter, what other roles? How long are they likely to stay at a particular level along this path?

Q6: How much autonomy and authority do your technical communicators experience? In other words, in what areas or aspects of your company’s activities are technical communicators considered the experts?

Q7: In the past five years, how has the role of technical communicators in your organization changed?

Q8: In the past five years, how has the number of technical communicators changed in your organization? In the comment area, please tell us why you think this change occurred.

  • □ Increased more than 50%
  • □ Increased up to 30%
  • □ Increased up to 10%
  • □ Stayed the same
  • □ Decreased up to 10%
  • □ Decreased up to 30%
  • □ Decreased more than 50%

Q9: In your view, where does technical communication typically happen in product development cycles? Where should it happen?

Products and Processes

Q10: What are the primary products of technical communication in your organization – specifically, what kinds of documentation or information products do technical communicators produce?

  • □ Contracts
  • □ Customer service scripts
  • □ Demonstrations
  • □ Design documents
  • □ Documentation plan
  • □ FAQs
  • □ How-to videos
  • □ Instructions
  • □ Knowledge base articles
  • □ Online and embedded help
  • □ Other
  • □ Policy documents
  • □ Process flows
  • □ Product catalogs
  • □ Product packaging
  • □ Project documents
  • □ Proposals
  • □ Release notes
  • □ Requirements specifications
  • □ Simulations
  • □ Training course materials
  • □ User manuals
  • □ Warning labels
  • □ Web-based training
  • □ Websites
  • □ White papers

Q11: What technical information products are most important for your company’s mission? What ones are less so? Please mark as follows:

  • □ Contracts
  • □ Customer service scripts
  • □ Demonstrations
  • □ Design documents
  • □ Documentation plan
  • □ FAQs
  • □ How-to videos
  • □ Instructions
  • □ Knowledge base articles
  • □ Online and embedded help
  • □ Policy documents
  • □ Process flows
  • □ Product catalogs
  • □ Product packaging
  • □ Project documents
  • □ Proposals
  • □ Release notes
  • □ Requirements specifications
  • □ Simulations
  • □ Training course materials
  • □ User manuals
  • □ Warning labels
  • □ Web-based training
  • □ Websites
  • □ White papers
  • □ Other

Q12: For at least two of your most important products, describe the processes involved. Discuss how an information product would be created from start to finish – Who would initiate the task? How would that task be communicated to the individual(s) or teams involved? Who would be on those teams?

Q13: Of the following analytical approaches, which ones do technical communicators in your organization use?

  • □ Analysis of technical support or customer support data
  • □ Field testing
  • □ Focus groups
  • □ Hands on testing
  • □ Interviews with users or customers
  • □ Search of marketing literature and development specifications
  • □ Usability analysis and testing
  • □ User Surveys
  • □ Other

Q14: Which social media are technical communicators using in the product development cycle, and how? Please comment to the right of any items you check.

  • □ Blogs
  • □ Facebook
  • □ Internal system (such as Microsoft Lync)
  • □ Twitter
  • □ Wikis
  • □ Other

Q15: What standards and regulations drive your process and product development?’

  • □ Darwin Information Typing Architecture (DITA)
  • □ ISO IEC 15289, Content of life-cycle information documentation
  • □ ISO/IEC 12207, Software life-cycle processes
  • □ ISO/IEC 15288, System life-cycle processes
  • □ ISO/IEC/IEEE 26511, Requirements for managers of user documentation
  • □ ISO/IEC/IEEE 26512, Requirements for acquirers and suppliers of user documentation
  • □ ISO/IEC/IEEE 26513, Requirements for testers and reviewers of user documentation
  • □ ISO/IEC/IEEE 26514, Requirements for testers and reviewers of user documentation
  • □ ISO/IEC/IEEE 26515, Developing user documentation in an agile environment
  • □ Rehabilitation Act of 1973, Section 508, Accessibility standards
  • □ W3C Web Content Accessibility Guidelines (WCAG)
  • □ XHTML
  • □ XML
  • □ No standards
  • □ Other

Q16: What is the role of user-generated content, if any, in your documentation cycle?

Q17: How important are the following skills and competencies for technical communicators to learn and practice today? Please mark as follows: 1 = mission-critical; 2 = important; 3 = useful, but not essential; 4 = not necessary; 5 = not desirable.

  • □ Audience analysis
  • □ Communication strategy
  • □ Content development
  • □ Critical thinking
  • □ Document design
  • □ Field research
  • □ Information architecture
  • □ Information design
  • □ Knowledge management
  • □ Managing distributed work
  • □ PHP, C++, etc.
  • □ Usability research
  • □ Visualization
  • □ Working in teams
  • □ Writing
  • □ XML, DITA, etc.
  • □ Other

Training and Education

Q18: Rank the following credentials in terms of which best signifies technical communication skills and competencies.

  • □ Degree in Technical or Professional Communication
  • □ Degree in English, Communication, Journalism
  • □ Technical degree
  • □ Certificate in TPC from college or university
  • □ Other liberal arts degree
  • □ Scientific degree
  • □ Industry certifications on skills/software (Microsoft, Adobe, etc.)
  • □ Training courses from professional organizations
  • □ Certification from professional organizations
  • □ Combination of the above

Q19: What kinds of training or training support does your company provide technical communicators?

  • □ Formal in-house training
  • □ Informal in-house training
  • □ Mentorship program
  • □ Support for external formal training
  • □ Support for external self-paced training
  • □ Support for traditional education (college degrees)
  • □ Other

Q20: What support does your company offer for professional development of technical communicators? Please mark all that apply.

  • □ Support on a case-by-case basis
  • □ Time to contribute to professional organizations and activities
  • □ Support for professional licensing and certification fees
  • □ Support for travel to professional conferences and conventions
  • □ No specific support
  • □ Other

Q21: If your employees need professional development, which of these options would you recommend?

  • □ Advanced degree in TPC
  • □ Academic certificate program
  • □ STC sponsored webinars
  • □ Other trade webinars
  • □ Other professional association certificates or courses
  • □ STC sponsored certificates
  • □ Other

Q22: How are training, education, and professional development for technical communicators valued or recognized in your organization?

Q23: If you could give academic program directors one piece of advice to make sure their programs were meeting the needs of the field, what would it be?

The Future of Technical Communication

Q24: Broadly, where do you see technical communication going as a profession and as an activity in the next ten years?

Q25: Technical communication has had a long-standing discussion about how to solidify the profession – for example, through academic program accreditation, professional certification, professional development, and other activities. What do you think are the biggest impediments to professionalization in technical communication? Please comment on any choices you make.

  • □ Changes in corporate culture
  • □ Changes in technology or society
  • □ Changes in the economy
  • □ Globalization
  • □ Lack of demand for well-educated or certified technical communicators
  • □ Lack of interest in professionalization among employers
  • □ Lack of interest in professionalization among technical communicators
  • □ Other

Q26: What kind of credentials, education, training, and professional development will employers expect of someone entering the field in 2024?

Q27: Technical communication currently can include many career paths (such as content strategist, specialized technical writer, usability professionals, etc.). In what directions will professional technical communicators specialize in the next ten years? What specialties will become less relevant (or even obsolete)?

Q28: What developing technologies will be the next big thing for technical communication in the near future – say the next five years? Conversely, what technologies will we abandon in that time?

Q29: For each of the following items, give us your best sense of whether it will increase in importance to the profession of technical communication, decrease in importance, or stay about the same in the next 10 years.

  • □ Collaboration
  • □ Data Visualization
  • □ Mobile Platforms
  • □ Distributed Content
  • □ Interface Design
  • □ User Metrics
  • □ User-Generated Content
  • □ Multiple Media
  • □ Subject Matter Expertise
  • □ Usability and Usability Testing
  • □ Web Development
  • □ Information Architecture
  • □ Product Documentation
  • □ Professional Development
  • □ Undergraduate Education in TPC
  • □ Writing and Editing
  • □ Graduate Education in TPC
  • □ Professional Certification
  • □ Policy Writing
  • □ Process Documentation

Q30: Will the role of user-generated content increase in importance in the next ten years? What role will professional technical communicators play in relation to that content?

Appendix B: Round 2 Survey

Identities and Relationships

Q1. In what leadership or managerial positions do technical writers currently serve in your organization?

Q2. Usability came out as largely separate from TC on the question about the relationship between fields, and on the analytical approaches question it came out near the bottom in importance. What do you see as the relationship between usability testing and technical communication?

Q3. What factors influence actual job titles of technical communicators in your company? In what ways?

Q4. Please comment on the differences you see in the radar charts showing responses to the question, “What’s the relationship in your company between technical communication and similar fields…?” (For your convenience, the charts are reproduced below.)

Products and Processes

Q1. In Round 1 (see chart below) most of you reported that your technical communicators spend much of their time and energy creating instructional products (e.g., user manuals, online or embedded help). But we found it interesting that the second tier of common products included policy documents and “how-to” videos. While “how-to” videos clearly fall under the larger category of instructional products, the medium is not often one used by technical communicators. If your technical communicators are producing “how-to” videos or you think they might be doing so in the near future, would you discuss who else in your organization, if anyone, works on these products?

Q2. Have you hired technical communicators with video production skills in mind? If not, have you offered additional training or sent your technical communicators to workshops/classes?

Q3. Are “policy documents” and white papers genres that most technical communicators you hire feel comfortable writing? If not, do you hire technical communicators with a specific skill set for these products? If so, what would that skill set include?

Q4. In the question “Of the following analytical approaches, which ones do technical communicators in your organization use?” the range of analytical approaches to data gathering was broad, with every category selected (see the graph below). However, interviews seemed most valued.

Q5. What special training, if any, do you provide, support, or expect of technical communicators using these interview methods?

Q6. Research skills (usability research, field research) came near the bottom of the list of desired skills and competencies (see chart below). Recognizing that this was a relative ranking and that all of the entries might be important in their own way, why do you think research skills came out at the bottom of the list?

Q7. In a number of places, respondents commented that Technical Communication is moving toward a greater involvement in design—interface design, product design, etc. Yet most of the top-ranked products (planning docs, FAQs, instructions) are pretty traditional. Is this a contradiction? If so, what is its significance? If not, how is design playing a larger role in these traditional products?

Q8. Do you see the relationship between technical communicators and social media changing as the influence of cloud technologies grows? How would you describe that change?

Q9. Besides basic XML, two items in the standards and regulations question stood out: W3C and the Rehabilitation Act of 1978, Section 508.

Q10. How are your technical communicators involved in accessibility research or standards compliance?

Q11. Has this involvement increased or decreased in the last five years?

Q12. Only one respondent specifically focused on the Agile/Scrum approach to project management. If you are not this respondent, has your company tried this approach? If it has, but you’re not using it, would you comment on why? What are the pros and cons of this approach, in your view?

Training and Education

Q1. Traditional college degrees in TPC, English, Communication, or Journalism came out as the most desirable credentials. What is it about these traditional college degrees that you value in potential employees?

Q2. What do you see as the differences between a degree in Technical or Professional Communication and a degree in English/Communication/Journalism?

Q3. Responses about professional development were split, with some companies placing a high value on it, and others not so much (see responses below). What do you think causes this difference? And if not through professional development, how do employees stay current with broader trends in the field?

The Future of Technical Communication

Q1. What challenges do you see ahead as technical communicators move to an emphasis on design and visual communication such as interface design, product design, information design, video production, etc.?

Q2. A number of comments from the “next ten years” question in Round 1 (see below), as well as

elsewhere, suggest that tech comm will broaden beyond writing to include people who can write, design, curate, and deliver content – currently tasks done by multiple people (writers, designers, information architects, web/social media designers, database designers, etc.). What implications do you think this might have for training, education, employment, and career paths for technical communicators?

Q3. Specializing is a common technique for increasing status and market value (medical GPs are paid less than medical specialists like cardiologists, for example). Yet comments in Round 1 suggest that specializing isn’t an optimal future path for technical communicators:

  • “Tech writers need to expand, not specialize.”
  • “No one will be a ‘technical writer’. Everyone involved with technical content will be expected to author, edit, curate, and manage the content, and interact with customers via social media and other channels.
  • “Demand will increase for content strategist and usability professionals. Demand will decrease for specialized technical writers.”

What factors do you think are driving that demand for expansion? What impedes opportunities to expansion? How can companies and technical communicators overcome those obstacles? What are the potential dangers of expanding the role of technical communication? Benefits?

Q4. Data visualization, collaboration, mobile development, and distributed content came up at the top of the list for increasing importance in the next 10 years (see graphic below). For each of these, answer the following:

Q5. Where do you see opportunities for technical communicators who want to into these activities? Particular industries, particular sites, particular kinds of projects, particular kinds of processes?

Q6. What barriers or opportunities do you see for technical communicators who want to go into these activities?

Q7. Specifically what contributions do you think technical communication can make to these activities that other fields and specialties can’t bring to the table as readily?

Q8. For each of the following items, give us your best sense of whether it will increase in importance to the profession of technical communication, decrease in importance, or stay about the same in the next 10 years.

  • Collaboration
  • Data visualization
  • Distributed content
  • Graduate education in TPC
  • Information architecture
  • Interface design
  • Mobile platforms
  • Multiple media
  • Policy writing
  • Process documentation
  • Product documentation
  • Professional certification
  • Professional development
  • Subject-matter expertise
  • Undergraduate education in TPC
  • Usability and usability testing
  • User metrics
  • User-generated content
  • Web development
  • Writing and editing

Appendix C: Round 4 Survey (Protocol: Send out questions before web conference)

Identities and Relationships

Two visions for technical communicators arose in the conversations in Round 3. Some comments suggested that technical communicators must grow in more specialized ways, into roles such as usability expert, information designer, user experience designer, and so forth. Other comments suggested that technical communicators must broaden their skills, so they can contribute flexibly to product teams. At your company, which direction do you think technical communicators are or should be heading?

Products and Processes

Some of the conversation in Round 3 suggests that technical communicators are increasingly involved in product design, as opposed to traditional post-design product documentation. To what extent do you see that happening in your organization? What implications do you think this shift will have in your organization and on the profession of technical communication?

On the Horizon

What do you think are the three biggest problems or issues that the profession of technical communication will face in the next five years? How do you think the profession should respond to these problems or issues?

Training and Education

For each of the three problems or issues you mentioned in the previous question, how do you think educators and training providers should respond?

Open Response

What question should we have asked you that we didn’t? How would you have answered it?

References

Allen, O. J., & Deming, L. H. (1994). Publications management: Essays for professional communicators. Amityville, NY: Baywood.

Amidon, S., & Blythe, S. (2008). Wrestling with Proteus: Tales of communication managers in a changing economy. Journal of Business and Technical Communication, 22(1), 5–37.

Carliner, S. (2004). What do we manage?: A survey of the management portfolios of large technical communication groups. Technical Communication, 51(1), 45–67.

Carliner, S., Qayyum, A., & Sanchez-Lozano, J. C. (2014). What measures of productivity and effectiveness do technical communication managers track and report? Technical Communication, 61(3), 147–172.

Dicks, R. S. (2003). Management principles and practices for technical communicators. New York, NY: Pearson/Longman.

Geist, M. R. (2010). Using the Delphi method to engage stakeholders: A comparison of two studies. Evaluation and Program Planning, 33(2), 147–154.

Hackos, J. T. (1994). Managing your documentation projects. New York, NY: Wiley.

Hart, H., & Baehr, C. (2013). Conceptualizing the technical communication body of knowledge: Context, metaphor, and direction. Technical Communication, 60(1), 260-266.

Johnson-Eilola, J. (2005). Datacloud: Toward a new theory of online work. Cresskill, NJ: Hampton Press.

Kendall, J. E., Kendall, K. E., Smithson, S., & Angell, I. O. (1992). SEER: A divergent methodology applied to forecasting the future roles of the systems analyst. Human Systems Management, 11(3), 123–135.

Kynell-Hunt, T., & Savage, G. J. (2003). Power and Legitimacy in Technical Communication: The Historical and Contemporary Struggle for Professional Status. (Vol. 1). Amityville, NY: Baywood.

Kynell-Hunt, T., & Savage, G. J. (2004). Power and legitimacy in technical communication: Strategies for professional status. (Vol. 2). Amityville, NY: Baywood.

Landeta, J. (2006). Current validity of the Delphi method in social sciences. Technological Forecasting and Social Change, 73(5), 467–482.

Leximancer. (2010). Leximancer White Paper. Retrieved from https://www.leximancer.com/lmedia/Leximancer_White_Paper_2010.pdf

Lips, C. C. (2007). Effective publications management: Keeping print communications on time, on budget, on message. New York, NY: Allworth Press.

Okoli, C., & Pawlowski, S. D. (2004). The Delphi method as a research tool: An example, design considerations and applications. Information & Management, 42(1), 15–29.

Parkhe, A. (1993). “Messy” research, methodological predispositions, and theory development in international joint ventures. The Academy of Management Review, 18(2), 227–268.

Smith, A. E., & Humphreys, M. S. (2006). Evaluation of unsupervised semantic mapping of natural language with Leximancer concept mapping. Behavior Research Methods, 38(2), 262–279.

Teram, E., Schachter, C. L., & Stalker, C. A. (2005). The case for integrating grounded theory and participatory action research: Empowering clients to inform professional practice. Qualitative Health Research, 15(8), 1129–1140.

Tukey, J. W. (1977). Exploratory data analysis. Reading, MA: Addison-Wesley.

Whiteside, A. (2004). Skills that technical communicators need: An investigation of technical communication graduates, managers, and curricula. Journal of Technical Writing and Communication, 33(4), 303–318.

About the Author

Miles A. Kimball’s research focuses on the performance and history of technical communication, as well as on visual communication and document design. He is the author of Document Design (Bedford/St. Martin’s 2008) and of The Web Portfolio Guide (Longman 2003). Contact: miles.kimball@gmail.com

Manuscript received 8 April 2015; revised 6 May 2015; accepted May 7 2015.