Features March/April 2024

Implications of AI for Program Design in Technical Communication

By Susan M. Lang | Sustaining Member

How AI training can be integrated into either academic programs or professional development for technical communicators.

Picture this—a discussion among employees whose jobs appear threatened by a new technology, panicked that they will become obsolete. Sound familiar? While most readers probably assume I’m referring to the current fear that many technical communication jobs will be taken over by ChatGPT or other generative AI applications, this scenario is only the latest in a sequence that has been imagined repeatedly over the past century in contexts both fictional and real. Since the onset of the COVID-19 pandemic in 2020, mainstream media and trade publications have considered how automation and cutting-edge technologies have either displaced or reinstated jobs in the name of safer and more efficient operations. Daily we hear about new applications of AI in medicine, engineering, or manufacturing and potential misapplications of AI in other areas, such as higher education.

AI is transforming the way some of us work. A 2023 Asana report on “The State of AI at Work,” a survey of over 4,500 employees in the U.S. and U.K., found that while 36% of these employees use AI weekly, nearly 90% want to see transparency and accuracy underlying the AI tools they use. And less than half of those surveyed understand the concept of human-centered AI, which integrates artificial intelligence with human values and needs, enriches work rather than dictate it.

The deluge of information regarding all aspects of AI seems overwhelming, especially when a good amount of that information implies that technical communication jobs will be among those taken over by AI systems. But at its core, AI is a tool, whether we are talking about traditional AI (think Siri, Amazon recommendations, or Google) or generative AI such as ChatGPT or Sudowrite. And just as technical communicators have learned to use stand-alone or cloud authoring tools, image editing tools, and publishing tools, they will need to understand and be able to apply the capabilities of AI to facets of their work.

Given the complicated nature of higher education’s evolving response to generative AI, those teaching in academic programs will need to consider their institution’s academic integrity policies toward the use of AI in courses. Most of these policies provide only general guidance that such use be “intentional,” while some prohibit requiring students to use AI in a course. Such contradictions make meaningful integration of instruction regarding generative AI more challenging, but I discuss some possibilities below for an undergraduate curriculum, which may also be appropriate for in-service training/refreshers for professionals.

AI, Student Background, and Core Competencies

Ideally, generative AI should be introduced as a topic of examination in a program’s introductory or service course along with other tools commonly used for writing. Rather than viewed as a radical departure, we should consider it as the next step in generations of tool development and in terms of the knowledge that professional technical communicators need to use it effectively. Students are likely already using basic AI tools such as Microsoft Word’s Editor or Google’s Smart Compose and Smart Reply as they create text, and editing tools such as Grammarly, Scribbr, or QuillBot. Basic AI terms, concepts, and principles should also be introduced, such as those from the Organization for Economic Cooperation and Development (OECD) (https://oecd.ai/en/ai-principles).

As tools and terms change, often the essential skill sets evolve as well. To illustrate, when website design and development became part of technical communication, professionals integrated visual design theories and markup languages, such as HTML, into their toolbox. Combined with strong writing and editing skills, these professionals differentiated themselves from their peers who may have been good writers but relied mostly on tools such as Microsoft FrontPage and Macromedia Dreamweaver to design and code websites. Likewise, when websites became more interactive, user experience became a more significant part of project planning, visual design, and content development. This did not mean, however, that other key skill sets, such as writing and editing, were jettisoned, but rather changed the methods in which they were employed.

A confluence of factors adds additional layers of complexity to programmatic decisions about generative AI. Our current generation of college-bound students often enter their degree-granting institution with less writing instruction and experience than prior generations. Because of dual-enrollment programs, as well as a desire to accrue affordable college credit, many students may take their first technical communication course without having enrolled in prior writing courses at the institution where they plan to earn their college degree. A dilemma for academic programs, then, is deciding how they will structure competencies in writing and editing that are expected of entry-level technical communicators. Experts still debate what best practices entail in terms of assessing proficiency of so-termed lower concerns or sentence-level issues are given the assistance offered by word processing programs. Generative AI, with its apparent capabilities that support the creation, drafting, and editing of text, further complicates instruction and assessment. We must decide what level of proficiency with the grammar, syntax, and mechanics of language is required for students before they can make the best use of writing tools.

Even more critical, though, will be ensuring that students understand the limits of generative AI applications in drafting and editing critical documents. While AI seems to function well as a drafting partner for some materials, human-centered AI practices must also be followed. Especially when dealing with novel technologies in such fields as aviation or electronic vehicles, for example, the human-AI interaction as those documents evolve will be vital. No one wants to be the writer or editor whose overreliance on AI led to a catastrophe on the level of the Challenger explosion or, more recently, the Boeing 737 MAX tragedies.

Other Considerations

Given that each day brings new information and insights about AI systems, academic program directors should be considering how generative AI may or may not alter the trajectory of their undergraduate and graduate programs over the next several years. One obvious challenge will be ensuring that all teaching faculty in technical communication programs have a working understanding of generative AI from basic terminology to practical application. For academic programs, this means that those teaching any course will need to decide how AI will be incorporated into assignments in their courses, whether students will use AI as they complete assignments or have students compare their responses to particular assignments with responses generated by AI systems.

Program directors and faculty at some point will also need to consider whether or not AI has a place in the evaluation of student assignments in technical communication programs. While incorporating AI technology into assignment production appears inevitable, using that technology to assist in commenting on student work carries equally pressing educational integrity questions, especially if evaluations are used to admit or deny admission to a program at the conclusion of an introductory course.

While the integration of generative AI into the technical communication curriculum will change the trajectory of programs, their focus may be less (or different) on writing and editing and more on other competencies and skills within the discipline. If so, those alterations and the rationale behind them will need to be articulated throughout the field so that both prospective students and employers understand what skill sets will be emphasized within programs.

Whether training as part of academic programs or continuing education, the field must look forward. As our tools evolve, we must also evolve to understand and advocate for their optimal use in our work. Extending these considerations beyond the academy, professionals who manage technical communication teams will also need to consider which core competencies will be emphasized in writing and editing usable content.


Susan Lang is Professor of English and the Director of the Center for the Study and Teaching of Writing at The Ohio State University. She is also the editor of the Journal of Writing Analytics. Lang teaches and researches in the areas of technical and scientific communication, writing program administration, online writing instruction, and hypertext theory. She has published in, Technical Communication, Journal of Technical Writing and Communication, Intercom, College English, College Composition and Communication, and Pedagogy, among others.