65.3, August 2018

Identifying Risk Communication Deficiencies: Merging Distributed Usability, Integrated Scope, and Ethics of Care

By Amber Lancaster

Abstract

Purpose: Risk communication research examines how people communicate risk to prevent accidents and fatalities. Past studies have analyzed risk communication effectiveness from one of two frameworks: a textual approach (meaning is in the text) or a socio-cultural approach (meaning is external to the text). Some studies have merged textual and socio-cultural approaches, yet none to date have merged both approaches with an analysis of usability and ethics, specifically. A more useful analysis would combine such approaches to examine the broader system that makes up risk communication, using distributed usability and what Spinuzzi (2003) called “integrated scope” with a lens on ethics.

Method: I examine existing frameworks used to analyze risk communication and offer a merged framework emphasizing distributed usability, integrated scope, and an ethics of care philosophy for identifying communication deficiences. I use archival research to retrace a historical case of how artifacts were used in a communication system at an industrial setting when a fatal explosion occurred.

Results: I use this historical case analysis to show that a merged framework with distributed usability, integrated scope, and an ethics of care philosophy provides a more comprehensive and concrete approach for identifying risk communication deficiencies and preventing injuries and fatalities.

Conclusion: Examining complex work systems through this new merged framework expands our understanding of the interplay between textual-level components of our information (placement, accuracy, and details) and the social/cultural/political environments that define how workers act. Practitioners and researchers can apply this new merged framework to analyze past and current risk communication systems to identify usability deficiencies and prevent accidents.

Keywords: risk communication, complex work systems, distributed usability, ethics of care

Practitioner’s Takeaway:

  • Existing frameworks for assessing risk communication deficiences emphasize both textual-level analyses and socio-cultural-level analyses, but not with a usability and ethics perspective combined.
  • A new merged framework based on distributed usability practices can help identify deficiencies at both levels of analyses across a complex workplace information system.
  • As part of the merged framework, an ethics of care philosophy provides practitioners an ethical lens to examine how identified risk communication deficiencies affect target users and the relationship between the various groups of people using technical information.

Introduction

Risk communication is a subfield of technical communication that focuses on the discursive practices involved in the communication of health, safety, and environmental risk. Risk communication motivates an audience to act in a desired way; specifically, it draws attention to risky behaviors to prevent injuries and fatalities. Technical communication practitioners who develop risk communication identify risks in workplace settings and train employees to avoid and respond to dangerous and hazardous situations.

Risk communication has generated scholarly interest for practitioners and researchers of technical communication since the mid 1980s. During the 1980s and 1990s, scholars in technical communication began investigating discursive practices in risk communication, but not with the focus on usability and ethics, specifically. For instance, early research in technical communication primarily addressed laws that mandate risk communication and liability concerns for technical warnings, instructions, and safety information (Bedford & Stearns, 1987; Croft, 1996; Helyar, 1992; Manning, 1982; Smith, 1990; Smith & Shirk, 1996; Strate, & Swerdlow, 1987; Velotta,1987). Research focused primarily on how technical communicators could improve written documentation to meet required laws—a focus that has continued in more recent research on liability and Right to Know laws (Batova, 2013; Hannah, 2010; Moran, 2012; Todd, 2014). In other earlier research, technical communication scholars investigated and published on communication failures of specific incidents, such as the Three Mile Island disaster in 1979 (Herndl, Fennell, & Miller, 1991) and the Challenger accident in 1986 (Dombrowski, 1991, 1992, 1995, 2000; Moore, 1992; Pace, 1988; Winsor, 1998, 1990). Yet, other scholars researched risk communication within specific industries, such as the transportation industry (Coogan, 2002; Dragga & Voss, 2003; Horsley & Barker, 2002) and the mining industry (Sauer, 1992, 2003). Furthermore, other scholars in technical communication examined risk communication issues in environmental health and public policy, including earlier works of Katz and Miller (1996), Grabill and Simmons (1998), and Waddell (1996), and more recent works of Simmons (2008) and Youngblood (2012).

Recent research on risk communication serves to prevent disastrous workplace accidents, but much of the existing research uses one of two frameworks—either usability or ethical effectiveness—but never specifically puts the two together for a more comprehensive and productive analysis. I aim to address the need in scholarly research on risk communication to merge these two frameworks. I first identify and explain existing methodological frameworks for analyzing risk communication; then, I offer a merged methodological framework to analyze the usability and ethical effectiveness of risk communication. I apply this merged framework to the risk communication system at Ford Motor Company, the Rouge Steel Complex, where a tragic incident resulted in several serious injuries and fatalities in 1999. Examining this historic case from a distributed usability and ethics framework extends our field’s existing research on risk communication, but also contributes new insights into how we, as technical communicators, might assess the usability and ethical effectiveness of risk communication, together, within a complex work system.

Existing Methodological Frameworks in Risk Communication

Technical communication researchers have studied risk communication in two primary ways: (1) examining the textual level of documents to assist technical writers in creating lawful, ethical, and effective documents; a textual-level approach to assessing risk communication sees the “meaning” in the text; and (2) examining the socio-cultural level to understand differences in discourse across several workgroups and the influences of meaning and decision-making processes. This approach to assessing risk communication sees the “meaning” as external to the text and as socially constructed by those who use it. The following section identifies past methodological frameworks used to study risk communication.

Textual-level Approaches for Assessing Risk Communication

Textual information within risk communication was aimed to address U.S. laws and liability issues; thus, content was prescriptive to meet laws and prevent liability.

Honing just textual information, however, technical communicators faced over-simplifying technical meaning. As a solution, Manning (1982) offered a framework called “the persuasion matrix” that outlined five primary considerations for risk communication: “destination, source, message, channel, and receiver” (p. 302). Additionally, Manning (1982) suggested that the matrix overlapped with common rhetorical principles: “identification principle,” “action principle,” “familiarity and trust principle,” and “clarity principle” (p. 302). This more simplified writing model was designed to assist technical writers in preparing materials for risk communication. Manning’s formula, “Action,” “Credibility,” “Target,” “Identification,” and “Readability” (ACTIR) (p. 303) was an assessment tool used for creating risk communication. ACTIR foregrounded the audience and focused on a text’s messages, rhetorical structures, and the message and structures’ ability to convince readers to act in certain ways.

Other scholars have also examined textual-level content for deficiencies in risk communication. Like Manning, Croft (1996) identified ways that technical communicators can simplify writing and reading processes, improve technical information in risk communication, and offer action-based communication. In the context of Material Safety Data Sheets (MSDS), Croft criticized efforts by the U.S. Occupational Safety and Health Administration (OSHA) to create risk communication standards with no guidance on presenting technical information. According to Croft (1996), “Instead of receiving complaints from workers that they did not receive enough information about the hazards of their workplace, there were complaints of getting too much seemingly irrelevant information” (pp. 172–173), pointing to a need for usability and accessiblity assessments.

To help alleviate some of the deficiences identified, Croft (1996) argued that government agencies must implement clear and explicit writing standards for entire industries. He provided a framework to help technical communicators write consistently across industries that dealt with hazardous materials and risky situations. Croft’s framework included four fundamental sections:

  • a description of the material and crucial information that may be needed immediately when an accident occurs,
  • an action plan, should a hazardous situation arise,
  • a prevention plan to avoid hazardous situations from occurring, and
  • additional useful information about the material. (p. 173)

This framework, according to Croft, “assist[ed] the MSDS writer in laying out the information to answer the four fundamental questions,” and “[w]hen readers have the answers to these basic questions, they have the information needed to use, handle, store, and transport the material appropriately” (p. 176). This framework suggested that meaning was in the text and contained in a single textual artifact (i.e., the document provided users with what they needed to complete tasks safely). This framework did not, however, address risk communication situated in a larger complex work system; rather, it emphasized an isolated piece of information.

Socio-cultural-Level Approaches for Assessing Risk Communication

Unlike textual-level approaches to risk communication, socio-cultural-level approaches examined several factors that are external to specific documents and that influenced organizational discourse (the role of discourse in social practices and organizational dynamics). This approach was more commonly used to examine risk communication in our field. For instance, several technical communication scholars have looked at organizational discourse and communication failures to analyze risk communication deficiencies. Research through this kind of framework has included discussions on disastrous incidents to illustrate how information is created, disseminated, and perceived by different groups of people within the same organization or among collaborating organizations. Two widely examined incidents include the Three Mile Island disaster of 1979 and the Challenger accident of 1986. These incidents were examined to explain why and how failed communication contributed to fatal accidents.

Existing literature on organizational discourse in risk communication argued that further research will increase our understanding of communication patterns among workgroups, which can be applied to improve communication and prevent future accidents. As a case in point, in their discussion on the accident at Three Mile Island and the Challenger disaster, Herndl, Fennell, and Miller (1991) stated,

Both these technological disasters involved failures of communication among ordinary professional people, mistakes committed in the course of routine work on the job, small mishaps with grotesque consequences. […] But disaster makes otherwise routine and invisible communication accessible, and disaster makes the study of it compelling. (pp. 279–280)

Herndl, Fennell, and Miller (1991) justified the study of “discourse behind the disaster” to see if communication patterns differed among “lines of social (organizational) structure” (p. 281). In their study of both the Three Mile Island and the Challenger disasters, they used a four-point framework to conduct organizational, linguistic, pragmatic, and argument analyses. They concluded that two distinct communication failures occur in organizational discourse: miscommunication and misunderstanding. They stated,

Miscommunication is detected through the structural analysis and is due to the lack of common language or to faulty communication procedures within an organization. Misunderstanding is detected through substantive analysis of what people say or write and what they must share to interpret discourse as it was intended. Put simply, miscommunication revolves around the how of communication, while misunderstanding revolves around the what. (Herndl, Fennell, & Miller, 1991, p. 303)

Here, they pointed to two critical areas in organizational discourse that might explain why communicative disconnects can occur between groups of people within workplace settings.

Similarly, Dombrowski (1991, 1992, 1995, 2000) and Winsor (1990) have addressed communicative disconnects in their investigations of the Challenger disaster. Their examinations of the technical reports and correspondence between scientists, engineers, and managers leading up to the event revealed clear disconnects between technical information and how knowledge was applied, disseminated, and perceived. Dombrowski (2000), for instance, stated that the investigations of the Challenger “demonstrate[d] the critical importance of clear communication in highly technical systems such as the shuttle, the powerful role of complex social forces in shaping communications, and the close interplay between values and language in communications” (p. 121).

Dombrowski also added to this argument a real concern for and understanding of how ethics played out in the communicative events between social groups. He stated, “They [the investigations] show, too, how ethical responsibility can be reflected in highly technical documents. Tragically, they also illustrate how differences in organizational power can negate even the most ethically responsible of technical communicators” (Dombrowski, 2000, p. 121).

Like Dombrowski, Winsor (1990) addressed ethics in the Challenger case when she stated, “[F]uture failures could be prevented only by removing unethical or incompetent employees—in effect the action which NASA and MTI both eventually took” (p. 7); however, Winsor further stated that communication about the O-ring failures in the Challenger case extended beyond concrete ethical decisions and demonstrated “the difficulty we [technical communicators] have in bridging our theoretical understanding of the uncertain, socially conditioned nature of discourse to bear on concrete incidents” (p. 8)—a key notion for a more concrete framework to assess risk communication at both the textual level and socio-cultural level with a lens on ethics and usability.

Both Dombrowski’s (2000) and Winsor’s (1990) examinations of organizational discourse central to the Challenger disaster emphasized a socio-cultural theoretical framework and the need for further research about communication patterns, applied ethics, and social structures in the area of risk communication, but neither emphasized usability assessments as part of their frameworks. A more complete framework should also merge distributed usability across complex work systems with applied ethics to examine concrete aspects of our work.

Well-known disasters like the Three Mile Island and the Challenger accidents were focal points for studying risk communication to explain how and why communication failures occur, but scholars have also studied risk communication in the public sphere to examine the what, or the meaning of words constructed by various social groups and how those groups contend for meaning.

Coogan (2002), for instance, researched how organizational discourse in the railway industry affected public safety and how meanings are constructed and construed. In his investigation of the Chicago Transit Authority and railway accidents between 1976 and 1984, Coogan attributed failed communication among different social groups as the primary cause in railway accidents that resulted in fatalities (p. 277). Coogan claimed that the fields of rhetoric and technical communication must address communication needs and the public concern. He stated that “the safe operation of mass transit is an issue that not only concerns technical communicators but citizens, politicians, and engineers alike” and that “a rhetorical analysis of accidents should proceed from sources that name the public concern” (p. 280). At the basis of his argument, Coogan called for the study of ideographs to understand both the how and why communication failures occur, and to understand the what, or the specific meaning of words that social groups develop, use, and compete to own.

Perhaps the most prominent work in risk communication studies, Sauer (1993, 2003) investigated the role of communication in mining industries. Sauer’s work is one of the few that more deeply employed the textual-level approach and the socio-cultural-level approach to study and assess risk communication. In The Rhetoric of Risk: Technical Documentation in Hazardous Environments (2003), Sauer used a rhetorical framework to analyze content and context of specific documents like memos between members of the same and different workgroups. She also addressed the cycle of technical documentation in large regulatory industries, drawing attention to the larger systemic context in which information works. She examined how technical communication portrays the following topics:

  • The dynamic uncertainty of hazardous environments
  • The variability and unreliability of human performance
  • The uncertainty of the agency’s notion of “premium data”
  • The uncertainty in social structure and organization
  • The rhetorical incompleteness of any single viewpoint (p. 19)

Though broad in scope, these topics have revealed the need to investigate risk communication with both a textual-level and socio-cultural-level analysis. Sauer’s work also revealed a growing concern for technical communicators to understand the complex interrelations between language use at the textual level, groups of workers at the socio-cultural level, and the contextual contingencies that shape risk communication. Furthermore, Sauer’s adoption of a feminist ethics of care illustrated the need to consider the applied ethical responsibilities of information and the caring of relationships of those impacted by such information. But Sauer has not combined usability assessments, specifically, with her analyses.

In more recent studies, some scholars emphasized moving beyond textual-level approaches to consider the rhetorical context (Cox & Pezzullo, 2015; Youngblood, 2012). For instance, Youngblood (2012) examined ambiguity and avoidance strategies in risk communication produced by community emergency planning committees, where miscommunication, misinformation, and ambiguous information within organizations could cause potential failed risk communication involving the public. As she pointed out, however, oftentimes, the complex social rhetorical situation may not afford technical communicators the ability to fully disclose technical information. She stated:

As advocates for the public, technical communicators’ first responses may be to argue for explicit details in RTK [Right-to-know] information. That approach would be in keeping with a traditional disciplinary focus on completeness. But such a response is complicated by real and perceived security threats and emergency planners’ need for a sense of control over their situations. (2012, p. 59)

To assess what information technical communicators publish, Youngblood (2012) offered a three-tiered framework for decision making—asking questions first “about the organization’s goals;” second “that identify rhetorical tensions;” and third “that explore levels of detail as they relate to civic participation, problem solving, and ethics” (p. 59).

Like Youngblood, Cox and Pezzullo (2015) illustrated cases where the challenges of risk communication go beyond the textual meaning in which technical communicators must produce risk communication that considers cultural values (p. 163). Cox and Pezzullo (2015) offered a framework for analyzing environmental risk communication in the public sphere based on two sub-structures: “the technical model of risk communication” and “the cultural model of risk communication” (pp. 159–165). Their technical model shared technical information with a targeted public audience with three goals in mind—to inform, to change behaviors, and to assure the public of their safety. Cox and Pezzullo argued, though, that this technical model often fails to acknowledge the concerns of individuals (and multiple groups of people) who make up that public (pp. 161–162)—hinting to a need for an ethics of care lens. Their cultural model, however, took into account individuals and involved them in the processes of assessing and communicating risk as they experience a risky situtation (pp. 161–162)—hinting to a participatory, or co-creation, role, which is also one premise of usability evaluation methods.

Within each of these existing frameworks are both practical and theoretical approaches for understanding, producing, disseminating, or anlyzing risk communication. However, a single, merged framework is needed to bring together the textual and the socio-cultural levels of analysis with a more concrete ethical understanding for those who use the technical information. Such a merged framework would also evaluate usability issues in technical information, but assess risk communication effectiveness at a level that is distributed across social, cultural, and political lines in a complex work system. This distributed usability assessment would integrate textual-level, socio-cultural-level, and ethical analyses to identify risk communication definiciencies and prevent injuries and fatalities.

A New Merged Methodological Framework for Studying Risk Communication

A new approach for assessing effectiveness of risk communication might be derived from usability evaluation—specifically, looking at how meaning is distributed across multiple artifacts (texts and interfaces) and how meaning exists in the interplay of artifacts, groups of people (users), and the ethical concerns surrounding social, cultural, and political spheres (invisible fields).

Typically, usability evaluation examines the textual-level of a document to identify where information fails to provide users with adequate breadth and depth to complete tasks. However, usability evaluation can also account for social, cultural, and political factors that influence the effectiveness of a text. Thus, usability as a method can offer technical communicators a way to analyze both the textual and socio-cultural levels of risk communication effectiveness.

One possible framework includes examining artifacts (texts and interfaces) and their connectedness through organizations. This method is primarily associated with Clay Spinuzzi’s (2003) research and has been applied to assess the design of workplace systems and the effectiveness of artifacts within a system to support workplace tasks. Unlike more traditional usability testing, Spinuzzi’s approach of integrated scope examines usability beyond an artifact in isolation. His approach identifies usability problems at three levels of scope: the micro, meso, and macro levels.

Oftentimes, according to Spinuzzi, usability evaluation takes on the designer-as-hero trope, in which workers are portrayed as victims of usability problems and designers are depicted as heroes who save workers by identifying design problems, fixing them, and improving the worker’s ability to complete job-related tasks. Under this usability framework, as Spinuzzi stated, “Once the crux of the problem is treated via a formal solution, the symptoms of the problem dissolve,” but as he has further questioned, “What if the usability problems cannot be neatly divided into cause and symptoms?” (2003, p. 26). For instance, in a complex workplace system, several factors external to the artifact influence its usability and impact a worker’s ability to complete tasks. Factors about the work system (such as structural, social, cultural, and political characteristics) influence usability within the system. To truly identify usability problems within a complex workplace system and to improve the system based on usability evaluations, designers must look at the scope of use at various levels within the work system—they must look at usability distributed across multiple artifacts and within the system itself. According to Spinuzzi,

Workers’ operations must be examined in their own right, as interactions—often centrifugal, subversive interactions—that coconstitute (reciprocally make up, shape, sustain) the cultural activities and goal-directed actions in which workers engage. At the same time, we [designers] must also examine work activities and goal-directed actions, where workers may also innovate. In short, to examine the centrifugal aspects of workers’ labor, it becomes important for us [designers] to integrate research scope: to examine the three levels [micro, meso, and macro] of activity, actions, and operations so that we can discern how they interact, how the coconstitute each other, and how innovations at any given level affect the others. (2003, p. 27)

In this passage, Spinuzzi identified distributed usability evaluation within the context of a complex workplace system. He suggested that rather than designing simply task-oriented artifacts to support work goals, technical communicators must design task-oriented artifacts situated in the context of various scopes.

As an example, to evaluate a procedural documentation that provides step-by-step instructions for mechanical operation, designers cannot assess usability effectiveness in isolation. They must identify how the document’s use is situated in the work system—how it interacts with or relies on other documents, how it operates within the political structure of the workplace, and how it alters or is altered by workarounds and worker innovation.

At a micro-level, designers might look at textual features of the document, such as placement of information, accuracy of information, or level of detail to complete tasks. They might look for information deficiencies where steps are missing or where users get confused.

At a meso-level, designers might look at the actions that workers perform with the use of specific tools. They might look at the user’s reliance on tools to operate mechanical elements of an object (for instance, a boiler). Designers might, then, identify these actions as links to the procedural documentation, noting the ways that users set goals, approach the task, and complete it. They might also observe the kinds of problems users encounter within the action and goal.

At a macro-level, the designer might look at the cultural and political activities that shape actions. This level “involves ways workers, work communities, cultures, and societies understand, structure, collaborate on, and execute their evolving cooperative enterprises” (Spinuzzi, 2003, p. 32). Designers, then, might look at how different workgroups use the mechanical operation’s procedural documentation, but also how different groups contribute to the use of that documentation within the larger context of workplace actions.

Though Spinuzzi emphasized designing task-oriented artifacts situated in the context of micro, meso, and macro-levels, ethics was not integrated at each level of analysis. To fully assess the usability effectiveness of risk communication, technical communicators must examine, at each level, the ethical concerns for information use. At the micro-level, designers look at legal information and ethical responsibilities conveyed within the textual information to protect users. At the meso-level, designers examine ethical issues with access to information and barriers to meeting compliance. At the macro-level, designers examine where ethical responsibilities of information exist within the social, cultural, and political relationships of groups of users. At all three levels of scope, concerns for information use are bound to users and relationships across groups of people—a central premise to an ethics of care philosophy.

Adding an Ethics of Care Lens to Risk Communication Analyses

Under an ethics of care philosophy, “The right decision is not about an individual’s own needs or desired outcomes, nor is the good decision about an individual’s obligation or duty to a set standard or governing rule. Rather, the best decision is uniquely tied to relationships that bond two or more individuals” (Lancaster & Tucker Lambert, 2015, pp. 295–296). The bonding relationships between people make especially important the effects of risk communication in risky environments. In risky environments where miscommunication or mishaps can lead to injuries and fatalities, the bonding relationships between workers are crucial to everyone’s safety.

Applying an ethics of care philosophy to risk communication, technical communicators gain an approach that places care for others and relationships between people as central to decision making. As an added layer to distributed usability and integrated scope, then, an ethics of care analysis emphasizes what Noddings defined as “receptivity, relatedness, and responsiveness” (as cited in Dombrowski, 2000, p. 64). An ethics of care analysis of risk communication would examine how work groups share ideas, offer suggestions, and make decisions about the workplace tasks and information supporting those tasks.

Additionally, according to Willerton, “The concept of care gives technical communicators another way to examine—and reaffirm—the relationships between their companies and their audiences and to consider the many ways in which those relationships manifest” (2015, p. 51). Within a workplace system of many stratified units of workers and information, an ethics of care lens provides a sensible way to analyze ethical responsibility to the groups of users and the relationships that bond them.

Using the New Merged Framework: A Case in Context

Merging and using distributed usability, integrated scope, and an ethics of care philosophy as a single framework to evaluate risk communication offers a more comprehensive and richer context than past frameworks have offered. A merged framework with these three foci help technical communicators more fully understand factors that influence the use of risk communication to prevent injuries and fatalities in the workplace.

To show how this merged framework can be used to better identify risk communication deficiencies, I examine a historical case study, a boiler explosion at Ford Motor Company’s Rouge Steel Complex in Dearborn, Michigan (Detroit metro area). I show how distributed usability, integrated scope, and an ethics of care philosophy provide more comprehensive insight to evaluating risk communication effectiveness, but also how such an approach provides understanding of the complexity of workplace communication systems in an industrial organization (a manufacturing plant). I emphasize how usability research and ethics combined contribute new analyses of risk communication. Such analyses assist in improving how organizations design, implement, and use documents to prevent accidents in risky environments.

Ideally, distributed usability research should involve primary research, such as usability testing, ethnographic observations and site visits, and focus-group interviews; however, because of the highly sensitive nature of this case (which is so often true in disastorous accidents), I was limited to reconstructing the events of the accident, the communication system, and the work system through archival documentation and sample artifacts in the Michigan Occupational Safety and Health Administration (MIOSHA) report. I solicited information from MIOSHA officers assigned to the investigation, reporters who covered the accident, and Ford Motor Company, with little response. Although conducting primary research was not possible, the historical artifacts and the narrative of MIOSHA’s investigation (a compilation of more than 1,000 documents) adequately supported identifying deficiencies in the work system, at all three levels of scope, that led to the fatal explosion. I retrace the MIOSHA investigation and classify MIOSHA’s findings in the context of a workplace communication system (something the MIOSHA report did not accomplish).

The following sections cover the background of this case study, provide historical context of the accident and political context of the work system, and devise a picture of risk communication deficiencies using a merged framework of distributed usability, integrated scope, and an ethics of care philosophy.

Case History

On February 1, 1999, a boiler exploded at the Ford Motor Rouge Steel Complex, igniting five floors of the complex, fatally injuring six employees and critically injuring more than a dozen others. Photos (distortions in originals) taken at the site by MIOSHA (see Figures 1 and 2) documented the extensive damage resulting from the explosion.

According to the Michigan Department of Consumer & Industry Services (CIS), “[the Michigan Occupational Safety and Health Administration (MIOSHA) and the Bureau of Construction Codes (BCC)] determined that the explosion was caused by a natural gas build-up in Boiler No. 6,” and “BCC inspectors concluded that the cause of the accident was a result of inadequate procedural controls for the safe shut-down of the boiler” (State of Michigan, 1999, Ford)—deficiencies that, during my research process, I classified as textual-level in the technical instructions for shutting down the boiler. Additionally, CIS concluded that “[i]mproper valve line-ups and inadequate work group communication allowed natural gas to flow into the boiler furnace chamber. This is believed to be the source of the gas build-up which caused the explosion” (State of Michigan, 1999, Ford)—deficiencies that I classified as social/cultural/political in the actions people performed and the miscommunication between those people.

Figure 1. Reproduction photo of Ford Motor Rouge Steel Complex building after explosion, courtesy of MIOSHA

Though this accident sadly bears similarities with many other reported workplace accidents, its case marks several historic outcomes. First, it marks a historic settlement between an auto corporation, union organizations, regulatory agencies, and Ford Motor Company employees at this point in time. The CIS stated, “The settlement includes a record $1.5 million penalty, the largest monetary sanction ever levied in Michigan as a result of a MIOSHA investigation” (State of Michigan, 1999, Ford). Furthermore, the settlement marked a historic effort from involved organizations to extend the scope of the agreement beyond the immediate case and included five other monetary sanctions, totaling $7 million. These included:

  • $1.5 million for establishing programs to achieve lasting improvements in safety
  • $1.0 million for research to increase understanding of industry safety and health
  • $1.5 million for medical research, facilities or equipment in the treatment of burns and other critical care
  • $1.0 million for scholarship funds
  • $0.5 million for potential third-party reimbursement (State of Michigan, 1999, Ford)

Additionally, the MIOSHA investigation became one of the most complex and high-profiled cases in the state of Michigan at its time, lasting more than seven months, involving the largest number of agencies and organizations, and requiring the largest review of information (including interviews with employees, material evidence from the physical structure, and in-house documentations) (State of Michigan, 1999, Ford). According to CIS, the Ford investigation involved the largest number of physical documents ever reviewed; it included “689 blueprints; 324 binders of documents containing more than 200,000 pages; 29,000 photos; and 375 boxes of evidence, including material in 10 file cabinets and 20 blueprint file cabinets” (State of Michigan, 1999, Ford). Lastly, this case was a leading story in news coverage and was controversial among several communities in the Detroit metro area, drawing attention to the effects of failed risk communication beyond those immediately involved.

Figure 2. Reproduction photo of MIOSHA inspectors on site at Ford Motor Rouge Steel Complex building after explosion, courtesy of MIOSHA

These outcomes make the study particularly interesting to our field, because they mark a historic effort dedicated to investigating large volumes of technical communication, from instructional manuals, technical descriptions, policy and procedures, training manuals and materials, employee interviews, and other corporate documents. This case highlights a truly complex information system and an especially egregious case of corporate ineptitude where corrective actions to improve technical communication were sanctioned with monetary penalties.

Context of the Accident

The following narrative is based on CIS (State of Michigan, 1999, Ford) and MIOSHA documents from the investigation (State of Michigan, 1999, Job) and offers contextual insight to what happened on the day of the explosion.

In 1999, the Ford Motor Rouge Steel Complex in Dearborn, MI covered 1,110 acres, accommodated six Ford manufacturing companies and the Rouge Steel Company, and employed approximately 10,000 workers. The powerhouse generated electrical, natural gas, and steam power for manufacturing operations and contained seven high-pressure power boilers used to provide steam at the complex. All boilers were in the same building. At the time of the 1999 explosion, Boilers No. 2, 3, 4, 5, and 7 were operating, and Boilers No. 1 and 6 were being shut down for annual inspection and maintenance. At 8:00 am on the day of the explosion, powerhouse employees began shut-down procedures for Boiler No. 6. At 12:00 pm, workers were completing the shut-down process by blanking (capping off) the natural gas supply. At about 12:45 pm, the natural gas control valves were opened to facilitate purging any remaining natural gas from the supply lines through the boiler. At approximately 1:00 pm, Boiler No. 6 violently exploded from a gas build-up, igniting five floors of the facility. Investigators determined that the operators were inadequately trained, but also failed to properly align the valves because of poor equipment markings, which resulted in the gas leak that led to the explosion.

Six employees were killed in this accident: John Arseneau, 45; Donald Harper, 58; Cody Boatwright, 51; Ken Anderson, 44; Warren Blow, 51; and Ronald Moritz, 46; each leaving behind family and children. Several others were critically injured and suffered severe skin burns and damage to vital organs; among them were Ralph Irvin, 53; Gerald Nyland, 47; John Sklarcyzk, 47; Gerald Moore, 55; Dennis Arrington, 47; Vincent Fodera, 46; Chris Getts, 46; John Kucharski, 40; and Geremia Villatala, 64. All of these individuals were diagnosed with a 50/50 chance of survival by physicians at the University of Michigan hospital where they were treated (McLaughlin, 1999). The tragic loss of life and life-impacting injuries among so many employees foregrounds the critical role risk communication plays in industrial settings. Usable and effective risk communication might have prevented these tragedies.

Social, Cultural, and Political Context of the Work System

Research from the MIOSHA investigation and from news coverage on this case indicates that the political context of Ford’s work system played a role in communication deficiencies. For instance, though Ford Motor Company developed an ideal system for communicating mechanical failures and work orders, this system proved insufficient across workgroups within the plant—some powerhouse workers were unaware of communication procedures, some were aware of a few procedures, but did not know how the system worked; and some used the system properly, but their requests went largely unanswered by supervisors or employees across departmental lines.

As an example, at the time of the explosion, employees were to complete an operator’s notice slip and receive approval from a maintenance supervisor and service operator prior to carrying out maintenance on any mechanical or operational system. Also, any employee encountering mechanical failures or discovering potential failures was to report those failures for critical repairs. However, several powerhouse workers noted in interviews with MIOSHA that on many occasions their safety notices and complaints were not addressed, and that failures in communication occurred between different workgroups within the work system. Specific to the boiler explosion, Ford employees told Detroit Free Press reporter Narji Warikoo that three of the six men who died had previously filed health and safety complaints about the conditions of the boiler; however, their complaints were largely left unanswered (White, 1999). In an interview with Warikoo, one employee stated, “The work that was supposed to be done was not. The majority of maintenance people were getting teed off because they weren’t getting overtime to do their jobs. […] It was a big issue. You can’t maintain that type of building with only eight hours of maintenance” (as cited in White, 1999). The picture painted by employees’ testimony illustrated that power struggles existed across political lines within the organization: workers were overworked, under-compensated, and felt ignored. As one employee recalled, an incident in the autumn of 1996 occurred when a maintenance work order was placed to fix a leaking warm-up line that warms up the turbines for the boilers, but the work order was never completed and supervisors never responded: “We put in recs [request for repairs], but they never got filled” (as cited in White, 1999).

The MIOSHA investigation confirmed communication failures across political lines existed within the organization with evidence “Health & Safety Complaint Form: NO. A 29945,” submitted by John Arseneau (deceased) on September 13, 1995, and “Health & Safety Complaint Form: NO. A 46293,” submitted by powerhouse workers Steve Patchuta and Matt Vanderboom on October 25, 1998. Arseneau’s hand-written request stated, “10-inch gas cocks are leaking on 1-3-5-6 Boiler. 2 Boiler. ADJUSTMENT REQUESTED: Needs to be changed” (State of Michigan, 1999, Job). However, examining the signatures of persons involved in the review and approval process, I verified that this communication request was ultimately ignored. The document included signatures from the district committeeperson and from Arseneau’s supervisor (dated September 13, 1995); it also included the request and signature from the superintendent (dated September 18, 1995) and the follow-up signatures from the district committeeperson and the UAW Health & Safety representative (dated September 20, 1995). But my review of the document revealed that the work order slip was never signed and approved by the company safety representative—the last person listed on the approval process. In fact, the MIOSHA investigator noted this deficiency and initialed the document, stating, “Who signed off? RJO 5-1-99” (State of Michigan, 1999, Job). Similarly, Patchuta and Vanderboom’s complaint was never addressed and documents were missing signatures from the district committeeperson and company safety representative on the last section of the review and approval process. Perhaps most chilling, Patchuta and Vanderboom forecasted a problem with inadequate training and job assignments among different classes of workers, which also contributed to the cause of the boiler accident. They stated, “This practice of assigning work which is traditional up-grade work to 2nd Class Opr’s [operators] including feed-pump job and engine room work with little to NO [all caps and underlining in original] training or orientation creates a hazardous situation and make a uncomfortable [sic] situation for the 2nd Class Operators being assigned to 1st Class traditional work” (State of Michigan, 1999, Job). My review of the documents also revealed the superintendent’s disposition and the disagreement between employees and their supervisor; the superintendent stated, “The Power House supervision feels that the aggrieved employees have been properly trained. But if they feel that they need additional training, accommodations will be made to improve their confidence” (State of Michigan, 1999, Job).

The disagreement across political lines within the organization may also indicate communication deficiencies between workgroups and reveal critical areas in which employees and supervisors failed to “collaborate on, and execute their evolving cooperative enterprises” (Spinuzzi, 2003, p. 32). The ways in which different groups at Ford contributed to the use of health and safety documentation within the larger context of workplace actions shows little effort to support the “ideal system” that Ford had likely imagined.

Usability and Ethical Deficiencies Based on Integrated Scope

In my review of the MIOSHA report and investigation documentation, I identified six primary means at Ford to communicate about risk and the prevention of risk at the time of the explosion within the work system at the powerhouse. My review of the documentation reconstructed a communication work system, something that MIOSHA did not highlight as a systemic focus. The system largely consisted of written communication to disseminate information, to train employees on standard operating and maintenance procedures, and to regulate workplace activities and actions for the safety of all employees. These included the following:

  • Training on code enforcement, regulations, and safe operating procedures
  • Wall and machine placards and equipment identification numbers
  • Danger tags and information tags
  • Operator’s notice slip
  • Operating and maintenance (O&M) procedures
  • Operating and maintenance (O&M) checklists (State of Michigan, 1999, Job)

Each method of communication contributed to the ideal overall effectiveness of the risk communication system. For instance, training sessions and documents were designed to provide new employees with training on powerhouse procedures, including special operating training on equipment that the employees would handle and maintain. Additionally, continual training sessions for existing employees were designed to inform them about changes in operating procedures and to reinforce continual safe working habits in the powerhouse. Within the powerhouse, wall placards and equipment identification numbers were designed to identify machines by type, potential danger, and cross reference to the machine’s manual. Danger tags and information tags were designed within the powerhouse to mark equipment failures or equipment that required service. When employees discovered a potential danger or equipment failure, they were to fill out a tag and attach it to the part or machine to notify other workers that the equipment needed repairs and might be dangerous to operate.

Operator’s notice slips were designed to be used in conjunction with the danger and information tags. Once employees tagged the equipment, they were to complete the operator’s notice slip by writing a brief narrative of what they encountered and when, the equipment failure, and the requested repair actions. Employees were then to submit the operator’s notice slip to their supervisor, who then was to approve the request for maintenance and route the slip to the superintendent.

O&M procedures documented step-by-step procedures for carrying out specific actions. O&M procedures were designed to be presented to employees in training sessions along with O&M checklists that offered employees ways to ensure that they completed each step in the specified order. The O&M procedures and checklist were kept on site in the powerhouse for employee use.

To look at the overall system of risk communication, it would appear that Ford Motor Company provided employees with adequate training and information to carry out safe working procedures and to prevent accidents. Figure 3 illustrates how the ideal system would operate among workgroups and shows the interconnections and reliability across written artifacts.

As shown in Figure 3, written communication is central to the system’s effectiveness. Through written documentation, employees would have accessed and created critical information to prevent injuries and fatal accidents in the powerhouse. Each form of written communication was connected to another, showing reliability across artifacts, which also suggested that effective usability was distributed among all parts of the whole.

Figure 3. Ford’s ideal risk communication system

For the system to achieve optimal usability effectiveness, artifacts and their use must be understood and consistent at the macro, meso, and micro levels. At the macro level (the social, cultural, and political activities shaping actions), all powerhouse employees would collaborate and routinely and properly use artifacts in conjunction with each other. As an example, the operator’s notice slip would cross reference the danger and information tags, the wall placards, and equipment identification numbers, but would also facilitate communication and collaboration among workers and across political lines within the organization. The operator’s notice slip would trigger communication between workgroups, and its cycle would end with the same person who initiated the action.

At the meso level (the actions that workers perform with the use of specific tools), artifacts would support actions multilaterally, linking goals to communicate and complete tasks, which would identify aspects like approximate time to complete the initial task, receive a response, and complete corrective action. In this ideal system, the operator’s notice slip would state the date that an employee identified the problem and the dates that each person across workgroups received and reviewed the slip but also the approximate response time that was needed to correct the equipment and production.

At the micro level (the textual features of the document, such as placement of information, accuracy of information, or level of detail to complete tasks), artifacts would support step-by-step task completion. It is at this level that information must achieve adequate granularity and information to carry out tasks properly and safely. The O&M procedures and checklists, as examples, would provide employees with all necessary information to carry out specific tasks without questions about the text’s purpose or meaning, without questions about parts of the text or larger process, and without questions about interrelated texts and tools.

At surface level, Ford’s overall ideal system of risk communication would meet usability effectiveness; however, careful scrutiny of this system shows that their ideal system was not effective at all—not because the overall design lacked effectiveness, but because the ideal design was not what workers actually used in the powerhouse. The most striking difference was that employees were relying heavily on oral communication, as noted in MIOSHA’s investigation, to complete tasks and communicate risks. Consequently, the system was deficient in all areas because written communication was not used effectively. Figure 4 illustrates identified deficiences based on findings in MIOSHA’s investigation.

Repeatedly, statements by investigators and by Ford powerhouse employees revealed communication deficiencies that existed across the overall work system. For instance, one MIOSHA investigator stated this about training: “During the course of the investigation[,] it was found that the training that was conducted in the powerhouse was generic” (State of Michigan, 1999, Job) and supported this claim with the following quotations from employee interviews (names were redacted by MIOSHA): “(Employee) stated in an interview that training was poor and there were few written procedures for the powerhouse;” “(Employee) stated in an interview he had never been trained in blanking procedures for boiler shutdowns;” “(Employee) stated in an interview he did not recall ever seeing a boiler startup or shutdown procedure. He stated there was a total lack of training; one had to watch and ask questions;” “(Employee) stated in an interview that he had training (number) years ago on boilers when he started with Ford. He had no training on boilers since. He stated he has had no classes in code enforcement, regulations, or in safe operating procedures” (State of Michigan, 1999, Job).

Figure 4. Ford’s actual risk communication system with identified breakdowns

Because training on work procedures was inadequate, outdated, or too generic (as identified by MIOSHA), employees were using danger tags and information tags improperly. In some cases, employees knew that they should use tags, but were not aware of which tags to use or when to use them (as noted by MIOSHA). In other cases, employees were unaware that a tag system existed. MIOSHA’s report stated,

According to multiple employees interviewed, it was discovered that some power service operators attached a danger or information tag with a string on the valve handle (Reference Book 3 Citation Documentation Tab D). No procedure was produced by Ford for when information tags were to be filled out or used. The practice of using tags was found to be inconsistent with regards to which tag was used, and if tags were used at all. (State of Michigan, 1999, Job)

The usability problems central to training show deficiencies at the macro level, when the larger work system across workgroups and political lines proved unsuccessful to properly communicate safety procedures. At this macro level, ethical responsibility to ensure people are equipped with the necessary information to support one another’s safety was largely unaddressed. It also shows deficiencies at the meso level, where employees were not trained to know the goals and purposes associated with communication tools provided; thus, ethical and legal responsibilities for safety compliance were largely disregarded.

The practice of using the operator notice slips was also found to be inconsistent or misused, according to MIOSHA investigators. In some cases, employees completed the operator’s notice slip correctly, but the slip was improperly routed across workgroups or never presented to the appropriate workgroups. The MIOSHA report identified inconsistent use and misuse of the operator’s notice slip and also identified this as one of the major communication deficiencies that led to the boiler explosion. It stated,

Ford had a system in place for requesting work to be performed on boilers. This was called an Operator’s Notice Slip (Reference Book 8 Support Documentation Tab C). The slip was to be filled out by the maintenance department requesting operations to isolate the equipment on which they had to work. This system was put in place as a result of a near-miss in approximately 1992, involving two electricians working in a boiler. Natural gas was accidentally introduced into the boiler while employees were still inside. […] The slip had a place for listing the equipment, the person making the request, date, and time of the request. It was then given to operations to complete the work request. Upon completion, it was to be signed by the power service operator(s) and returned to the maintenance personnel authorizing the work to commence. The request for Boiler #6 shutdown made by Ron Moritz, Maintenance Supervisor, was dated 02-01-99 and indicated the time of the request at 10:00 am. During the interviews, it was discovered one of the blast gas valves was closed on January 29, 1999. The other blast gas valve was closed on 02-01-99, at about 7:30 am. This indicated the shutdown was started and partially completed prior to the operator’s notice slip requesting work to be performed. On the notice slip there was no signature of a power service operator indicating that Boiler #6 shutdown was complete. (State of Michigan, 1999, Job)

The MIOSHA investigating officer drew attention to the fact that work was partially completed before the power service operator reviewed and approved the work to be completed, showing that a proper inspection was not completed and that workers were not necessarily accustomed to the correct safety operating procedures. At a macro level, this usability problem reveals deficiencies across workgroups in a lack of communication and collaboration, and demonstrates that ethical responsibility to ensure people’s safety was largely absent in the process. At a meso level, this usability problem shows lack of awareness of appropriate goals and tools used to complete actions, again identifying where safety compliance was unmet.

MIOSHA’s report highlighted the need for adequate and specific operating procedures, training of those procedures, and carrying them out as identified in written communication. In one interview for instance, Dave Johnson, Deputy Inspector, Boiler Division, stated “[I]t is prudent practice in the industry to have written startup and shutdown procedures where the boilers are operated; and that operators should be trained in these procedures.” He further stated, “Valves, pilots, and burners on boiler systems throughout the powerhouse are not adequately identified as would be necessary to perform startup and shutdown operations.” Here, the inspector indicated that Ford’s written procedures were not standard in the industry and lacked references to equipment and parts of the equipment specific to boiler shutdown procedures. MIOSHA’s report supported this notion, stating,

MIOSHA found that the written operating procedures provided by Ford Motor CO. were insufficient. The operating procedures from Babcock & Wilcox (Book 6 Support Documents Tab A – Instructions for the Care and Operation of Babcock & Wilcox Equipment) were generic and the employer had not modified the documents to make them site specific for the powerhouse. For example, no attempt had been made to relate equipment identification numbers to specific tasks. In addition, the generic tasks included equipment which had never been installed such as oil burners. (State of Michigan, 1999, Job)

At a micro level, these usability problems point to discrepancies in the text that could confuse users or mislead them to complete a task in the incorrect order or way. From an ethics of care perspective, textual-level inadequacies such as these could cause a user to feel less confident in the ability to perform the task (preceiving these as user inadequacies—placing blame on the user). Transferring the textual inadequacy and projecting it as a “user deficiency” was an underlying notion of the superintendent’s comment about employees’ confidence levels when he stated, “accommodations will be made to improve their confidence” (State of Michigan, 1999, Job). A perceived lack of confidence was emphasized by management (projecting user inadequacies) over providing training and domain-knowledge support (an ethical responsibility of management).

According to MIOSHA’s report, inconsistencies in boiler shutdown were apparent, showing communication deficencies at the micro level. The report stated,

During the course of the interview process, it was found that the lack of safe operating procedures for startup/shutdown of boilers created an array of inconsistencies. Variations were found in the sequential order in which the boiler was shutdown. In addition, it was found that variations existed for which valves were shutoff during the boiler shutdown. […] During employee interviews, both supervision and hourly personnel explained variations as to which valves were closed during Boiler #6 shutdown. For example, some individuals stated natural gas valves on the second and third floor were shut off, while others stated that they only closed the second and third floor natural gas valves. (State of Michigan, 1999, Job)

Furthermore, copies of Ford’s documentation for boiler shutdown reveal a lack of depth and granularity in the step-by-step instructions, but also show no specific references to equipment associated with the task (lacking the site-specific details noted by MIOSHA and that users need to safely perform their jobs). The 1974 instructional guide, “Procedure to Follow when all Blast Furnaces Go Down,” which was still in use at the time of the explosion, consisted of nine steps and offered no diagrams or equipment identification numbers to help the user identify parts, such as valves involved in the shutdown, and was incomplete, as noted by MIOSHA. The steps listed were as follows (all caps were used in the original document):

  1. CHANGE NEW PITS TO 100% NATURAL GAS.
  2. CHECK NATURAL GAS SPILL IN AT K K BLDG. SET FOR 5 INCHES.
  3. PUT ALL COKE OVEN BATTERIES ON COKE OVEN GAS UNDERFIRING.
  4. CONNECT MONOMETER TO BLAST FCE. MAIN AT SPLIT WIND ROOM. RUN TO DISP. PANEL BOARD.
  5. NOTIFY #1 POWER HOUSE TO CLOSE THREE MAINS, LEAVING ONE MAIN OPEN.
  6. #1 POWER HOUSE CLOSE LAST MAIN DURING CAST OF LAST BLAST FCE. TO GO DOWN.
  7. WATCH B.F. MAIN PRESS. WHEN IT STARTS BUILDING UP, OPEN BLEEDERS ON BLAST FCE.
  8. BLAST FCE. GAS MAIN WILL BE FILLED WITH STEAM SPILL IN FROM FURNACES.
  9. OPEN NAT. GAS SPILL IN (CABLE BY DISP. DESK) TO MAINTAIN 5 INCHES IN B.7. MAIN. (State of Michigan, 1999, Job)

A usability evaluation would have likely revealed problems with these instructions in several ways, but as one example, Step 4 could have been clarified by offering a diagram of the “Monometer” and where the connection occurs, or at the very least a cross reference to the part by its identification number on the machine. As another example, Step 7 could have clarified when the meter meets or exceeds “building up” by stating a range of pressure measurement, or by showing a diagram of the pressure valve with the pressure range highlighted in red. By providing more detail, more effective document design features, and more specific references to artifacts and parts of equipment, the designers could have created documents specific to the user’s needs, which also could have prevented misunderstandings and errors that resulted in injuries and fatalities.

Another problem indicated that even if written documentation had met effective usability at a micro level, the communication system would have likely failed to prevent the accident from occurring: Ford had procedures for shutting down the boiler, but stored these documents in an office where powerhouse employees could not access them. MIOSHA noted,

Upon interviewing numerous employees[,] it was determined that they [the employees] had no knowledge of written Boiler #6 startup/shutdown procedures. Also, Ford had no checklist available for startup/shutdown indicating the proper sequence or listing of which controls and valves were to be operated. These conditions were brought to management’s attention through internal and external audits. Recommendations to develop procedures were made as early as 1987 and up until 1998 (See documentation references section). (State of Michigan, 1999, Job)

Usability effectiveness was compromised because documents were not made available to or used by employees. Although Ford identified both the O&M procedures and checklist as part of the work system, it did not promote widespread use of these documents among workgroups. Consequently, though some employees were aware that shutdown procedures and documents existed, they were not provided with the documents to support completion of work tasks. One MIOSHA investigating officer supported this statement in his write-up of one employee’s interview, noting “(Employee) stated in an interview that the boiler shutdown procedures were in the boiler room office, even though he had never seen them” (State of Michigan, 1999, Job). He further wrote that the employee was familiar with some artifacts used in procedural shutdown, such as the operator’s notice slip, but the employee “did not know how the system worked” (State of Michigan, 1999, Job). Limiting access to procedural documentation cannot prevent injuries and fatalities; it demonstrates an egregious case of corporate ineptitude and a lack of ethics of care for the people affected by such gross negligence.

Conclusion

The MIOSHA investigation concluded that communication deficiencies were one of the primary causes leading to the boiler explosion, but offered no systemic analyses of the entire communication system. Technical communicators who work in areas of risk communication can offer this kind of specialized analysis.

Examining the system of risk communication using a merged framework of distributed usability, integrated scope, and an ethics of care philosophy revealed extensive deficiencies at all three levels of scope—aspects that designers could have examined to prevent the injuries and fatalities of the boiler explosion. At the micro-level, documents failed to provide the right amount and right kind of information. At the meso-level, employees were unaware of the goals and purposes of communication tools. And at the macro-level, employees were not collaborating, using, and sharing information among different workgroups. The training sessions on code enforcement, regulations, and safe operating procedures were inadequate, outdated, and generic. The danger tags and information tags were used inconsistently or not at all. The operator’s notice slips were used inconsistently or incorrectly. The wall and machine placards and equipment identification numbers were not referenced in task descriptions. The O&M procedures were found generic, were not displayed, or were not readily available. O&M checklists were not displayed or readily available, which resulted in improper step sequence. Consequently, the prevention of accidents was seriously compromised within the risk communication system.

Using distributed usability, integrated scope, and an ethics of care philosophy to assess risk communication effectiveness enables technical communicators to identify communication deficiencies within a complex work system, like that at Ford’s Rouge Steel Complex in this historical 1999 case. This merged framework allows scrutiny at a textual level (micro level) and a social/cultural/political level (macro level), but it also accounts for how the two levels intersect by goal-oriented actions supported by tool use in context (meso level) and how all three levels might incorporate an ethical lens.

Though an integrated scope framework has traditionally been applied to assess the design of work systems and to identify problem areas that could be improved, a merged framework, as I have offered, also lends itself to application in risk communication. Such an application accomplishes what Winsor (1990) called for: “bridging our theoretical understandings of uncertain, socially condititioned nature of discourse to bear on concrete incidents” (p. 8).

I have provided guidance on how technical communicators might use this merged framework to successfully examine risk communication in a broader context, as many scholars have suggested assessment of risk communication should (e.g., Dombrowski , 1991, 1992, 1995, 2000; Sauer, 1992, 2003;Winsor, 1988, 1990). Using this new merged framework will provide richer insights about textual features in context, social/cultural/political meaning-making, ethical application, and approaches that technical communicators might take to understand and improve the use of risk communication documentation to prevent injuries and fatalities.

References

Batova, T. (2013). Legal literacy for multilingual technical communication projects. In K. St. Amant & M. Courant Rife (Eds.), Legal issues in global contexts (pp. 83–101). Amityville, NY: Baywood.

Bedford, M. S., & Stearns, F. C. (1987). The technical writer’s responsibility for safety. IEEE Transactions on Professional Communication, 30, 127–132.

Coogan, D. (2002). Public rhetoric and public safety at the Chicago Transit Authority: Three approaches to accident analysis. Journal of Business and Technical Communication, 16. 277–305.

Cox, R. & Pezzullo, P. C. (2015). Environmental communication and the public sphere (4th ed.). Thousand Oaks, CA: Sage.

Croft, S. D. (1996). Writing material safety data sheets using the ANSI standard. Technical Communication, 43, 172–176.

Dombrowski, P. M. (1991). The lessons of the Challenger investigations. IEEE Transactions on Professional Communication, 34, 211–216.

Dombrowski, P. M. (1992). Challenger and the social contingency of meaning: Two lessons for the technical communication classroom. Technical Communication Quarterly, 1, 73–86.

Dombrowski, P. M. (1995). Can ethics be technologized? Lessons from Challenger, philosophy, and rhetoric. IEEE Transactions on Professional Communication, 38, 146–150.

Dombrowski, P. M. (2000). Ethics in technical communication. Boston, MA: Allyn & Bacon.

Dragga, S., & Voss, D. (2003). Hiding humanity: Verbal and visual ethics in accident reports. Technical Communication, 50, 61–82.

Grabill, J. T., & Simmons, W. M. (1998). Toward a critical rhetoric of risk communication: Producing citizens and the role of technical communicators. Technical Communication Quarterly, 7, 415–441.

Hannah, M. A. (2010). Legal literacy: Coproducing the law in technical communication. Technical Communication Quarterly, 20, 5–24.

Helyar, P. S. (1992). Products liability: Meeting legal standards for adequate instructions. Journal of Technical Writing and Communication, 22, 125–147.

Herndl, C. G., Fennell, B. A., & Miller, C. R. (1991). Understanding failures in organizational discourse: The accident at Three Mile Island and the shuttle Challenger disaster. In C. Bazerman & J. Paradis (Eds.), Textual dynamics of the professions: Historical and contemporary studies of writing in professional communities (pp. 279–305). Chicago, IL: University of Wisconsin Press.

Horsley, S. J., & Barker, R. T. (2002). Toward a synthesis model for crises communication in the public sector. Journal of Business and Technical Communication 16, 406–440.

Katz, S. B., & Miller, C. R. (1996). The low-level radioactive waste siting controversy in North Carolina: Toward a rhetorical model of risk communication. In C. Herndl (Ed.), Green culture: Environmental rhetoric in contemporary America (pp. 111–140). Chicago, IL: University of Wisconsin Press.

Lancaster, A. (2006). Rethinking our use of humanistic aspects: Effects of technical information beyond the intended audience. Technical Communication, 53, 212–224.

Lancaster, A. & Tucker Lambert, C. S. (2015). Communication and ethics. In C. S. Tucker Lambert & M. Schlobohm (Eds.), Communication and emerging media: What’s trending now (pp. 289–318). Dubuque, IA: Kendall Hunt.

Manning, D. T. (1982). Writing to promote health and safety behaviors in occupational settings. Journal of Technical Writing and Communication, 12, 301–306.

McLaughlin, M. (1999, February 23). Death toll rises to six in Michigan power plant disaster: Two more workers die from Ford Rouge explosion. World Socialist Web Site. Retreived from https://www.wsws.org/en/articles/1999/02/ford-f23.html

Moore, P. (1992). When politeness is fatal: Technical communication and the Challenger accident. Journal of Business and Technical Communication, 6, 269–292.

Moran, T. (2012, October). Placards and hazard alerts: Adding visual communication to an environmental communication class. In Professional Communication Conference (IPCC), 2012 IEEE International (pp. 1–8). IEEE.

Mirel, B. (1994). Debating nuclear energy: Theories of risk and purposes of communication. Technical Communication Quarterly, 3, 41–65.

Pace, R. C. (1988). Technical communication, group differentiation, and the decision to launch the space shuttle Challenger. Journal of Technical Writing and Communication, 18, 207–220.

Reamer, D. (2015). “Risk= probability× consequences”: Probability, uncertainty, and the Nuclear Regulatory Commission’s evolving risk communication rhetoric. Technical Communication Quarterly, 24, 349–373.

Sauer, B. A. (1992). The engineer as rational man: The problem of imminent danger in a non-rational environment. IEEE Transactions on Professional Communication, 35, 242–249.

Sauer, B. A. (2003). The rhetoric of risk: Technical documentation in hazardous environments. Mahwah, NJ: Lawrence Erlbaum.

Simmons, W. M. (2008). Participation and power: Civic discourse in environmental policy decisions. Albany, NY: SUNY Press.

Smith, H. (1990). Technical communications and the law: Product liability and safety labels. Journal of Technical Writing and Communication, 20, 307–319.

Smith, H. T., & Shirk, H. N. (1996). The perils of defective documentation: Preparing business and technical communicators to avoid products liability. Journal of Business and Technical Communication, 10, 187–202

Strate, L., & Swerdlow, S. (1987). The maze of the law: How technical writers can research and understand legal matters. IEEE transactions on professional communication, (3), 136–148.

Spinuzzi, C. (2003). Tracing genres through organizations: A sociocultural approach to information design. Cambridge, MA: MIT Press.

State of Michigan. Labor & Economic Growth. (1999, September 2). Ford settlement: State reaches historic settlement agreement with Ford and UAW. MIOSHA. Retrieved from http://www.michigan.gov/documents/CIS_WSH_minwsf99_27783_7.pdf

State of Michigan. Labor & Economic Growth. MIOSHA. (1999). Job No. 127242105: Consumer and Industry Services, Bureau of Safety and Regulations. Lansing, MI: MIOSHA.

Todd, J. (2014). Avoiding litigation for product instructions and warnings. Journal of Technical Writing and Communication, 44, 401–421.

Velotta, C. (1987). Safety labels: What to put in them, how to write them, and where to place them. IEEE Transactions on Professional Communication, 30, 121–126.

Waddell, C. (1996). Saving the Great Lakes: Public participation in environmental policy. In C. Herndl (Ed.), Green culture: Environmental rhetoric in contemporary America (pp. 141165). Chicago, IL: University of Wisconsin Press.

White, J. (1999, June 8). Safety agency says Ford hinders probe into February explosion at Michigan factory. World Socialist Web Site. Retrieved from https://www.wsws.org/en/articles/1999/06/uaw-j08.html

Willerton, R. (2015). Plain language and ethical action: A dialogic approach to technical content in the 21st century. New York, NY: Routledge.

Winsor, D. A. (1988). Communication failures contributing to the Challenger accident: An example for technical communicators. IEEE Transactions on Professional Communication, 31, 101–107.

Winsor, D. A. (1990). The construction of knowledge in organizations: Asking the right questions about the Challenger. Journal of Business and Technical Communication, 4, 7–20.

Youngblood, S. A. (2012). Balancing the rhetorical tension between right to know and security in risk communication: Ambiguity and avoidance. Journal of Business and Technical Communication, 26, 35–64.

About the Author

Amber Lancaster is an Assistant Professor at Oregon Tech, where she teaches composition and technical communication courses. Previously at Texas Tech University, she taught technical communication, rhetoric, and human relations courses and served as the Assistant Director of Composition and the Assistant Director of Online Graduate Studies. She helped start the Doctoral Support Center for Writing & Research Excellence in the College of Education and served as a Dissertation Specialist and Writing Coach for three years. Her research focuses on the intersections of user-centered design (UCD), ethics, and social issues as well as on technology and writing pedagogy. She is available at Amber.Lancaster@oit.edu.

Manuscript received 27 February 2017, revised 15 June 2017; accepted 31 August 2017.