Purpose: To demonstrate the importance of standardized process modeling notation and its value to technical communicators involved with visualizing business or technical processes. To argue that a standardized process modeling notation can assist with bridging cultural communication gaps brought on by globalized workplaces.
Method: A rhetorical analysis emphasizing how effectively technical communication visualizations address audience, purpose, and documentation conventions. Communication visualizations were modeled first using nonstandard and then standardized modeling techniques. The visualizations were generated as part of a qualitative study to represent the business and communication processes of a senior level employee from a software firm.
Results: Nonstandard or proprietary data models and visualizations are not readily useful to diverse audiences, especially global audiences. Models should be developed with notation software that supports open standards.
Conclusion: Technical communicators should become proficient with process modeling notation and understand the fundamentals of standardized notations such as Uniform Modeling Language (UML) and Business Process Modeling Notation (BPMN). Academics teaching and researching technical communication should be wary of creating a divide between industry and the academy by perpetuating the use of non-standard models.
Keywords: UML, BPMN, process modeling, data visualization, business analyst
- Numerous free and open source process modeling applications are available for download and use on Linux, Mac, and Windows operating systems. Numerous free references are available online for learning process modeling notation.
- Process modeling notation is not a skill or a task reserved exclusively for business analysts. As the field of technical communication expands and shifts, practitioners should understand the fundamentals of process modeling.
- Because small companies also cultivate global customers, communicating and collaborating via standardized process models can help bridge cross-cultural communication gaps.
As the ideologies of Business Process Management (BPM) continue to push the public and private sectors toward optimizing their practices, knowledge workers of all kinds, including technical communicators, face both challenges and opportunities (Dumas et al., 2013). Key components of increasing business efficiencies are the abilities to identify and then model visually the complexities of an organization’s varying practices and relationships. The purpose of this article is to demonstrate the importance of understanding recent advancements in standardized process modeling notation so technical communicators may continue to add value to their organizations and succeed in the globalized workplace. The primary claim is that proficiency with modeling languages such as Business Process Modeling Notation (BPMN) and Uniform Modeling Language (UML), both discussed here, is now a necessary component of a technical communicator’s evolving skillset. Additionally, this article functions to highlight potential disjunctions between the complex theories underpinning academic research and instruction (activity theory, actor network theory, genre theory, and combinations thereof) and their utility in the workplace.
The contents of this article were derived from a qualitative case study involving a senior member of a software development firm and her interest in gaining a different perspective on her workflow for new client acquisitions. Details of the case are provided for context but at the crux of this entry is a rhetorical analysis of the visual models originally created for the study results. Rhetorically, these models were found to have failed on the levels of audience, purpose, and conventions, three key aspects of successful technical communication. The results of that analysis alongside the visualizations redesigned using the standardized modeling language BPMN reveal a significant gap between academic approaches to, and instruction for, modeling business processes and the demands the workplace is making on modern technical communicators. In other words, the article offers a specific instance of what Spilka (2009) described as a real problem for the field: “A disconnect exists between the centrality of research to the work of many, if not most, technical communication practitioners and the growing evidence that academic programs in the field are not providing sufficient research training” (p. 219). Contributing to the disconnect is the fact that the demands of the workplace are shifting. The current president of the Society of Technical Communication, Bernard Aschwanden, noted recently in Intercom (2015) that, “the ‘writer’ role is evolving and changing into a career that is linked to business processes, user analysis, sales and marketing, and helping to generate revenue for a company” (p. 15). The overarching goal of this article, to advocate for proficiency and use of standardized modeling languages, is one way to keep pace with these changes.
Additional context from the study
The larger context for this article is a movement within local governments to turn toward procuring new software solutions for improving both performance and transparency. Specifically, governments are implementing new third-party business information (BI) and performance management (PM) software solutions to not just manage, measure, and report on their services, but to enable higher-quality data analyses that provide broader, more holistic views of their operations (Harder & Jordan, 2012). More prominent offerings in this space include products such as IBM’s Cognos suite of applications that promise to change “data into views of your organization’s operations and performance so your decision makers can capitalize on opportunities and minimize risks” (IBM, 2016, p. 1).
The original study worked with a much smaller (~40 employees), yet global software firm that develops PM software for local governments. And while the firm’s headquarters are in Europe, I worked in cooperation with the company’s U.S. Vice President (VP) of Operations for their North American clients, who, in her own words, is a “glorified technical writer” and conducted a 6-month qualitative study involving interviews, participant journaling, and artifact collection. The study’s objective was to trace different genres (i.e., specific kinds of documents) across complex communication networks as the VP responded to a Request for Information (RFI) with the intention of continuing the process for the subsequent Request for Proposals (RFP) issued by a city government. The VP, in turn, would receive business process models visualizing her own workflow to gain different perspectives on her own processes.
Ultimately, the data models presented to the VP were less than satisfactory and dubbed “proprietary” to an academic environment. Again, an analysis of these models shows they fail in their purpose to provide an audience with a new, useful view on client acquisition largely because of the nonstandard formatting conventions. The VP recommended Unified Modeling Language (UML) and Business Process Modeling and Notation (BPMN) for data visualizations due to their advancements in modeling not just software but also more fluid business interactions. She stressed the importance of using a standardized modeling language that would be immediately recognizable and useful to her and her colleagues. Attention to and the adoption of standards emerged as a key outcome. As will be discussed, standards are held up as a means for technical communication practitioners to navigate communication complexities brought on by globalization. As Getto and St. Amant (2014) note, “Just because individuals from different nations and regions can interact does not necessarily mean they will interact effectively or efficiently” (p. 24). And, the push for efficiency, especially in terms of increasing revenue, is not going to subside for technical communicators.
In order to explore the disconnects noted by Spilka (2009) between academic instruction and practitioner needs, particularly in regard to how disconnections may lead to inefficiency, this article begins with an overview of workplace writing methodologies including genre and activity theory as well as actor-network theory (ANT). The review highlights the challenges facing researchers studying knowledge workers operating in networked and distributed communication environments. The introduction reinforces the importance of qualitative research to technical communication and the article is constructed in such a way that it attempts to reflect on available resources for conducting qualitative studies while also providing brief context from this small, specific study. The article overviews the collecting of the data for the study as well as attempts to model that data, but the result is closer to what Wilson (2001) described as a “confessional case study” as he discussed strategies for a postmodern technical communication pedagogy (p. 79). Qualitative case studies are time intensive, even for a small-scale endeavor such as this. It was not until I attempted to begin modeling visualizations of the study’s framework that I received pushback from the VP. The discussion of that pushback and the rhetorical analysis of the models produced are not meant to be an indictment of anyone, any specific text, or a specific method or methodology, but rather a report on attempts to recalibrate my teaching strategies based on feedback from an experienced practitioner.
As with other small-scale qualitative studies, “[r]ather than aiming to generalize about large populations” the goal was to open a “view on the specific situation or phenomenon being studied” (Koerber & McMichael, 2008, p. 462). That is, the challenges encountered here with this convenience sampling of data would not necessarily be replicated elsewhere. Nevertheless, the article’s ultimate objective is to use the study experience, including an analysis of the models produced as part of the study, to advocate for proficiency with standardized modeling languages. As will be discussed, advocating for modeling languages such as UML and BPMN has implications for both teaching and practice. The article offers an overview of the open source UML/BPMN application Modelio as an introduction to the benefits of standards-based modeling and provides a sample diagram.
As is customary with Technical Communication, the article continues by offering suggestions for practitioners including how they might interact more “effectively and efficiently” in international contexts based on this study’s outcomes (Getto & St. Amant, 2014, p. 24). Current scholarship on process modeling notation suggests that standards-based notations are a useful when addressing cross-cultural communication issues.
Finally, the article contends with Rude’s (2009) observation that “[t]he connection between pedagogy and practice is close,” or at least it should be (p. 182). And, it offers both questions and some suggested answers for academics and practitioners related to Spilka’s (2009) concerns about disconnects between teaching and practice: Do the learning objectives for our research methods courses include data visualizations and modeling techniques recognized by industry professionals? How can we teach methods and methodologies in a fashion that balances practical know-how with the field’s theoretical work that remains so crucial to critical and analytical thinking? What can we learn, or borrow, from data visualization standards such as UML and BPMN already in use by business analysts?
Technical Communication and Workplace Writing
Scholars affiliated with professional and technical communication have studied workplace writing within a variety of conditions and settings (Johnson-Eilola, 2005; Henry, 2000; Spinuzzi, 2003, 2008; Van Nostrand, 1994, 1997; Winsor, 1996, 1999, 2001). Van Nostrand’s research (1994, 1997), for example, maps the different genres developed and exchanged between the U.S. Department of Defense and potential vendors bidding for federal research and development funding. The defense procurement system constitutes an “activity system” and by tracking the genres within the system, the “close connection between persuasion and knowledge production” is revealed (Van Nostrand, 1997, p. 141). The procurement process works because the rules governing the process as well as the eventual contractual agreements that bind both parties are captured in a variety of documents. In other words, “the documents represent coded and keyed events in a discourse exchange system; they are conspicuously genres” (1994, p. 111). And, as genres, we can conceptualize them as typified rhetorical actions or responses to a typified social context (Miller, 1984). For Van Nostrand, tracking and analyzing genres are the keys to understanding knowledge production. This situation is one of importance to technical communicators because it demonstrates the power and the range of influence their communications can have within these systems.
This is not to suggest, however, that activity systems such as those documented by Van Nostrand are fixed, rigid structures. While genre theory traditionally understands the social context as giving shape to texts, researchers of workplace writing have introduced activity theory to their methodologies to account for agency and the change brought about by the recursive interactions among different actors, tools, and texts (Russell, 1997; Winsor, 1999). With the introduction of activity theory, we can “theorize about the simultaneous existence of regularity and change and about the influence of systems and the role of agency” (Winsor, 1999, p. 201). Such an approach has much to offer technical communicators because it supplies additional evidence that their work is not merely formulaic or rigidly predetermined. The flexibility within genres allows communicators to execute a greater degree of agency.
Genre theory and activity theory were combined into a methodology termed genre tracing by Spinuzzi (2003) so he could study traffic workers in Iowa and their use of a database of traffic accidents. This methodology enables researchers to track genres as they morph and intermingle within activity systems in addition to examining how genres may be used to build or tear down those systems. Following Medvedev and Bakhtin (1978), Spinuzzi casts genres as emerging from “cultural-historical activity” or “tradition” (2003, p. 41). That is, “[g]enres are not discrete artifacts, but traditions of producing, using, and interpreting artifacts” (p. 41)—a factor important to technical communicators because it suggests their work can remain relevant well beyond a single moment in time.
Tracing genres through networked communication environments poses new challenges for researchers of workplace writing, and chief among them is the distributed and fragmented quality of computer-mediated communication. In his introduction to the special issue of Technical Communication Quarterly dedicated to distributed work, Spinuzzi (2007) states, “work is becoming more distributed: distributed across time, space, disciplines, fields, and trade; distributed across a multiplicity of stakeholders; distributed through telecommunications and digital technologies” (p. 272). In other words, the VP’s conference calls, emails, texts, proposal drafts, handwritten notes, site visits with the potential client, and meetings at corporate headquarters represent the new distributed work and the expanding roles of a technical communicator.
This distribution has effectively destabilized the modularized work endemic to 20th century business practices. Fixed hierarchies have been replaced with fluid networks and “assemblages that may or may not be stable from one incident to the next and in which work may not follow predictable or circumscribed paths” (Spinuzzi, 2007, p. 268). Researchers in technical communication have embraced Latour’s (2005) actor-network theory (ANT) as a means for understanding agency within webs of human and nonhuman actors (Potts, 2009, 2010; Rice, 2009; Spinuzzi, 2008; Swarts, 2010; Whittemore, 2012). As Swarts (2010) observes, “distributed work lacks inherent order and does not acquire it unless some force stabilizes (at least temporarily) the objects of work and the relationships between people and texts that rely on those objects” (p. 130). Stabilizing forces can come in the form of some unexpected objects. For example, the influence of nonhuman actors is captured in Whittemore’s (2012) case study of “ephemeral texts in design arguments” and his use of Latour’s three-part agonistic model of inscribing, mobilizing, and cascading to make sense of his collected data (p. 416). Ultimately, Whittemore traced the influence of a reminder note written by a technical communicator as a key factor in the communicator’s ability to argue successfully for design changes to a user interface. For technical communicators working in industry, these factors mean that all forms of communication, even the seemingly insignificant, have the potential for profound impact.
Examinations of distributed work usually identify the technical communication labor being performed as knowledge work, or what has been called symbolic-analytic work (Reich, 1991). Symbolic-analytic workers, as described by Johnson-Eilola (2005), face the burden of keeping pace with these changing networked work structures and new required skill sets: “People in this type of work identify, rearrange, circulate, abstract, and broker information” (p. 28). In other words, “symbolic analysts are people we might think of as technical rhetoricians” (p. 19, emphasis in original). Technical both because they are skilled at working with and adjusting to the technical materials encountered while collaborating with subject matter experts and other colleagues, but technical also in their proficiency with ubiquitous computer-mediated communication software and networks. Symbolic-analytic workers are rhetoricians in the sense that they, too, are producers of content and must focus on “process, action, and reception” to be successful (Porter, 2013, p. 136).
For technical communicators in industry, these ideas are important because they again reveal that the knowledge work they perform has the potential for long-range influence and improved efficiency—that is, influence beyond immediately perceived audiences and short-term results. The documents the VP prepared during the different phases of the RFI were as much about educating the city, her potential client, about PM software solutions as they were about persuading a wide range of stakeholders. If the city feels confident enough about what was learned in the RFI stage to move forward with an official RFP, the work done at the RFI stage was a crucial step toward winning a contract for the city’s business down the road.
Observing divides between teaching & practice
In a brief article in Technical Communication Quarterly, Charney (2015) challenges the assumption that research in our field must be driven by or begin with one particular question. The assumption continues that research must then proceed in a linear fashion in pursuit of an answer. Instead, Charney observes: “In real life, though, the method or the site or some special interest comes first. You realize you have access to a workplace or an archive of records” (p. 105). Her larger point is that research may begin with “an assortment of starting points” (p. 105). Such is the case with this project. A colleague who knew that I had worked in industry as a software developer and technical writer introduced me to the VP. At the time, the VP was interested in recommendations for communication strategies for getting feedback from her existing clients to her company’s development team overseas. As Charney suggests, I suddenly realized I had access to a new research site. And, as a curriculum director for undergraduate and master’s level professional writing programs, working with the VP afforded the opportunity to explore divides between program curricula and practice.
As noted previously, the field of technical communication has produced a number of exemplary articles and book-length projects using qualitative methods to study workplace research sites. As researchers learn of the benefits of appropriating methods from the social sciences, the field has also begun to address issues arising from the use of qualitative methods (Campbell, 1999, p. 533). For example, Koerber and McMichael (2008) offered their primer for conducting qualitative studies in an effort to speak to what they saw as a gap in the field’s available literature. Their primer certainly succeeds as a good addition to graduate courses introducing students to research methods, and, as instructors, we should continue to reflect on how we teach qualitative methods in relation to how we deploy those methods as researchers and practitioners. More recently, the third edition of Miles, Huberman, and Saldaña’s Qualitative Data Analysis: A Methods Sourcebook (2014) and Saldaña’s The Coding Manual for Qualitative Researchers (2013) were reviewed as “invaluable to technical communication researchers and practitioners, including students and those working in industry, as the field continues to rely more on qualitative methodologies” (Hashimov, 2015, p. 112).
After semi-structured telephone and in-person interviews with the VP, it was agreed that I would first need to spend time learning about her business and the market for PM software. Key features of qualitative sampling include setting effective boundaries for a study as well as creating a conceptual frame to help “uncover, confirm, or qualify the basic processes or constructs that undergird [a] study” (Miles, Huberman, & Saldaña, 2014, p. 31). Toward those ends, I scheduled regular interviews (approximately every one to two weeks) with the VP to better grasp her work practices including her use of analog and digital tools, her project management style, and the structure of her organization and her key collaborators. She also provided me with corporate marketing literature as well as outside sources to introduce me to the industry in order for her communications to have more contexts. She fielded my additional questions about her company and the market via phone, email, and text. Because the VP works simultaneously supporting existing clients while also striving to acquire new clients via RFI and RFP submissions, it took several months to determine how we would create boundaries and a beginning framework for the study. Primarily due to timing and schedule availability, we agreed that the study would focus on her networks of communicators and the communication artifacts developed as part of the VP’s work responding to a single RFI issued by a city government located in the Pacific Northwest. The RFI issued by the city contained project deadlines and software application requirements that would help establish project boundaries and frames for the data collected. If the city moved forward by issuing an RFP, we would continue the study with the RFP in the same fashion.
The RFI: Boundaries & frames. With the RFI providing boundaries for the study as well as key requirements for a conceptual frame, we proceeded by determining the artifacts and the information that would be collected during the RFI timeline. The VP agreed to track and save the iterations of her RFI submission by file name and file type as well as provide a brief journal entry about the iterations including content additions and deletions. She used a spreadsheet to manage document versions and each version was tagged as a “parent” or “child” to signal relationships between different documents and their content. In between versions, she had email exchanges and conference calls with her home office. In turn, I created a spreadsheet as a data accounting log to track the different artifacts sent by the VP (Miles, Huberman, & Saldaña, 2014, p. 122). I took notes during the interviews I conducted with the VP, saved the notes as text files, and archived the names of those files along with the date and time of the interview in my spreadsheet. Despite recommendations from Miles, Huberman, and Saldaña, I did not use a Computer Assisted Qualitative Data Analysis Software (CAQDAS) application as part of this study (p. 47).
All of these artifacts and the tools the VP uses to create and manage them amount to what Spinuzzi has called “mediating artifacts” that have the power to “qualitatively change the entire activity in which workers engage” (2003, p. 38). What was curious from an outsider’s perspective was that even though the city RFI requested an education on PM software, that education was never provided in the abstract. That is, anything PM software could do was illustrated through specific details about the VP’s software. As Van Nostrand observed in his pursuit of transactional genres, “From both an institutional perspective and an individual perspective, virtually all of the interactions in the defense R&D community entail relationships that are essentially rhetorical and insistently pragmatic. Honesty of purpose is normally transparent” (1997, pp. 141–142). While it is no surprise that the RFI submission is a rhetorically charged document, it is surprising that there are seemingly no efforts to obfuscate the rhetoric. The transparent “honesty” of the document is that she first and foremost wanted to sell them her software solution. The VP confirmed this observation in an interview telling me that the RFI is issued by the city because they want to buy a software solution; she responds to the RFI because she wants to sell them her solution.
Important to the integrity of the study, the VP provided informant feedback or member checks to validate the work as reflective of the processes studied. Even though RFIs and RFPs are public documents, I wanted to use member checks with the VP to confirm she felt that her confidentiality and anonymity had been maintained (Miles, Huberman, & Saldaña, 2014, pp. 58, 63).
Visualizing frameworks & beginning data models. Fortunately, technical communication researchers have produced examples and templates for modeling activity systems. In 2009, Hart-Davidson, Spinuzzi, and Zachry presented a workshop at the Rhetoric Society of America conference titled, “Visualizing Patterns of Group Communication in Digital Writing.” The contents of that workshop would later be developed into Spinuzzi’s book-length research guide Topsight: A Guide to Studying, Diagnosing, and Fixing Information Flow in Organizations (2013). As the title suggests, field studies are conducted with the objective of getting the big picture of an organization’s operations. The activity system templates from the workshop and Topsight were the starting point for my visual frameworks. Activity systems are represented as hexagons with the six corners serving as numbered information nodes. The first node requires two different categories and the remaining nodes require one each for a total of seven. They are:
- Community Stakeholders
- Division of Labor
These seven operate around the edges of the context of the activity system. I returned to my notes and the gathered data to begin the visualization of the VP’s activity system. With some adjustments, I believed most of my work could be positioned within these seven recommended portions. I populated two activity systems – one representing a city government perspective and one representing the V.P.’s software company for the RFI process. They are represented in Figures 1 and 2 below.
In both figures, I populated information in the Rules section with the deadlines and software application requirements from the city RFI. The RFI identified with whom the VP was corresponding and that information along with who would be tasked with what specific work, which completed the categories of actors, community stakeholders, and information about divisions of labor. I decided to separate Tools from the different genres the VP produced using those tools. Influenced by Spinuzzi’s (2008) work with both activity theory and ANT, the actors were defined as both human and nonhuman actors imbued with agency to alter the communication network. From a technical communicator’s perspective, the exercise of creating these diagrams was useful because the process was a lesson in organizing and distilling complex systems.
The next stage, according to Topsight, is to begin diagramming connections between activity systems to create activity networks. The networks reveal the various ways that the systems overlap or are chained together (2013, pp. 228–230). For example, the rules governing both systems overlap a great deal. My future plan for modeling the study’s data was to create more activity systems and put them in conversation with a larger and growing activity network. I would create a new activity system detailing the VP’s process for generating her list of vendor questions she was allowed to submit to the city and add that to what would have been a growing network. Similarly, different iterations of the RFI submission could be added to the network and with iterations represented as activity systems. Again, the study would become vastly more complex once the RFP was issued by the city. Before proceeding, I emailed the two activity systems to the VP along with supporting literature on the diagraming process.
Activity systems diagrams: A brief rhetorical analysis. Visual documents, such as those in Figures 1 and 2, communicate through a hybrid interaction of text and image. A rhetorical analysis of the figures based on audience, purpose, and the conventions employed in the visuals makes a critique of the activity systems possible. Discussing the “slippery topic of audience,” Bosley (1994, p. 296) reminds us that “actual readers have values, beliefs, perspectives, knowledge, authorities, politics, expectations, and constraints that enable or limit their ability to read and use technical documents” (p. 296). That is, audiences are automatically bound up with social, political, and ethical concerns that technical communicators, including those designing visual documentation, may or may not be aware of. Even though my immediate audience was the VP, I was also aware that the VP’s colleagues, audience members I would never meet, may one day see or attempt to use these visuals.
In a rhetorical analysis, audience and purpose are linked naturally. While I was not attempting to persuade the VP in a more conventional sense to, for example, buy a product or service, the purpose of these figures was that they would function to provide an accurate and useful rendering of her processes. Foss (1994) actually recommended the term “function” instead of purpose in her “rhetorical schema” for evaluating visual imagery (p. 215). Once the function of an image has been determined, in this case capturing and rendering the fluidity of the VP’s processes, the “critic’s concern here is with the various stylistic and substantive dimensions in the image” (p. 216). In other words, it was clear what the activity diagrams were supposed to do, and it would be the content in combination with the conventions deployed in their visual display that would determine their success.
The VP did not dispute that the activity diagrams attempted to account for all audience members involved, including herself. She also emphasized the importance of diagramming objectives as well as the desired outcomes from communicating with a potential client. Ultimately, however, the VP would be critical of the diagrams for several reasons but primarily because they required her to learn a new means to visualize processes when an existing standardized system was already available. This audience’s values and beliefs held open standards in the highest regard. Additionally, the diagrams needed to be more easily editable and shareable, especially when a group of users may be scattered around the world and using different computing platforms. In the VP’s estimation, the diagrams failed in the limited ways they invited audience members to participate with (re)using them. After a number of emails in which I attempted to explain what were dubbed academic and proprietary research models, I received an email with the following subject line: “Please. Use. UML.”
Process Modeling Notation and Technical Communication
The VP’s email could be dismissed as glib, but I contend that it reflects the level of frustration we reached regarding initial attempts at data visualizations for the evolution of the study framework. I had last used UML in the early 2000s, and it was not uncommon for requirements specifications for our team’s larger software projects to contain a dozen or more UML diagrams capturing system workflows such as trusted IP and digital certificate validations as well as encryption and decryption sequences. But, I was genuinely surprised by the advancements in the standards and the technology available to develop with UML when I returned to it. UML is a standardized modeling notation supported and controlled by the not-for-profit consortium Object Management Group (OMG). The OMG touts UML as the “lingua franca of software development” and cites large government and private organizations that rely on it to visualize the many operations a software system can support (Watson, 2008, p. 2). A major milestone came at the end of 2004 when the UML standard moved from version 1.4 to UML 2.0 (as of this writing, UML is in version 2.5). The 2.0 version included:
13 distinct modeling notations ranging from high-level use case diagrams, which depict the interactions and relationships between (human) actors and major business functions, through to low-level object diagrams which capture instances of individual data objects, their constituent data elements and values, and their relationships with other data objects. (Russell, van der Aalst, ter Hofstede, & Wohed, 2006, p. 1)
In other words, UML was expanding in ways that could account for multiple actors across complex communication networks. A modeler now had the ability to demonstrate the dynamic behavior of an activity system by showing collaborations among objects and actors and changes to the states of those objects. Today, “UML groups together a large number of modeling techniques that were previously scattered among different domains” including BPMN, another standard that the OMG has managed since 2005 (Desfray & Raymond, 2014, p. 99).
Business Process Modeling & Notation or BPMN
Like UML, BPMN was originally conceived on a more limited scale and designed to capture business processes and communications within an organization. Also like UML, BPMN has expanded its modeling abilities so that “the graphical notation will facilitate the understanding of the performance collaborations and business transactions between the organizations” (BPMN, para. 1). BPMN is considered a process modeling notation that includes a grammar “which specif[ies] the syntax and semantics of the graphical elements in a process model and the rules of how to combine the elements” (Recker, 2011, p. 2). Since 2005, there is research documenting the use of BPMN activity diagrams for modeling these complex transactions (Russell et al., 2006), and the release of BPMN 2.0 in 2010 has stretched the reach of the standard (Recker, 2011). While UML and BPMN are separate standards, because they are both overseen by the OMG, they are managed for compatibility. This means that newer versions of UML software tools will contain the option of using BPMN when creating a new project file. As the VP reminded me, the data models I was preparing needed to be recognizable to a variety of audiences and industry standards like UML and BPMN were welcome choices. Again, by requiring my audience to adapt to my original activity system diagrams, I ignored an existing standard as well as that audience’s particular beliefs and values about sharing data visualizations.
Technical communication scholarship addressing UML has, by and large, relegated it to be a modeling tool for software development or as a means to enhance technical documentation (MacKinnon & Murphy, 2003). At best, it has been referenced as “beneficial when considering the design of what occurs on the screen and within technological systems. These [UML] diagrams are excellent for showing systems, states, and task processes related to these systems and states” (Potts, 2008, p. 2). Across a number of conference proceedings, articles, and finally a book-length project on alternative mapping techniques for studying “social web ecosystems,” Potts (2010, p. 1) consistently advises UML “is not intended to be a way to understand an entire ecosystem of actors participating in ordinary activities” (2008, p. 2). That claim enables Potts to advocate for the use of actor-network theory (ANT) and what she terms ANT diagrams or maps. As described by Potts, the “major tenet of ANT is that all participants, whether they are human or non-human, have equal agency to affect any given situation” (2009, p. 34). The ANT diagrams are developed to trace the distribution of agency across a web of actors assembled around an event. Potts’s point of reference for UML’s inadequacy is Fowler’s UML Distilled: A Brief Guide to the Standard Object Modeling Language (2003) and she cites Fowler consistently across her publications. As a 2003 text, Fowler’s work pre-dates UML 2.0 and the current advancements by the OMG in modeling standards, including OMG’s management of BPMN standards. In fairness, though, the rise of BPMN has been swift. According to Recker (2010), “No other notation has seen such an uptake in such a short time as BPMN has” (p. 182). Certainly, my own initial attempts to visualize the VP’s processes did not make use of the OMG standards.
In my conversations and email exchanges with the VP, I realized that I needed to update my own outlook on UML and now BPMN as potential modeling options as well as research more current sources on UML and BPMN. In a 2010 issue of Technical Communication, Damrau reviewed The Process: Business Process Modeling Using BPMN (2009) and BPMN Method and Style (2009), the latter now in its second edition. Damrau asserts that:
Business process modeling is becoming more prominent for documenting business and system processes. Business systems analysts and technical communicators are the professionals who should be well-versed in the structure of business process modeling (BPM) and the graphical notation or business process modeling notation (BPMN) that accompanies it. (p. 333)
For this project, I acquired Silver’s second edition BPMN Method and Style (2011) as well as the second edition of Podeswa’s UML for the IT Business Analyst (2010). I relied heavily on online BPMN resources to guide my new efforts (Modelio, “Tutorials”; Visual Paradigm, “Introduction to BPMN”; White, 2006). What all of these sources emphasize is that BPMN has emerged as a result of a “clear need for a modeling language for business processes which could be expressive and formal enough but easily understandable also by final users and not only by domain experts” (Chinosi & Trombetta, 2012, p. 124). When it comes to the complex problem solving required of many of today’s technical communicators, BPMN provides a means to visually capture and then easily distribute proposed solutions to the wicked problems the workplace presents. Silver (2011), however, does caution that because BPMN diagrams look like “traditional flowcharts” their “outward familiarity” may mask some of the complexity of diagraming with BPMN (p. 3, emphasis in original). Podeswa (2010) notes that business process diagrams produced using BPMN make use of a “rich symbol set [that] can model complex and subtle workflow requirements” but that a more nuanced use of that symbol set can take time to learn (p. 64). Table 1 catalogs some of the basic elements used in BPMN models.
Even though the VP would not be using the activity system diagrams originally generated to document any of her company’s practices, nor would she be sharing them with her colleagues, the discussion generated by those models spurred a reevaluation of OMG standards. The remainder of the article will illustrate the use of BPMN to model new diagrams as well as reflect on the importance of acquiring proficiency with modeling languages in order to better reduce disconnects between teaching and practice.
Open source UML/BPMN data modeling via Modelio
Advancements in the open source software community have produced stable software for a growing number of markets, and that includes process modeling notation applications. Wikipedia’s entry comparing BPMN tools based on platform, version support, and licensing is a useful resource for reviewing tool options (“Comparison of business process modeling notation tools”). For this study, I turned to the open source application Modelio to revise the VP’s models. Modelio requires Java to run, and it has Linux, Mac, and Windows versions. The version of Modelio used for this project (version 3.2.1) includes the option to build a BPMN diagram when launching a new project. With several hours of working with the application, including viewing and referencing tutorial materials, I was able to work together a new activity network and send it to the VP (see Figure 3).
While the grammar of this initial BPMN model would surely benefit from additional revision, the draft received an enthusiastic response from the VP and it is shown here to visualize connections among the basic BPMN elements. As she pointed out, revising a process model is much easier when everyone involved is working from a standardized notation and has an (albeit basic) understanding of the grammar. First, the activity network is setup as a pool that contains the original activity systems representing the city’s and the VP’s perspectives. These systems reside within the pool as swim lanes. Modelers use swim lanes to represent the activities of any number of participants and BPMN diagrams often contain several swim lanes representing actors such as customers, the company, a computer server, and product distributors. The circular elements represent different types of events including start, intermediate, and end events. Events can be assigned different type attributes and the start and intermediate events shown in Figure 3 are designated as timers to signify the different RFI deadlines. The diamond-shaped elements represent different gateways for regulating process flow. For example, an exclusive gateway indicates a choice with only one information flow continuing on. Other gateways, such as parallel gateways, indicate that multiple processes can be in motion simultaneously. The rounded rectangles in the figure stand in for activities performed during a process. Activities are either a specific task or a type of sub-process activity. Above, the VP’s review of the city’s RFI is represented as a single, discrete task. The other activities in the diagram contain a “+” symbol indicating that there are sub-processes that make up the activity. Clicking “RFI Draft 001,” for example, would reveal the underlying elements that make up that sub-process, including the VP’s communication flows with her development team and a draft of her RFI submission represented as a data object. Data objects represent different types of data produced or stored as part of a business process. Other data objects in this diagram include the published RFI and the city’s published responses to the vendor questions, both in the city’s swim lane. Future versions of this model that attempt to visualize the more complex RFP process may benefit from moving actors, such as the company owner or a lead developer, into their own swim lanes. As Winsor (1999) reminds, “complex organizations almost always encompass several subsidiary activity systems with different interests” (p. 201). Sub-processes shown above contain separate business processes, but when actors, such as the owner, become more involved across multiple information flows, they will move to the fore of the diagram.
Part of the power of BPMN diagrams is that they are shareable among users. BPMN 2.0 uses XML to facilitate diagram exchange and collaboration across many notation tools (Silver, 2011, p. 9). Also, the symbols representing the different elements in a process model are standardized and widely recognized by developers, analysts, and other knowledge workers. As academics providing instruction for future technical communicators, we should be cautious of advising students to model systems with proprietary visualizations. For example, reflecting on the construction of her ANT maps, Potts (2014) advises that after the different “nouns,” or actors in a network, have been identified, the “experience architects” developing a map should devise visual “stencils” to stand in for the nouns or the actors (p. 35). The unique icons are needed because “different situations require the use of different kinds of stencils. For example, a situation like a terrorist attack requires a stencil that looks like a bomb, while a situation like a new movie release might require a stencil that looks like a film” (p. 35). Potts goes on to discuss using connecting lines of various thicknesses between the actors to approximate the amount of contact between two actors. A thicker line signifies more contact. Indeed, there may be instances where modeling a new network of communication requires the creation of specialized symbols. With more recent advancements in UML and BPMN 2.0 standards, I would argue for attempting these visualizations with standards-based processes first. Standardization does not necessarily have to mean rigidity—that is, rigidity built into the tool that would prevent modelers from respecting the nuances and fluidity of the type of work knowledge workers perform. It should be noted, too, that most modeling tools have the ability to import custom images and contain freehand drawing and shape tools.
Concerns regarding standards-imposed rigidity, however, are not unfounded. Process modeling has its roots in manufacturing where material flows and production schedules were mapped as a means to increase efficiency and a company’s bottom line (Recker, 2008, pp. 11–13). Models for these processes are akin to what Spinuzzi (2003) critiqued as “formalization methods” that “tend to assume some sort of structure that underlies the work of a range of workers, a structure that can be investigated, modeled, and repaired in such a way as to solve the workers’ general problems” (p. 19). The issue, according to Spinuzzi, is that these methods extend a “victimhood trope” found in fieldwork-to-formalization studies that ultimately devalue a worker’s agency (p. 13). Companies should work to strike a balance between formalizing aspects of their business processes without replacing the “local, idiosyncratic, or contingent solutions” developed by workers in their responses to emergent tasks (p. 21). While the VP and her different, sometimes idiosyncratic, strategies for approaching her work were not in need of rescue of any kind, I wanted to remain mindful of visualizing the very fluid and nuanced portions of her workflow in an overly reductive form. Again, with advancements in the BPMN 2.0 release, there is not only greater flexibility for modeling but also an interest in expanding on the context in which a process occurs. For example, “many process models also include information regarding the involved data, organizational/IT resources and potentially other artifacts such as external stakeholders, performance metrics, context factors and other related information” (Recker, 2011, p. 13). Recognition of these external influences and “extrinsic drivers” that may influence processes are the “drivers for flexibility [that] can be found in the context of a process and may include among others time, location, weather, legislation or performance requirements” (Rosemann, Recker, & Flender, 2008, p. 47). Increased flexibility within the UML and BPMN 2.0 standards can mean more nuanced data visualizations that resist mere fieldwork-to-formalization models.
In short, even amateur attempts at using current UML and BPMN tools revealed several reasons technical communicators may wish to explore these tools for their own data visualizations:
- OMG managed standards are recognized by many knowledge workers, including analysts, engineers, and executives
- XML-based file formats are shareable across teams for collaboration
- Drill down option for sub-activities to account for greater complexity of systems and networks
- Relationships among data and actors are visualized across systems
- Free/open source UML/BPMN authoring tools are stable enough for industry use
Practitioner results: Business analyst or technical communicator?
From a practitioner standpoint, I felt the need to question the VP about the relationship between the changing roles required of a technical communicator and the job description of a business analyst or those often tasked with analyzing, capturing, and modeling business and/or software processes. This is not, of course, a new conversation. A quick search of the TechWhirl list-serv archives (http://www.techwr-l.com/archives) reveals a number of discussion threads related to business analysts including “What is a business analyst?” and “Tech Writers Turned Business Analyst.” In another thread titled, “Business Analyst vs. Tech Writer,” the author of the post was updating his résumé to include job responsibilities from his most recent positions and those found in his new technical writer position. When he entered some of the key words from his job description into the popular job search website Dice, many of the results that came back were for analyst positions. The author quipped that the companies he has worked for “got a tech writer and an analyst for the price of a tech writer” (Barrow, 2007). Indeed, many analyst responsibilities do not sound foreign to technical communicators.
The International Institute of Business Analysis maintains and publishes the Business Analysis Body of Knowledge (BABOK) now in its third version and available in five languages. In their cataloging of the many responsibilities an analyst may have, they include:
Business analysts must analyze and synthesize information provided by a large number of people who interact with the business, such as customers, staff, IT professionals, and executives. The business analyst is responsible for eliciting the actual needs of stakeholders, not simply their expressed desires. In many cases, the business analyst will also work to facilitate communication between organizational units. (BABOK)
From the perspective of the VP, the proverbial ship has sailed. Technical communicators that are unprepared or unwilling to enter an organization and navigate their complex business processes as well as work to understand the processes of their potential customers and current clients will find themselves relegated to fewer and smaller tasks. Of course, the person providing this feedback is someone who has herself risen to the position of Vice President. The VP’s job description is vast and it is presumptuous to assume that all practitioners in our field aspire to VP-level jobs. But, technical communication practitioners should not be surprised to see more demand for business analyst skills. In a recent “My Job” profile written for Intercom, “Jill’s” new technical communication position has a number of new requirements, the first of which is knowledge of “business development and processes” (Woelk, 2015, p. 32). Because our field does lobby frequently for more status among our industry colleagues and has published extensively about our own “power and legitimacy,” these increased requirements can be viewed as a victory (Kynell-Hunt & Savage, 2003, 2004). Our field also claims that technical communicators should be involved earlier in business activities of many kinds, such as the development cycle of a new software product or building a new website. Not only do we want to be involved early, we want to be involved for the duration of the project, including testing and even subsequent product versions. We have made these claims based on our strengths as symbolic analytic workers and those same analytic skills for complex problem solving are those frequently associated with analysts. This is not a call to clamor for territory among business analysts but to recognize how the boundaries of the fields push against each other and, at points, overlap.
Technical Communication, Globalization and the Importance of Standards
As noted, Damrau suggests that both technical communicators and business analysts “should be well-versed” in BPMN because of its ability to show “the complete end-to-end process-oriented view of a business process” (2010, p. 333). As she summarizes, a BPMN visualization displays “the steps and actors (humans or systems) in a process, describes what information is needed when, and determines where transfers (handovers) take too much time” (p. 333). Because practitioners wrestle with new communication challenges brought on by globalization, standardized models for capturing these complex processes could play a key role in bridging gaps formed from international and cultural communication differences. The VP’s company headquarters are in Europe, and her role was to expand the company’s market share to the U.S. Granted, the communication differences between two Western cultures are relatively small when compared with corporations that have colleagues and customers in China, India, Europe, and the U.S. The V.P. was adamant in our interviews that standards matter, especially cross-culturally.
Discussing the challenges of developing successful approaches to communication design, Getto and St. Amant suggest that the “trick becomes finding a method that can facilitate communication design practices for global audiences” (2014, p. 25). Standards-based models like BPMN should be among the “tricks” up a technical communicator’s sleeve. The OMG that maintains BPMN and UML is an international standards organization and an international community of researchers publishes much of the scholarship available on BPMN. Writing from the Bucharest University of Economic Studies, Geambaşu (2012) offers her review of BPMN and its abilities to help address “a growing interest of organizations in improving their business processes in order to be more competitive in a globalized economy” (p. 637). Italian colleagues Chinosi and Trombetta (2012) note that to “share a diagram across multiple domains and using many different technologies and softwares is seen as big challenge” (p. 124). In their article on the standard, they conclude that BPMN is now the de facto means “for representing in a very expressive graphical way the processes occurring in virtually every kind of organization” (p. 124). Finally, at the outset of his book, Silver (2011) is adamant about the important role standards play in modeling:
The most important thing about it is that it is a standard. . . . That means it is not owned or controlled by a single tool vendor or consultancy. You pay no fee or royalty to use the intellectual property it represents. Today, virtually every process modeling tool supports BPMN in some fashion. . . . A key benefit of a process modeling standard is that understanding is not limited to users of a particular tool. The semantics are defined by the standard, not by each tool. (p. 3)
Given the emergence of standards-driven process modeling and the possibilities for a specific standard such as BPMN to assist technical communicators in addressing the challenges of international communication, below is a list of suggestions for practitioners interested in exploring BPMN:
- Research and read free documentation on BPMN on the bpmn.org website. The documentation section contains basic overviews and the resources section contains free videos and tutorials of BPMN in action.
- Download and try one of the many free UML/BPMN tools. This project used Modelio from modelio.org and the site’s Quick Start Guide takes the user through download and installation all the way through project creation. As of 2012, Microsoft Visio now supports BPMN 2.0. If technical communicators are already accustomed to working in Visio, this may be a friendlier environment to experiment with BPMN.
- Reevaluate job descriptions in relation to the actual or predominate tasks performed as a technical communicator. As the demands of work evolve, have those demands converted into work typically associated with business analysts? If so, it may benefit a technical communicator to compile those tasks and request an updated job description or change of position in order to better reflect the work performed.
- Using a standards-based process model such as BPMN, attempt to model a business or systems process that technical communicators either participate in or are in the midst of developing. Consider the value the diagrams may bring to all of those involved, especially an international contingent. Circulate the model among those involved with the intent of providing a new perspective on the process.
Conclusions: Implications for Teaching and Practice
Returning to Rude (2009) and her observation that “[t]he connection between pedagogy and practice is close, especially when pedagogy concerns undergraduate or master’s-level preparation for practitioner purposes” (pp.182, 186), this article closes reflecting on this (dis)connection. Her article mapping the research questions for the field of technical communication indicates renewed unease for how we prepare students for nonacademic writing with a specific mention of teaching students “how to present results in ways that are ethical, usable, and appropriate for the need” (p. 195). Among Rude’s maps are overlapping spheres of pedagogy and practice, containing “information design” and “development and management” as the two key subcategories crucial to a practitioner’s success (p. 183). Again, according to Spilka (2009), where we should have an overlap, we instead have a “disconnect” (p. 219). Working with the VP, even in this admittedly small study isolating one RFI, has pushed me to address what may be my own contributions to the disconnect.
My strategy so far has not been to cut theoretical texts exploring genre and activity theory or actor-network theory from research methods reading lists. Because “[q]ualitative sampling is often decidedly theory driven” the texts are important inclusions (Miles, Huberman, &, Saldaña, 2014, p. 31, emphasis in original). Students should learn to question, “what work the theory does for us” (Winsor, 1999, p. 201). Much of the scholarship cited in this article represents elegant deployments of a theory—that is, a specific methodology set out to drive a particular method of qualitative sampling. Students benefit from contending with the complex relationship between a theory driving a particular method, but I have found that students often misunderstand the basic difference between a theory or methodology and a method. In other words, before we study them working together, we start by understanding them separately. Citing Sullivan and Porter (1997), Spinuzzi offers an important distinction between method and methodology that I am mindful of sharing with students. Specifically, “they express quite different things. A method is a way of investigating phenomena; a methodology is the theory, philosophy, heuristics, aims, and values that underlie, motivate, and guide method” (2003, pp. 6–7, emphasis in original). The terms are not interchangeable. Graduate students in particular will often begin a class outfitted with their favorite methodology in mind and mistake that methodology as a means in and of itself for conducting a study. If we agree that “[o]ur research questions demand a greater variety of methods” then we will need to be strategic about the instruction of those new methods including qualitative approaches (Rude, 2009, p. 177).
Resources like Koerber and McMichael’s primer for qualitative sampling is a good option but so are chapters from Miles, Huberman, and Saldaña (2014) and Saldaña (2013) to show students study methods as distinct from methodologies. Finally, Smagorinsky (2008), in his often cited article on the frequently overlooked importance of the role the methods section plays in a qualitative study, is a good addition to a graduate methods course. He advises: “The Method section, then, has evolved to the point where, in order for results to be credible, the methods of collection, reduction, and analysis need to be highly explicit. Further, the methods need to be clearly aligned with the framing theory and the rendering of the results” (p. 392). One of the best ways to reinforce these points is to read work, such as the scholarship cited here, that demonstrates a strong methods section.
Students should also understand that methodologically complex texts with data visualizations that are not recognized as industry standards may not be appropriate models for their work if their next move is to enter industry. The blunt, perhaps even crude, question here is whether or not methodological complexities such as activity theory, genre theory, and actor network theory are worth their efforts. How useful, or even reliable, are they for the field? Those questions are for future projects, but Rude’s (2009) original concern over preparing undergraduate and master’s-level students for the workplace suggests that we should have candid discussions with students about audience for each of the different texts we employ in the classroom. For example, in an otherwise laudatory review of Spinuzzi’s Network: Theorizing Knowledge Work in Telecommunications, Yeats (2010) writes:
Clearly, Spinuzzi’s target audience is not made up of telecommunications executives, but it’s not unreasonable to say that the recommendations for workers and managers in the final chapter of Network could be used to develop strategies and policies that help workers and managers become more effective and efficient in their work. I fear, however, that no VP at Verizon or AT&T will pick up a copy of Network at the airport bookstore. (p. 321, emphasis in original)
And that’s fine; even academics as well published as Spinuzzi are not (generally) writing for traveling executives passing through O’Hare. Spinuzzi’s deft deployment of activity theory and ANT in Network has received widespread acclaim in academic communities and it challenges graduate students at all levels. But, my mistake with modeling data for the VP in my study was thinking that I was writing for her (an executive audience) when I was writing and modeling with more academic tools (perhaps) better received by an academic audience. The hexagon models representing activity systems offered by Hart-Davidson, Spinuzzi, and Zachry (2009) were also not necessarily designed for the business process modeling I was attempting to use them for. But if that is the case, how do we teach our students standardized data visualizations and modeling in our research methods courses so that will have value outside of the classroom? Again, current qualitative methods sourcebooks are effective points of departure.
Miles, Huberman, and Saldaña (2014) divide visual displays into matrices and network models. The tabular form of matrices represent data in standard rows and columns that “collects and arranges data for easy viewing in one place, permits detailed analysis, and sets the stage for later cross-case analysis with other comparable cases or sites” (p. 111). Network displays are visual models that resemble UML and BPMN diagrams. They are a “collection of nodes or points connected by links or lines that display streams of participant actions, events, and processes” (p. 111). The nodes visualized in the many samples offered by Miles, Huberman, and Saldaña are often circles, ovals, and rectangles representing actors or attributes enmeshed in a network. Such networks are recommended for “a case-oriented approach that re-creates the ‘plot’ of events over time, as well as showing complex interrelationships between variables” (p. 111). Again, the authors recommend a number of CAQDAS applications for qualitative projects, most of which have the ability export visual displays of case data. Microsoft applications such as Word, Excel, and PowerPoint will also suffice for creating matrices and networks.
However, those modeling larger projects intended to capture business processes should consider selecting a UML/BPMN tool. Selecting a Microsoft product “is fine for initial descriptions, which require easy, flexible usage, but quickly turns out to be counterproductive when managing a structured whole over time” (Desfray & Raymond, 2014, p. 230). Again, the latest version of Microsoft’s Visio does support BPMN 2.0 and, provided that a practitioner or student has access to it, Visio may offer a more familiar interface for exploring process modeling. But, as a free, open source application, Modelio is an easy way to get students producing data models that are scalable and standards-based. Regardless of the specific application selected, “BPMN support is available in most UML tools” (Desfray & Raymond, 2014, p. 230). Technical communication instructors should consider introducing these modeling tools to their students and at least give students the option of modeling their next project using these standards-based notations. With strategic adjustments to our pedagogy, we can better prepare students to visualize complex networks in formats that are easily shared and recognized by a large, professional audience.
Aschwanden, B. (2015). Technical communication: State of the industry. Intercom, November/December, 15–17.
BABOK. (n.d.) What is business analysis? BABOK Guide v2 Online. Retrieved from http://www.iiba.org/babok-guide/babok-guide-v2/babok-guide-online/chapter-one-introduction/1-2-what-is-business-analysis.aspx
Barrow, J. (2007, March 11) Business analyst v. tech writer. TechWhir-L. Retrieved from http://www.techwr-l.com/archives/0703/techwhirl-0703-00505.html#.VctcHZ1VhBc
Barton, E. (2001). More methodological matters: Against negative argumentation. College Composition and Communication, 51, 399–416.
Bosely, D. S. (1994). Feminist theory, audience analysis, and verbal and visual representation in technical communication writing task. Technical Communication Quarterly, 3, 293–307.
BPMN – Business Process Model and Notation. (n.d.). Charter. Retrieved from: http://www.bpmn.org/
Campbell, K. (1999). Collecting information: Qualitative research methods for solving workplace methods. Technical Communication, 46, 532–545.
Charney, D. (2015). Getting to “How do you know?” Rather than “So what?” From “What’s new?” Technical Communication Quarterly, 24, 105–108.
Chinosi, M. & Trombetta, A. (2012). BPMN: An introduction to the standard.
Computer Standards & Interfaces, 34, 124–134.
Comparison of business process modeling notation tools. (n.d.). In Wikipedia. Retrieved from https://en.wikipedia.org/wiki/Comparison_of_Business_Process_Modeling_Notation_tools
Damrau, J. (2010). [Book review of The process: Business process modeling using BPMN by A. Grosskopf, G. Decker, & M. Weske and BPMN Method and Style by B. Silver. Spinuzzi]. Technical Communication, 57, 333–334.
Desfray, P. & Raymond, G. (2014). Modeling enterprise architecture with TOGAF®: A practical guide using UML and BPMN. Waltham, MA: Morgan Kaufmann.
Dumas, M., La Rosa, M., Mendling, J., & Reijers, H. A. (2013). Fundamentals of business process management. New York, NY: Springer.
Foss, S. K. (1994). A rhetorical schema for the evaluation of visual imagery. Communication Studies, 45, 213–224.
Fowler, M. (2003). UML distilled: A brief guide to the standard object modeling language (3rd ed.). Reading, MA: Addison-Wesley.
Getto, G. & St. Amant, K. (2014). Designing globally, working locally: Using personas to develop online communications products for international users. Communication Design Quarterly, 3(1), 24–46.
Geambaşu, C. V. (2012). BPMN vs. UML activity diagram for business process modeling. Accounting and Management Information Systems 11(4), 637–651.
Johnson-Eilola, J. (2005). Datacloud: Toward a new theory of online work. New York, NY: Hampton Press.
Harder, C. & Jordan, M. (2013). The transparency of county websites: A content analysis. Public Administration Quarterly, 37, 103–128.
Hart-Davidson, W., Spinuzzi, C., & Zachry, M. (2009). Visualizing patterns of group communication in digital writing. RSA 2009 Workshop. June 26–28.
Hashimov, E. (2015). [Book Reviews of Qualitative data analysis: A methods sourcebook by M. B. Miles, M. Huberman, & J. Saldaña and The coding manual for qualitative researchers by J. Saldaña]. Technical Communication Quarterly, 24, 109–112.
Henry, J. (2000). Writing workplace cultures: An archeology of professional writing. Carbondale: S. Illinois U.P.
IBM (2016). IBM Cognos analytics on cloud. Retrieved from http://public.dhe.ibm.com/common/ssi/ecm/yt/en/ytb03039caen/YTB03039CAEN.PDF
Koerber, A. & McMichael, L. (2008). Qualitative sampling methods: A primer for technical communicators. Journal of Business and Technical Communication, 22, 454–473.
Kynell-Hunt, T. & Savage, G. J. (Eds.). (2003). Power and legitimacy in technical communication, volume I: The historical and contemporary struggle for status in technical communication. Amityville, NY: Baywood.
Kynell-Hunt, T. & Savage, G. J. (Eds.). (2004). Power and legitimacy in technical communication, volume II: Strategies for professional status. Amityville, NY: Baywood.
Latour, B. (2005). Reassembling the social. New York, NY: Oxford University Press.
MacKinnon, N. & Murphy, S. (2003). Designing UML diagrams for technical documentation. SIGDOC ’03, October 12–15, pp. 105–112.
MacNealy, M. S. (1999). Strategies for empirical research in writing. New York, NY: Addison Wesley Longman.
Medvedev, P. N. & Bakhtin, M. M. (1978). The formal method in literary scholarship: A critical introduction to sociological poetics. Baltimore, MD: Johns Hopkins U.P.
Miles, M. B., Huberman, M., & Saldaña, J. (2014). Qualitative data analysis: A methods sourcebook. Thousand Oaks, CA: SAGE.
Miller, C. R. (1984). Genre as social action. Quarterly Journal of Speech, 70, 157–178.
Modelio. (n.d.). Tutorials. Retrieved from: https://www.modelio.org/documentation/tutorials.html
Podeswa, H. (2010). UML for the IT Business Analyst: A practical guide to requirements gathering using the unified modeling language. (2nd ed.). Boston, MA: Cengage.
Porter, J. (2013). Chapter 5: How can rhetoric theory inform the practice of technical communication? In J. Johnson-Eilola & S. Selber (Eds.), Solving problems in technical communication (pp. 125–145). Chicago: University of Chicago Press.
Potts, L. (2008). Designing with actor network theory: A new method for modeling holistic experience. Proceedings of the IEEE International Professional Communication Conference. 1–6.
Potts, L. (2009). Designing for disaster: Social software use in times of crisis. International Journal of Sociotechnology and Knowledge Development, 1(2), 33–46.
Potts, L. (2010). Consuming digital rights: Mapping the artifacts of entertainment. Technical Communication, 57, 300–318.
Potts, L. (2014). Social media in disaster response: How experience architects can build for participation. New York, NY: Routledge.
Recker, J. (2010). Opportunities and constraints: the current struggle with BPMN. Business Process Management Journal, 16, 181–201.
Recker, J. (2011). Evaluations of process modeling grammars: Ontological, qualitative and quantitative analyses using the example of BPMN. New York, NY: Springer.
Reich, R. (1991). The work of nations: Preparing ourselves for 21st-century capitalism. New York, NY: Knopf.
Rice, J. (2009). Networked exchanges, identity, writing. Journal of Business and Technical Communication, 23, 294–317.
Rosemann, M., Recker, J., Flender, C. (2008). Contextualization of business processes. International Journal of Business Process Integration and Management, 3(1), 47–60.
Rude, C. (2009). Mapping the research questions in technical communication. Journal of Business and Technical Communication, 23, 174–215.
Russell, D. (1997). Rethinking genre in school and society: An activity theory analysis. Written Communication, 14, 504–554.
Russell, N., van der Aalst, W., ter Hofstede, A., & Wohed, P. (2006). On the suitability of UML 2.0 activity diagrams for business process modelling. Third Asia-Pacific Conference on Conceptual Modelling. Hobart, Australia, 1–10.
Saldaña, J. (2013). The coding manual for qualitative researchers. Thousand Oaks, CA: SAGE.
Silver, B. (2011). BPMN method and style. 2nd ed. Aptos, CA: Cody-Cassidy Press.
Smagorinsky, P. (2008). The method section as conceptual epicenter in constructing social science research reports. Written Communication, 25, 389–411.
Spilka, R. (2009). Practitioner research instruction: A neglected curricular area in technical communication undergraduate programs. Journal of Business and Technical Communication. 23, 216–237.
Spinuzzi, C. (2003). Tracing genres through organizations: A sociocultural approach to information design. Cambridge: MIT Press.
Spinnuzi, C. (2007). Guest editor’s introduction: Technical communication in the age of distributed work. Technical Communication Quarterly, 16, 265–277.
Spinuzzi, C. (2008). Network: Theorizing knowledge work in telecommunications. New York, NY: Cambridge U.P.
Spinuzzi, C. (2013). Topsight: A guide to studying, diagnosing, and fixing information flow in organizations. CreateSpace Independent Publishing Platform.
Sullivan, P. & Porter, J. (1997). Opening spaces: Writing technologies and critical research practices. Greenwich, CT: Ablex.
Swarts, J. (2010). Recycled writing: Assembling actor networks from reusable content. Journal of Business and Technical Communication, 24, 127–163.
Van Nostrand, A.D. (1994). Chapter 8: A genre map of R&D knowledge production for the US Department of Defense. In A. Freedman and P. Medway (Eds.) Genre in the new rhetoric. (pp. 111–121). Bristol, PA: Taylor & Francis.
Van Nostrand, A.D. (1997). Fundable knowledge: The marketing of defense technology. Mahwah, NJ: Lawrence Erlbaum Associates.
Visual Paradigm. (n.d.) Introduction to BPMN. Retrieved from: http://www.visual-paradigm.com/tutorials/bpmn1.jsp
Watson, A. (2008). Visual modelling: Past, present, future. Retrieved from: http://www.uml.org/Visual_Modeling.pdf
White, S. (2006). Introduction to BPMN. Retrieved from: http://www.omg.org/news/meetings/workshops/soa-bpm-mda-2006/00-T4_White.pdf
Whittemore, S. (2012). Immutable mobiles revisited: A framework for evaluating the function of ephemeral texts in design arguments. Journal of Technical Writing and Communication, 42, 413–430.
Wilson, G. (2001). Technical communication and late capitalism: Considering a postmodern technical communication pedagogy. Journal of Business and Technical Communication, 15, 72–99.
Winsor, D. (1996). Writing like an engineer: A rhetorical education. Mahwah, NJ: Erlbaum.
Winsor, D. (1999). Genre and activity systems: The role of documentation in maintaining and changing activity systems. Written Communication, 16, 200–224.
Winsor, D. (2001). Learning to do knowledge work in systems of distributed cognition. Journal of Business and Technical Communication, 15, 5–28.
Woelk, M. (2015). How “Jill” landed her dream techcomm career. Intercom, November/December, 32.
Yeats, D. (2010). [Book review of Network: Theorizing knowledge work in telecommunications by C. Spinuzzi]. IEEE Transactions on Professional Communication, 53, 320–321.
About the Author
Brian D. Ballentine is an associate professor and the associate chair for the Department of English at West Virginia University. Previously, he was a senior software engineer for a major medical corporation designing user interfaces for Web-based radiology applications. His publications have appeared in journals such as Technical Communication, IEEE Transactions on Professional Communication, Technical Communication Quarterly, Computers & Composition, the Journal of Technical Writing and Communication, and Across the Disciplines as well as in several collected editions. He is available at Brian.Ballentine@mail.wvu.edu.
Manuscript received 13 August 2015, revised 6 April 2016; accepted 15 April 2016.