60.2, May 2013

Recent & Relevant

Roger A. Grice, Audrey G. Bennett, Janice W. Fernheimer, Cheryl Geisler, Robert Krull, Raymond A. Lutzky, Matthew G.J. Rolph, Patricia Search, and James P. Zappen

Abstract

Purpose: In this article I propose an audience analysis instrument designed to assess representative members of a desired target population’s underlying predispositions in terms of the sources of information they privilege, their motivations toward environment-related action, and the commonplaces that impact their perceptions of environment-related communication. The goal of this method is to offer a time- and cost-effective instrument that enables organizations to easily classify an audience’s interest in environmentalism, assess their willingness to listen to and accept environment-related messaging, and pinpoint the commonplace elements likely to be most useful in constructing environment-related communication.

Method: I developed the interview, coding sheet, instructions for completing the process, and glossary that make up the Deep Audience Analysis instrument from existing data presented elsewhere (Ross 2013, 2012, 2008), and refined the instrument through both participatory design and usability testing.

Results: The results of my testing suggest that the Deep Audience Analysis tool I propose is both valid and reliable, but training with the instrument would prove beneficial, and triangulation with multiple coders is optimal. Organizations adopting this method of audience analysis would do well to practice with the instrument and have norming sessions before putting the instrument to use in the field.

Conclusion: The audience analysis instrument I propose here includes an interview script, code sheet, instructions for completing the process, and glossary. The method proposed here should serve as a time- and cost-effective paper-based strategy for organizations wishing a deeper understanding of their audience(s).

Keywords: audience analysis, environmental communication, environmental rhetoric, organizational communication

Practitioner's Takeaway

  • Effective audience analysis allows individuals and organizations to more effectively communicate with their audience(s).
  • Audience analysis measures are often expensive, complex, and/or result in acquisition of non-target data which hinders application of analysis results.
  • The instrument I propose is a qualitative, practical, analog, step-by-step approach that does not require training in statistical analysis or software packages and yields an overarching audience profile designed for direct application.
  • The instrument appears both valid and reliable and could be easily modified for use outside of environment-related communication.

Introduction

“To write, to engage in any communication, is to engage in a community: to write well is to understand the conditions of one’s own participation—the concepts, values, traditions, and style which permit identification with that community and determine the success or failure of communication” (Miller, 1979, p. 617).

In the summer of 2011, the Sierra Club released an Internet-based video detailing the problems associated with once-through cooling in coal plants. “Chopper,” animated by Pulitzer Prize-winning cartoonist Mark Fiore (Sierra Club, 2011), features a lilting-voiced narrator and cartooned elements, such as a talking coal plant. The video’s stated intent is to encourage its audience to “send a message to the EPA [Environmental Protection Agency]” to “urge the Agency to protect our fish and waterways by requiring water-saving, fish-protecting cooling towers on power plants!”  According to a representative from the Sierra Club, the primary audience for the video consisted of the general public, and more directly, members of the Sierra Club. The club chose key phrases and images, such as a river surface coated with cartoon blood, fish parts, and bones to trigger action while presenting a large amount of complex information—how once-through cooling in power plants works and its environmental impact—in a brief, yet positive and engaging manner (Sierra Club Representative, personal communication, January 11, 2012).

When the video debuted, I was teaching an environmental rhetoric, ethics, and public policy graduate-level course. We watched the video. My students—for the most part 20-30 year-old females who claimed awareness of Internet use and social networking skills—did not respond as we believed the Sierra Club intended. They laughed, not with the video, but at the narrator, the child-like song, and the talking coal plant. They commented that the video could not be taken seriously and agreed they were not inspired to take action. The Sierra Club representative with whom I spoke, however, explained that the video ultimately did work—it garnered over 30,000 trackable views and facilitated communication with the EPA—and why it worked: careful audience analysis and dissemination. The Club initially included a link to the video in an email to Sierra Club members, targeted partially based on key words and phrases from the Club’s extensive member database.  Ultimately, however, views of the video itself were fewer than some of their other projects (according to the representative, for example, their Beyond Coal Breakdance Mob garnered over a million views), and only one comment from the public-at-large (at the time of this writing) had been posted to the video’s YouTube page: “What a bunch of BS wrapped in a cartoon” (YouTube, 2012).

The “Chopper” video illustrates an interesting case of environmental messaging in relation to audience analysis—environmental messages must be tailored to their audience in order for them to be effective. Movement outside of an intended primary audience appears to lessen impact, or, as the one response on YouTube suggests, even have a negative effect. Not all organizations have the tools that the Sierra Club uses to analyze its audiences. Not all organizations can afford unintended effects beyond the intended audience (see, for example, Lancaster, 2006). While the Club uses its extensive membership database to determine how to best create and distribute messages to members, and even employs a polling and research strategist (personal communication, January 11, 2012), other organizations may need to use less expensive, more immediately available means to learn how to shape their messages. As noted by Swenson, Constantinides, and Gurak, “audience analysis methodologies [such as developing and maintaining an effective database like that employed by the Sierra Club] are often difficult, time consuming, and economically unfeasible” (2002, p. 350).

My goal with this paper is to propose an audience analysis instrument designed to assess representative members of a desired target population’s underlying predispositions in terms of the sources of information they privilege, their motivations toward environment-related action, and the commonplaces that impact their perceptions of environment-related communication. The goal of this method is to offer a time- and cost-effective instrument that enables organizations to easily classify an audience’s interest in environmentalism, assess willingness to listen to and accept environment-related messaging, and pinpoint the commonplace elements likely to be most useful in constructing environment-related communication. To create this instrument, I developed an interview script, code sheet, instructions for completing the process, and glossary based on existing interview data. I refined the coding package through both participatory design and usability testing.

Evolving Audience Analysis 

In 1975, Walter Ong suggested that audience is a “collectivity” (1975, p. 11) where members read individually, but those with shared interests can function as a cohesive whole. Ong argued that the writer must “construct in his imagination, clearly or vaguely, an audience cast in some sort of role […]” (p. 12). In doing so, he notes, the writer imagines an audience to use as a tool for shaping creative work. Such a view creates an imagined intimacy with the reader that allows complex and emotional communication and interaction. As Simons (1976) points out, different types of audience interaction contribute to decision-making, as do the rhetorical strategies employed by writers. Ong’s construction, then, is a persistent argument and so serves as a fitting beginning toward a consideration of audience.

Ong’s construction of the audience as fictional, however useful a tool, raises problems for technical communicators. The impact of our writing is often direct and measurable: our audiences, as Tomlinson (1990) and Warren (1993) note, are not imaginary. They are “flesh-and-blood individuals who buy, open, and read any printed materials” (Goodwin, 1991, p. 100) who have “values, beliefs, perspectives, knowledge, authorities, politics, expectations, and constraints than enable or limit their ability to read and use technical documents” (Bosley, 1994, p. 296). Even if we must design work for large, somewhat ambiguous collectives, we still need to pinpoint actual users to serve as representatives so that we can effectively design information for the rest of the imagined construct (see, for example,  Swenson, Constantinides, & Gurak, 2002). Thus, audience becomes an increasingly complex problem in technical communication—technical communicators are tasked with learning how to define their audience, deciding how best to analyze them, and then determining how to design information for them.

Defining Audience

In “Audience Addressed/Audience Invoked,” Ede and Lunsford (1984) address the ways writers consider audience via the differences between an “addressed” and “invoked” audience. An “addressed” audience is emphasized by writers who value real-world writing and believe that understanding of audience may be achieved through observation and analysis, while an “invoked” audience is used by those who believe that audience complexity precludes ever truly knowing how an audience will respond to a piece of writing. Both views are valuable, particularly to technical communicators—we need to learn that we can more completely understand audiences than ever before because of advances in theory, analysis techniques, and technology (see, for example, Van Velsen, Van der Geest, & Steehouder, 2010).

Hovde (2000) notes that technical communicators often struggle when they lack access to, or awareness of, their audience (pp. 398-399). While several studies note that many organizations do help technical communicators gain an understanding of their target audience (Bist, Dixon, & Chadwick, 1993; Floreak, 1989; MacKinnon, 1993), other studies (Simpson, 1989) note that direct access may be difficult or impossible. However, it cannot be denied that direct and timely access to real members of an audience/population is most useful (Hovde, 2000, p. 429; Johnson, 1997, p. 363; Schriver, 1997, p. 161; Spilka, 1990, p. 45-49).

Perhaps one reason why audience analysis has at times been attacked (Cohen, 1990) is because it is easy to make the mistake of viewing audience as a passive recipient in the communication process, rather than a user, doer, or participant (Slack, Miller, & Doak, 1993). Once we accept that our audience members actively use the information we design and that they participate in popular culture while contributing to knowledge creation (Hermes, 2009), we can make an effort to pinpoint representative users and begin to assess needs and expectations more effectively.

Analyzing Audience

Alberty (1997) notes that Ede and Lunsford’s approach to audience analysis (1984) involves two essential elements: audience analysis and audience awareness. The analysis component seeks concrete information such as demographics and cultural associations, while the awareness component, as described by Caricato (2000), draws inferences from analyses and utilizes direct feedback from representative members of a target audience. Both modes are deeply entwined and are vital for the complex understanding of users needed by effective communicators.

Both of Ede and Lunsford’s approaches rely on a demographic component for an initial snapshot of audience, and it is a key component of most audience analysis systems. An analytical approach that forms inferences of audience from demographic information, however, creates the danger of stereotyping (Black, 1989; Long, 1990), or can entirely overwhelm a writer with too much nonessential information. It becomes the author’s role to determine what information is needed for effective design. As Dragga and Gong (1989) note:

The modern ability to gather seemingly infinite amounts of demographic information […] requires that the gathering of data be limited to only that information which has relevance to the speaker’s specific communicative aim. This purpose-oriented analysis is the initial modification of the Aristotelian formulation: instead of asking all possible questions about a particular audience, the speaker asks only those questions that deal with the rhetor’s specific rhetorical purpose. (p. 20)

Thus, as Hovde (2000) writes, “simply collecting facts about an audience may be insufficient” (p. 398). While we know that demographic variables are important and can significantly impact the way information is designed and published (see Lippincott’s 2004 discussion on age-related variables, for example), determining which variables on which to focus and why should be an initial component of the analysis process.

Houp and Pearsall (1988) offer questions related to knowledge, experience, relationship, and persona (pp. 20-21), which form the framework of more complex modes of audience analysis—methods which consider factors such a prior knowledge, projected responses to visual layout, level of reading, and comfort levels with knowledge being presented (for example, Allen, 1989). These increasingly complex methods of analysis can be sorted into types. Warren (1993), for example, notes three “increasingly complex” (p. 83) classes: demographic analyses, which ask questions about people to infer group characteristics; organizational analyses, which work to determine a reader’s organizational role and their needs; and psychological analyses, which ask what readers’ needs are in terms of what they need to know, what they need help understanding, and what the writer wants the reader to do with the information.

Similar to Warren’s organizational and psychological labels are cognitive-based or intuition-driven approaches. As described by Bocchi (1991), a cognitive-based approach “centers on the individual writer’s creation of a text for the rhetorical situation, essentially a problem-solving process” (p. 153). Such an approach is illustrated in Pearsall’s (1997) suggestion that authors document key elements related to audience, such as such as: “Reader’s concerns and characteristics;” “Reader’s education and experience in the subject area;” and “Reader’s attitudes toward my purpose.” He suggests that once you have completed this analysis, “you are ready to choose your content and to organize it in the way that best suits your purpose and audience” (p. 11). Similar approaches are often espoused in technical communication textbooks and accompanied by forms to aid the author (see, for example, Markel, 2010, p. 102; and Burnett, 2001, p. 69).

An intuition-driven approach (Bocchi, 1991; Schriver, 1997) uses conversations with members of a community to help form models of the rest of the community. As Schriver notes, intuition-driven audience analysis involves the creation of a “mental construct of imagined readers” (p. 157), which allows communicators to visualize the audience to whom they are writing, then shape communication to address that imagined audience’s needs and expectations. The intuition-driven approach is similar to Longo’s (1993, 1995) value-driven approach, which suggests that an author can determine “how the reader’s community values the subject” through “articles in journals and newspapers, interviews with community members, and your interpretations of graphic material” (1993, p. 168). Referring to Young, Becker, and Pike (1970), Longo notes that some respondent/community values will be more important than others, so the designer must give those socially-important elements priority.

To the intuition-driven approach, Karen Schriver (1997) adds classification- and feedback-driven approaches. In the classification-driven approach, “communicators begin their analysis by brainstorming about the audience and by cataloging audience demographics (for example, age, sex, income, educational level) or psychographics (for example, values, lifestyles, attitudes, personality traits, work habits)” (p. 155). A feedback-driven approach is one in which the analysis is based on an examination of “real readers interpreting real texts” (p. 160). Such an approach is akin to participatory design or usability testing. In this model of audience analysis, readers are given copies of a text, then observed and questioned as they engage with the content. Such an approach helps a communicator understand how a reader will approach the communication artifact, and what they will likely take away from the encounter.

In Multidimensional Audience Analysis for Dynamic Information, Albers (2003) notes that “current methods [of audience analysis] are good at defining what data to collect, but are weak at approaches to analyzing and interpreting that collected data” (pp. 264-265). He describes most techniques as effective, but lacking because they leave out the important element of helping the writer determine relationships among answers. Albers suggests that “the difficulty of both collecting the data and performing that analysis prevents many writers from using an audience analysis beyond a superficial level,” and notes that a multidimensional approach to audience analysis that accounts for an audience’s needs and expectations in relation to levels of knowledge, detail, and cognitive abilities (p. 266) could be more effective. Dimensions operate independently, so based on audience analysis a writer may need to structure information to, for example, account for an audience’s low knowledge about a subject while considering their high cognitive abilities. Albers’s presentation, in many ways an updated and expanded presentation of Hart’s (1996) reconsideration of the “five W’s,” (“who?,” “what?,” “when?,” “where?,” and “why?,” as a path toward user-centered design) suggests user-centered design of information, and also lends mobility to information post-design. After analysis, information is structured to enable readers to creatively utilize information as needed. The dimensionality allows for more complex views of audience than many prior methods.

In terms of dimensionality, audience can be considered on many different levels. Albers notes the value of considering “three distinct dimensions […]: knowledge level, detail level, and cognitive abilities.” He notes that “depending on the situation, other dimensions may also come into play, with social or cultural factors being a common one” (2003, p. 266). Similarly, Turns and Wagner (2004) note that audience can be considered in relation to:

  • Role (of user)
  • Goals (what a user wants/needs to accomplish)
  • Knowledge (what a user brings to the table)
  • Human factors (physical/cognitive limitations)
  • Circumstances of use (environmental factors, including technological limitations)
  • Culture (beliefs, language, traditions, and values—see, for example, Hoft, 1999, p. 69, p. 72)

In her presentation on Key Attribute Audience Modeling at the 2010 Conference on College Composition and Communication, Karen Schriver described a three-dimensional model that incorporated audience expertise, motivation, and anxiety.  She suggested that audience-modeling tools should limit their considerations to just three attributes and argued that it is difficult for an information designer to hold more than three or four audience aspects in ready memory at any given time.

Multidimensional methods of audience analysis are wonderfully complex methods of gaining an understanding of audience—they allow technical communicators to comprehend audience needs and expectations, which potentially facilitate highly effective communication—however, the same complexity that makes them so effective also puts them at a disadvantage. An author working with a strict timeline, under a strict budget, or without the means to collect or interpret that level of data cannot fully use these techniques.

The Deep Audience Analysis method I propose here acknowledges the limits of human cognition by keeping to a profile which includes only one overarching category, three commonplace elements to invoke, and a favored mode of communication.  It strives to account for the myriad approaches to audience analysis which the aforementioned authors—and more—have outlined. In doing so, I propose a method that is designed to be both time and cost effective while maximizing actionable content.

Deep Audience Analysis: A Brief Overview

The Deep Audience Analysis instrument I propose (Appendix 1) includes an information sheet, interview protocol, coding worksheet, classification worksheet, and a glossary accompanied by descriptive appendices. It is designed to provide an end-classification audience profile which an information designer may use as a heuristic when constructing audience-specific environment-related communication.

The package begins with an explanatory sheet which offers context for the systematic audience analysis process, which is as follows:

  1. Conduct and transcribe interview. See “General Interview Protocol” for steps.
  2. Code and mark up interview transcript. See “Coding Process” for steps.
  3. Complete worksheets A and B based on transcription coding and markup.
  4. Assign a Final Profile based on your findings.
  5. Use your final profile to shape, or re-shape, your messaging to emphasize elements indicated as positive (+) and avoid elements indicated as negative (-).

Users are guided through a thirteen-point interview—intended to be conducted with members of the target audience population—then provided information on how to effectively transcribe the interview. Following transcription, users are instructed to code the transcript, and provided coding guidance in the form of explanations of commonplaces and sub-topics, suggestions on what to look for when coding, and examples showing marking and color-coding techniques for marking up the transcript. Users are then instructed on how to fill out two worksheets based on their coding findings. These worksheets lead the user to the final profile sheets, and instructions and examples are provided to help them fill out these sheets. The Deep Audience Analysis coding package concludes with two glossaries to help users more fully understand the worksheets and process.

This normative, systematic analysis process offers an analysis method which, while requiring some practice, results in replicable, reliable, audience profiles without the need for extensive training outside of the coding system, or training in statistical analysis or software packages. The process yields an overarching audience profile designed for direct application. One of the benefits to the method I propose is that it not only results in information about the audience, but directs the author toward potential uses of that information.

Justifications for the Deep Audience Analysis Method

In 1982, Douglas Park argued:

To learn how to systematically analyze audience in discourse, […] it seems best to avoid the metaphor, to replace the question, ‘who is the audience?’ with a set of more precise questions as to how the piece in question establishes or possesses the contexts that make it meaningful for readers. (p. 252)

The Deep Audience Analysis method (DAA)—“deep,” because of the levels of specificity resulting from the end-product of analysis—adds to methods of analysis designed to establish meaningful context by encouraging a writer to engage with his or her intended audience and ask how they perceive the environment-related information that shapes—and is shaped by—their worldviews. In doing so, DAA helps communicators more effectively design information for their audiences.

The method proposed here falls largely into the intuition- and value-driven approaches, and eschews most demographic variables in favor of a social-contextual approach that ultimately establishes the environmental commonplaces (see Ross, 2013, 2012, 2008) which motivate a reader to action. Such an approach fosters audience awareness through analysis (see Alberty, 1997; Caricato, 2000), thereby addressing Hovde’s (2000) concerns that technical communicators lack such awareness. The focused nature of DAA means that information designers wanting to acquire more information about audiences for environment-related information production can follow the process as given without having to address the “where do we begin” aspects of audience analysis.

The few demographic variables collected through DAA interviews are collected and coded in such a way as to address Black’s and Long’s (1989, 1990, respectively) concerns about stereotyping and Lippincott’s (2004) concerns about age-related variables. Information coded in the DAA process does not yield a profile based on a person’s physique or claimed identity, but rather on the words and phrases they use and the patterns that evolve as they respond to structured interview questions. The approach considers Dragga and Gong’s (1989) concern about overwhelming amounts of data, and Warren’s demographic, organizational, and psychological variables (1993) through directed, organized, context-specific data collection and analysis. In addition, DAA acknowledges cognitive-based approaches by asking writers to engage with members of the potential audience in order to document audience expectations for environment-related communication.

Development of the Deep Audience Analysis Instrument

I developed the interview, coding sheet, instructions for completing the process, and glossary that make up the DAA instrument from existing data, and refined the instrument through both participatory design and usability testing. All aspects of this study were conducted in full compliance with the guidelines established by the institutional review board (IRB) for human subject research at the institutions where this research and writing took place.

During the summer of 2007, I collected 261 three-to-five minute interviews with visitors to, and employees of, the Glen Canyon and Hoover dams. The data from which the coding package initially derives came from the transcription and coding of 125 interviews with American citizens collected at the Glen Canyon Dam in May of 2007. The package was later usability tested on five transcripts from the Hoover Dam data, also collected in May of 2007, and two transcripts from interviews with foreign visitors to the Glen Canyon Dam not included in the original data.

Each interview was conducted according to the protocol in Appendix 2, which was designed to answer the research question “what are commonplaces of environmental rhetoric?” Briefly put, a commonplace is a word or phrase which brings an audience to a place of shared understanding through “applicable in common” (Aristotle, 2007, p. 45) elements from which a rhetor may develop argument (Miller, 2000). The interview protocol was developed through testing with graduate students and faculty at a large, research-oriented institution, and further refined through field testing. Data were coded using a grounded theory approach (Creswell, 2003; Glaser & Strauss, 1967; Lewis & Whitely, 1992; Richards, 2006) and using NVivo 8 as a data-management tool.

Coding, as described by Creswell, who cites Rossman and Rallis (1998, p. 171), is:

The process of organizing […] material into “chunks” before bringing meaning to those “chunks.” It involves taking text data or pictures, segmenting sentences (or paragraphs) or images into categories, and labeling those categories with a term, often a term based in the actual language of the participant (called an in vivo term). (p. 192)

The iterative coding process I used resulted in the description of twelve commonplaces of environmental rhetoric: “Al Gore;” “Balance;” “Common Sense;” “Environment as Setting;” “Experience;” “Extremism;” “Man’s Achievements;” “Pragmatism;” “Proof;” “Religion;” “Recycling;” and “Seeing is Believing,” which I describe elsewhere (Ross, 2013, 2012, 2008). These commonplaces, which emerged from coding, suggested potential use in audience analysis, so I decided to pursue development of an audience analysis coding package for public use.

Development and Usability Testing of Coding Worksheet

I developed, tested, and revised the Deep Audience Analysis coding package using participatory design (for example, Spinuzzi, 2005), which included co-interpretation of design by designer-researchers (p. 164) and testing for inter-rater reliability.  Usability testing of the DAA Coding Package was designed using a structured task, participant observation, think-aloud protocol, and post-task questionnaire (for example, Barnum, 2011). Three 20-25 year-old females, two of whom had experience designing information for non-profit organizations, and so required little explanation toward the value of understanding audience needs and expectations when designing informational brochures, flyers, posters, and so forth, tested the coding package. All three were Master’s-level students in the English department at the university where I developed the tool. I further refined the instrument through consultation with technical and professional communication specialists in the same department.

Table 1. Transcript Coding and Comparison of Versions One and Two of Coding Worksheet

Case # Total coding instances Coding agreement† Coding disagreement‡ Agreement to change• Initial agreement Total end-percent agreement
1 7 6 0 1* 85.71 100
2 15 10 0 5* 67 100
3 17 12 1 4* 70.58 94.12
4 19 14 5 3 73.68 89.47
5 + 20 12 0 8 60 100
6 + 11 6 1 4 54.5 90.9

†Coders initially coded the same lines the same way
‡Coders coded the same lines differently and did not agree to change
• Coders initially coded the same lines differently, but agreed to change to a different classification, bringing coding into agreement
* Category changes here are from initially un-coded lines or phrases, not from disagreement
+ First use of version 2.0 of coding sheet.

The first step in the development of a financially viable, readily accessible coding package was moving coding from NVivo 8 to a paper-based worksheet with an eye toward transparency and accessibility. To refine this worksheet for the project at hand, I first created a table of commonplace categories and subcategories as previously documented elsewhere (Ross, 2013, 2012, 2008). This table was then refined to account for the “shelf life” of commonplaces: because commonplaces are representative of popular discourse at particular places and times (Killingsworth, 2005) they may shift with changing aspects of popular culture. Thus, commonplaces specific to a particular place and time, such one initially described as “Al Gore,” were revised (in this case to “influential persons”)  in order to account for the certainty that new identifiers (for example, Leonardo DiCaprio or Barack Obama), will become topics of common conversation with regard to environmental communication.

For the first version of the package, two coders marked and coded four transcripts. The coders compared their results and discussed discrepancies, resolving to agree, disagree, or agree to change topic/sub-topic descriptions. Following these discussions, additions of affective attitude (positive, negative, or ambivalent) were added, commonplace descriptions were edited, and Worksheet B (Information Acquisition and Audience Classification) was extensively refined.

After the initial participatory design process, two more transcripts were coded with the same imperatives in order to test changes to the instrument. The average end-percent agreement for these six total coding instances was 95.78% agreement (Table 1). The end-classification scheme in these cases was unanimous (100% agreement), though how the coders documented results was not, resulting in a restructuring of the instructions sections on Worksheet B in order to clarify end-presentation of results.

This process of participatory design and usability testing resulted in a coding package that appeared to be usable by an audience largely unfamiliar with both coding and environment-related rhetoric, though minor changes in layout were suggested. After refining the appearance of the coding package for increased ease of use, usability testing with a different tester began.

Usability Testing of the DAA Coding Package

In the second round of usability testing, a participant was given only a brief overview of expectations (consisting largely of the explanation that coding is performed in order to pull out interesting and/or unique elements from information) and then asked to familiarize herself with the package and code a single new transcript from the Hoover data. The participant, by her own admission, had no previous experience with coding, and only limited experience with the environment-related language used in the coding package.

Results of Usability Test 

The goal of DAA is to develop an overall profile of an interview respondent which categorizes their interest level in environmentalism, then indicates the commonplaces which the respondent would likely most positively and negatively respond to in communication, followed by an assessment of their desired mode of communication.

The second tester and I had 100% agreement on the end-classification scheme and top two commonplaces. There was no agreement on the last commonplace, but it should be noted that the third commonplace had to be assessed from a series of commonplaces mentioned only one time each, or twice over the course of a single phrase. This testing instance was judged to be a successful test in that the coder successfully navigated the package and achieved a viable and useful result.

While the second tester successfully navigated the package, she expressed general confusion at many of the words and phrases and took roughly twice as long as expected (46 minutes and 31 seconds to code a 77 line transcript). While this time was largely spent reading through the transcript and coding package multiple times, her thoughts, and a post-test questionnaire (Appendix 3), informed several changes:

  • The initial overview was expanded
  • A section describing effective coding was added
  • Examples were added
  • The sub-topics glossary was added
  • Explanations of transcription practices were added
  • Language throughout the document was refined

Following revisions, the second tester and I each coded two more transcripts using the new version of the coding package. She noted that the additions and changes to the coding package facilitated an understanding of both the instructions and commonplace definitions. Inter-rater reliability on the overall category for the two new transcripts was 100% on the first transcript as written, and within one deviation on the second (she chose “casually interested,” I chose “interested”). Discussion yielded agreement on “interested.” We shared two of three commonplaces. Examination of coding and notations revealed that more commonplaces and subcategory agreements were shared, but inconsistencies in describing the final profiles led to initial confusion. We both agreed that the respondents largely obtained their environment-related information from DVD’s and broadcast media.

A third round of usability-testing was performed with a third tester using the same transcripts as before to allow comparison, and her answers and experiences were compared to our previous results. On the first transcript she answered within one deviation on the subjective overall characterization scale (choosing “casually interested,” where the previous tester and I had chosen “interested”), and chose the same commonplaces to invoke as tester two (100% agreement), though she and I had only one in common (33% agreement). On the second transcript all agreed on the overall category of “interested,” (100%) and had 1 of 3 commonplaces (33%, “experience”) in common. The two new testers shared two commonplaces in common (67%, “experience” and “influential” people), though disagreed on how the respondent felt toward “balance.” Tester three also noted “video” as a preferred mode of communication for the first transcript and “documentaries” for the second, both of which were in keeping with the previous tester’s (and my own) results.

The results obtained through testing suggest that the DAA instrument is both valid and reliable, but training with the DAA instrument would prove beneficial, and triangulation with multiple coders is optimal. Testing regularly yields a consistent overall environmental characterization and at least one, if not more, shared commonplaces. The medium through which the respondents accessed environment-related information came through clearly. Though it is difficult to offer quantitative assessment on the similarities of the commonplaces chosen, examination shows that they seem similar: for the first transcript, for example, I selected the respondent as behaving positively toward proof, pragmatism, and influential persons; tester two rated them as positive toward proof, experience, and human achievements; and tester three also rated them as positive toward proof, experience, and human achievement. Discussion suggested that while I viewed “influential persons” as being highly specific (using names from media, for example), the two new testers read “human achievement” as doing similar things in the transcript. Our analyses, then, would yield a usable outcome with the same end-intent: we might create a message which draws upon human resourcefulness (though I would add celebrity to the mix). It’s notable, however, that we are all not choosing commonplaces which I would view as characteristically unrelated: that is, action, religion, or scene.

In every testing instance with the last two testers, discussion of results led to 100% agreement. Discussion with tester three offered insight into potential confusion of results or lack of willingness to commit to a single answer—she felt that it would not be at all possible to even achieve a common perception of an overarching classification, much less agreement on any  commonplaces. When she saw how her results compared to mine and the other testers, however, she immediately noted that she would be able to proceed through the process with more confidence. Her concern is well stated, which leads me to believe that DAA as presented is a viable tool, but training and norming with others would lead to more effective use in the workplace.

Conclusions

Deep Audience Analysis is ultimately an audience analysis tool grounded deeply within the theories and approaches that comprise the study of technical communication. The approach is informed by work in other fields, including Kassing, Johnson, Kloeber, and Wentzel’s (2010) Environmental Communication Scale; Cordano, Welcomer, and Scherer’s (2003) work on environmental beliefs and behavior; and Pelletier, Tuson, Green-Demers, Noels, and Beaton’s (1998) work on motive-based environmental action. These tools, which enable communicators to quantitatively assess aspects of environmental communication, concern, and motivation, are important instruments for generating knowledge. They are largely statistics-based packages, however, designed for specialists collecting longitudinal, trend-based data. Where DAA differs greatly is in its qualitative, practical approach—it offers an overarching profile of environmental predisposition designed for direct application, and does not require statistical analysis. DAA thus leans more heavily toward a tool for assessing audience than environmentalism itself.

DAA generates a profile based on representative audience members’ underlying rhetorical predispositions (their attitudes toward extremism, celebrity, etc.) and which rhetorical elements they might respond to in order to create a writing heuristic. Because of the heuristic nature of the instrument, small sample sizes should still allow for useful feedback. I view this tool as useful for information designers, and hope that DAA, when placed in concert with existing measurement strategies, can fill in additional pieces of the complex social puzzle with which we all struggle.

Input from professionals in both technical communication and composition and rhetoric also suggests numerous alternative uses to DAA—with minor restructuring to the interview questions the tool could be used by numerous non-environment related nonprofits, such as breast-cancer awareness groups, to determine their target audience’s predispositions toward both the cause and the likelihood of their response toward overarching commonplace categories (for example, action, balance, common sense, experience, and so forth). The interview itself would need only minor restructuring. Notably, question four sets the tone for all that follows. Questions five, seven, and nine would then need to be reworded to match that tone. For example, rewording question four from “when I say the word ‘environmental,’ what is the first thing that comes to mind?,” to “when I say the words, ‘breast cancer,’” or, “when I say the words, ‘medical research,’” or any number of other possible permutations,  then restructuring the following questions (replacing “environmental issues” in five, reframing seven to ask about action in relation to the topic, and replacing “environmental” in nine), could result in usable data.

In terms of coding data obtained from a modified interview set, I suspect the commonplaces themselves would still prove viable, as would many of the subtopics. Some sub-topics, of course, such as the “Pollution” sub-topic, under the “Action” commonplace, or the “Being in Nature” sub-topic, under the “Experience” commonplace, would likely no longer prove applicable, should the overarching theme of the interview change. An organization could choose to either ignore those isolated elements, or conduct a series of interviews, code the transcripts, and ultimately discover sub-topics relevant to their particular interests.

Although testing to date demonstrates the promise of DAA, more research is needed regarding the application of this system. This should include experimentation using control and test groups to further establish effectiveness. The coding package needs to be used in the field by both academic and nonacademic audiences because these audiences will have different approaches, exigencies for use, and difficulties during use. As users assess DAA’s reception—the “meeting between a medium and its audience” (Jensen, 1987, p. 24)—it should be adapted for local needs and expectations. DAA will likely be most useful if it is refined using participatory design and usability testing for specific groups or organizations.

One of the problems with an audience analysis system of this type is that the coders must share an understanding of both the coding and interpretation processes, and these processes must be in accord with the research conducted on how audiences discuss their environment-related values and associated topics. While this coding package appears to work at the academic level when tested by researchers with varying degrees of familiarity with the research and coding-in-general, for it to be of full value to information designers, it is my opinion that training and testing, consisting of coding a series of transcripts with pre-established values, then assessing end-agreement and training for refined coding skills and full understanding of the associated commonplaces, would offer a more robust and valuable skill set for this area of environment-related audience analysis. As shown in Table 1, when researchers first begin to use the coding package, there are more likely to be instances of missed coding opportunities or misjudgment of similar phrases than areas of outright inflexibility or disagreement. Two options then exist for organizations wishing to use this tool “out of the box:” either have two or more members conduct the analysis and compare results, or have a single member train and become intimately familiar with the system prior to use.

In the end, it is my hope that both academic and nonacademic information designers find use in the Deep Audience Analysis system, as it offers a response to Martin and Sander’s vital argument:

In [a climate where citizens affected by environment related decisions come from diverse cultural, financial, and ethnic backgrounds] good decisions regarding public policy issues almost certainly require careful negotiation of the meaning of technical data, with full awareness of the interests of all the parties, including the technical experts and the communicators who represent their expertise to the public. Technical communicators working in this dangerous climate, this nevertheless commonplace climate, need to practice a careful and extended audience analysis that on the surface may look like the traditional practice of identifying potential readers and their concerns, but which is, in fact, more pervasive and fundamental. Audience analysis of the sort we describe here is an inherent part of the writing process that leads to the production of texts that evince an ethical rhetoric, a rhetoric that is as willing to change the self as it is to influence the other. (p. 148)

I propose the Deep Audience Analysis instrument for practitioners with a need to understand their audience’s underlying expectations and motivations so that they may more effectively communicate with them. This project is motivated by the belief that more useful information can be gained from representative members of an audience through human interaction than through impersonal surveys. By talking with people, probing for reasoning behind initial answers, questioning them about their hesitancy to answer questions, or why they appear angry at a thought, or their unintentional laughter, we may uncover more true beliefs and motivations than by attempting to draw inferences of imagined audiences from dehumanized data. This Deep Audience Analysis tool is one proposed method to reduce the complexity of interviews and coding by providing a ready method for interview analysis and comparison, and it is accompanied by glossaries and indexes to help reduce the complexity of the process. The method proposed here should serve as a time- and cost-effective strategy for organizations wishing a deeper understanding of their audience.

Acknowledgments

I thank the numerous agencies that allowed me to carry out this research, especially the Bureau of Reclamation and the National Park Service. Specifically, thanks to the many people at the Carl Hayden Visitor Center, without whom this work would have been impossible. Thanks as well to my colleagues for their feedback, and my research assistants Katie Mullinax, Kristina Litchford, and Sarah Stude, who contributed many long hours of work to this project. Special thanks to Jen Ross for all of her help and support.

References

Albers, M. J. (2003). Multidimensional audience analysis for dynamic information. Journal of Technical Writing and Communication, 33, 263-279.

Alberty, C. (1987). A step beyond audience analysis: A writer’s awareness of audience while composing. Society for Technical Communication Annual Conference Proceedings. Washington, DC: Society for Technical Communication, RET-26-RET-29.

Allen, J. (1989). Breaking with a tradition: New directions in audience analysis. In B. E. Fearing & W. K. Sparrow (Eds.), Technical writing: Theory and practice (pp. 53-62). New York, NY: Modern Language Association.

Aristotle. (2007). On rhetoric: A theory of civic discourse. (G. A. Kennedy, Trans.). New York, NY: Oxford.

Barnum, C. M. (2011). Usability testing essentials: Ready, set…test! New York, NY: Elsevier.

Bist, G., Dixon, K., & Chadwick, G.  (1993). Setting up a customer network to review documentation. Technical Communication, 40, 715-719.

Black, K. (1989). Audience analysis and persuasive writing at the college level. Research in the Teaching of English, 23, 231-253.

Bosley, D. S. (1994). Feminist theory, audience analysis, and verbal and visual representation in a technical communication writing task. Technical Communication Quarterly, 3, 293-307.

Burnett, R. E. (2001). Technical communication (5th ed.). New York, NY: Harcourt College Publishers.

Caricato, J. A. (2000). An analysis of the presenter’s perspective of audience as a partner in visual design. Technical Communication, 47, 496-514.

Cohen, G. (1990). Defining your audience: Let the games begin. Technical Communication, 37, 274-275.

Cordano, M., Welcomer, S. A., & Scherer, R. (2003). An analysis of the predictive validity of the new ecological paradigm scale. Journal of Environmental Education, 34(3), 22-28.

Creswell, J. W. (2003). Research design: Qualitative, quantitative, and mixed methods approaches (2nd ed.). Thousand Oaks, CA: Sage.

Dragga, S., & Gong, G. (1989). Editing: The design of rhetoric. Amityville, NY: Baywood.

Ede, L., & Lunsford, A. (1984). Audience addressed/audience invoked: The role of audience in composition theory and pedagogy. College Composition and Communication, 35, 155-171.

Floreak, M. J. (1989). Designing for the real world: Using research to turn a ‘target audience’ into real people. Technical Communication, 36: 373-381.

Goodwin, D. (1991). Emplotting the reader: Motivation and technical documentation. Journal of Technical Writing and Communication, 21(2), 99-115.

Hart, G. (1996). The five w’s: An old tool for the new task of audience analysis. Technical Communication, 43, 139-145.

Hermes, J. (2009). Audience studies 2.0. On the theory, politics and method of qualitative audience research. Interactions: Studies in Communication and Culture, 1(1), 111-127.

Hoft, N. (1999). Global issues, local concerns. Technical Communication, 46, 145-206.

Houp, K. W. & Pearsall, T. E. (1988). Reporting Technical Information (6th Ed.). New York, NY: Macmillan.

Hovde, M. R. (2000). Tactics for building images of audience in organizational contexts: An ethnographic study of technical communicators. Journal of Business and Technical Communication, 14, 395-444.

Johnson, R. R. (1997). Audience involved: Toward a participatory model of writing. Computers and Composition, 14, 361-376.

Kassing, J. W., Johnson, H. S., Kloeber, D. N., & Wentzel, B. R. (2010). Development and validation of the environmental communication scale. Environmental Communication, 4(1), 1-21.

Kempton, W., Boster, J. S., & Hartley, J. A. (1995). Environmental values in American culture. Cambridge, MA: MIT Press.

Killingsworth, M. J. (2005). Appeals in modern rhetoric: An ordinary language approach. Carbondale, IL: Southern Illinois University Press.

Lancaster, A. (2006). Rethinking our use of humanistic aspects : Effects of technical information beyond the intended audience. Technical Communication, 53, 212-224.

Lippincott, G. (2004). Gray matters: Where are the technical communicators in research and design for aging audiences? IEEE Transactions on Professional Communication, 47, 157-170.

Long, R. (1990). The writer’s audience: Fact or fiction? In G. E. Kirsch & D. Roen (Eds.), A sense of audience in written communication (pp. 73-84). Newbury Park, CA: Sage Publications.

MacKinnon, J. (1993). Becoming a rhetor: Developing writing ability in a mature, writing-intensive organization. In R. Spilka (Ed.), Writing in the workplace: New research perspectives (pp. 41-55). Carbondale, IL: Southern Illinois University Press.

Markel, M. (2010). Technical Communication (9th ed.). New York, NY: Bedford St. Martin’s.

Miller, C. R. (1979). A humanistic rationale for technical writing. College English, 40, 610-617.

Miller, C. R. (2000). The Aristotelian topos: Hunting for novelty. In A. G. Gross & A. E. Waltzer (Eds.), Rereading Aristotle’s Rhetoric (pp. 130-144). Carbondale, IL: Southern Illinois University Press.

Ong, W. J. (1975). The writer’s audience is always a fiction. PMLA, 90(1), 9-21.

Park, D. B. (1982). The meanings of ‘audience’. College English, 44, 247-257.

Pearsall, T. E. (1997). The elements of technical writing. Boston, MA: Allyn & Bacon.

Pelletier, L. G., Tuson, K. M., Green-Demers, I., Noels, K., & Beaton, A. M. (1998). Why are you doing things for the environment? The motivation toward the environment scale (MTES). Journal of Applied Social Psychology, 28, 437-468.

Ross, D. G. (2008). The commonplaces of environmental rhetoric: Resonation and perception of environmentalism in American tourists. Unpublished PhD Dissertation, Texas Tech University.

Ross, D. G. (2012). Ambiguous weighting and nonsensical sense: The problems of ‘balance’ and ‘common sense’ as commonplace concepts and decision-making heuristics in environmental rhetoric. Social Epistemology, 26(1), 115-144.

Ross, D. G. (2013). Common topics and commonplaces of environmental rhetoric. Written Communication, 30, 91-131.

Rossman, G., & Rallis, S. F. (1998). Learning in the field: An introduction to qualitative research. Thousand Oaks, CA: Sage.

Schriver, K. (1997). Dynamics in document design: Creating texts for readers. New York, NY: John Wiley.

Schriver, K. (2010). Document design in transition: Evolving conceptions of audiences as readers. Paper presented at the meeting of the Conference on College Composition and Communication, Louisville, KY.

Sierra Club. (2011). Giant fish blenders: Power plants are destroying our waters [Video]. Retrieved from http://www.sierraclub.org/coal/fishchopper/.

Simons, H. W. (1976). Persuasion: Understanding, practice, and analysis. Reading, MA: Addison-Wesley.

Simpson, M. (1989). Shaping computer documentation for multiple audiences: An ethnographic study. Unpublished PhD Dissertation, Purdue University.

Slack, J. D., Miller, D. J., and Doak, J. (1993). The technical communicator as author: Meaning, power, authority. Journal of Business and Technical Communication, 7, 12-36.

Spilka, R. (1990). Orality and literacy in the workplace: Process- and text-based strategies for multiple audience adaptation. Journal of Business and Technical Communication, 4, 44-67.

Swenson, J., Constantinides, H., &Gurak, L. (2010). Audience-driven web design: An application to medical web sites. Technical Communication, 49, 340-352.

Tomlinson, B. (1990). Ong may be wrong: Negotiating with nonfictional readers. In G. E. Kirsch & D. Roen (Eds.), A sense of audience in written communication (pp. 216-230). Newbury Park, CA: Sage Publications.

Turns, J., & Wagner, T. S. (2004). Characterizing audience for informational web site design. Technical Communication, 51, 68-85.

Van Velsen, L., Van der Geest, T., & Steehouder, M. (2010). The contribution of technical communicators to the user-centered design process of personalized systems. Technical Communication, 57, 182-196.

Warren, T. L. (1993). Three approaches to reader analysis. Technical Communication, 40, 81-88.

Young, R. E., Becker, A. L., & Pike, K. L. (1970). Rhetoric: Discovery and change. San Diego, CA: Harcourt Brace Jovanovich.

YouTube. (2012). Fish chopper: Power plants chop fish and waste water [Video and comments].  Retrieved from http://www.youtube.com/watch?v=gmUm6U01PPM.

About the Author

Derek G. Ross is an assistant professor in the Master of Technical and Professional Communication Program (MTPC) at Auburn University, where he teaches courses in technical communication, document design, environmental rhetoric, and policy writing. His research interests include perceptions of environment-related rhetoric, document design, and audience analysis. His work has appeared in Written Communication, Social Epistemology, Present Tense, and the Journal of Technical Writing and Communication, among others. He is the Ethics Editor/Columnist for Intercom: The Magazine of the Society for Technical Communication. Contact: derek.ross@auburn.edu.

Manuscript received 28 March 2012; revised 28 March 2013; accepted 5 April 2013.

 

Appendix 1: Deep Audience Analysis (DAA) Coding Package

Coder Case #

This tool codes for environmental commonplaces, the stories-within-a-story which motivate a reader to action. It is designed to be used to assess an audience’s underlying predispositions in terms of what sources of information they privilege, what rhetorical elements motivate them to action, and what rhetorical elements positively or negatively impact their perceptions of the environment.

Purpose

The purpose of this tool is to provide an end-classification scheme of representative members of a desired target population in relation to their willingness to listen to/accept environment-related messaging and argumentation. After using this coding tool, your organization should have a deeper understanding of your target audience, and will be able to structure your environment-related communication to achieve maximum desired impact by:

Utilizing or avoiding commonplace narratives indicated by the sample population

Structuring the overall thematic structure of your communication towards levels of pro/anti environmentalism based on the end-classification scheme

Process

  1. Conduct and transcribe interview. See “General Interview Protocol” for steps.
  2. Code and mark up interview transcript. See “Coding Process” for steps.
  3. Complete worksheets A and B based on transcription coding and markup.
  4. Assign a Final Profile based on your findings.
  5. Use your final profile to shape, or re-shape, your messaging to emphasize elements indicated as positive (+) and avoid elements indicated as negative (-).

Use of End Classification Scheme

The end-product of this interview and coding process is a classification scheme that rates the speaker on a scale of their overall interest in, and attitude toward, environmentalism, as well as the top three commonplaces (common narratives) to which they will likely strongly respond in environment-related communication, and subtopics associated with those commonplaces. By being aware of your audience’s attitudes towards environmentalism in general, and the narrative elements to which they will most likely positively or negatively respond, you should be able to shape your environment-related communication to account for your audience’s attitudes and perceptions, thereby increasing the overall potential effectiveness of your messaging.

General Interview Protocol

This interview protocol is designed to lead an audience into a brief discussion of their perceptions of environment-related messaging, argumentation, and rhetoric. The interview is designed to be conducted rapidly (approximately 5 minutes), and can be easily adapted to work with couples or groups of people. Make sure you have the appropriate permissions for your institute to record human subjects, and record the interviews for later transcription.

The suggested probes (listed as heading a, b, etc.) may help to clarify ambiguous answers, or further allow respondent(s) to work through issue(s) they are attempting to articulate. When at all possible, the interviewer should facilitate the interview through verbal and non-verbal cues which do not cast aspersion or hint at judgment. Avoid leading the respondent(s) to a desired conclusion.

Interview

  1. Where are you from?
  2. What brought you here today?
  3. How did you hear about (place)? (optional—use only when interviewing at a tourist-based location)
  4. Now, just to change up a bit, when I say the word “environmental,” what is the first thing that comes to mind?
    1. Why?
  5. Where do you stand on environmental issues?
  6. Based on that, where do you feel that you get your values from?
  7. Do you think of yourself as an “environmentalist,” or “environmentally active?”
  8. Could you explain why or why not?
  9. Can you think of any environmental arguments that you have heard that are particularly effective?
    1. Where did you hear this argument?
    2. What would make the argument more effective?
  10. Based on what you were just saying, what kind of arguments would be most effective, at least for you personally?”
    1. How would this work?
    2. What would it take to convince you of something that you did not already believe?
  11. Looking out over all of this, how would you describe this to someone else? (optional—use only when interviewing at a place where the respondent would have to engage with/describe the environment)
  12. May I ask how old you are?
  13. If you don’t mind me asking, can you tell me what level of education you have?

Transcription

Transcribe the interview, making sure to use line-numbering. Work slowly, and pay particular attention to odd phrasing and difficult-to-understand language—these may be important elements. You may use any transcription software, such as NCH Express Scribe (http://www.nch.com.au/scribe/index.html). For a list of transcription in-line notations, see the Transcription Notation section.

Coding Process

Code the transcript by indicating on the right-hand side of the transcript which commonplace (see Worksheet A and Glossary 1) is addressed in any given individual statement. Mark the section in both the associated color (see Color Coding Sheet for color coding using a twelve-pack of readily-available Rose Art brand colored pencils) and by using the commonplace abbreviation. If a particular subtopic is apparent (see Glossary 2), note that as well. If, for example, your audience mentions that they try to conserve water by keeping showers to only 5 minutes, you would underline or circle that statement in yellow-green (indicating the “Action” commonplace), check the (+) line on Worksheet A (because they view such action in a positive light), and note the line number in the transcript on which that statement occurred.

Transcription Notation 

Bold and Underlined Indicates an interview question
Bold Indicates a probe—solicitation of additional information from the participant
/ Indicates interruption.
. Indicates a period of approximately one second. Several of these together, separated by one space each, visually show passage of time.
(#) Where # is a value, indicate time elapsed.
(( )) Enclosed information is the transcriber’s voice, i.e. ((laughter))
[ ] Indicates an interjection

Commonplaces

The commonplaces indicated in Worksheet A are the overarching narrative elements which, for your audience, tell a story-within-a-story. The commonplace “balance,” for example, implies that arguments and counterarguments are equal and that information is presented fairly and objectively. Invoked, “balance” asks an audience to weigh information objectively, even if the informational components are not of equal value. In a transcript, balance might be indicated by a respondent saying, “‘Balance is a good thing, you know, you got people that destroy the environment and just trash the place, you got people that overprotect. I think that a better balance is a better way to go.” Note that the word used to indicate the commonplace, “Balance” in this case, may not necessarily appear in a response, though the meaning should come through. A respondent stating that they are “obviously in favor of, you know, having a clean environment, recognizing the practicalities of modern life, that we’re not going to just walk everywhere, and give up automobiles, airplanes, everything else” is an example of how this might happen. Note also that this phrase might be dually coded as “Pragmatism.”  See Glossary 1 for definitions of commonplaces.

Sub-Topics

The listed sub-topics are designed to help you determine which commonplace is most relevant in the transcript you are reading. The sub-topic “Extremism vs. Hyperconservatism,” which is a sub-topic of the “Balance” commonplace, for example, is apparent in the first of the two examples listed above, when the respondent mentions “people that […] just trash the place [and] people that overprotect.” Subtopics should serve as both guides for commonplace recognition, and as ways for you to further refine the eventual presentation of your information. Note that some commonplaces, such as “Pragmatism” do not currently have listed subtopics. You may fill in unique occurrences in the “other” line. See Glossary 2 for brief definitions of commonplace sub-topics.

What to Look for When Coding

In coding, you want to look for interesting or unique statements that represent your respondent’s attitudes and beliefs about the topic. You may do this by either moving through the code sheet line-by-line and looking for statements that represent each commonplace and/or sub-topic, or familiarize yourself with the coding sheet (Worksheet A), and read through the transcript multiple times, marking statements for their relation to the commonplaces and sub-topics. This second way is more effective, but requires some familiarity with the coding process. Note that not all commonplaces and sub-topics will likely appear in any single transcript. If no statement corresponds to a commonplace or sub-topic, simply leave that section of the worksheet blank.

If you believe a statement to be of interest, but are not sure of how to code it, simply mark the statement and come back to it, after finishing the rest of the transcript, for later review.

Marking and Color-Coding

The document should be marked-up in a way that makes sense to you and can easily be translated to Worksheets A and B. In this example the coder has used her colored pencils to underline statements in their associated colors (Orange for “Experience,” Red for “Extremism,” and Blue for “Balance”), also noting on the right-hand side of the page what commonplace was meant by the underlining.

Coding Example 1

 

 

 

 

Indicating which subcategory helped you identify the commonplace at play may also help with your end-classification of the speaker. In the following example, the coder does not underline, but instead brackets the content and notes which sub-topic (Growing Up) led him to the overall commonplace (Experience).

 

 

 

 

 

However you choose to mark-up your transcript, make sure that both you and your colleagues can interpret your markings. This will allow you to revisit your data should you wish to confirm your findings against the original transcript at any given time.

Filling out the Final Profile Sheet: Information Acquisition and Audience Classification

Use the information recorded on Worksheet A to answer the questions on Worksheet B. These answers will then allow you fill out the Final Profile sheet.

Most question responses are self-explanatory. The following is a clarification for question 3, worksheet B.

For question three, listing the commonplaces, count the number of times you listed unique line number sections on Worksheet A.

In the following example, the coder has noted six unique instances of “Action:” one positive reference  to the subtopic “Going Green,” two negative references regarding “Pollution,” and three positive references regarding “Recycling.”

 

 

 

 

 

 

In the following example, the coder has noted two unique instances of “Nurture” (both positive references to an “Action Mindset”), and one negative reference to “Pragmatism.”

 

 

 

 

 

 

 

The final profile you list will ask you to write the profile out in longhand so that it is readily understood by other members of your organization, listing both the overall characterization and top three commonplaces, along with their attitude towards the commonplace. A complete profile might appear as “Passionate Environmentalist: negative towards extremism, positive towards balance, positive towards influential persons.” The subtopics will be used to inform your information design. You will use positive (+) subtopic references in your top commonplaces, and avoid negative (-) references.

Color Coding Sheet

The following colors are available in a twelve-pack of Rose Art brand colored pencils. If you wish to use other colors or brands, please make sure you record the colors to be associated with each commonplace before beginning the coding process.