Books Reviewed in This Issue
Avon J. Murphy, Editor
Andrew J. Friedland and Carol L. Folt. 2009. 2nd ed. New Haven, CT: Yale University Press. [ISBN 978-0-300-11939-8. 204 pages, including index. US$18.00 (softcover).]
The advanced writers who buy this book because of the title will be pleased to find exactly what the authors promise. Science and engineering researchers who seek federal funding will find practical writing strategies from these experts.
Friedland and Folt are life scientists who teach proposal writing regularly, so they explain their guidelines throughout this text as if introducing the genre to graduate students. The well-developed book provides 10 short exercises that will help readers who simultaneously manage staff, conduct research projects, teach courses, and lead grant proposal teams. The authors introduce specialized vocabulary terms required by federal agencies for paragraph-level content, the working of large multidisciplinary teams, and the role of proposal reviewers in the competitive federal funding process. Readers also learn details about authorship and ethics, including appropriate guidelines for principal investigators.
A new chapter in this edition about foundation funding stresses preexisting relationships of the researcher’s institution with foundation staff and warns readers about the impact on a researcher’s tenure.
The book needs two updates. Federal agency Web sites and search tools have changed recently, so any reader who wants to offer academic consulting services needs to know about new search strategies for finding the grants awards databases and links for electronic proposal submissions. First, the National Science Foundation, NASA, Defense Research Agencies, and United States Department of Agriculture/Cooperative State Research, Education, and Extension Service (CSREES) have designed a consolidated Research.gov Web site www.research.gov for electronic submissions and access to the awards database. More agencies will be added as partners, so Research.gov does not yet replace the home pages of the agencies that provide the portal. Second, on the National Institutes of Health Web site, writers will have to use the Research Portfolio Online Reporting Tool (RePorter) search tool to find awards, because the NIH Crisp tool was replaced in September 2009.
In addition, the authors do not directly address the greatest challenges for a new writing consultant or junior researcher who seeks multiyear funding: strict adherence to the request for proposals, proposal document control, and online editing with investigators in other states and countries.
Writers on collaborative teams of scholars and joint industry-university projects will profit from reading and rereading this book.
Karen S. Griggs is a communication consultant to academic, business, engineering, and environmental organizations. She belongs to STC, the Association for Business Communication, and the Association of Professional Communication Consultants.
Margaret Cargill and Patrick O’Connor. 2009. Oxford, UK. Wiley-Blackwell. [ISBN 978-1-4051-8619-3. 173 pages, including index. US$29.95 (softcover).]
For scientists who are just entering what Margaret Cargill and Patrick O’Connor call “the international research conversation,” writing research articles can be quite daunting. To be published, research articles need to tell the scientific story quickly, completely, and carefully—and that task is much harder than it looks.
Fortunately, Cargill and O’Connor—a scientist and a research communication teacher, respectively—have produced a wonderful guide to this difficult process. Writing Scientific Research Articles is a standalone guide or workshop book that lays out the basic procedure for writing a research article, starting with the critical predrafting process and moving on through the writing of each section of the article. To guide this writing, the book includes two published research articles in the appendix, one from ecology and one from plant biology. (The book focuses more on physical science than social science research, although you can work the book’s examples using a sample article from your own field.) Each short chapter explores a different section of the scientific article, primarily through examples and exercises. (Fortunately, the appendix includes full, descriptive answers to the exercises.) The book also includes an in-depth look at what journal editors look for in research articles and ends with a very useful chapter, “Developing Discipline-Specific English Skills,” for readers speaking English as another language.
The authors have spent years teaching scientists in China and Australia to write articles, and the first piece of advice they give is time-honored: “Identify . . . a clearly connected story which leads to one or more take-home messages” (p. 21). In the sciences, the story is in the data, and the data are in the figures and tables—and that is where Cargill and O’Connor begin: “Our aim in this section is not to provide a concrete set of rules for data presentation but rather to help you optimize the presentation of your data to support the story of your article” (p. 23). Subsequent chapters explore the sections of the typical research paper in the order they are to be written: results, methods, introduction, discussion, title, and abstract.
Although each of these sections offers specific advice, the introduction showcases some of the best features of this book. Citing research drawn from applied linguistics, the authors divide introductions into five main stages and provide annotated examples of each. This makes it easy to see writing that works and writing that doesn’t. They also use these examples to illustrate some nuggets of good technical writing: writing topic sentences that transition from the previous paragraph, starting a sentence with old information and ending it with new, putting the subject and verb into the first nine words of a sentence, and moving from general to specific information.
Writing Scientific Research Articles also includes quite useful sections on getting articles into print: preparing manuscripts, doing pre-edits with colleagues, and knowing what editors and referees want. (For example, editors want “good science” over good writing.) These pages include a list of the typical questions on referees’ scorecards and an entire chapter devoted to how to respond to the (sometimes vague) comments of referees.
However, what will probably set this book apart are the techniques it offers for scientists who speak English as another language (EAL). For example, consider these two techniques: “borrowing” sentence templates from other articles (for their structure, not their content) and using free concordancing software to investigate how words are used (such as general and specific articles). These simple, formulaic suggestions could help EAL speakers learn good writing skills and dramatically improve their articles.
The one downside to this book is wading through all the examples, which are obviously designed to be used in a workshop. (Cargill and O’Connor even suggest that you pair up with someone else as you go through the book.) But even if you aren’t lucky enough to find a workshop that uses this guide, it is absolutely worth having on your shelf for its wealth of writing advice, whether you are new to scientific writing or are an experienced writer.
Jake Ashcraft has worked as a teacher, manager, and writer in the scientific sector for more than 10 years. He recently graduated from the Technical Communication program at the University of Washington, where he focused on scientific communication. He teaches chemistry at Seattle University.
Alexander Grosskopf, Gero Decker, and Mathias Weske. 2009. Tampa, FL: Meghan-Kiffer Press. [ISBN 978-0-929652-26-9. 181 pages, including business process modeling notation poster. US$34.95 (softcover).]
Bruce Silver. 2009. Aptos, CA. Cody-Cassidy Press. [ISBN 978-0-9823681-0-7. 213 pages, including index. US$35.95 (softcover).]
The Process: Business Process Modeling Using BPMN makes an interesting read because it is written as a fast-paced novel based on a real company that the authors use to guide you through the business modeling process. It aims to show you how to use process thinking and process modeling as the path to business innovation.
Business process modeling is becoming more prominent for documenting business and system processes. Business systems analysts and technical communicators are the professionals who should be well-versed in the structure of business process modeling (BPM) and the graphical notation or business process modeling notation (BPMN) that accompanies it.
Business process modeling shows the complete end-to-end process-oriented view of a business process by representing it graphically in a flow chart or swimlane view. A process shows the steps and actors (humans or systems) in a process, describes what information is needed when, and determines where transfers (handovers) take too much time. The analyst then looks for ways for the business to improve processes, increase quality, reduce waste, and save time. The main challenge is to document the processes effectively using a visual method, analyzing the model, and deciding whether changes are needed, and to conclude with full documentation of the processes and the decisions.
Grosskopf and colleagues say, “Process models are visual artifacts to communicate content” (p. 21). Processes start simple and reach a complexity in which each step needs further explanation. Through the visual representation, you need to create and communicate a common understanding of what the symbols mean (the common semantics) that become the process-modeling language.
Diagramming processes using BPMN have Start events (open circle), Sequence flows (arrow connectors), Activities/Tasks (squares or rounded rectangles), Decisions (diamonds), and End events (thick-bordered open circles). Many more symbols exist, but these are the primary ones anyone needs to know to read a process model. The authors say, “All models are incomplete. It’s the nature of a model to reflect aspects of the reality. A subset. They can never be complete. But if this subset helps to understand what is done then the model is useful” (p. 32).
People or systems perform certain roles in a process. In BPMN, you identify the tasks/functions/activities that each entity performs by using a swim lane, which is similar to a pool used for swimming competitions. Lanes are called pools that represent the actors (human or system) that perform their own process steps in their own lane. When you group all the pools into one diagram, you have a swimlane diagram.
As a book has chapters, processes have subprocesses. A subprocess is an activity or task that requires completion before the process continues. These subprocesses may be one-time events or repeatable, recurring, reusable events that occur in more than one process. The authors write, “Subprocesses can be used to place a whole process model inside. Usually a subprocess links to a process diagram described somewhere else. But there is also the concept of embedded subprocesses. They can be expanded to show the process behind the subprocess in the same diagram” (p. 74).
A completed process model is called the AS-IS model. Analysts and business leaders use the AS-IS model to look for areas where the business can reduce costs, time, or the use of multiple applications. During this analysis, you may move parts around or model a what-if scenario that brings in a new process or application to see its effect on the business. Your AS-IS model now becomes your TO-BE model. If the business adopts the changes, the TO-BE becomes your new AS-IS until the next analysis takes place.
The Process makes for an easy read. You learn much more in this book about business process modeling (BPM) and its notation (BPMN) than you might from any other book on this subject. The best part of the book is that it comes with a glossy poster showing you the entire BPM notation with explanations on when and how to use it.
BPMN Method and Style teaches you about the BPMN elements, shows how to use BPMN to approach problems, and offers guidelines for what constitutes a good BPMN style. Moreover, it provides a separate methodology for both businesspeople and technical developers. Unlike Grosskopf’s book, Silver’s is a study guide and reference book on the concept of BPMN version 2.0.
BPMN is an Object Management Group (OMG) standard that is not owned by a vendor or consulting company. It is free and has spawned many free or affordable tools for documenting and modeling processes. Like flowcharting, BPMN is user-friendly, because it uses a set of shapes and symbols that have “a precisely defined meaning, with rules about what can connect to what and what those connections signify” (p. vii).
Process modeling serves one of two purposes: (1) to model the activity sequences in a process, or (2) to create a blueprint for automating that activity sequence within an execution engine to improve existing business processes. Silver’s method uses a three-level approach, wherein Level 1 is descriptive modeling that uses a simple process map, Level 2 is analytical modeling that includes more precise activity flow definitions with exception paths significant to a business’s key process indicator, and Level 3 is executable modeling that only a few major vendors have activated with their own BPM suite (BPMS) tools. BPMN Method and Style covers Level 2 in depth.
Modeling processes can become quite dense. All models should “flow left to right with sequence flows entering activities from the left and exiting from the right. It takes some rearranging to keep line crossings at a minimum, and sometimes it cannot be avoided” (p. 15). The objective is to keep the diagram neat and consistently organized to aid in shared understanding.
BPMN Method and Style explains the steps involved in documenting process flows at Levels 1 and 3. The Level 1 method includes five steps: (1) define process scope, (2) create the top-level diagram for the happy path, (3) add to-level exception paths, (4) expand subprocesses to show detail at the child level, and, optionally, (5) add intermediate message flows to external pools. Silver’s discusses the 13 compositional elements and six elementary usage rules of the Level 1 BPMN style.
Level 2 modeling takes the standard Level 1 flowcharting and models to a deeper level to do simulation analysis (projected quantitative performance improvement) and underpin executable processes. Running simulations and executable processes requires precisely defining the flows semantically: “exactly what happens when, under what conditions—as if the model were actually controlling execution by some automated program” (p. 59). This deeper meaning adds complexity to the start and complete events, send and receive events, automated tasks (script, service, asynchronous request), and decisions and rules events.
In BPMN Method and Style, the Level 2 method uses the Level 1 steps and adds four more: (6) refine branch/merge notation, (7) refine for channel-dependent start, (8) refine for iterative behavior, and (9) refine exception handling patterns. Silver discusses the 13 compositional elements of the Level 2 BPMN style. Silver briefly covers Level 3, the executable BPMN, by saying that it is a brand new feature of the BPMN 2.0 standard. Its main purpose is to extend the XML underneath the diagram to a complete executable design language.
BPMN Method and Style is a good reference book to have available. Bruce Silver is a noted expert in the field of BPMN and offers his own certification course for purchase from his Web site at www.bpmessentials.com.
Jackie Damrau has more than 20 years of technical communication experience. She is a fellow and member of the STC Lone Star community and the Instructional Design & Learning SIG, manager of the Nominating Committee, and member of the Competitions Task Force. She enjoys reading philosophy and psychology and spending time with her grandson.
Philip Kortum, ed. 2008. Burlington, MA: Morgan Kaufman. [ISBN 978-0-12-374017-5. 462 pages, including index. US$49.95 (softcover).]
HCI Beyond the GUI, edited by Philip Kortum, is written by an international collection of experts from a wide range of backgrounds, describing their research in alternative human-computer interaction (HCI), that is, HCI using interfaces other than the traditional graphical user interface (GUI). Chapters include HCI variations for 11 nontraditional interfaces covering all five senses. These interfaces were selected by the editor to represent the most important nontraditional interfaces that human factors professionals should be familiar with to appropriately address user needs and technological trends. Some senses are broken into variant applications when their basic use is already commonplace—such as auditory, voice, and interactive voice response for sound interfaces, and small screen, multimode, and multimodal for visual interfaces. And while the book is titled HCI Beyond the GUI, the visual interface chapters do describe GUIs, but with the caveat of nonstandard usage (the small-screen and multimodal interfaces). Because of the bleeding-edge technology involved, the chapters on olfactory and taste interfaces are relatively short and provide more background information than research data. In fact, the chapter on taste discusses feel more than flavor.
Most chapters are amply illustrated with black-and-white images, and the chapter bibliographies provide numerous references if you wish to research particular topics. All chapters follow the same basic format, covering the interface’s nature, technology, current implementations, human factors, testing techniques, design guidelines, case study, and future trends. This repetition can start to feel monotonous despite the variety of contributors but does make it easy to compare the attributes and characteristics of each interface if you’re using the book to determine which approach would be most appropriate for an application.
The book is written to be used as a textbook as well as a how-to guide for developing and testing alternative interfaces, whether in a real-world or virtual reality environment. The chapter on locomotion interfaces in particular—describing interaction in high-end simulators—reads as if the authors were designing the holodeck for Star Trek. However, chapters vary in level of technical detail, ranging from introductory to expert. For example, the chapter on gestures feels fairly simplistic to me, especially as it immediately follows the chapter on haptics, which relies heavily on technical terms. Such high-level chapters use advanced mathematics and terminology to describe the human psychophysics that explain how and why the interaction works. This could very well be extremely useful information, but it is provided inconsistently throughout the book. And although the book reads well, the numerous typographical errors in the text are a distraction.
Case studies provide information on real-world experiences, and more detail is available on the publisher’s accompanying Web site, www.beyondthegui.com. Unfortunately, the Web site fails to live up its full potential. I appreciate the additional information, but some case studies are very brief, and almost all are exclusively black-and-white PDF documents that look like appendixes that never made it into the book itself. While I understand that it isn’t possible to replicate multisensory interactions in a standard Web site, the companion site would have been an ideal way to provide multiple case studies and full-color graphics for the text, as well as additional sensory touches like the sound files that support the chapter on auditory interfaces and help bring the text to life.
While the nature of alternative interfaces makes them obvious considerations for the accessibility issues addressed in HCI Beyond the GUI, I applaud the book for its consistently strong user-centered design considerations, including concern for natural interactions and avoidance of user fatigue. In fact, Kortum summarizes his fundamental design principles, applicable to all types of interfaces, in his introductory chapter, emphasizing that any interface should be effective, efficient, and satisfactory for the user.
In all, HCI Beyond the GUI is a good introduction to alternative HCI options, providing ample history, data, and direction for development. If you’re looking for information on user interfaces other than the traditional GUI, this book is a great place to start.
Devor Barton holds a BA in communications from the University of Houston, and a certificate of project management and an MS in technical communication from the University of Washington. He is a member of STC’s Puget Sound Chapter and the Technical Editing SIG, and is an ICIA Certified Technology Specialist.
Barry Allen. 2008. Ithaca, NY: Cornell University Press. [ISBN 978-0-8014-4682-5. 213 pages, including notes and index. US$35.00.]
Barry Allen’s Artifice and Design argues that the invention of tools established a common ground between design (technology) and artifice (art). For Allen, only humans can infer the intentions of others (whether expressed or not), and this awareness defines a uniquely human social understanding. The singular dexterity of the human hand can in turn shape this socially shared knowledge into functional objects, tools, and artifacts. Human tools are therefore consciously “designed and made, usually by others, to facilitate action”; they are “manufactured,” literally “made by hand,” and are specifically social in nature (p. 2). In contrast, apes lack both the ability to read intentions and the prehensile facility to manufacture tools. A chimp extracting termites from a hole with a stick employs “a conveniently nonce object manipulated into facilitation,” but such use depends entirely on the co-presence of stick, ape, and termites (p. 2). A chimp will not necessarily use his “proto-tool” again, even in the same circumstance; nor will he see its usefulness in other situations or share his knowledge of it; hence a chimp’s tool is really a “proto-tool,” an implement usable only in a particular context, and usually only once.
Tools and artifacts—made objects—therefore embody the social needs and purposes of the group; these include both utility and beauty. Hence artifacts are designed for both function and perception, either singly or as assembled systems. The emergence of artifactual systems combining instrumentality with aesthetics results in what Allen calls “technology” and coincides with the development of art—both involve artifacts designed “in anticipation of perception” (p. 177); both are unified in a “technical economy” of “perceptually expressive works” (p. 157); and since “technical coherence (design) begins with aesthetic coherence (beauty) and never abandons it,” both share a common origin and retain a symbiotic relationship (p. 178).
New York’s Hell Gate Bridge exemplifies this symbiosis. The arch bridge, its ends embedded on each side of the East River, can easily support its load all by itself, yet the designer added massive towers at the abutments. Why include towers that aren’t technically necessary? Because both the engineer’s “trained” eye and the nonengineer’s untrained eye feel “the want of a visual counterpart to the thrust inconspicuously channeled into the foundations” (p. 139). Intellectually, we understand the bridge’s ability to work without the towers; aesthetically, we do not. We need to see the weight at work; and so an aesthetic quality is built into the object in anticipation of its perception. However, though they seem purely ornamental, the towers, by increasing downward thrust, also assume a functional purpose: they permit “the designer to minimize the volume of the foundation while keeping thrusts within the safety zone” (p. 140). Beauty and utility, then, are not mutually exclusive. They reinforce each other, and this can be seen throughout history, in an ancient axe or a modern bridge.
Allen’s argument sometimes seems circular and repetitive, and the style occasionally is obscure, but these are cavils. The book is rich in insight, reveals deep scholarship, and will provoke fruitful reflection in technical communicators concerned about usability as a combination of utility and beauty. Read it!
Donald R. Riccomini is an STC member and a lecturer in English at Santa Clara University, where he specializes in teaching engineering and technical communications. He previously spent 23 years as a technical writer, engineer, and manager in semiconductors, instrumentation, and server development.
Patricia T. O’Conner and Stewart Kellerman. 2009. New York, NY: Random House, Inc. [ISBN 978-1-4000-6660-5. 266 pages, including index. US$22.00.]
Who hasn’t had a vicious argument about the proper use of a word or whether English is a malleable language? Maybe that’s just me. But if language is your game, Origins of the Specious: Myths and Misconceptions of the English Language is your book. It’s a blooper highlights reel of English, explaining fables from the myth about the number of Eskimo words for snow to the misconception that all double negatives are incorrect.
Origins of the Specious discusses these misuses and myths with a wry sense of humor. It even includes some lightly dirty humor, such as a bit of history on the Yiddish word putz. When discussing that newspapers were “abuzz” about two Oxford dictionaries giving the okay to willy-nilly split your infinitives, the authors comment, “It was a slow news week” (p. 17). In explanation for why the word ain’t is no longer considered acceptable language, they say, it “got too big for its britches” (p. 49). An entire chapter that had me snickering is “Lex education: Cleaning up dirty words.” If you want to bore the curse words out of unruly children (or inform it out of them, depending on their disposition), you might read them this chapter.
The authors use humor to get a basic idea across: English is a liquid language (regardless of how thick we would like that liquid to be). Throughout the book, they say that English is changed by “the people who actually use the language day in and day out” (p. 43), that is, all of us. My favorite example is the hunt for a single “all-purpose pronoun for people that can be masculine or feminine” (p. 141). We all know how frustrating it is to write around “he/she” and “he or she.” But try as we might, no word has successfully taken hold of this empty space. Thon made a valiant effort in 1858 but fell by the wayside. Regardless, it will always hold a special place in my heart.
Occasionally, their humor can get a bit harsh, but in a teasing way. With regard to the literal meaning of “beg the question,” which Aristotle originally used in 350 BC, they say, “It’s time for the purists to get a life—one in the twenty-first century” (p. 182). In another example of harsh but humorous reprimands, they say, “If you think ‘octopi’ is classier than ‘octopuses,’ go stand in the corner” (p. 184). Away I went.
At times, this book had me laughing out loud for the dorky language jokes. And if it doesn’t provide enough information for you, the bibliography in the back provides 30 other resources. I definitely recommend Origins of the Specious for language junkies with a good sense of humor.
Angela Boyle is a technical writer for Tyler Technologies, Inc, where she has worked for four years. She graduated from the University of Washington with a BS in technical communication.
Harry F. Wolcott. 2009. 3rd ed. Thousand Oaks, CA: Sage. [ISBN 978-1-4129-7011-2. 208 pages, including index. US$41.95 (softcover).]
Ironically, when I began reading Writing Up Qualitative Research, I was in the midst of writing an article that was attached to a swiftly approaching deadline. I knew the material, but I just couldn’t seem to put it together in a way that I felt would engage the audience. Few academics out there haven’t been confronted with a similar scenario, which is why this book is so necessary. Harry F. Wolcott, a professor emeritus at the University of Oregon, has been through the trenches in his 40-plus years as an academic, and his advice is both comforting and valuable for any academic writer who works with qualitative data.
Writing Up Qualitative Research is organized into seven chapters that mirror the writing process. Wolcott begins with a discussion of getting started with your writing project and ends by guiding your project through the publication process. Each chapter offers practical advice for overcoming potential obstacles as well as tips for making the project your own in the context of academic fields that often demand formulaic writing.
Wolcott’s advice is highly individual, and much of it involves strategies for finding the techniques for writing and editing that best fit you as a writer. For example, he discusses the importance of choosing a time and setting for writing, when to write without interruption and when to review your writing, and when to ask for help with revision. This kind of material is rarely spelled out in this manner; it is usually something writers learn on the job through long and sometimes painful experiences. Wolcott explains his points in a friendly and accessible manner, which makes the book feel like a lively conversation with a colleague rather than an academic treatise on writing.
Much of the advice Wolcott offers is entirely new to me. I have read other books on writing and publishing—such as Getting It Published by William Germano (University of Chicago Press, 2008)—but Wolcott’s take on the writing process reformulates much of what I have thought of as “truisms” about academic writing. For instance, the chapter “Linking Up” explains how to choose a theory that fits the project rather than working to make the data fit a particular theory: a pitfall that many graduate students and new professors make in their first attempts at research.
Writing Up Qualitative Research is a valuable resource for academic writers at any stage of their careers. Graduate students will find the discussion of breaking out of the prescribed formulas of the dissertation genre very helpful. New academics will enjoy the advice for keeping writing relevant to the audience, as well as the chapters on negotiating the publication process. Even seasoned academic writers will not go away from Wolcott’s book empty-handed, because it contains advice for guiding new writers and graduate students. Writing Up Qualitative Research is an enjoyable, refreshing, and useful monograph, and I know I will be using it for advice and inspiration for years to come.
Nicole St. Germaine-McDaniel is a senior member of STC and head of the Technical and Business Writing Program at Angelo State University. Her research interests include technical communication for a Mexican-American audience and technical communication in the health fields.
Richard Johnson-Sheehan. 2010. 3rd ed. New York, NY: Longman. [ISBN 978-0-205-63244-2. 760 pages, plus appendixes, references, and index. US$100.00 (softcover).]
Technical Communication Today is the latest version of one of the best of the big, comprehensive textbooks designed for undergraduate introductory technical communication courses: courses critical both for developing the skills of workplace professionals and for establishing their attitudes to our field and to us, as specialists within it.
The standout strengths of this book for students and teachers alike are the way current technologies are presented as transformative, the variety of support materials provided, and the encouraging approach to explaining traditional technical communication topics.
With respect to technologies, the book’s premise is that computers are “thinking tools” that help writers work better, so embracing and mastering new communication technologies can be both exciting and critical to professional success. As a result, the book does not isolate its discussion of technology applications (as some textbooks still do) but instead integrates discussions of technology use throughout. Such treatment is likely to whet the appetites of technologically savvy students for studying technical communication; it also may help some teachers better value and welcome technology use.
Augmenting this strength are numerous teaching aids that demonstrate software applications, provide examples and links to current tools and information, and offer how-to tips and advice from practitioners. Other textbook elements are particularly well done, including a thorough companion Web site, practical exercises (such as documents provided as “revision challenges”), and practice-what-you-preach design features.
Complementing all the fresh material are extended, intelligent treatments of the classic principles and methods of technical communication, including those of rhetoric, genre, and process, topics richer for being combined with new content. For example, the book describes technical communication in terms of “how information flows,” leading to a more active definition of the communicator’s role that identifies up-to-date methods for achieving goals (such as using search engines to profile readers). Using a motivating tone and accessible style, the book draws on familiar experiences to explain concepts in a way that should foster practice and innovation.
Teachers who have used previous editions of Technical Communication Today will be pleased to know that this book substantially improves upon the 2007 version, with revisions that go beyond cursory updates to add critical new topics (such as text messaging and translation tools), more helpful examples, and thought-provoking ethical case studies. The instructor’s edition of the text specifically delineates these changes, making it easy to update lesson plans and assignments.
The downsides of this book both mirror its strengths and are endemic to its type. First, however quickly a textbook moves from draft to market, that timing can never be quick enough to keep it fully current with changes and options in tools and workplace practices. Second, comprehensiveness requires covering a lot of material, much of it cursorily, which can lead to a sense of overload and to shallow treatment of important topics. This might especially be a problem for science and engineering students, who may need less general information about professional writing and more specifics about conventions in their own specialized fields. For this reason, future editions might reduce or eliminate coverage of less critical topics (such as studying word origins by means of the Oxford English Dictionary). Another artifact of comprehensiveness is that the many navigational aids, marginal cross-references, screenshots, highlights, callouts, and other graphical features in this book threaten to overwhelm with a confusion of colors and elements, undercutting the purposes for which those features were designed.
However, the tweaks that might make this a better book for some might make it a less helpful resource for others. As a wide-ranging text that covers the fundamentals of a broad and progressive field, the newest edition of Technical Communication Today manages quite effectively the tough balancing act of being appropriately technology-centric yet also grounded in the basics that remain as critical as ever for those studying and entering our field.
Lu Rehling is an associate fellow of STC and a professor in the Technical and Professional Writing Program at San Francisco State University. She has worked extensively in industry as a writer, editor, trainer, manager, and consultant. Her PhD is from the University of Michigan.
Harold Kirkham and Robin C. Dumas. 2009. Hoboken, NJ: John Wiley & Sons, Inc. [ISBN 978-0-470-40547-5. 379 pages, including index. US$69.95 (softcover).]
The back cover of The Right Graph claims that the book “arms you with all you need to know to conceptualize, create, and incorporate the type of quality graphs and graphics that will help get your scientific and technical papers published.” A rather tall order. While I don’t think the book fully delivers on this promise, it does offer numerous guidelines and a great deal of straightforward advice presented engagingly.
You’ll find conceptual information about designing graphs primarily in chapters 1 through 3 (arguably the most valuable in the book) and the chapter “Style Matters.” The first three chapters offer guidelines for determining the most appropriate type of graph and ensuring the effectiveness of a variety of graph types, reinforced by many helpful examples and clear explanations. “Style Matters” includes additional tips for achieving clarity and establishing a consistent and effective style.
A significant portion of the chapter on textual aspects of graphics (labels, captions, callouts, and so on) discusses typefaces. The advice given is for the most part sound, although there is a tendency to oversimplify and even ignore relevant research. For example, contrary to the authors’ claim, readers may, in fact, have trouble comprehending text set in all caps. Additionally, an evaluation of graph design that says “It really doesn’t look ‘right,’ somehow” (p. 105) does not help you understand how to avoid problems in your own graphics. Still, the chapter includes useful advice that supports the goals of the book.
Unfortunately, at this point The Right Graph veers from offering guidance on designing effective graphs to providing tips that are both basic and general, sometimes in chapters that don’t really belong in the book. For example, the chapter “Getting the Most Out of Your Software” includes pointers such as using keyboard shortcuts and copying and pasting, reminding you that “using the mouse to select options from a menu is slooooooooow!” (p. 141). Similarly, the chapter on organizing and giving a presentation offers little detail about designing your slides. Likewise, the chapter on perspective seems an odd inclusion, given the authors’ stance that “perspective is rarely needed in a technical drawing” (p. 297).
Discussions of software include an interesting but nonessential brief history of the spreadsheet, instructions for making graphs in Excel and Quattro Pro behave, and importing and improving those graphs in PowerPoint and Presentations. Users of Excel and Quattro Pro are likely to find the textual repetition between chapters annoying. In contrast, the chapter “Fixes Using Graphics Programs” discusses principles instead of providing details for each software package, allowing the authors to cover more ground in less space, an approach that would also have been effective in the preceding chapters on software. Finally, the chapter on file formats concludes that converting files is generally time-consuming and imperfect, a fact with which I expect most technical communicators are already familiar.
These issues notwithstanding, the book contains a wealth of advice and helpful visual examples (some duplicated on color plates that make the visuals both clearer and more attractive), although, ironically, some of the figures are pixilated or difficult to read. Almost all chapters also include exercises, which, although perhaps more at home in a textbook, are potentially useful if you want to practice applying the material before you work with your own data. Unfortunately, the lack of solutions diminishes the helpfulness of the exercises. Each chapter also includes a bulleted list that gives readers a quick summary of the key points of the chapter; oddly, the 15-page final chapter reiterates these summaries.
In the end, I appreciate Kirkham and Dumas’ conversational style and use of humor, which make their manual considerably more user-friendly than most. At the same time, I find myself wishing that The Right Graph were briefer and more focused, bypassing some of the quirky anecdotes and extraneous material to get to the essentials more quickly.
Eva Brumberger teaches professional communication at Virginia Tech. She has worked as a technical writer/editor on both a full-time and a freelance basis. Her research interests include visual communication, international communication, and pedagogy. She is a member of STC and was president of the Border Network Chapter.
Dolores Lehr. 2009. Newburyport, MA: Focus Publishing/R. Pullins Company. [ISBN 978-1-58510-257-0. 212 pages, including index. US$36.95 (softcover).]
As expressed in the title, Dolores Lehr has a specific purpose for her book. Her approach to technical communication is to treat text and graphics as a comprehensive unit rather than let graphics play second fiddle to the text.
Lehr follows through with this concept by being generous with graphics, using them in conjunction with text to explain important aspects of technical communication, features of specific documents, and even the mundane side of writing.
Technical and Professional Communication: Integrating Text and Visuals is divided into four parts: “Planning Documents,” “Composing Text and Generating Graphics,” “Integrating Text and Graphics,” and “Appendices.” Each chapter begins with objectives and ends with a usable checklist and exercises that reinforce the content.
Part One goes from the practical to the creative, from legal and ethical issues to gathering and evaluating information, to drafting and sketching. Drafting and sketching entail brainstorming, free writing, and mapping for idea development; sketching pages and illustrations for page layout; and preparing preliminary drafts.
The other parts cover the more familiar aspects of technical communication of writing and using tables, figures, color, and graphic elements. What makes this book different from most technical writing books is the way Lehr handles examples, both graphical and textual, of various documents that are common to technical communicators: instructions, proposals, reports, correspondence, guides and promotional materials, and oral presentations.
Descriptions of these documents and their requirements are brief but not sparse. Each document is shown in a graphic while Lehr explains internal components and use of the documents, with topical tips given where applicable. This book will not make you an expert on each document type, but it does ensure that you will have more than a passing acquaintance with it. Document layouts are shown with appropriate textual content. In at least one instance—instructions—includes additional ways an instruction could be worded and advises on the best version.
It is a shame the publisher didn’t use higher quality paper, which would make this book a better reading experience. Faint outlines of text and graphics show through the page, and the contrast of text against the page could be stronger; reading in dim light can be tricky. The paper quality is especially disappointing, because the book includes excellent reference material that you would likely want to note or highlight. Such notes and highlighting will partially obscure the text on the other side of the page.
Overall, the book is an excellent introduction to document planning and creation for novices and a good reference book for more experienced writers.
Sherry Shadday works for Southwest Research Institute in Layton, UT, as a principal instructional specialist creating print, stand-up, and Web-based training. An STC member, she received a technical communication master’s degree from Utah State University. Previously, she served 21 years in the U.S. Air Force, maintaining aircraft electrical systems.
The Restructuring of Scholarly Publishing in the United States 1980–2001: A Resource-Based Analysis of University Presses
Barbara G. Haney Jones. 2009. Lewiston, NY: The Edwin Mellen Press. [ISBN 978-0-7734-4727-1. 424 pages, includes index. US$129.95.]
In her meticulous study, Barbara G. Haney Jones interviews more than 30 directors of university presses to identify factors that have influenced their restructuring since 1979 and strategies employed to help the presses stay afloat in spite of financial cuts and a shifting publishing market, including changes in technology such as electronic publishing avenues. Jones hypothesizes that university presses with greater resources would be more apt to experiment with new modes of advertising, printing, and publishing, whereas smaller presses with fewer resources would stick to business-as-usual in the face of change. Although Jones basically finds support for her hypothesis, during the interviews she also discovers how even some of the smaller university presses with limited resources were able to survive because they were willing to take some risks.
Rare is the reader of Technical Communication who is not aware of the current economic crisis and the effects on our profession. We have all received e-mails from STC over the past year about budget constraints and efforts to remedy the financial dearth. What Jones has done in her study, though, is to examine some 30 years of change and several influences on university presses, all of which have had an impact. Her interviews and data analysis are based on her dissertation research from the mid-1990s.
So what are the major factors influencing the constant restructuring of university presses? Jones says the number 1 factor is the “publish or perish” pressure on professors in the social sciences and humanities to publish monographs, which the presses often just can’t sell. Although a couple of decades ago the presses were subsidized by the university and such monographs were more viable, today’s market does not lend itself to niche research publishing. Jones cites one press director who remarks that such books “are too long and that the information could just as well be published as an article” (p. 330). The university presses, for the most part, want to move away from the highly specialized research dissertation-turned-monograph at a time when most research institutions are still pressuring their tenure-track faculty to publish books.
The irony, of course, is obvious. Jones’s book itself is a dissertation-turned-book. However, her situation is unique: Before, during, and after her dissertation, Jones worked as the controller for the Edwin Mellen Press. Her position and experience offer valuable insight into why university presses are struggling and what they are doing to adjust to financial pressure and changing markets.
Another influence on university press restructuring is library budgets and acquisition. As university libraries suffer increased budget cuts, the resources they tend to cut first are books. Why? Because librarians feel pressured to maintain subscriptions to certain journals, and these journals, especially in the sciences, have increased exponentially in price. Since the libraries must carry the established and expensive journals even in a budget crisis, there is less demand for university presses to publish specialized academic books. Besides, online journal issues are easier to store in cyberspace than a bunch of hardcover books collecting dust on library shelves.
To combat the reduction in demand for research monographs, some university presses are exploring electronic publishing. Because interlibrary loan departments have all but eliminated many research professors’ need to buy academic texts, some publishers, including university presses, have turned to on-demand purchasing of electronic texts. Although not a university press, STC offers access to articles through online purchase. For example, you can download a copy of Jo Mackiewicz and Kathy Riley’s “The Technical Editor as Diplomat: Linguistic Strategies for Balancing Clarity and Politeness” from the February 2003 issue of Technical Communication for just $5.95 from Amazon.com. In addition, to combat the purchasing fatigue of research monographs, some university presses make up to a third of their list trade books. Custom publishing, particularly in the textbooks, also has created increased revenue for some university presses.
In terms of approach, Jones’s book gets off to a slow start. She repeats her hypothesis and research question several times in the first 30 pages. Although this repetition may be an element of the dissertation or study format, I find the loss in momentum almost staggering at times. Some parts of the table of contents are mismatched with the text itself or have erroneous repetition (for example, 3.2.6 and 3.2.7 have the same title). Despite some cosmetic and stylistic issues in the beginning, however, what Jones offers in the remaining 400 pages is indeed a rigorous and eye-opening study that reveals not only exactly what is causing the downturn in scholarly book publishing with university presses but also how university press directors feel about the shift and the steps they have taken, are taking, and plan to take to survive in a changing publishing market.
Nicole Amare is a senior member of STC and an associate professor of technical communication at the University of South Alabama. Her research interests include ethics, editing, and visual rhetoric. She is associate editor of Industry Practices for IEEE Transactions on Professional Communication.
How to Write, Publish, and Present in the Health Sciences: A Guide for Clinicians and Laboratory Researchers
Thomas A. Lang. 2010. Philadelphia, PA: American College of Physicians. [ISBN 978-1-934465-9-4. 383 pages, including index. US$59.95 (softcover).]
Thomas Lang has written the book I would have written.
When I began teaching technical writing in 1971, I couldn’t find a textbook that discussed audience, nor was there one that provided authentic examples from industry. By 1987 I decided that the only solution to providing my students with a textbook that included the information I believed they needed was to write my own book. In fact, I wrote two.
When I began teaching medical writing in 2008, I thought I might have to do it again. Almost all the medical writing texts have been written by health professionals (doctors and scientists) or academicians rather than writers, and they lack a professional writer’s perspective.
Then Thomas Lang published his book—the book I would have written. This was the book I needed to teach my course.
A former technical writer and academician, Lang has taken the research results from a quarter century of communication studies and adapted them to the medical world with which he has become familiar as a consultant the past 20 years.
The book is aimed at anyone who will be writing and presenting in the health sciences, including nurses, clinicians, medical technicians, biomedical scientists, physicians, and medical writers. It is specifically focused on writing proposals and research article results. Lang divides the book into three parts: (1) writing in the health sciences in general, (2) publishing in the health sciences, and (3) presenting in the health sciences.
The opening chapters provide a framework for anyone engaged in some form of technical communication, not just in the health sciences. The first chapter is unique among medical writing texts: Lang provides a historical overview of the discipline, going back to the earliest known Egyptian medical text in 1700 BCE. He goes on to include the Greek development of idiographic script, Johnson’s publication of the Dictionary of the English Language, the Royal Society’s first scientific journal, the first medical library in Philadelphia, and the introduction of the Journal of the American Medical Association (now JAMA). Bringing the overview up to date, he lists such recent (2009) guidelines as the CONSORT, QUOROM, and TREND statements, and such reporting standards as the Minimum Information for Biological and Biomedical Investigations (MIBBI). In addition, he provides a section on the latest technological innovations, such as e-prints, open-access publishing, and self-archiving. I find this chapter fascinating, as I think others will.
The chapter “How to Write Effectively: Making Reading Easier” describes how the reader relates to a text rather than how the writer relates to a reader, as is found in most technical writing books. Lang explains what academicians mean by writing reader-based prose and, more importantly, why writers need to write this kind of prose. He delineates four features—comprehensibility, recallability, “referenceability,” and usability—that readers need to strive for in a medical document if they are to effectively make a decision or follow a procedure, the two major purposes of technical documents. He continues in this and the following chapter to provide recommendations for successfully implementing these features. While he discusses many of the techniques usually suggested in technical writing texts—use familiar words, make sure phrases and clauses in a list are parallel, and don’t nominalize verbs—he adds some clarifications from his own experience as a writer that are supported by linguistic research. While he recommends short sentences in most cases, he says it is not length but syntactic complexity that impairs comprehension and clarity. In terms of using active rather than passive voice, as most texts do, he adds a caveat (with examples) that the passive is sometimes more appropriate.
Lang devotes three chapters to graphic display. He focuses on the writer’s purpose in displaying data or images and then provides recommendations for using the appropriate visual form, thus invoking Louis Sullivan’s axiom “form follows function.” Providing full color, myriad examples, and appropriate alternatives, he goes into detail on such matters as how to help readers analyze data in tables and graphs, perceive patterns in data, and compare data; how to prepare drawings and photographs with the desired quality; and how to document biomedical images. Some of this discussion is specific to particular specialties, such as MRI scanning and genetic sequencing.
The book provides excellent discussions of abstracts, grant proposals, and research articles. The chapter on abstracts is one of the most detailed I have ever encountered—it delves into descriptive, informative, and structured abstracts. Best of all, Lang suggests ways to reduce word count, an extremely important but frustrating aspect of writing an abstract. The excellent advice on grant proposals includes a list of characteristics of successful and unsuccessful grant proposals, as well as an explanation of how funding agencies and grant/contract offices evaluate proposals. Finally, the chapter on journal articles includes tips on such topics as budget, equations, measurements, references, and statistical methods.
The section on publishing includes discussions of ethics and the process of publishing a journal article, from submission through the peer process to production. Only an experienced writer could give neophytes such helpful advice as the following: “FOLLOW THE INSTRUCTIONS FOR AUTHORS EXACTLY!” (p. 269) and “If you do not hear from the journal within 60 days after submitting your manuscript, contact the editor” (p. 275).
The section on presenting looks at creating posters and slides. It offers useful advice on, for example, limiting the amount of text, avoiding three-dimensional images in graphs, using sans serif type, developing tabletop posters, and selecting font size and display orientation. This section, like the first, applies to anyone involved in presenting a technical topic.
Lang offers explanations and information to assist anyone who is interested in writing and presenting medical/scientific information. I am only sorry he doesn’t include patient education materials. Perhaps he could add an addendum in the future.
Carolyn Boiarsky is a professor of professional communication at Purdue University Calumet. She has written two textbooks, Technical Writing: Contexts, Audiences, and Communities and Writings From the Workplace. She formerly consulted with the nuclear power industry for her firm, Effective Communication Associates. She began as a United Press International correspondent.
Debbie Rose Myers. 2009. 2nd ed. Hoboken, NJ: John Wiley & Sons. [ISBN 978-0-470-18476-9. 262 pages, including index. US$45.00 (softcover).]
Can a book titled The Graphic Designer’s Guide to Portfolio Design be useful to a technical communicator? The answer is “Yes!” This compact, friendly book offers value to a technical communicator who’s starting out or hasn’t put together a portfolio in a while.
Debbie Rose Myers’s style is welcoming, as if she’s leading a small tutorial session in designing a portfolio, overcoming your initial reaction that you don’t need help. I now see that I do need help, or at least tweaks, to update my portfolio. The book focuses mainly on designing an electronic, 21st-century portfolio.
Each chapter gives real-life examples from Myers’s experience or that of her students. In the valuable “Interview” section of the chapter, practicing graphic designers answer honest and informative questions about the interview process, such as “What qualities do you look for in an applicant?” and “What makes a successful interview?” They’re most blunt and practical in their answers to “What are the five best things job candidates say that impress you during an interview?” and “What are the five worst?” You can use chapter exercises (“Designer’s Challenges”) to create a portfolio.
Myers argues well that “a designer will always be judged by the weakest pieces in the portfolio” (p. 10), a theme she emphasizes throughout. The chapter “Planning Your Portfolio” includes a checklist of pieces you might include, with specifics for each category. Obviously, you will adapt the list to your specialties. The chapters on the traditional paper portfolio include a list of action verbs, a great tool to use in writing your résumé. Myers’s examples of display options gave me suggestions for updating my paper portfolio. The summary checklists on interviews and the examples of thank you notes show reflect her experience.
Many details on digital portfolios will appeal mainly to readers with less experience. Graphic design students and experienced technical communicators, for example, will find the material on page layout programs, clip art, and construction of a Web site too elementary. Myers’ section on creating an “artist’s statement,” however, challenged me to work on my own writer’s statement, a new element that I’ll include in my updated portfolio.
One annoyance is a design miscue that is not Myers’s fault. The color bleeding on the edges of the pages has no function, because the edges aren’t staggered to correspond to the chapters.
In all, this is a good reference book for recent graduates as well as experienced professionals.
Beth Lisberg Najberg is an instructional designer based in Chicago. She develops technical training for frontline workers, incorporating graphics and job aids so procedures, processes, and concepts are easy to follow. She is principal of Beginnings, an information design consulting firm.
Philip Vassallo. 2010. New York, NY: AMACOM. [ISBN 978-0-8144-1485-9. 195 pages, including index. US$18.95 (softcover).]
In the spirit of How to Write Fast Under Pressure, I procrastinated writing this review until the day it was due to instill a sense of urgency and see if I could put some of Philip Vassallo’s advice to use. The purpose of the book, as the title suggests, is to provide tactics for writers who have difficulty overcoming writer’s block or hitting deadlines. And while Vassallo describes useful writing practices, one would be hard-pressed to find a professional writer who has not discovered them on his or her own. A good audience for the book, then, would be novice writers who are new to the workplace.
Vassallo positions his narrative around a hypothetical set of employees at a hypothetical company working through fairly common writing situations—proposals, requisitions, e-mails. The pressure of looming deadlines is always present. One employee is disciplined and seasoned; the other lacks confidence and is fairly new to his role. As in many business self-help books, the dialogue in this book is flat and the commentary overly didactic.
The author focuses on four areas: direction, acceleration, strength, and health (DASH). Direction involves understanding the purpose for the writing task at hand, being clear on the time parameters, and delineating an efficient process to get the job done. Vassallo reviews basics such as the writing process and the use of document templates.
Acceleration is about maintaining momentum. Here the author provides tips on getting organized and reducing complexity. One tidbit he offers for managing e-mail, which I found myself following the next day at work, is the four Ds—delete, delegate, defer, do. Within seconds of scanning an e-mail, you should be able to determine which “D” is most applicable. My favorite is delete.
In his chapter on strength, Vassallo shares ideas to help the burgeoning writer develop good writing habits and produce quality work. The guidance proffered ranges from establishing a comfortable writing environment—ergonomics, lighting, temperature—to having good resources at hand and mastering different levels of editing.
In the health section, the author discusses writing as a therapeutic activity that can help you “heal physically and emotionally” (p. 135). Although the thesis seems a bit flighty, Vassallo shares some of his better insights here. For instance, he instructs you to develop thick skin in dealing with criticism—the goal is to learn to make constructive use of it. He also suggests you should expect the unexpected and learn to enjoy planning as well as reacting, all of which is good advice.
Altogether, How to Write Fast Under Pressure isn’t a bad book. It could be helpful for a person without a writing background. For a professional or experienced writer, however, it’s much too elementary.
Gary Hernandez is a communications manager for BP. He received an MA in English literature from George Mason University and an MS in technical writing from Utah State University. He belongs to STC and the International Association of Business Communicators.
The Backchannel: How Audiences Are Using Twitter and Social Media and Changing Presentations Forever
Cliff Atkinson. 2010. Berkeley, CA: New Riders. [ISBN 978-0-321-65951-4. 226 pages, including index and photo credits. US$34.99 (softcover).]
Whether you are an experienced participant in the backchannel or unfamiliar with the term and associated activities, this book could be an eye-opener. The Backchannel confirms that expert users of this technique of adding to (or hijacking) conference presentations have diverted the flow of the old river of presentation methods and audience behavior. Atkinson believes that the diverted river is picking up speed, and those who do not use the backchannel may be left behind on the river banks. But those who embrace the backchannel can influence the course of the river to their own benefit.
Yes, Atkinson champions the use of metaphors. He also provides fascinating anecdotes of backchannel incidents, supported by photographs, illustrations, and trenchant quotations. Atkinson defines backchannel as “a line of communication created by people in an audience to connect with others inside or outside the room, with or without the knowledge of the speaker at the front of the room” (p. 17). It is a silent conversation conducted on laptop computers and smartphones using Twitter.
Atkinson writes in a breezy, compact style. Into the river of the backchannel he launches a barge full of convincing true stories of how audiences at technology conferences have either supported or disrupted speakers through on-the-spot communication with the audience members’ own Twitter followers. Conference by conference, speaker name by speaker name, author by author of tweets—all are revealed in a colorful, convincing story of mutiny on the ark by tweeters who want to comment on or influence the presentation.
Although he presents a solid chapter on the drawbacks of the backchannel, Atkinson clearly believes that it is an ineradicable and spreading feature of conference presentations and that presenters should use it to their own advantage.
First, become an expert with Twitter and other social media. Discover whether your upcoming conference will supply meeting rooms with wireless and mobile phone support. If yes, use social media—your Web site, your blog, e-mail, YouTube, SlideShare, Twitter—to build anticipation for your presentation. Next, prepare a presentation (with or without illustrated or text-only slides) that focuses on a few key points worded so briefly that they can be forwarded as part of a 140-character Twitter message. At the start of your presentation, acknowledge the presence of the backchannel, display audience tweets on your own laptop via the conference projector, and make backchannel comments a part of your discussion with the audience.
These activities require considerable preparation and expertise. To help readers, Atkinson provides plenty of advice. Some of it is basic: how to set up a Twitter account and how to condense the main points of your presentation into Twitter-friendly length. Other advice builds on the basics: how to obtain a Twitter hashtag (identifier) for your conference session and an abbreviated link to your blog or Web site for inclusion in Twitter messages. The advice increases in sophistication: Use social media to build buzz before the conference, engage thoroughly with the backchannel during your presentation, learn to handle negative tweets skillfully, and follow up after the conference using social media.
This quick read so energized me that I immediately went to the Web to learn whether my next conference plans to supply hashtags for all sessions and whether the session rooms are going to support the Internet and smartphones.
Two messages dominate Atkinson’s book. First, audiences want to participate in presentations rather than be silent observers. Second, the “Law of Two Feet” (p. 82) applies: The audience will walk out if your presentation is not useful, meaningful, well researched, and up to date.
So let’s load that barge and head down the river!
Ann Jennings is a senior member of STC, professor of English, and coordinator of the MS program in Professional Writing and Technical Communication at the University of Houston-Downtown.
Jason Whittaker. 2009. 3rd ed. New York, NY: Routledge. [ISBN 978-0-415-48622-4. 272 pages, including index. US$39.95 (softcover).]
If information about planning, setting up, and managing a Web site using Web 2.0 interests you, the third edition of Producing for Web 2.0 could be a good read. You’ll find an eclectic and useful overview of current technologies available for online communication using Web 2.0 tools.
Of special interest are answers to questions such as What is Web 2.0? How can you customize a blog to better use Web 2.0 technologies? How important is it to know Dreamweaver to do Web work? How does a content management system (CMS) fit in? What are the elements and attributes of XHTML?
Even Tim Berners-Lee, the inventor of the World Wide Web, notes about Web 2.0, “Nobody even knows what it means” (p. 2). According to Whittaker, the term, coined by Dale Doughterty of O’Reilly Media and Craig Cline of MediaLive, refers loosely to platforms and technologies that represent new Web developments. Instead of the static information common to Web 1.0, the author notes that Web 2.0 makes greater use of data dynamically and includes applications such as video, audio, and Really Simple Syndication (RSS) feeds.
The sections on customizing a blog with Web 2.0 contain useful step-by-step directions for using Feedity.com to work with RSS feeds. The author notes that “WordPress is one example of blogging software that you can install on your own site and fine-tune to match templates and styles used with your other pages” (p. 138).
Samples and codes from the book appear on the companion Web site (http://www.producingforWeb2.com), which also includes information on new developments in Web production and design. The information about CMSs covered on this site and in the book reflects the hundreds of CMS applications available today.
Among the most useful tips for written content are these: select an audience, write as you speak, use the pyramid technique, use links, and become a reader. The pyramid technique in news journalism tells the story at the beginning and expands from there. Originally a useful technique to allow editors to cut stories from the bottom up to save space, the technique can be useful on the Web to accommodate readers who may wish to quickly move on to other pages.
The author’s personal viewpoint comes through when he discusses both technology and writing skills. He is on target when he says, “While . . . coding skills are essential for the modern Web producer, the fact remains that the vast majority of content online consists of text. For this reason, the ability to write well to attract your audience’s attention cannot be underestimated” (p. 196).
Jeanette P. Evans holds an MS in technical communication management from Mercer University. She has more than 20 years of experience in the field and has published in Intercom and presented at several STC conferences. An STC associate fellow, Jeanette is active in the Northeast Ohio Chapter.
Ricardo Mazza. 2009. London: Springer. [ISBN 978-1-84800-218-0. 139 pages, including index. US$49.95 (softcover).]
Open any technical writing textbook (even some of the very earliest ones) and you will find a section or a chapter on creating visuals to support the assertions made in the text. As far back as 1970, you will find whole books devoted to visualizing information. Mazza’s Introduction to Information Visualization follows in this tradition.
After an opening chapter in which he describes what he means by visual representations, Mazza moves to chapters on how to create visual representations as well as how readers perceive visual representations. The next three chapters describe various ways data can be presented, including multivariant analysis, networks and hierarchies, and the World Wide Web.
He then turns to the problem of information overload and describes several situations in which the presentation method affects the way users perceive information. The key to user understanding is whether the user can manipulate the view of the data or the source of the data and the mapping process. Both approaches help the user analyze what he or she is seeing and draw insights from the data.
The book adds to stand-alone graphics books that range from the simple (A. J. MacGregor, Graphics Simplified [University of Toronto Press, 1979]) to the encyclopedic (Robert L. Harris, Information Graphics [Management Graphics, 1996; reviewed in the May 1997 issue of Technical Communication]) to the highly theoretical (William S. Cleveland, The Elements of Graphing Data [Hobart Press, 1994; reviewed in the August 1996 issue of Technical Communication]) as well as the seminal work of Edward Tufte. Also included in this collection of works is the special issue of the IBM Research Journal devoted to visualizing massive amounts of data. However, of this group, Mazza mentions only an article by Cleveland.
In eight well-illustrated chapters, Mazza moves you through basic theory and simplified visualization techniques to reader response theory and highly complex techniques. The book’s title tells you exactly what to expect. He shows you what is possible but tells you very little about how to do it. When he describes a particular way to present highly complex data using computer software packages, he provides a footnote containing the URL for finding that software.
Mazza’s examples are from computer presentations of complex data. For example, he shows a way to present computer usage by students in 345 online classes on a single screen by conveying information for individual courses at the pixel level and using color to indicate usage by the course. Presumably, one could insert a visual presentation into a document, but he does not address that issue.
The book addresses what he sees as the main problem in designing a visual representation: “creating visual mapping that, on the one hand, faithfully reproduces the information codified in the data and, on the other, facilitates the user in the predetermined goal” (p. 24). This emphasis on the user culminates in an entire chapter on evaluating the effectiveness of the visual presentation. This chapter will be quite familiar to those who regularly do usability studies. While not specifically adding anything new, its very presence in a book on visual representation is unusual.
Mazza makes clear that the book is meant to be a textbook for a course in information visualization. It certainly would be an effective text for such a course extending over 16 or 17 weeks, but the instructor would need to add material, especially on how to actually create the different visual types.
As mentioned before, the final chapter explores evaluation. Principally, the designer needs to evaluate the presentation correctly so that it “can reveal potential problems and indicate which actions have to be carried out to improve the quality of the visual representation” (p. 132).
I find this book highly useful in understanding how to handle massive quantities of data visually. It is an introduction and, if used as a textbook, would need heavy supplementing. If used as a reference book, it points the way to solving problems related to visual representation of data in real time. I think it a useful book for both academics and practitioners.
Tom Warren is an STC Fellow, a winner of the Jay R. Gould Award for teaching excellence, and professor emeritus of English (technical writing) at Oklahoma State University, where he established the BA, MA, and PhD technical writing programs. Past president of INTECOM, he serves as guest professor at the University of Paderborn, Germany.
Kurt W. Beyer. 2009. Cambridge, MA: The MIT Press. [ISBN 978-0-262-01310-9. 408 pages, including index. US$27.95.]
In Grace Hopper and the Invention of the Information Age, Beyer weaves U.S. history into Hopper’s life in a book that is difficult to put down. Although I was ghostwriter on the college textbook published by West Publishing Company in the 1980s under Hopper’s name, I was unaware of the lasting impact of the attack on Pearl Harbor and the U.S. involvement in World War II on the computer industry. Pearl Harbor created career opportunities for women. At about that time, Hopper divorced her husband, left her teaching position at Vassar College, and joined the Navy.
Beyer includes a humorous quote from Hopper regarding her adjustment to the Navy after her intense teaching schedules at both Vassar and Barnard College. In the Navy, she found an environment where she was relieved of all minor decisions: “I just promptly relaxed into it like a featherbed and gained weight and had a perfectly heavenly time” (p. 33).
The early computers were used by the government to simulate rocket trajectories and the movement of ships. Although the Mark I—the first computing machine used by the government—was thought to do calculations quickly, today’s laptop computer has the capacity to process information 333 million times faster. The Mark I (also known as the Automatic Sequence Controlled Calculator) was 51 feet long, weighed 9,445 pounds, and had 530 miles of wiring. Manipulating the hardware of the early computing machines could be dangerous. Beyer notes that operator David Green was nearly strangled when his tie got caught in the sequence mechanism.
Hopper is credited with finding the first computer bug. Since the windows at Harvard University didn’t have screens on them, bugs flew in. When the Mark II stopped running, Hopper found a large moth with a 4-inch wing span beaten to death in one of the relays. The workers used Scotch tape to add the corpse to the log book on September 9, 1945.
Beyer covers many details about Hopper, including the following:
- Was among the first modern programmers.
- Developed a system of documentation within each segment of code.
- Developed COBOL.
- Braved a hurricane in fall 1944 to work in the Harvard Computation Laboratory.
- Used the vanity mirror in her handbag as the preferred tool to inspect the $750,000 Mark I.
- Developed the first compiler.
- Had a clock in her office that ran counterclockwise to illustrate that there are many ways to conceptualize solutions to problems.
- Helped establish the Association for Computing Machinery.
- Was named the first Computer Sciences “Man of the Year” in 1969 by the Data Processing Management Association.
- Retired in 1986 as the oldest active officer in the Navy.
Hopper died in her sleep on January 1, 1992, at age 86 and was buried in Arlington Cemetery with full honors. Her advice for maintaining a youthful and creative outlook by constantly broadening one’s own knowledge base is good advice for anyone.
Beyer not only writes about Hopper’s life but strings together the history of computers and details of the lives of others who were involved with the information age into a well-researched book. His vivid writing style and the numerous photos from the archives make past events come alive in the reader’s mind.
For those who are interested in the early history of computers and Hopper’s involvement in computing, the Archives Center of the Smithsonian Institution’s National Museum of American History in Washington, DC, contains many items from the early days of computing, including photos, academic articles, technical notes, manuals, and press clippings.
Rhonda Lunemann is a senior technical writer with Siemens PLM Software and a senior member of STC’s Twin Cities Chapter.
Anne Gentle. 2009. Fort Collins, CO: XML Press. [ISBN 978-0-9822191-1-9. 236 pages, including index. US$29.95 (softcover).]
Anne Gentle’s Conversation and Community: The Social Web for Documentation is a wake-up call for technical communicators who are still not ready to embrace the social Web that takes them beyond their comfort zone of in-house-produced user guides and online help to the realm of blogs, wikis, and forums. Accustomed to a more formal writing environment that allows for rigorous editing and complete content control, writers often shun the lax user-generated content populating the Internet. This, according to Gentle, is a big mistake. Instead, professional writers need to view this communication shift as an opportunity to embrace a new collaboration with their online audience. The results will be better served, happier users and continued relevance of the professional technical communicator.
Gentle’s knowledge of this growing social Web is vast, and she excitedly shares what she knows. She suggests that blogs and wikis encourage a productive dialogue between writer and users. “Writers have more conversation-starting tools at their disposal than at any other time in history,” she explains (p. 14). To help novices understand these social media tools, Gentle dedicates an entire chapter to describing them. She explains in some detail the terminology common in the social Web, terms such as tagging, syndicated content, and community. In addition, she lists online communities writers may want to consider as they figure out how to insert themselves into the ongoing conversation.
The book hits its stride when Gentle actually connects technical communicators with users, offering specific examples on how writers can ease themselves into existing conversations. I am especially impressed by the reasonableness of Gentle’s approach. For example, she does not tell writers to run out and start their own blogs; instead, she emphasizes the responsibilities that come with blogging and recommends serious consideration of all aspects of blogging before entering the fray. For those who are interested in creating blogs, the book offers practical guidance: how to select a platform and how to ensure a supply of fresh posts. It also includes several helpful examples.
In addition to blogs, the book addresses mashup options, wikis, and open documentation systems. Gentle recommends that writers consider mashing user contributions into their formal user assistance, and she provides a list of tools, techniques, and guidelines to help readers get started.
As for wikis, Gentle says, “I believe that wikis encourage crowd-sourcing, or delegating work to a large group of semi-organized individuals (a crowd), and help writers see customers’ view points [sic]” (p. 143). She also recognizes the limitations of a wiki: access controls that anger users, output quality issues, and connectivity requirements. For readers who do want to make the move to wikis, Gentle provides a list of wiki features that should be considered when selecting a tool, such as security, support, and usability. She also includes a detailed explanation of how to start a new wiki or revive an existing one. For readers who decide a wiki is not the right fit for their situation, she suggests an online help alternative, along with a list of considerations for choosing one.
After taking writers through the painstaking steps of selecting and implementing the correct social Web technology for their situation, Gentle brings the process full circle, offering tips on writing for this new audience. Acknowledging that social Web documentation is probably not as formal as standard user assistance, she suggests the creation of a separate style guide for the new content. Her other suggestions: be direct and honest, share personal stories, and create snappy titles.
The book concludes with a wonderful glossary of terms, a list of recommended reading, and a comprehensive index. Gentle’s book is an irreplaceable tool for any technical communicator who wants to stay relevant in the field.
Denise Kadilak is an information architect with Blackbaud Inc., a software company based in Charleston, SC. She holds a BA and an MA in English and has worked as a technical communicator for more than 12 years. Denise is a senior member of STC and immediate past president of the Northeast Ohio Chapter.
Curt Cloninger. 2009. Berkeley, CA: New Riders. [ISBN 978-0-321-56269-2. 205 pages, including index. US$39.99 (softcover).]
The classic publications adage “You can’t judge a book by its cover” applies to Fresher Styles for Web Designers. With its green and white diagonal pattern offset only by the black title text, the cover style is, well, not very fresh at all! It hardly calls out “eye candy of Web designs.”
However, open the book, and, wow! Peruse its contents; read its witty, thought-provoking prose; admire its structure; and enjoy its examples. You will be hooked and want this issue on your bookshelf or even your coffee table.
But wait a minute—why should this author be considered an authority on something as subjective as Web design? In the ever-expanding and morphing world of the Web, it is hard to discern whether an author can be considered an expert. The technology and even the industry of Web design are too new and changes too rapid to judge.
However, take into consideration his predictions from Fresh Styles for Web Designers: Eye Candy from the Underground (New Riders, 2001; reviewed in the August 2003 issue of Technical Communication), repeated in this book, that the Web would both “become a minimal, text-centric channel of efficient information delivery (the web as database information model)” and “become a high-bandwidth entertainment channel (the web as interactive TV model)” (p. 3). Either this was a lucky guess or the author was plugged in nine years ago. Also, take a look at the trends in experimental Web design he identified in 2001 as being “fresher”—they can still be considered at least fresh, and perhaps even cutting-edge.
Cloninger makes a static presentation feel dynamic while staying true to his intent to take “the samples and examples approach” to teaching design, working backward to distill the basic principles (p. 15). For each of eight categories of stylecleverly labeled, for example, “No Style,” “Corkboard Sprawl,” and “Hand-Drawn Analog”he provides a definition, followed by characteristics, influences, examples, and uses. These styles are seen as the beginning of a taxonomy, although there is some blurring between styles.
For the serious Web designer, Cloninger offers serious insights in a humorous way. For the rest of us, he provides delicious food for thought. He describes, for example, three ways to handle unknown browser width: Liquid Layout, Jello Layout, and Ice Layout. He achieves his overriding goalto get you to think about your own Web design through the styles, examples, and explanationswith flair, as reflected in a quotation from Wolfgang Weingart: “A fresh, exciting, new era was brewing out of authentic creative motivation, and not out of aesthetic formalism” (p. 42).
Traditionalist that I am, I appreciate the author’s subscription to fundamentals of good writing and layout, particularly in using white space and clearly establishing spatial relationships among the visual elements. At the same time, he has forced me to think outside my traditional comfort zone in terms of design and delivery of Web messages. Cover notwithstanding, the author delivers on the promise of exploring fresher styles through delicious eye candy.
Mark Hanigan has more than 25 years of experience as a technical writer, business analyst, instructional designer, trainer, speaker, and project manager. He has his own consulting company, On the Write Track. He has served in various STC roles at the chapter and Society levels, including STC president in 2000–2001. Mark was elected Fellow in 2005.
Carol Siri Johnson. 2009. Amityville, NY: Baywood Publishing Company. [ISBN 978-0-89503-384-0. 200 pages, including index. US$49.95.]
In The Language of Work, Carol Siri Johnson traces the emergence and development of technical communication within one company, Lukens Steel of Pennsylvania, over a 115-year period. The book is divided into two parts: “Background,” consisting of two chapters, offers a survey of technical communication in the 19th-century American iron and steel industry and presents a history of Lukens Steel; and “Analysis,” consisting of four chapters, examines artifacts of the company’s communication from 1810 to 1925. Using primary sources from the Hagley Museum and Library, Johnson makes a significant contribution to our understanding of the evolution of technical communication in late 19th- and early 20th-century America.
From 1810 to 1870, the workers at Lukens Steel communicated primarily by word of mouth and body language rather than writing. The earliest forms of written communication were entries in daybooks, journals, and ledgers and correspondence stored in letterbooks. From 1870 to 1900, the company began to rely more heavily on paper for record keeping, paving the way for sophisticated forms of written technical communication. After 1900, the Lukens Steel plant “exploded from comparative silence into a multiplicity of voices” (p. 107) as new genres of technical communication proliferated.
The four chapters in “Analysis” follow a similar pattern: They describe the technologies of the period and then analyze the related genres one by one, allowing us to see the connections between the technologies and the forms of communications that emerged. Each chapter is illustrated by photographs and facsimiles of documents. “Lukens 1900–1915: An Explosion of Technical Communication,” for example, reproduces reports, handwritten notes, blueprints, drawings, cards, a chart, and printed pages. These illustrations effectively complement the analysis in the text.
Johnson makes an observation that might deserve closer study: The company’s stenographers and typists were “the mediators of technical communication” (p. 162). These women had a high level of literacy in comparison with others in the company. They were mediators because they transformed oral technical communication into written documents and produced “multiple copies of error-free text” (p. 162). It might be interesting to study their work to see if they can be regarded, in some respects, as the precursors of technical editors in industry.
An interesting argument that Johnson explores in both the introduction and the conclusion is the “literary value” of all texts. Influenced by Elizabeth Tebeaux as well as Foucault, she complains that literature has been defined too narrowly and that technical communication as literature is a prohibited area of inquiry in the academic world. Whereas traditional literature “memorializes an individual consciousness in a state of retrospection,” technical communication is “the literature of a group” (p. 5): collaborative, transactional, and commonplace. A company’s technical documentation is “a different sort of literature than a novel, but it still tells the story of human lives” (p. 10). Johnson makes an interesting case for including everyday documents in literary studies and for using literary theories and methods in technical communication.
The Language of Work represents a type of historical study that is needed in technical communication: studies of the communication practices of individual companies and industries. Especially in its extensive use of primary sources, Johnson’s book should point the way for scholars interested in this kind of historical and rhetorical research.
Edward A. Malone is associate professor and former director of technical communication programs in the Department of English and Technical Communication at Missouri University of Science and Technology. He is a senior member of STC.
John R. Beech. 2009. Chichester, UK: Wiley-Blackwell. [ISBN 978-1-4051-5694-3. 256 pages, including index. US$30.00 (softcover).]
When I started my bachelor’s degree program in psychology in 1995, I had no definitive guide to how to write and research in psychology. Like many students, I learned by trial and error what constitutes solid writing for the sciences, often guided only by examples of documents (such as questionnaires or informed consent forms) that other students had written or that the institutional review board at my university gave me as models for my research. I can honestly say that I did not understand the full magnitude of what it took to construct a study, write it up, and try to get it published until well into graduate school. I would have welcomed a book like John R. Beech’s How to Write in Psychology.
What is most valuable about the book is the sheer variety of writing it covers. Many books attempt to cover only the academic article or work that can be published, but Beech discusses everything from how to study for and respond to an essay exam in the undergraduate and graduate school years, to lab reports, to questionnaire design, to writing recruitment letters to potential participants for a study, and finally to writing the academic article. Further, Beech goes beyond the prescriptive how-tos and checklists and explains the importance, meaning, and uses of each document. He leaves no stone unturned in giving students a solid basis for understanding how to write in psychology.
The only serious criticism I might level at How to Write in Psychology is that in places the writing is clearly outdated. For example, Beech demonstrates how to make tables with the 2003 version of Word. This might not seem like a serious matter, but for undergraduate students who try to follow the instructions literally, this section will probably be confusing, since later versions of Word are markedly different. Further, Beech devotes time to discussing writing out exams in longhand, while many students are used to taking exams on a computer.
Otherwise, Beech shows a good sense of audience. For example, his discussion of the parts of a quantitative academic article (method, results, and so on) provides a handy checklist for students to use to make sure that they have addressed all applicable criteria. The chapter “Presenting Numbers, Tables and Figures” boils down the conventions of writing in the APA style and includes very helpful examples of naming figures and tables in APA style. Other examples of useful information are using databases such as PsycINFO, finding and citing appropriate sources, and purchasing a field-specific dictionary. A lot of this is very valuable advice that students might not otherwise learn except through trial and error.
The audience for How to Write in Psychology is clearly the psychology student, who will without a doubt find the information invaluable. However, because so many of the genres covered by Beech are universal in the sciences and even for topics such as usability in technical writing (such as the lab report and the essay exam), this text could easily serve as a writing primer for students in other disciplines. As a faculty member who regularly teaches upper-level writing, I am very glad to have encountered this text because of its clear, helpful advice. I plan to integrate much of Beech’s advice for writing in some of the technical genres I teach.
Terry Kidd, ed. 2010. Hershey, PA: Information Science Reference. [ISBN 978-1-60566-830-7. 352 pages, including index. US$180.00.]
A compilation of research studies and articles regarding contemporary issues in online education, Terry Kidd’s Online Education and Adult Learning will spark some timely conversations that are long overdue regarding adult education and online delivery.
The collection of articles covers three main categories: (1) “Introducing New Perspectives in Online Learning,” (2) “New Frontiers for Online Teaching and Adult Learning Practices,” and (3) “Case Studies of Online Learning.” Most of the articles in each category are accessible for even new graduate students, but they also address long-standing issues that the most seasoned online instructors regularly face and may find interesting to revisit.
For instance, many online instructors understand that some adult students are technological newbies and may have trouble navigating an online classroom. Exactly how do we help them overcome their fear and feel comfortable with their learning environment? What can we do on our own, outside the scope of our regular online classroom responsibilities, to help these students become technically savvy? What we know about online learners may not always be addressed fruitfully through our teaching, which is why it is important to read studies like those presented in this anthology, in which authors share their successes and failures, as well as contemporary research on the subject.
While the book offers current research in online education, it is difficult to say that any of the information is “new,” as implied by the titles of the three categories. The research and studies are certainly new, and we need to continue research in online education overall, but the issues under study are not necessarily new to anyone who has taught online for a few years. An example is the use of games for adult learners. The concept itself is not new, but it is important to continue talking about this topic, because technology changes so rapidly.
The collection of contributors is diverse, which is an advantage—the reader gets perspectives and study results from those who work at state universities as well as at more specialized or private colleges around the United States. A few articles are included by authors in other countries; more international pieces in the collection would give it a more global perspective.
The intended audience is “all major stakeholders” (p. xx), which I interpret to mean administrators and instructors, because the price would most likely prohibit its use as a textbook for graduate students.
Diane Martinez is a writing specialist for Kaplan University’s online Writing Center and a PhD student at Utah State University. Her technical writing experience has been mostly in higher education, engineering, and government contracting. She has been with Kaplan since 2004 and a member of STC since 2005.
Lee Lister. 2009. [Ipswich], UK: Biz Guru Ltd. [ISBN 978-0-9563861-0-6. 101 pages. US$47.55 (softcover).]
Proposal Writing for Smaller Businesses kicks off with three pages of warnings from the author, who tells us that she does not believe in get-rich-quick schemes and does not guarantee any level of profit based on the information in her 101-page book.
In Chapter 1, Lee Lister states the main reason someone at a small business would want to spend time writing proposals: to expand the business. Each subsequent chapter discusses the approach and process she recommends to prepare a winning proposal. Although Lister is writing for a UK audience (with Briticisms such as “keep an eye on the adverts” [p. 46]), the book loses only a bit of utility for American (and presumably Canadian) audiences.
Lister’s process is logical, but the information she provides tends to lack detail. For example, in the chapter “Now You Need to Get Writing,” she says that before starting to write your proposal, you need to “find out exactly what your potential client wants—not always what they ask for” (p. 56). What’s missing are tips on how to do that.
The book needs a good copyedit before the second edition is published. It contains numerous typographical errors, dropped words, incorrect words, and acronyms and abbreviations defined on second reference. Adding one or two examples and using them throughout the book would help explain concepts; by the end of the book, this would provide a detailed—and complete—proposal that the reader could use as a template. Lister’s writing is conversational, which isn’t necessarily bad, but it lacks the polish that will inspire small business owners to do their best when penning their first proposals.
Proposal Writing for Smaller Businesses is a basic text appropriate for the business novice who wants to get a fundamental understanding of the proposal process, but to get the next level of understanding, check out similar titles at the bookstore. You might get more for your dollar (or euro).
Ginny Hudak-David is the associate director in the Office for University Relations, the communications unit of the three-campus University of Illinois system.
Jillana B. Enteen. 2010. New York, NY: Routledge. [ISBN 978-0-415-97724-1. 208 pages, including index. US$125.00.]
Jillana Enteen’s academic text asks an interesting research question: How does nonstandard English play a role in digital communication? Enteen, a professor and director of Gender Studies at Northwestern University, begins with a lengthy, well-annotated essay on creolization as a term that denotes legitimate language rather than the “nonlanguage” designation given by westerners. She then uses this position to state that digital creolization is also legitimate and worthy of study, defining it as “an alternative for describing the strategic deployment of English taking place in digital environments” (p. 42).
Enteen’s interests lie in the nonstandard, the queer, the socially divergent, and the underrepresented, all of which make interesting research material, especially when queer theory and cultural studies are the frames for inquiry. She became interested in the interplay of online technology and English-language terms when she was hired to teach a class on “life skills” for urban youth. What began as a graduate student’s idea developed into Virtual English, a book that promises an interesting analysis: “Understanding English in digital environments as a Creole emphasizes the creative aspect of language use and assumes that non-grammatical deployments are not mistakes, but poetics” (p. 43). Because I expected to read about the ontology of terms such as blog, I found myself engaged most when she looked at all that is implied in a term like boot up, which she traces to everything from the travels of the Baron Münchhausen to Booker T. Washington.
She cursorily examines the idea that catachresis in the digital sphere is more often a deliberate misuse that shows that “meanings can never be fully captured or determined” (p. 40). She addresses inherent heterosexual-normative terms such as male and female for computer cables and the colonialist undertones behind the terms master/slave for a hard disk hierarchy. English is ubiquitous in technology; it’s the default online language, appearing in everything from HTML code to top-level domain names such as .org and .com. When “others” use English, creolization occurs.
Enteen examines how several science fiction novels create technical terms, play with the English language, and combine English and creoles with technology. This might seem like a tangent, but essentially it’s a shift into a queer theory analysis of a plot line and its connections to language and female power.
If all chapters examined digital creolization, the book would be indispensable for academic technical communicators. Unfortunately, it wanders away from virtual English and shifts to three academic case studies of Internet use in marginalized groups, beginning with the Tamil-Eelam, who are using online space as a way to begin building an independent nation-state. Enteen looks at straight Thai women and gay Thai men, paying particular attention to the sexual stereotypes that westerners have propagated.
These chapters occasionally point out a creolized term or comment on the use of English, but the diversions aren’t enough to fulfill the promise of language examination that the title of the book suggests. A language enthusiast who is pursuing a PhD might use this academic source as a starting point for research, but the book better serves those in Internet studies or cultural studies.
Kelly A. Harrison works as a consultant, speaker, and writing instructor in San Jose, CA. For more than 15 years, she’s written technical materials and online content for various software companies. Currently, she teaches writing at San Jose State University and prefers short-term and part-time contracts.
Developing Quality Dissertations in the Sciences: A Graduate Student’s Guide to Achieving Excellence
Barbara E. Lovitts and Ellen L. Wert. 2009. Sterling, VA: Stylus Publishing. [ISBN 978-1-57922-259-8. 40 pages, including appendixes. US$7.95 (softcover).]
Barbara Lovitts and Ellen Wert argue that, contrary to advisors’ assumptions, students do not clearly understand the purpose of a dissertation or how it is evaluated for originality and significance. To address this deficiency, their booklet clearly and precisely defines the dissertation as a vehicle for demonstrating the graduate student’s ability to exhibit both originality and significance in acquiring, interpreting, and communicating “expert knowledge,” and provides specific criteria for determining quality in both form and content.
Originality can apply to either content or approach but must be “something that has not been done, found, proved, or seen before” (p. 4); the level of originality is determined by the student’s advisor and committee. Significance—because it involves the consequences of a contribution over time—is ultimately decided by the disciplinary community. Original and significant contributions are therefore those adjudged to be novel and (eventually) “nontrivial” (p. 5).
To help students achieve these goals, Lovitts and Wert provide tables for science (physics, biology), engineering (electrical, computer), and mathematics students, listing the specific, concrete criteria dissertations in each field must demonstrate to be considered outstanding, very good, acceptable, or unacceptable. Criteria are tailored to each discipline’s research (content) and to the writing of each section of the dissertation (form). For example, an acceptable literature review “cites all the right papers but does not put them in the right context”; an outstanding review “cites only important relevant information” (p. 26). Writing is acceptable if it is “weak” and needs “strong editorial work,” but outstanding if it is “very well written and very well organized,” as well as “elegant” and “compelling” (pp. 9–10). This approach provides students and faculty with relatively objective, specific, and actionable criteria for identifying strengths and addressing weaknesses.
The strong emphasis on writing quality stems from complaints by science and engineering faculty about “the surprising amount of poor writing they see among their graduate students” (p. 22). Though it is obvious to technical communicators, many science and engineering students do not realize the importance of writing to their careers; this booklet pointedly reminds them that once they are working in their profession, their “writing will be an indicator of the quality” of their thought and their “attention to the details of research.” If they “cannot convey [their] ideas and data clearly, concisely, and coherently, the reader will not be able to appreciate their import” (p. 22).
By offering practical, directly applicable quality metrics for scientific and technical dissertations, including originality and significance, the authors help students and faculty clarify mutual expectations and eliminate the misunderstanding that makes dissertations unnecessarily onerous. The authors especially intend that students use their advice to become more proactive in consciously managing the communication demands of their professions. The responsibility, they emphasize, lies more with the student than the teacher.
Short, inexpensive, practical, offering usable advice based on Lovitts’ own previously published research, laid out in distinct modules that are easily identified and assigned, and focused on representative disciplines that are generally applicable, this pamphlet can readily supplement in-depth study of dissertation research and writing methods. Recommended.
Lars Heide. 2009. Baltimore, MD: The Johns Hopkins University Press. [ISBN 978-0-8018-9143-4. 370 pages, including index. US$65.00.]
As a college freshman in 1970, I sat down to the tedious task of registering for classes by filling out a stack of punch cards (#2 pencil only!), with little awareness of the rich history contained in those humble pastel cards. Lars Heide tells that story.
The book begins with Herman Hollerith’s invention of a punch system to count the 1890 U.S. census. We learn of Hollerith’s struggles to develop the punch card technology and find other customers besides the Census Bureau, and his eventual break with the Bureau. In 1902, the originally ad hoc Census Bureau becomes a permanent institution by congressional degree. With a new director, Simon Newton Dexter North, the Bureau is now all about controlling costs. North demands price concessions from Hollerith for the rental of his punched-card machines and asks for congressional funds to build his own machines. The rift between North and Hollerith results in the loss of the census contract in 1905 and Hollerith’s search for other markets.
We follow Hollerith as he joins forces with Eugene A. Ford, an experienced shop engineer, who helps him simplify and standardize his machines. He can now respond to the First World War and the growing demand for his “tabulating machines.” The need to mobilize and then statistically track men and munitions is a boon for Hollerith’s company, the Tabulating Machine Company. Each punch card for the war effort contains a printed “Man Number” that identifies each solder and facilitates statistical processing.
The use of punched cards to facilitate information storage continues into the Second World War. As Heide puts it, “The punched card was the basis for the most advanced information technology from the 1920s to the Second World War” (p. 5). Along the way, unable to meet demands for his machines and hobbled by an inability to delegate responsibility to employees and effectively grow his company, Hollerith sells his company to a conglomerate that will become IBM in 1924.
Although separate chapters describe punched card use in Europe and Hollerith’s attempt to gain a foothold there, we never really get to know Herman Hollerith as a man. While this book is clearly of interest to anyone who studies the history of technological innovation, I would like to know more about the individual behind the punch card. What was it that caused Herman to press through so many periods of business reversals? What spark drove him to usher in our modern computer age against often overwhelming obstacles? We learn much about the design of the punched card but too little about the design of the man who crafted it.
Victoria Maki is the president of Bitzone, a technical publishing and training company, and coauthor of Documenting APIs: Writing Developer Documentation for Java APIs and SDKs, which is available from the Bitzone Web site at http://www.bitzone.com. She also co-manages the Technical/API Docs Special Interest Group for the Silicon Valley STC Chapter.
Steve Wheeler, ed. 2009. Charlotte, NC: Information Age Publishing, Inc. [ISBN 978-1-60752-015-3. 284 pages, including author bios. US$39.99 (softcover).]
Today, people can access a wide range of online degree programs, certificate-granting courses, and training seminars with the click of a button. Providing successful instruction in such contexts, however, is often not a matter of technology but of people. That is, the group with which students interact in educational contexts can influence the learning process. Educators and trainers, therefore, need to understand how the group dynamics or the culture of an online class affects both instructor and student success. In this context, Connected Minds, Emerging Cultures: Cybercultures in Online Learning offers insights into the role culture and communities can play in online instruction.
Cultures—whether online or face-to-face—are complex and nuanced. Thus, it would be difficult for a single text to effectively explore all aspects affecting their creation and evolution. The editor of Connected Minds, Emerging Cultures does not try to provide readers with a definitive reference resource; rather, his objective is to present ideas and perspectives that will prompt reflection and help readers think more carefully about the topic. This approach allows him to include entries on a wide range of issues and opinions associated with communities, culture, and education in online environments. It also means that the book’s contributors use various methods to explore ideas and different writing styles to convey information. The result is an interesting, highly informative, and very readable work that would be of interest to anyone involved in or considering online education or online training.
Steve Wheeler has organized the book’s 17 chapters into four broad sections that address “Digital Subcultures,” “Roles and Identities,” “Cyber Perspectives,” and “Narratives and Case Studies.” One might expect such divisions to be too broad; on the contrary, each section contains rich and interesting chapters that collectively do an effective job of examining the various aspects of that section.
The first section explores what subcultures are and how they are created in terms of collaboration, mobility and access, visual versus verbal interaction, and pervasiveness. Similarly, “Roles and Identities” reviews previous research on online identity. The entries in this section also examine how different kinds of social interactions (for example, interaction via gaming vs. via formal online classes) lead to the creation of differing group identities, such as virtual clans and digital tribes, which affects perceptions of self in the virtual and real worlds.
The vaguely titled section “Cyber Perspectives” actually contains four focused entries that expand notions of culture and identity. These chapters explore how such ideas affect information creation and sharing in online communities. The concluding section leaves readers with four examples of how the topics covered in the book’s first three sections can affect online learning in different contexts. This section provides readers with examples of how to apply the more abstract concepts covered in the first three sections of the collection.
Through this organizational approach, the editor has created an inverse pyramid that gradually eases readers into the topic. This structure allows them to begin their examination of the subject with broad concepts related to communities, culture, and online education, then gradually move to a more focused exploration of certain topics. The text ends with specific examples that help readers understand how these concepts actually affect learning online. This organization, in combination with well-written chapters on interesting topics, makes Connected Minds, Emerging Cultures an ideal text for anyone interested in online instruction. It also makes the book an effective text for undergraduate or graduate courses on developing online instruction.
Kirk St.Amant teaches technical and professional communication at East Carolina University. His research interests include international and intercultural communication (especially in the online environment) and online communication. He is an STC senior member.