Jackie Damrau, Editor
The reviews provided here are those that are self-selected by the reviewers from a provided list of available titles over a specific date range. Want to become a book reviewer? Contact Dr. Jackie Damrau at jdamrau3@gmail.com for more information.
Critical Visualization: Rethinking the Representation of Data
Peter A. Hall and Patricio Dávila
Can Science Be Witty? Science Communication Between Critique and Cabaret
Marc-Denis Weitze, Wolfgang Chr. Goede, and Wolfgang M. Heckl, eds.
Meganets: How Digital Forces Beyond Our Control Commandeer Our Daily Lives and Inner Realities
David B. Auerbach
User Experience as Innovative Academic Practice
Kate Crane and Kelli Cargile Cook, ed.
The Greatest Invention: A History of the World in Nine Mysterious Scripts
Silvia Ferrara
101 UX Principles: Actionable Solutions for Product Design Success
Will Grant
Strategic Content Design: Tools and Research Techniques for Better UX
Erica Jorgensen
Because Data Can’t Speak for Itself: A Practical Guide to Telling Persuasive Policy Stories
David Chrisinger and Lauren Brodsky
Strangely Rhetorical: Composing Differently With Novelty Devices
Jimmy Butts
Pursuing Teaching Excellence in Higher Education: Towards an Inclusive Perspective
Margaret Wood and Feng Su
Clarity and Coherence in Academic Writing: Using Language as a Resource
David Nunan and Julie Choi
The Quantified Worker: Law and Technology in the Modern Workplace
Ifeoma Ajunwa
The 24-Hour Rule and Other Secrets for Smarter Organizations
Adrienne Bellehumeur
the invisible art of literary editing
Bryan Furuness and Sarah Layden
Customer Experience Analytics: How Customers Can Better Guide Your Web and App Design Decisions
Akin Arikan
Immersive Content and Usability
Preston So
Critical Visualization: Rethinking the Representation of Data
Peter A. Hall and Patricio Dávila. 2023. Bloomsbury Visual Arts. [ISBN 978-1-3500-7724-9. 256 pages, including index. US$34.95 (softcover).]
Critical Visualization: Rethinking the Representation of Data is an ambitious, sometimes polemic book that may distress some readers. After all, who wants to hear that they’ve been thinking wrong over their entire career!
Hall and Dávila’s central thrust is that we too often blindly assume that all the data submerging us is neutral. They counter: “The veil of neutrality drawn over data visualizations becomes more opaque and impenetrable when the socio-technical infrastructure that produces them is obscured behind the so-called neutral technologies: human decisions sealed behind computational logic (i.e., algorithms)” (pp. 10–11).
The subtitle, “Rethinking the Representation of Data,” suggests their intention to dive deeply. Thus, they recommend asking “Who made the visualization and for whom? When did they make it? Why did they make it? What social/cultural conditions was it made under? What belief systems is it reinforcing or challenging? What processes, or translations, preceded its production? What has been excluded?” (p. 14).
Without posing such questions, say the authors, we often unthinkingly adopt presented data without considering the context, particularly producers’ biases and what they may have erased. Importantly, we thereby fail to see how the data serves to validate such systems of oppression as colonialism, capitalism, and rule by older white men.
Not surprisingly, the many negative examples are sociopolitical. Conditions within war-ravaged Yemen, the unfulfilled promise of smart cities, urban planning designed to protect wealthy New Yorkers, risk profiling along United States borders, measurement of moral qualities of human races, maps of poverty and neighborhood improvement, Statistics Canada’s efforts to identify races (early versions erased many individuals), and much more show mostly the perils of not asking the authors’ incisive questions.
The book also has some positive observations. This we see especially in descriptions of community mapping projects. (Descriptions in these sections, however, have less bite than those of attack.) The most successful illustration of the book’s arguments is the maps and visualizations made by W. E. B. Du Bois at the 1900 Paris Exposition to interpret Black life in the United States.
As mind-opening and colorful as Critical Visualization is, it’s not perfect. The authors more than once attack Edward Tufte, a graphics guru well-known to STC members, as overly accepting of data speaking for itself without examining its history. Tufte’s books and seminars refute this charge. Also, the publisher should have tested the unreliable index.
The book includes some wonderful graphics, but many large ones have been squeezed onto their pages so that I find the wording within the pictures impossible to read, even with my 20-20 vision and a magnifying glass.
Hall and Dávila allude to machine learning in several pages (it’s not in the index). I wonder what would happen if they brought into the book something like ChatGPT, which stormed into our consciousness after the authors finished the book. They would surely have more to say.
Avon J. Murphy
Avon J. Murphy is an STC Fellow who serves the Society as a researcher. A onetime college professor and government writer, he is a technical editing contractor and the principal in Murphy Editing and Writing Services, based in western Washington.
Can Science Be Witty? Science Communication Between Critique and Cabaret
Marc-Denis Weitze, Wolfgang Chr. Goede, and Wolfgang M. Heckl, eds. Springer. [ISBN 978-3-662-65753-9 (eBook). 242 pages. US$34.99 (eBook).]
In the foreword to Can Science Be Witty? Science Communication Between Critique and Cabaret, scientist Wolfgang M. Heckl shares his discovery that funny, easygoing colleagues went stale when they discussed scientific issues in public. For research to be remembered, dry facts must find their characters and go on heroic journeys.
Storytelling is gaining ground in the genre of science cabaret. To create the stories, this book suggests finding amusement in experiments that went wrong and exploring the gray areas between amusement, satire, and criticism. Can Science Be Witty? also shares instructions for injecting humor into science.
One caveat to the reader: the book was machine translated from German to English and human edited, but the sentences may not flow as smoothly as they would in their original language. It’s also helpful to remember that humor doesn’t always translate well; it relies on wordplay, which varies greatly in languages and dialects.
If you’re familiar with poetry slams (where poets perform spoken-word poetry in front of an audience and a panel of judges), science slams are similar, but with scientific content—and maybe a few PowerPoint slides. Factor in humor, satire, and ridiculousness, and you’re on your way to understanding how this book uses the word cabaret.
Chapter 3, Laughter Tears Down Walls, provides excellent examples of combining science and humor. Contributor Vince Ebert writes that “scientific thinking is nothing more than a method of testing conjectures” (p. 27). His scenario starts with “there might be some beer left in the refrigerator” (p. 28). If he checks, he’s engaging in a preliminary form of science, which is different than theology where conjectures aren’t usually checked. So, if he asserts, “There’s beer in the fridge!”, he’s a theologian. If he checks, he’s a scientist. If he looks, finds nothing, and still claims there’s beer in the fridge, he’s an esotericist. Ebert confirms that science can be witty when he ends this example with “an esotericist can claim more nonsense in five minutes than a scientist can disprove in a lifetime” (p. 28).
In Chapter 8, Searching for Humor in the Deutsches Museum, Wolfgang Chr. Goede writes about Jürgen Teichmann, Germany’s longest-serving museum educator. He tells a story about Otto von Guericke and the proof of air pressure using the Magdeburg hemispheres. Guericke pumped out some of the air to create a vacuum inside, then had eight horses harnessed to the left and right to pull the hemispheres apart. What was the achievement? “Finding sixteen horses so weak that they couldn’t tear the hemispheres apart” (p. 87). Plain, simple humor.
While Can Science Be Witty? in its entirety won’t make you laugh out loud, there are several gems throughout. Whether science-pop is an actual musical genre, you might get a kick out of the silly lyrics in Chapter 5, Date an Engineer. For those who like illusions and magic, Chapter 7, Scientists, Magicians, and Charlatans might capture your attention. To learn important rules for making your jokes work and how humor triggers laughter, read Chapter 10, When a Dalmation Comes to the Cash Register. (You might also add some of Eckart von Hirschhausen’s jokes to your repertoire.)
Michelle Gardner
Michelle Gardner is a contract senior writer at Microsoft focused on their cloud portfolio. She has a bachelor’s degree in journalism: public relations from California State University, Long Beach, and a master’s degree in computer resources and information management from Webster University.
Meganets: How Digital Forces Beyond Our Control Commandeer Our Daily Lives and Inner Realities
David B. Auerbach. 2023. PublicAffairs. [ISBN 978-1-5417-7444-5. 352 pages, including index. US$30.00 (hardcover).]
The digital linking of much of humanity into huge online data networks has brought great benefits but has also opened a Pandora’s box of intractable problems—floods of disinformation, toxic online discourse, extreme political and social polarization, pervasive corporate and governmental surveillance, online scams, and much more.
Drawing on more than a decade of experience working as a software developer for Microsoft and Google, David Auerbach has produced a perceptive analysis of the problems caused by what he calls meganets and sketches out a first cut at mitigating their harm.
Meganets, as described by Auerbach, are huge, semiautonomous data networks that operate at levels of volume, velocity, and virality that make them too big for humans to know or control. Driven by feedback loops of algorithms and human interactions—visits, likes, purchases—they are continuously evolving, have the capacity to self-organize, and are as ungovernable as the weather.
To develop his case, Auerbach traces the development of meganets from early chat and meeting apps through social media, indexing and search, games and commerce, to AI and the metaverse. Using examples from many fields—online gaming, cryptocurrency, government efforts to leverage technology for social ends (India, China), and others—Auerbach shows that while you can attempt to control code, you can’t control how people use it. As a result, meganets spin out of control, behave in ways well beyond what their creators intended, and cause problems that spill into the offline world.
Meganet chaos has produced numerous calls for reform and regulation. In carefully reasoned arguments, Auerbach shows that the kinds of “fixes” being attempted so far—de-platforming bad actors, tailored attempts to police specific content, demands that corporations control the uncontrollable—are largely ineffective and often counterproductive. But that does not mean we should give up in despair. While the meganets are here to stay and can’t be “fixed,” Auerbach argues they might be tamed and coaxed into retaining their many benefits, while doing less harm.
Instead of resorting to bans and censorship, he argues we should look for systemic ways to apply the brakes to the meganets’ velocity and virality, thus slowing down their aggressive feedback loops and diluting their power. Toward this end, he offers a number of broad suggestions—limiting automatic forwarding, forcing turn taking on discussion threads, randomly injecting new elements into threads and forums, and more—that might be tried. In choosing tactics, our aim should be to preserve as much individual autonomy and liberty as possible, while still effectively restricting the most noxious manifestations of meganet social culture.
Auerbach acknowledges that his suggestions are controversial, untried, and will need refinement and testing, but as other approaches haven’t worked, he believes approaches along the lines he suggests may be the best shot we have.
Those who hope to preserve the online world’s vitality, while still making it a less toxic, more habitable place, should find much in Meganets to clarify their thinking.
Patrick Lufkin
Patrick Lufkin is an STC Fellow with experience in computer documentation, newsletter production, and public relations. He reads widely in science, history, and current affairs, as well as on writing and editing. He chairs the Gordon Scholarship for technical communication and co-chairs the Northern California technical communication competition.
User Experience as Innovative Academic Practice
Kate Crane and Kelli Cargile Cook, ed. 2022. The WAC Clearinghouse and University Press of Colorado. [ISBN 978-1-64642-268-5. 330 pages. US$36.95 (softcover).]
User Experience as Innovative Academic Practice compiles 13 scholarly articles describing user experience activities intended to improve collegiate classes. The chapters explain case studies that focus on applying common user experience (UX) methods to academic learning. Some articles embrace what would be considered traditional improvements to usability; others expand to customer experience (CX) or improvements to the overall experience students had in the course.
Readers working outside academia may find the need for this kind of research surprising. Some case studies focus on basic ideas—not just UX principles, but basic common-sense ideas like surveying your students about what works and doesn’t work for their education. Jennifer Bay, Margaret Becker, Ashlie Clark, Emily Mast, Brendan Robb, and Korbyn Torres explain, “There has been extensive research on user-centered design, usability, and the user experience in professional and technical writing, but we have found less attention to how we are to understand students as “users” in an undergraduate major” (p. 268). Each article in this book tries to lessen that deficiency by applying usability principles to academic courses. For example, Sarah Martin developed user profiles addressing the assumption that all students spoke English (p. 55). Beau Pihlaja consulted students early in the semester to redesign his syllabus in efforts to improve its user-centered design (p. 112). Luke Thominet acknowledged criticisms about inefficiency in design thinking and quantified the hours he spent using a design thinking process to create student learning outcomes for a writing and rhetoric program (p. 179).
User Experience as Innovative Academic Practice shares useful ideas for improving academic courses. Some case studies followed traditional UX methods like journey mapping, user interviews, and surveys. The collection shares techniques that professionals in academia can use to improve their courses. While those ideas might not seem novel for professionals working in usability, incorporating them into courses exposes students to a more collaborative educational experience and provides faculty with methods to make their courses more applicable to the current market demand and the interests of their students.
Stephanie Saylor
Stephanie Saylor is a usability engineer at Yellow Duck Technologies, Inc. She received her master’s degree in digital communication from Johns Hopkins University.
The Greatest Invention: A History of the World in Nine Mysterious Scripts
Silvia Ferrara. 2022 (translated by Todd Portnowitz for Farrar, Strauss and Giroux). Picador [ISBN 978-1-250-86299-0. 304 pages. US$18.00 (softcover).]
The scripts—the writing systems—in Professor Silvia Ferrara’s book have proved mysterious indeed. She describes The Greatest Invention: A History of the World in Nine Mysterious Scripts as a “book [that] recounts the invention of writing” (p. 4). In addition, the book is a history of unraveling the mysteries of the scripts in her story.
Ferrara does approach history as a storyteller, writing “as if speaking to you” (p. 280), making the reading conversational. She discusses the people and places involved as well as the challenges and successes that led to the discoveries of the scripts in The Greatest Invention. Her story begins 4000 years ago in Crete.
The book is divided between scripts that have not been deciphered and those that have been invented. Cretan Hieroglyphic, the very first discovery, is in the undeciphered category, as are several other Cretan systems: Linear A, Cypro-Minoan, and the Phaistos Disk. (Another Cretan script, Linear B, has been deciphered.) The story continues with Easter Island’s mellifluous sounding Rongorongo before turning to scripts that were invented.
While there is much speculation about the purposes of the undeciphered scripts, “writing systems flourish when they’re channeled toward a common aim,” (p. 154). Egyptian hieroglyphics were meant to celebrate very important persons (VIPs). The Mesopotamians needed a writing system to keep bureaucratic records. Early Chinese script messages communicated with the dead. Mayan script reanimated both people and objects. These systems all became part of the social framework.
Ferrara examines the qualities new writing systems have in common and what it takes to establish these scripts in a society. She also sets aside several chapters to discuss writing systems invented by individuals, such as Sequoyah, whose Cherokee script is still in use, and others invented by civilizations that didn’t pass the test of time.
To conclude The Greatest Invention, Ferrara brings her expertise to bear with guidance on deciphering texts, with examples from current scientific research she and other scholars have experienced. There are recommended steps to decipherment and a list of what not to do. “Completing the puzzle of an undeciphered script, demands…the force of logic, and the force of creativity and flexibility” (p. 248).
The Greatest Invention provides a wealth of information on the development of writing in different cultures throughout the world. Illustrations of historical scripts add to the interest. The conversational nature of this text and references to pop culture make this an easy read, although I found the author’s frequent digressions distracting. It also seems to be aimed at an audience who is familiar with linguistics; linguistic references are used throughout without explanation. For a reader who is not a linguist, a glossary would be helpful.
Linda M. Davis
Linda M. Davis is an independent communications practitioner in the Los Angeles area. She holds an MA in Communication Management and has specialized in strategic communication planning, publication management, writing, and editing for more than 25 years.
101 UX Principles: Actionable Solutions for Product Design Success
Will Grant. 2022. 2nd ed. Packt Publishing. [ISBN 978-1-80323-488-5. 432 pages, including index. US$39.99 (softcover).]
101 UX Principles: Actionable Solutions for Product Design Success attempts to provide systematic principles for a field that has historically resisted them. There is a saying in the user experience (UX) professional community that the answer to every question is “it depends,” a response grounded in the assumption that every product is so dependent on its context that no systematic rules for product development can be defined. Will Grant argues that there are fixed principles that can be used across products, though he strongly cautions that “[n]othing in this book means anything unless you test with real people” (p. 24).
The principles themselves range from fairly abstract (Principle 1: Everyone Can Be Great at UX, p. 3) to the very specific (Principle 28: When a User Refreshes a Feed, Move Them to the Last Unread Item, p. 113). And as Grant himself admits, most of the principles “focuses heavily on the User Interface (UI)” because “the pixels on the screen are still a huge part of the customer experience with digital products” (p. xx). Put in this context, the book is less a comprehensive examination of UX principles than it is a series of useful principles for designing digital products.
Given these caveats, it’s possible the book should have been entitled 101 Digital Product Design Principles. Regardless, the real value of this book is the near encyclopedic reference to actual UX research, particularly as it intersects with UI design. Each principle included targets a specific UI element (Principle 29 that focuses on hamburger menus, p. 117) cites data by a UX expert to ground it. In this way, the principles double as best practices grounded in research with actual users.
What 101 UX Principles does for UX professionals, and anyone involved in any element of product design, then, is to provide a large list of working assumptions for creating user interfaces. There are always limitations to how involved users can be in product design. Having a series of principles to guide the creation of initial prototypes, prototypes that can then be rigorously tested with actual users, will give any professional a huge head start. UX experts may quibble with some of the principles listed, but this reviewer found them to be valid starting points for avoiding some of the worst issues with UIs that continue to plague many digital products even to this day.
Guiseppe Getto
Guiseppe Getto is a faculty member at Mercer University where he directs the Master of Science in Technical Communication Management program.
Strategic Content Design: Tools and Research Techniques for Better UX
Erica Jorgensen. 2023. Rosenfeld Media. [ISBN 978-1-959029-57-1. 320 pages, including index. US $54.99 (softcover).]
Strategic Content Design: Tools and Research Techniques for Better UX is a much-anticipated book for professionals like me who work at the intersections of content strategy and user experience (UX). Content design has been spoken about by practitioners for several years now, but this is the first full-length book on the topic. With newer professions like UX writing emerging, the book is also timely for people trying to launch careers at the intersection of creating digital products and crafting content.
One potential shortcoming of the book, however, is that it doesn’t begin by defining this emerging area. Perhaps this is the academic researcher in me, but I would have loved to start the book with a definition for content design that I could use to fully grasp the tips throughout the book. That definition is certainly present in an appendix that begins on page 262, where the author defines content design as “the art and science of determining what specific content should appear in a user experience, and where and when it should appear” (p. 264). This clear, simple definition is useful to readers, regardless of its placement.
The real focus of Erica Jorgensen’s book, however, is what she calls content research, a practice that “involves asking your customers or audience for focused feedback on your content—for example, what they like, what they don’t like, and why—and then using that feedback to improve your content” (p. 4). The book can thus largely be seen as a primer in how to conduct this kind of research and how to leverage it for the improvement of an organization’s content.
The tips for conducting content research are quite comprehensive and range from how to measure content clarity (p. 35) to how to define content research goals (p. 97) to crafting research questions (p. 147), analyzing research results (p. 167), and applying these results to business goals (p. 235). The real value of Strategic Content Design is this practical, hands-on approach to gathering feedback from users to improve content, which has long been a pain point for many content professionals. The book contains everything a practitioner or researcher may need to assess, analyze, and improve the state of content by soliciting feedback from users.
Seen in this light, any professional seeking to gather user feedback on content will find much to enjoy in Strategic Content Design. This could include practitioners seeking to add content research to their regular workflow or who are struggling with ineffective content, as well as researchers seeking to assess content in various forms and teachers seeking to introduce learners to the emerging content design profession.
Guiseppe Getto
Guiseppe Getto is a faculty member at Mercer University where he directs the Master of Science in Technical Communication Management.
Because Data Can’t Speak for Itself: A Practical Guide to Telling Persuasive Policy Stories
David Chrisinger and Lauren Brodsky. 2023. Johns Hopkins University Press. [ISBN 978-1-4214-4584-7. 136 pages, including index. US$22.95 (soft cover).]
>Because Data Can’t Speak for Itself: A Practical Guide to Telling Persuasive Policy Stories by David Chrisinger and Lauren Brodsky is a highly beneficial guide for writers who want to convey data-driven information to their readers effectively. The book emphasizes the importance of writing with data, not just about data, to make a lasting impact.
The book is well-organized into three parts: People, Purpose, and Persistence. Part I, People, highlights the connection between data and people (p. 24), emphasizing that quantitative data alone may not always provide insights into the reasons behind certain phenomena (p. 21). Part II, Purpose, focuses on developing better research questions to create more interesting and accurate stories. It also outlines the three primary objectives of an effective story: understanding and describing the situation, determining the best reforms or interventions, and suggesting the next steps to address challenges (p. 35).
Part III, Persistence, underscores the significance of integrity in data-driven writing. Instead of pretending that data has no limitations, the authors encourage writers to be honest about the data’s scope and offer human interpretations. This approach prevents overstatement of the data’s meaning and maintains credibility with the reader.
Because Data Can’t Speak for Itself contains 32 tips to help writers craft more effective data-driven narratives. Some key sections highlighted in the review include the importance of asking better research questions (p. 36), avoiding unnecessary attribution (p. 87), and using concise language to explain complex concepts (p. 95).
In Tip #17, the COVID-19 data story (p. 61) shows a discrepancy between data and real stories. As a technical writer, it helped me understand how to deliver data information more effectively to my readers and the importance of the context, not only data itself. This book appears to provide valuable insights and guidance on data information.
Overall, I commend the book as a practical handbook suitable for writers in data-driven environments, evidence-based researchers, policy writers, marketers, and storytellers. The inclusion of all 32 tips in one place (pp. 107–113) serves as a helpful reference for writers to check and improve their writing journey.
Sam Lee
Sam Lee is an STC member and a Policies & Procedures SIG manager. Sam has a Master of Technology Management, a Master of Electrical Engineering, and a Technical Writing Certificate. He is currently a Senior Electrical and Avionics System Engineer, where he supports avionics systems certification and writes aviation-related documentation.
Strangely Rhetorical: Composing Differently With Novelty Devices
Jimmy Butts. Utah State University Press. [ISBN 978-1-64642-281-4. 221 pages, including index. US$24.95 (softcover).]
Strangely Rhetorical: Composing Differently With Novelty Devices by Jimmy Butts discusses strangeness as a lens for rhetoric and composition. He declares that our culture is constantly hunting for novelty and strangenesses [sic]. In this book, strangeness is the measure of difference or distance between relations. He says that it “offers similarities and differences, helps us perceive when things are more or less alike; it builds tension and creates attraction and repulsion” (p. 9).
In Chapter 3, How Strangeness Works, Butts relates an explanation from Christopher Tindale about rhetorical figures. They are “devices that use words to make some striking effects on an audience.…Whether we are referring to figures in the form of tropes…or schemes…, they grab us and help us take notice by making language…strange” (p. 59).
Of the hundreds of rhetorical figures in existence, Butts chose seven that he feels “can be executed to surprising effect across a number of old and new media” (p. 103). Using the utterly apropos mnemonic STRANGE, they are:
- Shapeshifting. Transforming an entire original form that retains its reference back to the original arrangement (Neptune’s son Proteus).
- Time travel. Speeding up, slowing down, reversing, or skipping a text’s linear chronology. A composition that moves back and forth by rearranging the order of events illustrates time travel.
- Replacement. Putting one component in place of another in a part of a text. Examples are metaphors and euphemisms.
- Addition and subtraction. Adding something where it would normally not be or taking something away. Butts uses the Headless Horseman as an example.
- Negation. Inverting text by using the negative. In an appropriate turn of events, Butts does not provide a theoretical introduction to this topic as he did for the others.
- Glossolalia. Using foreign or unintelligible language to communicate an unrecognizable form (Pythia, the Oracle at Delphi).
- Exponentiation. Amplifying or minimizing something that would typically be within normative measurements, as with hyperbole.
In this reviewer’s opinion, what sells Strangely Rhetorical are the seven strange projects that correspond to each of the rhetorical figures. For example, in reference to time travel, Butts suggests sharing your personal history (briefly), but changing three details about your past that creates an alternate history. For exponentiation, he challenges you to create the ____-est thing in the universe, worthy of Guinness World Records.
If you want to tap into the novel and unusual to create striking compositions, pick up a copy of Strangely Rhetorical to understand why strangeness matters.
Michelle Gardner
Michelle Gardner is a copywriter and content editor in the life sciences industry and the technology sector. She has a bachelor’s degree in Journalism: Public Relations from California State University, Long Beach, and a master’s degree in Computer Resources and Information Management from Webster University.
Pursuing Teaching Excellence in Higher Education: Towards an Inclusive Perspective
Margaret Wood and Feng Su. 2023. Bloomsbury Academic. [ISBN: 978-1-350-21669-3. 174 pages, including index. US$39.95 (softcover).]
What do Margaret Wood and Feng Su (who are university teachers in the United Kingdom) mean when they discuss what they call an inclusive perspective? In their understanding, they include the perspective of various stakeholders interested in educational excellence. The stakeholders include students, parents, employers, academics, and educational institutions—each getting a chapter in Pursuing Teaching Excellence in Higher Education: Towards an Inclusive Perspective.
The authors apply this attempt at inclusivity toward teaching excellence which they argue is poorly understood and poorly defined. They attempt in their book to better define excellence in higher education. They do this by looking at an understanding of the role and purpose of higher education. The authors argue that by looking at inclusive perspectives, they can get a valuable understanding of the value of higher education and move toward excellence.
To give an idea of what kinds of issues the authors discuss, consider from the academician’s viewpoint the following question as it relates to academic excellence: Is it possible to excel in both teaching and research (p. 55). From the student’s viewpoint, the authors ask the question relating it to excellence: Does teaching excellence focus too much on teaching and not enough on the role of the learner and learning (p. 76). The authors ask equally interesting questions for all the perspectives and stakeholders discussed.
The authors even look at what academic excellence means in not just the United Kingdom. For example, researchers in Russia indicate that excellence has a narrow definition (p. 21). Excellence in Russia shows research is more important than teaching. This perspective gives an even more insightful glimpse on how to think about excellence in higher education.
Students, parents, teachers, researchers, policy makers at academic institutions, employers, and anyone interested in the topic of academic excellence could find Pursuing Teaching Excellence in Higher Education: Towards an Inclusive Perspective to be a valuable resource and perspective on what it means to be academically excellent today. We can also as a society look at this book to help examine the purpose of academic institutions and how to make these institutions reflect the interests of various stakeholders providing an inclusive perspective.
Jeanette Evans
Jeanette Evans is an STC Associate Fellow; active in the Ohio STC community, currently serving on the newsletter committee; and co-author of an Intercom column on emerging technologies in education. She holds an MS in technical communication management from Mercer University and undergraduate degree in education.
Clarity and Coherence in Academic Writing: Using Language as a Resource
David Nunan and Julie Choi. 2023. Routledge. [ISBN 978-1-0320-1559-0. 212 pages, including index. US$34.95 (hardcover).]
Consider this way-out suggestion: Every teacher of writing––no matter what kind of writing they’re teaching––should write a novel, before they start teaching. This will give you a much better feel for what language can really do…the linguistic mountaintops and depths of the sea you are able to reach. Think of language as your Archimedes fulcrum to pry into everything, from physics to poetry.
Clarity and Coherence in Academic Writing: Using Language as a Resource in general provides an excellent summary of how to “humanize” academic prose; how to make it more interesting and appealing. In other words, how to connect with the reader.
The authors’ analysis is often brilliant and deep: so deep that it sometimes sinks to the bottom of the barrel, or the brain. In other words, so detailed that it becomes hard to apply. And sometimes, just the opposite: giving a long list of details but not expanding on them; for example, not providing useful examples.
A case-in-point: The book contains an insightful 20-page chapter on Figurative Language. It points out that this can pose serious problems even for advanced second language users, since some figures of speech are not even recognized as such. Take the following example: “Data suggests that the economy will shrink in the second quarter.” Data here is being used figuratively: “Data don’t suggest anything…because data don’t (or doesn’t) speak (or write). Nor does it imply or indicate” (p. 125). The author is attributing a human quality to a nonhuman entity. Which is a definition of personification.
Figures of speech are much more pervasive than we think (p. 123), and probably come in different forms. Analogies, for example, are especially good for conveying the feel of statistical information, such as a NASA spokesman explaining the amount of information NASA handles in one year (10 bits) as equal to 100 million Sears catalogs. Likewise, animation is surprisingly common in biology and chemistry textbooks, as in the following: “The oak embarked on a journey of continued growth. Cells divided repeatedly, grew together, and increase in diameter.” And metaphors run deeper than we think. Neuroscientists have found they actually activate those areas of the brain connected with the sights, sounds, and smells involved.
Sometimes, the topic is given too brief a treatment. In the case of punctuation, just one page (p. 23f), whereas a book that I authored on the technique in nonfiction devotes 8 pages to the topic. Clarity and Coherence is designed for students who are learning to write academic prose, such as scholarly articles and dissertations. Those students include native and nonnative speakers of English. Regular “newspaper English” is hard enough for the second language student. Academic English, with its pre-Victorian syntax and phonemics, is a journey without an end. Which, of course, is one of the things that makes the language so fascinating.
Steven Darian
Darian is a Professor Emeritus at Rutgers University. He has worked and studied in several countries. Steven’s Ph.D. is in Applied Linguistics from New York University. Several examples in the review are taken from the second edition of Steven’s book: Technique in Nonfiction: The Tools of the Trade.
The Quantified Worker: Law and Technology in the Modern Workplace
Ifeoma Ajunwa. 2023. Cambridge University Press. [ISBN 978-1-316-63695-4. 462 pages, including index. US$34.99 (softcover).]
The Industrial Revolution produced a shift from small, home-based, master-apprentice craft shops to large, impersonal factories that redefined human relationships in purely quantitative and functional terms. The efficiency expert Frederick Taylor formalized this shift into “scientific management” (p. 4) by isolating and redistributing complex, single-worker skills into simplified steps anyone could replicate. The worker no longer built a wheel by himself; now each step was performed by a different individual, with a manager observing and timing the worker with a stopwatch and tallying the number of units produced. The “mission to improve efficiency” soon “developed into an ideology” (p. 61) whose power has exploded through information technology, resulting in a workplace dominated by increasingly intrusive, automated surveillance systems that replace humans with “mechanical managers” (p. 2).
These technologies “erode worker personhood” (p. 30) by turning employees into “data doubles” (p. 265) that managers evaluate instead of the real employee, with the system itself increasingly replacing the manager’s decisions. Automated surveillance has become ubiquitous: it includes logging keystrokes, monitoring email, capturing web page views, requiring “wearable” data collectors, implanting RFID chips, and wearing exoskeletons. It has even “begun to encroach on the home” (p. 172), creating a “digital panopticon” (p. 9) that enables “perpetual and intimate surveillance” (p. 174) across an “unbounded workplace” (p. 244).
Because software reflects the decisions and prejudices of its creators, and “learns” through repetition, systemic bias increases over time. And because it is impossible to write software that completely avoids inherent human biases (p. 80), regulation is necessary to ensure “algorithmic-derived employment decisions are in line with anti-discrimination laws” (p. 77).
A solution would require algorithms to include employees in “the design of the data program” (p. 291); reflect “‘privacy law and property law’ rooted in ongoing employee data and value rights” (p. 291); and designate employers as “‘information fiduciaries’” (p. 291) responsible for ensuring the continuing fairness of the system. Such “concurrent regulation” would unite “industry regulations” with “governmental legislation” (p. 351) and transform the algorithm into a “broker” mediating the explicit responsibilities of both employer and employee to ensure “fairness” as “part of the design from the onset” (p. 380).
With this change, employers would have to exercise their “duty of care” (p. 361) by working with employees to design a fair system; implementing periodic audits to remedy “disparate impact” (p. 372); and recognizing that the legal principle of discrimination per se transfers the burden of proof to employers, who must now prove they did not discriminate. Other remedies include government or third-party certification systems, sequestering “demographic information” (p. 376) from hiring managers, and including mutually agreeable data capture rules in union contracts.
Ifeoma Ajunwa’s practical solution would reduce intrusive, automated surveillance of employees, minimize (if not eliminate) discrimination and bias, re-establish boundaries between work and home, and restore the humanity of workers too long dehumanized by obsessive and excessive quantification.
Donald R. Riccomini
Donald R. Riccomini is an STC member and Emeritus Senior Lecturer in English at Santa Clara University, where he specialized in engineering and technical communications. He previously spent twenty-three years in high technology as a technical writer, engineer, and manager in semiconductors, instrumentation, and server development.
The 24-Hour Rule and Other Secrets for Smarter Organizations
Adrienne Bellehumeur. 2023. Matt Holt Books. [ISBN 978-1-6377-4283-9]. 242 pages. US$28.00 (hardcover).]
Do you know that your short-term memory is like having seven Post-it® Notes stored at the front of your brain? Adrienne Bellehumeur, author of The 24-Hour Rule and Other Secrets for Smarter Organizations, cites this finding of noted psychologist George Miller. She states, “Seven items is not a lot of space for all the Post-it Notes you need for life!” (p. 57). Her book teaches you how to get information out of your head and store it effectively.
In Part 1, The Foundation, Bellehumeur explains how we all do documentation, including proposals, memos, spreadsheets, and email. However, we have little understanding of how to apply skills of documentation to turbocharge our effectiveness.
As a consultant specializing in business analysis, audit, and internal control programs, she found that doing something with information within 24 hours of hearing it is the foundation to working faster, smarter, and more nimbly. For example, following a team meeting, she suggests writing down three main points that were mentioned. This 24-hour rule is the core concept of her six-step approach to dynamic documentation.
In Part 2, The 6 Steps of Dynamic Documentation, she describes the six steps that allow you to take control of your work to achieve your goals.
- Capturing. Get information out of your head and store it using journaling, mind mapping, blogs, action items, and project lists. She explains what information to capture and what not to capture.
- Structuring. Structure the information by shuffling, moving, pulling together, and looking for patterns. During this step, state the purpose by defining the “why.”
- Presenting. Presenting is what “sells” your information to your team.
- Communicating. Of the various communication vehicles, communicating by email is a better influencer than social media to sell your ideas.
- Storing and Leveraging. When storing and leveraging your content, you need fewer documents than you think. She recommends you trash documents that are ROT (redundant, outdated, and trivial).
- Leading and Innovating. Looking to the future involves leading and innovating. Bellehumeur states, “Companies now need documentation to train their staff on how systems, processes, and protocols work. Thinking of your staff as permanent fixtures is dangerous” (p. 164).
In Part 3, Dynamic Documentation in Action, she describes how to implement dynamic documentation in your workplace and your home. Her final chapter, Applying Dynamic Documentation at Home, makes her book an even more valuable resource. As a working mother of three small children, Adrienne is aware of the types of household tasks requiring documentation, such as meal planning, exercise, cleaning, bill management, vacation planning, and family appointments.
As a technical writer, I highly recommend Adrienne’s book as a reference for your bookshelf. In fact, I have replaced my other books by well-known productivity authors with her book as my go-to reference. Find out more about Adrienne’s work and her articles at www.bellehumeurco.com.
Rhonda Lunemann
Rhonda Lunemann is an STC senior member and a technical writer with Siemens Digital Industry Software. She serves on the STC Twin Cities chapter’s Program committee and is a member of the MN (Minnesota) Bot Makers.
the invisible art of literary editing
Bryan Furuness and Sarah Layden. 2023. Bloomsbury Academic. [ISBN 978-1-350-29648-0. 136 pages. US$19.95 (softcover).]
How do editors learn their craft? How do instructors train editors? As of a July 2023 update, Poets & Writers lists 169 universities that produce literary journals (https://www.pw.org/content/literary_journals_us_mfa_programs). Faculty who manage these journals have a complex job of teaching students the various types of editing on top of topics ranging from print and online production to social media to sales and marketing—all without a textbook, until now. the invisible art of literary editing fills a crucial part of this void by focusing on the editing of creative prose in literary journals.
The book contains three parts, organized to follow the editing process (p. 1), including a brief series of discussions around the roles and responsibilities of editors, followed by best practices for communicating with authors, and ending with several substantive and interesting case studies from editors at a range of journals.
Part One is a short twenty pages that outlines the typical process for handling submissions, from creating submissions guidelines to determining an aesthetic to reading the slush pile. This part could be expanded to examine the practical side of culling submissions, perhaps by providing ten to twenty first paragraphs as an exercise to see how quickly students can skim and decide which works they might be interested in and why, thus turning the reading process into lessons on craft that tie together the information on aesthetics and acquisitions. Also, the sections “Deeper Consideration” and “All-Staff Discussion” (pp. 16–17) could include activities on developing the journal’s editorial aesthetic. What do students think of second person point of view, for example? What about difficult content and trigger warnings? The authors lost some opportunities here for classroom activities and connecting the practical lessons in Part One.
the invisible art of literary editing excels in providing sample texts with editing prompts, all of which could work well as classroom activities or homework and as exemplars for any course on editing. The publisher should offer digital versions so that students can practice marking texts in tools like Google Docs and Microsoft Word. (Faculty often mistakenly believe students know how to do this already.) Despite being a short book, I also wish it had an index.
I most appreciate how the authors incorporate humor throughout. In discussing how to write rejection letters, they caution against including sale offers or subscriptions: “When someone gets rejected, they’re not in a mood to buy anything but Häagen-Dazs and Wild Turkey” (p. 25). The section “Editing with a Heavy Hand,” aims to help “timid” entry-level editors, those who tend to edit lightly, by inviting them to dominate a practice revision. The authors describe how Gordon Lish took a “heavy hand” to Raymond Carver’s fiction, touching on the controversy surrounding Lish’s drastic approach. Their humorous instructions for the exercise? “Lish the hell out of it” (p. 114).
the invisible art of literary editing is a compact book at a very reasonable price that gives faculty advisors a starting point for teaching the next generation of editors.
Kelly A. Harrison
Kelly A. Harrison, MFA, teaches technical communication at Stanford University. Formerly, she taught a range of writing courses at San José State University and worked in various roles for software companies. Kelly is the Associate Editor for West Trade Review, and during her MFA she was the managing editor for Reed Magazine.
Customer Experience Analytics: How Customers Can Better Guide Your Web and App Design Decisions
Akin Arikan. 2023. Routledge. [ISBN 978-1-0323-7076-7. 326 pages, including index. US$39.95 (softcover).]
Customer Experience Analytics: How Customers Can Better Guide Your Web and App Design Decisions is a detailed introduction to using experience analytics to reduce user frustration and increase engagement. This book would be useful to marketing and User Experience (UX) professionals, as well as anyone who has influence over product design.
Akin Arikan has extensive experience in digital analysis, which shines through in his book. He is clearly passionate about improving user journeys through digital products. The book makes a good case for leveraging customer experience analytics to improve customer engagement and conversion (sales) besides traditional digital metrics. Numerous case studies illustrate how experience analytics measurably impacted percentages and dollar amounts that were being lost due to customer frustration or that were gained through innovation.
Part I explores what makes a good user experience: the user’s journey through the product should be inspiring, rewarding, and seamless (p. 16). The digital team should spot problem areas quickly and identify areas for potential innovation. Traditional digital analytics can capture simple facts such as traffic and clicks, and it can evaluate the effect of potential design changes using A/B testing, however, “[d]espite all the bar charts, trend charts, and click behavior numbers, nobody can still hear a customer’s rage on a site” (p. 28). Incorporating experience analytics gives better insight to the estimated 90% of user behavior that happens between clicks and taps (p. 39). The final chapters of Part I go into detail about the types of experience metrics including heatmaps, struggle and error metrics (“rage-clicks”), and more.
Part II gets into practical case studies of using experience analytics to solve common problems in every area of the user journey funnel, from the initial landing on the home page through transaction completion and beyond. Arikan introduces a quick cheat sheet for interpreting the case studies called the C-SUITE (Challenge, Surface, Understand, Impact, Test, and Evaluate) (p. 108). Following each case study, a chart shows the C-SUITE takeaways exploring how the company started with a problem area identified by analytics, explored the true frustration, and implemented and tested a solution. Common themes among the case studies include important reminders for all designers: put vital information above the fold so customers do not have to scroll to see it, design mobile sites with mobile-specific contexts in mind and prioritize accessibility to reach the widest number of customers.
Part III looks forward to the future, outlining how to use experience analytics to predict future customer behavior and exploring potential uses of artificial intelligence for customer service with chatbots, digital assistants, and wearable devices.
Overall, Customer Experience Analytics is neither too broad or too in-depth for the average businessperson and serves as a guide for incorporating a wide variety of analytics to drive website and app design as well as increase company revenue. I would recommend it to anyone on the digital team of a company from marketing and UX to management and technical analysts.
Bonnie Winstel
Bonnie Winstel is the Assistant Manager of Software Development for Book Systems, Inc. in Huntsville, Alabama. She received her master’s degree in English and Technical Communication at the University of Alabama-Huntsville in 2013.
Immersive Content and Usability
Preston So. 2023. A Book Apart. [ISBN 978-1-952616-28-0. 132 pages, including index. US$39.00 (softcover).]
Immersive Content and Usability surfaces the considerations that ought to be top of mind for content strategists working with virtual reality (VR). I picked up this book interested to hear the implications for information architecture that come with content immersion; that is, of using language to establish a sense of presence in virtual experiences.
As Preston So explains, content is “immersive” when it exemplifies four key characteristics (pp. 6–7). Firstly, such content is based in the physical world, not trapped behind a screen. This physicality requires content designers to pay special attention to real-world usability testing, to avoid embarrassing mishaps such as placing digital signs in unreadable locations or inundating a user with multiple location-based notifications if they stumble into the intersection of two beacon ranges (p. 51). Secondly, immersive content normally involves signage or labelling, which means it inherits non-immersive content best practices, such as using succinct wording and legible fonts. However, designers must determine whether to deliver “embedded content” or “environmental content,” which So explains well with the simile, “Embedded content in VR is like the speedometer or fuel gauge in a car, while environmental content is the signage and billboards that whoosh by in the windshield” (p. 10). Another key differentiator for immersive content is its response to movement, allowing designers to attune content to positional or orientational changes in space (p. 8). Finally, immersive content operates in three dimensions, necessitating the use of devices such as VR headsets. The detailed graph provided on p. 33 clearly demarcates the various “content zones” for VR, providing specific measurements for how content should be positioned so that it fits into the “Goldilocks zone” for maximum usability. It’s time-saving research like this where the book provides most of its value.
Something I appreciate about Immersive Content and Usability is that it foregrounds accessibility considerations whenever discussing usability. However, So has little empathy for designers who allow usability oversights into their products, referring to a company who employed biometric data to render a digital avatar for his wheelchair-user colleague that mistakenly portrayed her as a non-wheelchair-user as “blatantly ableist and oppressive” (p. 117). While I understand So’s frustration, personally, I feel he dishes out such invective a bit too readily. One could make the case that his own book is ableist and oppressive against blind or dyslexic people for not yet being available in audiobook format, for example. Achieving full accessibility is a laudable yet lofty goal that necessarily takes time, so it is unfortunate that So uses such accusatory and demeaning language towards those whose experience in this subject does not match his own. Most of his writing, however, stays pragmatic and enlightening, imparting the message that virtual reality is an environment that designers will find is brimming with both peril and opportunity.
Josh Anderson
Josh Anderson, CPTC, is an Information Architect at Paligo. Josh was an English teacher in Japan and an SEO Specialist in the Chicagoland area before earning a Master of Information at the University of Toronto.