doi.org/10.55177/tc719324
By Ann Shivers-McNair
ABSTRACT
Purpose: In this review of research, I examine Brock’s (2018) critical technocultural discourse analysis approach and Sano-Franchini’s (2018) critical interface analysis approach as two methods for critical interface analysis that are useful not only for critique but also for community-engaged design work. Specifically, critical interface analysis can offer a more expansive approach to heuristic evaluation that offers design researchers strategies for a) engaging community members in critical conversations and b) including communities and stakeholders in the design research process in the spirit of justice-focused co-design.
Method: I place critical interface analysis in conversation with heuristic evaluation, highlighting similarities, differences, and possibilities for rethinking and expanding each through the connection.
Results: I describe how critical interface analysis heuristics from Brock (2018) and Sano-Franchini (2018) can be applied to support layered community engagement throughout design research processes: specifically, in (1) language setting, (2) research plans, (3) participatory analysis, and (4) research evaluation.
Conclusion: The approaches to critical interface analysis discussed here afford people in traditionally privileged design research roles (in academic, industry, and public sector institutions) a way to honor the experiences and expertise of community members by not only reflecting on the ways they contribute to and are impacted by designs, but also collaborating with them on critical interface analysis.
Keywords: Critical Interface Analysis, Social Justice, Community Engaged Design, Design Research
Practitioner’s Takeaway:
- Critical interface analysis can be applied by justice-focused practitioners across design contexts as a collaboration-oriented approach to heuristic evaluation in order to account for the experiences and contributions of users, communities, and researchers in design work.
- Design researchers can use the principles that guide critical interface analysis as a heuristic to highlight the stakes and value of critical approaches in design research in their engagements with communities and stakeholders.
- Design researchers can use critical interface analysis to frame research plans that account for the connections between functions and ideologies, between uses and values, and between actions and relationships.
- Design researchers can partner with and value the expertise of communities through participatory approaches to critical interface analysis.
- Design researchers can evaluate and hold our own research accountable by applying reflexive critical interface analysis to our research practices.
This review of research builds on a tradition of technical communication literature enacting critical interface analyses that pushes design research beyond a narrow focus on “use” (as a way of centering industry priorities) to consider designs in relation to the cultural, ideological, affective, contextual, and embodied experiences of people (Green, 2021; Gu, 2016; Homer, 2020; Jones, 2021; Knight et al., 2009; Mckoy et al., 2020; Moses & Katz, 2006; Sano-Franchini, 2018; Sidler & Jones, 2009; Zdenek, 2007). These approaches to critical interface analysis align with the social justice turn in our field that emphasizes the need, as Jones (2016a) puts it, not only to value “the experiences and perspectives of others (especially populations that are often silenced and marginalized)” but also to take “action to redress inequities” (p. 347). Indeed, critical interface analyses are closely related to many strands of research in technical communication that take a critical, contextual, and multimodal approach to analyzing technologies and communication. For example, Agbozo (2022) applies critical multimodal discourse analysis (CMDA) to conduct a historically, materially, and contextually situated and layered analysis of technical documents for GhanaPostGPS, a geolocation tool, that shares many theoretical and methodological commitments with critical interface analysis approaches.
Both critical interface analysis scholarship and social justice scholarship more broadly in our field have highlighted and reimagined collaborative and participatory approaches to research and design that emphasize more meaningful and sustained engagement with users and communities in design processes—centering more holistic engagements with and understandings of people beyond their interactions with a specific interface (e.g., Rose & Cardinal, 2018; Salvo, 2001; Spinuzzi, 2005). But as Green (2021) outlines, even in participatory approaches, there can still be a tendency to “overly rely on ‘use’ as the primary metric for measuring individuals’ participation in technology development” (p. 333) at the expense of attending to the “broader relations of power” that influence some people “to abandon or decline to take up digital technologies” (p. 334). For this reason, and building on scholars who focus on nonuse (Satchell & Dourish, 2009), Green models how queer theories of resistance can help participatory design researchers take a more holistic approach to understanding why people may choose not to engage with technologies.
Relatedly, HCI and design scholars advocating for design justice (Costanza-Chock, 2020) and equity-centered co-design (Harrington et al., 2019) highlight collaborative and participatory approaches that resist industry-centered priorities in design research, including focusing on what communities are doing before and after they interact with design researchers, rather than engaging with communities with a primary goal of validating the design team’s ideas or solutions. As Costanza-Chock (2020) explains,
Design justice does not focus on developing systems to abstract the knowledge, wisdom, and lived experience of community members who are supposed to be the end users of a product. Instead, design justice practitioners focus on trying to ensure that community members are actually included in meaningful ways throughout the design process. Another way to put this is ‘If you’re not at the table, you’re on the menu.’ Design justice practitioners flip the ‘problem’ of how to ensure community participation in a design process on its head to ask instead how design can best be used as a tool to amplify, support, and extend existing community-based processes. (p. 84)
Harrington et al. (2019) urge researchers, furthermore, to center sustainable engagements after and beyond design research engagement points by asking,
- What are the steps that can be taken immediately following a design engagement such that the impact is immediately perceived [in the community]?
- What are the resources that already exist and can be leveraged and supported, such that they are able to be maintained and progressed in the absence of researchers? (p. 216:19, bullet points added)
All of these community-centered strategies—paying attention to resistance and nonuse, prioritizing what communities are already doing, and planning for accountability and sustainability beyond engagement points—are important for justice-focused, holistic engagements with users and communities. In the spirit of these strategies, this article suggests a complementary strategy for applying critical interface analysis as a heuristic tool for engaging communities and stakeholders—as well as design team members—in the process of critically examining interfaces to contextually account for ideologies, values, histories, and experiences. In other words, involving communities and stakeholders, as well as designers and researchers, in critical interface analysis is another opportunity to value expertise beyond privileged design roles and privileged embodiments and to build design coalitions (Jones, 2020; Walton et al., 2019) for redressing injustices and inequities in design.
I focus here on two critical interface analysis approaches in particular—Brock’s (2018) critical technocultural discourse analysis approach and Sano-Franchini’s (2018) critical interface analysis approach—because they both offer adaptable and applicable heuristics that can support justice-focused community engagement in design research. And because both Brock’s (2018) and Sano-Franchini’s (2018) work highlights their own embodiments as part of their analyses, I account for my own positionality as a white woman working at a research university in the US, where I study, teach, and engage with communities in design research. Community engagement spans my research, teaching, and service work as an academic: currently, that includes co-researching with a borderlands justice non-profit, partnering with community activist-artists in course projects, and co-organizing a UX community. The core values of my practice are grounded in what I have learned from collaborations. In particular, and as we shared in a previous issue of this journal, Clarissa San Diego’s approach to community strategy emphasizes accounting for one’s own positionality in relation to collaborators and communities (Shivers-McNair & San Diego, 2017, p. 101) and building relationships grounded in genuine listening and connection, going above and beyond revenue-centric metrics (p. 109). I aim to practice that accountable, relationship-centered community strategy in my engagements, and this core value guides me to highlight ways that critical interface analysis can be leveraged for community engagement in design research across contexts.
Furthermore, my positionality and values not only inform my desire to highlight the justice-focused co-design possibilities for design researchers, but my positionality and values are also inextricably bound up in how I make my case. As Brock (2018) explains, acknowledging that our positionalities inflect our research can strategically and critically “expose that validity and replicability are false constructs of positivism, that each researcher brings their disciplinary, cultural, and social perspectives to the research they conduct” (p. 1027). Therefore, I present the arguments and possibilities I share here not as definitive or exhaustive, but as situated, and as invitations for more perspectives and possibilities. In what follows, I contextualize the two critical interface analysis approaches I am focusing on, then I connect critical interface analysis with heuristic evaluation in design research. Finally, I offer strategies for layering approaches to critical interface analysis in order to engage with communities and stakeholders throughout the design process, and specifically in language setting, research plans, participatory analysis, and research evaluation.
Contextualizing Critical Interface Analysis
Sano-Franchini (2018) and Brock (2018) take up interaction design theory and practice to define what interface means in their approaches and analyses, both of which focus on information communication technologies (ICTs)—specifically social media platforms—even as they both also call for and model more complexity in how we understand interfaces. Brock (2018) explains that digital interfaces are “the medium through which humans primarily interact with ICT algorithms, symbols, and practices” (p. 1020), and therefore, we must attend to “the interface’s symbolic articulation/accretion of meaning. ‘Interface’ is a complex concept with multiple considerations: Is the screen the interface? Is the ‘app’ the interface? Is the material form the interface? CTDA says ‘yes’ to all of these, depending upon context” (p. 1020). Sano-Franchini (2018) follows a similar approach to defining interfaces as material and contextual, and Sano-Franchini suggests that “critical interface analysis can be used to analyze the ideological and rhetorical function of design” across hardware, software, and other technical interfaces (n.p.). Importantly, both Brock and Sano-Franchini insist that understanding how an interface functions or how people use an interface cannot be separated from culturally situated ideologies, histories, meanings, and contexts. In other words, interfaces are not neutral.
And while “critical” is a key operative word in both Brock’s (2018) and Sano-Franchini’s (2018) work, both scholars also insist that a critical approach should not be separated from a technical approach to interfaces. As Brock (2018) explains, “Even as the Internet has matured to become our cultural communicative infrastructure, social sciences and humanities research continues to address this multifaceted medium from either an instrumental or theoretical approach” (p. 1013). In this either/or norm, a design researcher might take either an instrumental approach that “focuses on specific, characteristic communicative functions of the technology” or a theoretical approach that conceptualizes “‘discourse’ from a disciplinary perspective” (Brock, 2018, p. 1013). Instead, Brock (2018) calls for “a critical cultural approach . . . that interrogates [a technology’s] material and semiotic complexities, framed by the extant offline cultural and social practices its users engage in as they use these digital artifacts” (p. 1013). Sano-Franchini (2018) similarly pushes back on norms that separate the technical from “the cultural and ideological dimensions of UX,” arguing that “while it is of course important to understand the technical aspects of technical communication as well as user’s perspectives about technologies, it is also imperative that we consider how technologies affect users through processes of interpretation and embodied engagement” (n.p.).
Because I am highlighting heuristics in each approach that can support community-engaged design research, the next section places critical interface analysis in conversation with heuristic evaluation in design research. In tracing similarities and distinctions between heuristic evaluation and critical interface analysis, my point is to emphasize the value and applications of critical interface analysis across scholarly, pedagogical, industry, public sector, and community contexts—not only to encourage more uptakes across these contexts, but also in recognition that the boundaries of these contexts are themselves fluid and entangled, as many of us engage in and across many of these contexts.
Connecting Heuristic Evaluation and Critical Interface Analysis
I use the term “heuristic evaluation” to refer to the practice of assessing the usability or effectiveness of a design (and often a digital interface) against a set of guidelines or criteria and drawing on the expertise of the evaluator (Nielsen & Molich, 1990). Heuristic evaluation has long been part of professional practice in technical communication (Donker-Kuijer & Menno, 2010; Friess, 2015; Hart & Portwood, 2009; Kantner et al., 2002; Van der Geest & Spyridakis, 2000). And as Grice et al. (2013) show, newer media and emerging information communication technologies (ICTs) have necessarily expanded the field’s approaches to heuristic evaluation. Technical communication instructors not only teach heuristic evaluation in evolving ways (Banville & Christensen, 2022), but they also apply heuristic evaluation methods to assessing instructional effectiveness (Brown & Chao, 2010) and learning management systems (LMSs) (Chou & Sun, 1996).
But heuristic evaluation has limitations, particularly if it is the only way a design is assessed. As Carliner (2003) notes, heuristic evaluation (or characteristic-based assessment) is often and should be triangulated with task-based evaluation (with users) and results-based evaluation (such as key performance indicators [KPIs], analytics, and financial metrics). With the rise of UX and the influences of participatory values on design processes (Agboka, 2013; Cardinal et al., 2020; Salvo, 2001; Spinuzzi, 2005), researchers and practitioners often combine or triangulate heuristic evaluation with methods that engage directly with users—an approach that is vitally important, for example, in evaluating accessibility, where relying on checklists without also engaging and collaborating with disabled users in the design process reifies an ideology of normalcy instead of genuine inclusion (Oswal & Melonçon, 2017).
Still, even when heuristic evaluation is triangulated with approaches that directly engage users, the approach to heuristic evaluation can still negatively impact designs (and the people who use them) if the heuristic evaluation relies on imagined or even amalgamated user “types.” Personas (which highlight user demographics and goals) and scenarios (which highlight use-cases, often in tandem with personas) are two common approaches that design teams use to try to center users in heuristic evaluation and design work. Sometimes these personas and scenarios are assumption-based, meaning the design team is relying on their own perceptions and assumptions (and the biases and stereotypes therein) about people using their design, and sometimes these personas and scenarios are an aggregation or distillation of user research. But even when it is grounded in user research, persona-based and scenario-based heuristic evaluation can still erase the experiences and contributions of marginalized people.
Jones (2016b) explains that traditional research-based design “scenarios, though they are based on user feedback, still decontextualize and water down user experiences . . . [because] scenarios are usually a conglomerate of experiences culled from a number of different users and then compiled into a scenario that is developed from themes and patterns that designers perceive to be useful or valuable” (p. 484). Jones argues that this conglomerating (and normative) move undercuts the social justice imperative to actively center marginalized voices and experiences in design. Costanza-Chock (2020) similarly criticizes “stand-in” design strategies, including user personas, that create “abstractions about communities that are not really at the table in the design process” (n.p.) for designers and design researchers working in privileged positions (such as academic and industry institutions) who are not meaningfully engaging, collaborating with, or accountable to the communities impacted by their work.
Instead of relying on abstraction and conglomeration to contextualize heuristic evaluation and design, Jones (2016b) recommends that design researchers and teams “use actual user narratives, users’ own stories in their own voices, to replace traditional scenarios, avoiding the recontextualization of user experiences for design purposes and embracing and valuing the knowledge that a user can provide about a design. In other words, give users the opportunity and space to contribute in a participatory process without recontextualizing based on a designer perspective” (p. 486). Jones explains, further, that such an approach “would require that designers not pull-out and focus on themes within a number of narratives to create a single scenario. Instead, researchers would use interviews to create a corpus of narrative inquiry scenarios, which would present full narratives of the participants (void of designer reconceptualization). These re-envisioned scenarios would zero in on each experience as valuable and useful” (p. 486). Thus, instead of relying on abstractions and assumptions about users to inform the personas and scenarios used in heuristic evaluation, design researchers following Jones’ (2016b) approach would commit to engaging directly with and honoring the voices of users, and especially those who have been marginalized or excluded from design research.
As the work of this special issue (and the work it builds upon) demonstrates, critical interface analysis is doing this kind of work by contextualizing and evaluating the cultural and material contexts and impacts of interfaces. Furthermore, similarities and differences between critical interface analysis and traditional approaches to heuristic evaluation offer potential for justice-focused researchers across contexts. As Sano-Franchini (2018) has shown, critical analyses of digital interfaces that apply cultural and critical theories have also long been part of the field’s research and teaching, building on the foundational work of scholars like Selfe and Selfe (1994), Banks (2006, 2010), and Haas (2012). Sano-Franchini (2018) connects this tradition of critical digital interface analyses with the field’s social justice turn (Agboka, 2013; Haas & Eble, 2018; Jones, 2016a; Jones et al., 2016; Williams & Pimentel, 2016), emphasizing the importance of tracing and intervening in the flows and structures of power in and through digital interfaces.
Like heuristic evaluation, a critical analysis of a digital interface focuses on characteristics of an interface, is guided by criteria, and enacts the expertise of the analyst, but there are important differences. In critical interface analysis, the interface characteristics, uses, and functions cannot be universalized because they are inseparable from cultural and material contexts. Accordingly, the evaluative criteria are grounded in theory and cultural reflexivity, and the required expertise is critical, technical, and experiential (Brock, 2012, 2018; Sano-Franchini, 2018). As Sano-Franchini (2022) notes in the call for this special issue, a critical approach to interface analysis necessitates that we “move beyond usability, functionality, and a-political approaches to UX” (with which heuristic evaluation has often been associated) “to consider the ideological, cultural, and political implications of interface design” (n.p.).
There is a productive emphasis on humanistic critique in the technical communication literature enacting critical interface analyses (Green, 2021; Gu, 2016; Homer, 2020; Jones, 2021; Knight et al., 2009; Mckoy et al., 2020; Moses & Katz, 2006; Sano-Franchini, 2018; Sidler & Jones, 2009; Zdenek, 2007). Specifically, these analyses are often conducted by critical, humanistic scholars working in academic institutions (even as many of these scholars have experience in and relationships to industry and community practices). As Sano-Franchini (2018) explains, just as the approach itself “may be seen as outside of typical UX research methods,” the humanistic standpoint of the analyst (even one who is also a user-researcher) is also key to gaining “a more complete understanding of how human beings make meaning from UX design” (n.p.). Brock (2018), writing as a researcher in social informatics, similarly describes his critical research as distinct from research focused on design and use, and the intervention is not directly into the technology design process but rather into the cultural discourses about technologies, because “[t]his is in and of itself valuable work, as technocultural attitudes toward ICTs work very hard to obscure the costs incurred by adopting technological solutions to social problems” (p. 1027).
In other words, Brock’s (2018) and Sano-Franchini’s (2018) emphasis on humanistic perspectives celebrates the value of critical, cultural expertise of stakeholders who might not always be (directly) at the table in design processes—specifically, both academic researchers and communities. And both Brock and Sano-Franchini also highlight how academic and community perspectives can further modulate and mitigate each other. Sano-Franchini notes that critical academic perspectives can highlight ideologies that communities or individual users may not be aware of (n.p.), and Brock notes that centering communities’ perspectives can help academics avoid deterministic interpretations (p. 1017).
Therefore, while traditional approaches to heuristic evaluation can usefully bolster the value of design expertise, critical interface analysis usefully calls into question where we locate that design expertise. Specifically, putting heuristic evaluation and critical interface analysis in conversation helps us problematize the assumption that designers and design researchers (be they in academic, industry, or public sector contexts) have the primary expertise about what is best for designs. Design researcher Harrington refers to this assumption as “design narcissism,” or the belief that “people don’t know what [they] want until you tell them . . . when in reality, people know what they need, because they live this experience every day” (Fonder, 2022, n.p.).
Resisting design narcissism is crucial for justice-focused design research, because as Costanza-Chock (2020) advocates, design researchers should focus on “how design can best be used as a tool to amplify, support, and extend existing community-based processes” (p. 84). Rethinking heuristic evaluation through the insights of critical interface analysis offers design researchers a way not only to foster more reflexivity within the design team, but also to foster more community engagement in design research and design processes. And, likewise, emphasizing “heuristics” and “evaluation” in critical interface analysis offers design researchers a way to recognize more kinds of (situated) expertise across contexts and, thus, more opportunities for intervening and participating in design processes. Therefore, in the following section, I highlight how critical interface analysis heuristics from Brock (2018) and Sano-Franchini (2018) can be applied to support justice-focused community engagement throughout design research processes.
Layering Critical Interface Analysis Heuristics
Whether we are working in a design team, on our own, or in a community, critical interface analysis heuristics can help us zoom out from the specifics of a feature or interface to consider cultural, ideological, and contextual aspects of our design processes and to be accountable for the impacts of our design work. In the spirit of Brock’s (2018) and Sano-Franchini’s (2018) layered approaches, I present layered strategies that facilitate justice-centered engagement with communities and stakeholders throughout the design process; specifically, I highlight heuristic applications for (1) language setting, (2) research plans, (3) participatory analysis, and (4) research evaluation.
Language Setting
The Creative Reaction Lab’s (2018) Equity Centered Community Design Field Guide advocates for the importance of language setting—that is, developing shared understandings, vocabularies, and guiding baselines—in justice-focused work, because “common language is a crucial foundational step in dismantling systemic oppression and designing equity” (p. 10). Stating critical starting points and definitions is a core strategy of critical interface analysis, as exemplified in Brock’s (2018) principles for CTDA:
- “ICTs [information communication technologies] are not neutral artifacts outside of society; they are shaped by the sociocultural context of their design and use.
- Society organizes itself through the artifacts, ideologies, and discourses of ICT-based technoculture.
- Technocultural discourses must be framed from the cultural perspectives of the user AND of the designer.” (p. 1020)
Indeed, Brock’s principles resonate with what I learned from community strategist Clarissa San Diego, which is that definitions—including and especially keywords like “community” or “inclusion”—“should be products of meaningful, localized engagement, not assumptions we start with” (Shivers-McNair & San Diego, 2017, p. 110). Brock’s principles could inspire a language-setting exercise to get design team members invested in a critical, contextual approach, or the principles could frame a participatory design workshop with community members. For example, an academic or practitioner collaborating with community members might start by sharing Brock’s principles as a way of revealing their guiding values and assumptions, and then invite community members to share their guiding values and assumptions so that together they can localize and highlight shared understandings—and thus lay a foundation to revisit and renegotiate those understandings throughout the engagement. Similarly, teachers, trainers, and mentors can use these principles to language-set with learners and to highlight the stakes and value of critical interface analysis approaches in design research. This language-setting heuristic can also be valuable for making the case to stakeholders for why we need to take a more expansive, critical approach to our research processes—including and especially in our research plans, as I discuss below.
Research Plans
Whether our audience is a UX research team manager, a funding agency, or an institutional review board, design researchers invest time and effort in strategically framing our research for key stakeholders in our research plans and proposals. With the language setting heuristic from Brock’s (2018) approach, we can argue for the stakes of a critical approach, and with a complementary set of questions from Sano-Franchini (2018), we can model what it looks like to take a critical approach. For UX practitioners, specifically, Sano-Franchini (2018) offers a set of questions (below) that we might apply or adapt as a heuristic that pushes what might typically be a feature-focused or fix-focused design evaluations to also consider affective, ideological, situated experiences and relationships. In other words, beyond focusing on the relatively isolated (and abstracted or universalized) uses and functions of a new feature or product, design researchers could use Sano-Franchini’s questions to frame research plans that account for the connections between (a) functions and ideologies, (b) uses and values, and (c) actions and relationships. Sano-Franchini’s UX practitioner-focused questions for critical interface analysis are as follows:
- “How does a given UI mediate how people interact with one another over time? That is, how does pacing and duration as mediated by the UI impact those relationships? Does political engagement take place in the context of those relationships, and, if so, what does that engagement look like?
- What are the purposes and outcomes of connecting users with one another? Are we only connecting users for profit, or are there other outcomes that might benefit the users themselves? If the latter, what potential barriers exist that may keep one or more users from experiencing those benefits?
- How will this feature make people feel (understanding that it is not necessarily a good thing to feel good all the time)? What are the logics of the interface, and how will it encourage them to interpret and engage with the world around them?
- How and why might we create designs that encourage more active, critical, and deliberate participation among users?
- What kinds of content, values, and logics are rewarded over others? What are the affective, temporal, and political consequences of these priorities?
- What is the relationship and place of these mediated interactions within the larger media ecology?
- How can we keep in mind that when dealing with political issues, people need background information—whether about democracy, institutionalized racism, immigration, healthcare, environmental issues, and/or women rights—to make informed decisions?
- Finally, what kind of society do we want to live in? And, how might we design technologies that bring us to that ideal?” (pp. 402–403)
Regardless of the design context, these are not easy questions to ask or easy conversations to have. Pushing heuristic evaluation to engage on this level of ideology, context, and culture requires time, energy, and dispositions that may not be readily supported by our colleagues, clients, deadline structures, or infrastructures. For example, checking a new feature or product against standardized usability and accessibility criteria is likely more expedient and more comfortable than taking time to account for whose values, bodies, relationships, and experiences are privileged or dis-privileged by that feature or product—but that is precisely the kind of engagement we need to make space for in our processes, starting with our research plans. Indeed, when I introduce students to the UX research plan, the template document I share is longer than some models because it includes prompts—inspired by Sano-Franchini’s questions, which students also read—for accounting for researcher positionality in relation to the community/users, as well as researcher and community values, as part of accounting for audiences, stakeholders, and goals (which otherwise might default to design team goals). But if we are committed to working toward justice, these are precisely the kinds of questions and conversations we must advocate and find a way to make space for from the beginning in design research.
Participatory Analysis
Justice-focused co-design processes can invite and value the expertise of communities and participants not only in the design of products or content, but also in the analysis of design research. Gonzales (2022) exemplifies this layered approach by modeling participatory data analysis of participatory translation processes. Across three case studies, Gonzales engages community partners both in the act of translating and designing content and in the act of analyzing and reflecting on the process of participatory translation. Making the case for participatory analysis, Gonzales argues for an approach that prioritizes relationships over abstracted data points (2022, p. 57), explaining,
Rather than engage in systematic coding of a limited data set, I engaged in participatory data analysis by tracing multilingual experiences across contexts focused on issues of language, power, and positionality–discussing these experiences with participants both during and after each project was completed and then drafting and getting feedback from participants on how their multilingual experiences are presented in this book. (2022, p. 56)
In other words, Gonzales interpreted data with the people she was engaging, and design researchers can emulate Gonzales’ (2022) justice-focused approach by partnering with communities and participants in the analysis and evaluation of interfaces. Sano-Franchini (2018) provided a set of questions to guide a critical interface analysis that design researchers could apply or adapt to facilitate co-analysis workshops. Sano-Franchini’s questions for performing critical interface analysis are as follows:
- “Who is the target/primary user? Who are the secondary users, unintended users, and other stakeholders?
- What are the tasks, interactions, and relationships (human-computer, human-human) that are facilitated by and through the interface?
- What kinds of content are presented through the interface?
- What are the organizing logics of the interface?
- What are the ideological and cultural values and assumptions imparted through the interface, whether through its content, its organizing logics, or the interactions facilitated by the site?
- In what kinds of environments will these tasks be conducted and these interactions take place?
- What are the various affordances of the interface? Who benefits from its use and how do they benefit? What are the limitations of the interface? What and whom does it leave out?
- What are the range of emotions and embodied responses that are enabled and encouraged by the interface?
- On what memories, literacies, and histories does the interface rely?” (pp. 391–392)
To support the co-analysis process, a design researcher might start with a language setting discussion (perhaps following Brock’s heuristic in the Language Setting section above), and then move through a co-analysis using Sano-Franchini’s heuristic above—making sure, also, to leave space for participants to contribute their own analytic questions, as well. The activity could be structured with common collaborative analysis techniques like affinity diagramming, in which participants generate individual responses on stickies and then group and make sense of the stickies together. The co-analysis activity could be applied to an existing interface, or it could be used to iteratively develop and refine a prototype or speculative interface.
But it is also important for design researchers to make space for and honor co-analysis that does not take the shape of a more formal method—like affinity diagramming—that practitioners and academics more readily associate with analysis. Just as it is important to be able to share heuristics and methods we were trained in, it is also important to be able to recognize and support community-led analysis in forms that align with community practices and values—be they in conversations, community events, or asynchronous online interactions. What might begin, from an academic or practitioner researcher’s perspective, as a traditional interview could unfold into co-analysis, or, better yet, we can build relationships and understandings with communities that reorient our processes from the start. Either way, academics and practitioners can support community-led analysis by first recognizing and valuing it ourselves, and also by highlighting the affordances and insights of community-led analysis when we report to stakeholders and colleagues. This is why it is important for design researchers to appropriately credit our co-analysts. For example, Gonzales (2022) included participants’ authored accounts in her book and ensured they are credited in the book’s meta-data as well as in its pages.
Research Evaluation
Just as Sano-Franchini’s (2018) heuristic invites and models a reflexive, critical approach to evaluating a design, critical interface analysis approaches also invite design researchers to critically evaluate our research as itself an interface that is deeply contextual and that impacts people’s experiences. A critical orientation to research-as-interface is modeled in Cobos et al.’s (2018) definition of interfaces as a way of accounting for concept uptakes in cultural rhetorics research: specifically, the mediated, negotiated interfaces not only of “culture” and “rhetoric,” but also of varying definitions and approaches within work that identifies as “cultural rhetorics” (p. 144). Highlighting the situated, mediated work of scholarship as an interface calls for “[a]rticulating and situating how we use our terms of inquiry,” and thus “interrogating concepts and our uptakes of them so that we do not take these terms nor our agendas for them for granted” (Cobos et al., 2018, p. 141). In other words, treating research as itself an interface is an important reflexive tool for researchers to help us avoid taking our research agendas and terms for granted.
Brock’s (2018) requirements for CTDA (listed below) can be a helpful way for design researchers to evaluate our own research as an interface—even if we are not taking up a formal or full CTDA approach. Brock’s requirements for theoretical application in CTDA are as follows:
- “The theory [used in the analysis] should draw directly from the perspective of the group under examination;
- Critical technoculture should be integrated with the above cultural continuity (Christians, 2007) perspective.” (Christians, 2007, as cited in Brock, 2018, p. 1017)
And Brock’s requirements for multimodal engagement in CTDA are as follows:
- “Multimodal data operationalization;
- Multimodal interpretive research methods;
- Critical cultural framework applied equally to all data modes.” (p. 1023; numbers in original)
Brock’s requirements for theoretical engagement remind design researchers and their stakeholders to think more critically and expansively about our assumptions and interpretations. For design researchers, “theory” can come from many places—from academic scholarship, social movements, and/or from the data, mental models, and experiences in a specific design context—but the point is that it comes from somewhere, whether we acknowledge it or not. And, as Brock argues, we have an opportunity to de-center privileged discourses, knowledges, and experiences in how we approach our design research. Regarding the cultural continuity requirement, Brock (2018), citing Christians’ (2007) article “Cultural Continuity as an Ethical Imperative,” clarifies that “‘cultural continuity’ in technology studies is meant to decenter theories of technological determinism premised upon the beliefs of a dominant culture or modernist technological enterprises” by instead centering “historically and geographically constituted people as the value-laden creators of technological enterprise” (p. 1017). No matter what our approach or context, Brock’s requirements are an important reminder to be careful about and accountable for the assumptions that guide our research and the interpretations we produce, and to take the opportunity to de-center privileged perspectives. For example, if we make the effort to account for community values in our UX research plans (as I described above), we should also hold ourselves accountable to those values and relationships in our reporting and in our debriefing and reflecting with stakeholders and communities.
Furthermore, Brock’s definition of multimodality in research requires the analyst to treat interfaces as both an artifact and a medium, and this approach reminds community-engaged design researchers to account for “technology as a troika of artifact, practice, and belief” (Brock, 2018, p. 1023). In other words, a critical understanding of interfaces requires us to account for material or tangible interactions as inseparable from not only what people do but also what people think—and, as Sano-Franchini’s (2018) approach especially emphasizes, how people feel. Just as digital interfaces are inextricably bound up in ideologies, experiences, and contexts, so, too, are our design research processes. This means we have to account for the interrelations of artifact, practice, and belief not only in the products or content we design, but also in the processes by which we research content, products, and the people who interact with them.
In practice, this can look like making space for community collaborators to share their experiences with the co-research/co-design process in addition to sharing their experiences with the interface or product (perhaps in an anonymous venue, or with an ombudsperson or evaluator other than the researcher)—and then highlighting those experiences in reports and evaluations alongside experiences with the interface. Indeed, inviting colleagues or community members to serve in a recognized role as an evaluator of community experiences in a research and design process, or offering to serve as an evaluator for another researcher’s project, could be a fruitful expansion of heuristic evaluation in our practice. Sano-Franchini (2018) notes the importance of accounting for experiences over time in critical interface analysis, and the same is true in accounting for the interrelations of experiences and values: We can and should trace these experiences with interfaces and with research and design processes over time, building relationships and centering community experiences and values in our measures of success.
Conclusion
A critical, contextual approach to interface analysis is an important tool for community-engaged researchers who want to center people’s lived experiences and insights throughout the design process. As I have discussed, Brock (2018) and Sano-Franchini (2018) challenge mainstream interface analysis approaches to go beyond focusing on universalized uses and functions and, instead, to contextualize experiences as cultural, ideological, affective, relational, and situated in time and context. This critical, contextual orientation aligns with work by scholars like Jones (2016) and Costanza-Chock (2020) who have revealed the potential for erasures and extractive relations in heuristic evaluation and design work that relies on abstract representations of users’ experiences and voices. Instead, these scholars call for work that is accountable to the contextual, embodied experiences of people who engage with, contribute to, and are impacted by designs—and especially the “communities most targeted by intersectional structural inequality” (Costanza-Chock, 2020, p. 85).
The approaches to critical interface analysis discussed here afford people in traditionally privileged design research roles (in academic, industry, and public sector institutions) a way to honor the experiences and expertise of community members by not only reflecting on the ways they contribute to and are impacted by designs, but also collaborating with them on critical interface analysis. Such collaborative work answers Costanza-Chock’s (2020) calls for design justice work to account for and intervene not only in “the values that we encode in the objects and systems we design,” but also in
- “who gets to participate in and control design processes” and
- “who receives attention and credit for design work, how we frame design problems and challenges, how we scope design solutions, and what stories we tell about how design processes operate.” (p. 109; bullets added)
After all, both critical interface analysis and heuristic evaluation leverage (and thus reinforce) the expertise of the person doing the analysis, even as critical interface analysis also emphasizes the contributions of people who use technologies. Costanza-Chock (2020) praises the work that Brock’s CTDA approach to critical interface analysis does to “unpack how marginalized users often produce technocultural practices that become the core use case for digital tools and platforms, with Black Twitter as a key case study,” and adds that “[m]uch more work remains to be done to mainstream these and other approaches to proper attribution in design.” (p. 114). I agree, and in addition to using critical interface analysis to properly attribute historical and existing design practices, academic and industry researchers can also engage communities as partners in critical, analytical design research. Instead of treating the product of the collaborative work as data to be abstracted into themes, researchers can use critical interface analysis processes to properly attribute (and compensate in appropriate ways) community partners’ intellectual and material contributions to design research and design processes to get us closer to a more just and equitable society—as Sano-Franchini (2018) puts it, the kind of society we want to live in.
References
Agboka, G. Y. (2013). Participatory localization: A social justice approach to navigating unenfranchised/disenfranchised cultural sites. Technical Communication Quarterly, 22(1), 28–49.
Banks, A. J. (2006). Race, rhetoric, and technology: Searching for higher ground. Routledge.
Banks, A. J. (2010). Digital griots: African American rhetoric in a multimedia age. SIU Press.
Banville, M., & Christensen, K. (2022, July). Ready, set, bake: A heuristic analysis teaching case. In 2022 IEEE International Professional Communication Conference (ProComm) (pp. 339–345). IEEE.
Brock, A. (2018). Critical technocultural discourse analysis. New Media & Society, 20(3), 1012–1030.
Brock. (2012). From the blackhand side: Twitter as a cultural conversation. Journal of Broadcasting & Electronic Media, 56(4), 529–549. https://doi.org/10.1080/08838151.2012.732147
Brown, J., & Chao, J. T. (2010). Collaboration of two service-learning courses: Software development and technical communication. Issues in Informing Science and Information Technology, 7, 403–412.
Cardinal, A., Gonzales, L., & J. Rose, E. (2020, October). Language as participation: Multilingual user experience design. In Proceedings of the 38th ACM International Conference on Design of Communication (pp. 1–7).
Carliner, S. (2003). Characteristic-based, task-based, and results-based: Three value systems for assessing professionally produced technical communication products. Technical communication quarterly, 12(1), 83–100.
Chou, C., & Sun, C. T. (1996). A computer-network-supported cooperative distance learning system for technical communication education. IEEE Transactions on Professional Communication, 39(4), 205–214.
Christians, C. G. (2007). Cultural continuity as an ethical imperative. Qualitative Inquiry, 13(3), 437–444.
Cobos, C., Raquel Ríos, G., Johnson Sackey, D., Sano-Franchini, J., & Haas, A. M. (2018). Interfacing cultural rhetorics: A history and a call. Rhetoric Review, 37(2), 139–154.
Costanza-Chock, Sasha. (2020). Design justice: Community-led practices to build the worlds we need. MIT Press.
Donker-Kuijer, M. W., & Menno, D. J. (2010). Heuristic Web site evaluation: Exploring the effects of guidelines on experts’ detection of usability problems. In Qualitative research in technical communication (pp. 269–289). Routledge.
Friess, E. (2015). Personas in heuristic evaluation: an exploratory study. IEEE Transactions on Professional Communication, 58(2), 176–191.
Gonzales, L. (2022). Designing Multilingual Experiences in Technical Communication. University Press of Colorado.
Green, M. (2021). Resistance as participation: Queer theory’s applications for HIV health technology design. Technical Communication Quarterly, 30(4), 331–344.
Grice, R. A., Bennett, A. G., Fernheimer, J. W., Geisler, C., Krull, R., Lutzky, R. A., & Zappen, J. P. (2013). Heuristics for broader assessment of effectiveness and usability in technology-mediated technical communication. Technical Communication, 60(1), 3–27.
Gu, B. (2016). East meets west on flat design: Convergence and divergence in Chinese and American user interface design. Technical Communication, 63(3), 231–247.
Haas, A. M. (2012). Race, rhetoric, and technology: A case study of decolonial technical communication theory, methodology, and pedagogy. Journal of Business and Technical Communication, 26(3), 277–310.
Haas, A. M., & Eble, M. F. (Eds.). (2018). Key theoretical frameworks: Teaching technical communication in the twenty-first century. University Press of Colorado.
Harrington, C., Erete, S., & Piper, A. M. (2019). Deconstructing community-based collaborative design: Towards more equitable participatory design engagements. Proceedings of the ACM on Human-Computer Interaction, 3(CSCW), 1–25.
Hart, D., & Portwood, D. M. (2009, July). Usability testing of web sites designed for communities of practice: tests of the IEEE Professional Communication Society (PCS) web site combining specialized heuristic evaluation and task-based user testing. In 2009 IEEE International Professional Communication Conference (pp. 1–17). IEEE.
Homer, M. (2020). Sovereignty and Algorithms: Indigenous Land Disputes in Digital Democracy. In Platforms, Protests, and the Challenge of Networked Democracy (pp. 329–343). Palgrave Macmillan.
Jones, N. N. (2016a). The technical communicator as advocate: Integrating a social justice approach in technical communication. Journal of Technical Writing and Communication, 46(3), 342–361.
Jones, N. N. (2016b). Narrative inquiry in human-centered design: Examining silence and voice to promote social justice in design scenarios. Journal of Technical Writing and Communication, 46(4), 471–492.
Jones, N. N. (2020). Coalitional learning in the contact zones: Inclusion and narrative inquiry in technical communication and composition studies. College English, 82(5), 515–526.
Jones, N. N., Moore, K. R., & Walton, R. (2016). Disrupting the past to disrupt the future: An antenarrative of technical communication. Technical Communication Quarterly, 25(4), 211–229.
Jones, L. C. (2021, October). Online advocacy work: “Palatable” platforms and privilege in GUI features on Twitter and Instagram. In The 39th ACM International Conference on Design of Communication (pp. 157–164).
Kantner, L., Shroyer, R., & Rosenbaum, S. (2002, September). Structured heuristic evaluation of online documentation. In Proceedings. IEEE International Professional Communication Conference (pp. 331–342). IEEE.
Knight, A., Rife, M. C., Alexander, P., Loncharich, L., & DeVoss, D. N. (2009). About face: Mapping our institutional presence. Computers and Composition, 26(3), 190–202.
Mckoy, T., Shelton, C. D., Sackey, D., Jones, N. N., Haywood, C., Wourman, J. L., & Harper, K. C. (2020). CCCC Black Technical and Professional Communication Position Statement with Resource Guide. Position Statement. National Council of Teachers of English.
Moses, M. G., & Katz, S. B. (2006). The phantom machine: The invisible ideology of email (a cultural critique). In B. Longo, J. B. Scott, and K. V. Wills (Eds.), Critical power tools: Technical communication and cultural studies (pp. 71–105). SUNY Press.
Nielsen, J., & Molich, R. (1990, March). Heuristic evaluation of user interfaces. In Proceedings of the SIGCHI conference on Human factors in computing systems (pp. 249–256).
Oswal, S. K., & Meloncon, L. (2017). Saying no to the checklist: Shifting from an ideology of normalcy to an ideology of inclusion in online writing instruction. WPA: Writing Program Administration-Journal of the Council of Writing Program Administrators, 40(3).
Richter, J. D. (2021). Writing with reddiquette: Networked agonism and structured deliberation in networked communities. Computers and Composition, 59, 102627.
Rose, E., & Cardinal, A. (2018). Participatory video methods in UX: Sharing power with users to gain insights into everyday life. Communication Design Quarterly Review, 6(2), 9–20.
Salvo, M. J. (2001). Ethics of engagement: User-centered design and rhetorical methodology. Technical communication quarterly, 10(3), 273–290.
Sano-Franchini, J. (2018). Designing outrage, programming discord: A critical interface analysis of Facebook as a campaign technology. Technical Communication, 65(4), 387–410.
Sano-Franchini, J. (2022). Call for Papers: Special Issue of Technical Communication on “Digital Interface Analysis and Social Justice. https://www.stc.org/notebook/2022/08/03/call-for-papers-special-issue-of-technical-communication-on-digital-interface-analysis-and-social-justice/?fbclid=IwAR3ea3xwxJrRG9rZYXnsx3TqetfV1v43VoOXyprHAOQ_6eQCMfGX3oCY9hE.
Satchell, C., & Dourish, P. (2009). Beyond the user: Use and non-use in HCI. Proceedings of the 21st Annual Conference of the Australian Computer-Human Interaction Special Interest Group on Design: Open 24/7 – OZCHI ’09 (p. 9).
Selfe, C. L., & Selfe, R. J. (1994). The politics of the interface: Power and its exercise in electronic contact zones. College Composition and Communication, 45, 480–504.
Shivers-McNair, A., & San Diego, C. (2017). Localizing communities, goals, communication, and inclusion: A collaborative approach. Technical Communication, 64(2), 97–112.
Sidler, M., & Jones, N. (2008). Genetics interfaces: Representing science and enacting public discourse in online spaces. Technical Communication Quarterly, 18, 28–48.
Spinuzzi, C. (2005). The methodology of participatory design. Technical communication, 52(2), 163–174.
Van der Geest, T., & Spyridakis, J. H. (2000). Developing heuristics for Web communication: An introduction to this special issue. Technical communication, 47(3), 301–310.
Walton, R., Moore, K. R., & Jones, N. N. (2019). Technical communication after the social justice turn: Building coalitions for action. Routledge.
Williams, M., & Pimentel, O. (2016). Communicating race, ethnicity, and identity in technical communication. Routledge.
Zdenek, S. (2007). “Just roll your mouse over me”: Designing virtual women for customer service on the web. Technical Communication Quarterly, 16(4), 397–430.
About the Author
Dr. Ann Shivers-McNair is an associate professor and director of professional and technical writing in the English Department and affiliated faculty in the School of Information at the University of Arizona, on the lands of the Tohono O’odham and Pascua Yaqui. Her research and teaching focus on justice-centered design research and communication practices in educational and community contexts, and current projects include an engineering education collaboration funded by the National Science Foundation. Her 2021 book, Beyond the Makerspace: Making and Relational Rhetorics, was published by the University of Michigan Press, and her research has also appeared in refereed journals, edited collections, and proceedings. She serves as secretary of the Association of Teachers of Technical Writing executive committee and as University of Arizona liaison on the chapter management and advisory council of the Society of Technical Communication Arizona Chapter. She can be reached at shiversmcnair@arizona.edu.