By Elisabeth Kramer-Simpson
Abstract
Purpose: I investigated best practices for teaching content auditing within two graduate classes tasked with content auditing websites. I observed their strategies for developing auditing criteria. The graduate students used the audits to implement website redesign. Two research questions guided this study: 1) How do students create assessment criteria for website content audits? 2)What additional support could help students better determine assessment or rubric criteria to make them specific, and most of all, easily measurable? I focused on how and why students made auditing decisions.
Method: I taught two graduate classes in which the students worked with real clients and live websites. Part of the process the students learned and used was content auditing. I used case study, interviews, and text analysis to empirically investigate student auditing. Nine participants shared their perspectives, including students and clients.
Results: Simple categories with binary criteria made auditing easier; simple categories also made for simpler assessment. Students asked to see more examples of audits. Students in Class 1 misunderstood the audience, which led to a ripple effect on the resulting web design. Students from both classes were nonetheless able to make incremental improvements to both client websites.
Conclusion: Additional training in listening to clients is needed with graduate students who do content auditing of websites. Discussion of the impact of web content evaluation may be needed to help students discover how to tailor auditing guidelines to their specific clients. Practitioners may need recursive auditing to fully define criteria.
Keywords: Content Strategy, Content Auditing, Service-Learning, Web Design
Practitioner Takeaways
- Simple audit criteria, especially criteria that have clear standards or binary options, can be effective in content audits to improve websites.
- Students may not look deeply enough to see all audiences of a website, and require recursive evaluation.
- Content audits themselves can be a useful “how” and “why” web design explanation for clients and practitioners justifying revision decisions.
Introduction
Content auditing of communication channels has emerged as a tool to systematically assess the old content and to make a redesign more effective (Land, 2023; Rockley & Cooper, 2012). Academia has also employed this approach (Batova, 2021; Gonzales et al., 2016), though few academic articles cover content auditing before 2010 (Sperano, 2017, p. 3). More empirical research is needed on teaching content auditing to determine best practices for the classroom.
Teaching content auditing can be most effective when using a complex, real-world audience (Getto, Labriola, & Ruszkiewicz, 2020; Gonzales et al., 2016; Howard, 2020). Teaching technical communication using a “service-learning” approach involves real clients (and thus a real audience) (Bowden & Scott, 2002; Huckin, 1997; Sapp & Crabtree, 2002). The goal of service learning and content auditing websites is to support nonprofits who lack resources to do a website redesign themselves (Getto et al., 2023). Service-learning has been used in teaching content auditing (Batova, 2021; Getto & Labriola, 2016; Steiner, 2020) but is very labor intensive.
In order to learn more about service-learning in the content auditing context, I investigated two graduate classes who audited nonprofit websites and implemented redesigns. Two clients and seven students were interviewed retrospectively. Qualitative analysis of student audits, strategy documents, teacher reflections, and revised webpages were synthesized with the interview findings. Findings indicate that students had trouble understanding the multiple audiences for the websites and this impacted the website redesign. Further, students needed lots of scaffolding, recursive practice, and simple categories for the audit assessment criteria.
Literature Review
Content auditing provides a strategic way to look at what already exists within an organization’s communication channels, and to re-tool the content to focus on achieving organizational goals and priorities. In this review, I first define content auditing, and then I discuss in brief how these tools are used particularly in university systems. I then further discuss service-learning as well as case studies of service-learning and content auditing. I conclude calling for additional case study research specifically on content auditing.
Defining Content Strategy and Content Auditing
The field trajectory has been to move from single sourcing to content management to content strategy, particularly within the last decade (Gonzales et al., 2016). Rockley and Cooper (2012) define content strategy for industry as “support[ing] both organizational goals and customer needs” (p. 13). Andersen (2014) describes content strategy as the “life cycle” of “intelligent content,” which includes revisions, use in multiple contexts, and storage (p. 133). Content strategy looks at the big picture of the effectiveness of the content rather than the structure or document format (Albers, 2020). Content strategy is focused on the efficiency of communicating effective information to audiences (Bailie, 2024).
Content auditing is a technique and tool within content strategy. Rockley and Cooper (2012) provide a chapter dedicated to content auditing: “the purpose of a content audit is to analyze how content is written, organized, used, reused and delivered to its various audiences” (p. 102). They emphasize finding overlapping elements of content and unifying the content with a focus on reuse. Rockley and Cooper (2012) focus on the tasks and decisions of the audience to evaluate the content. Land (2023) wrote a handbook for content auditing and inventorying that includes many of these concepts and further clarifies criteria that can be used, including rating scales that may be helpful in auditing (pp. 60–84). In particular, Land (2023) suggests that user value for a website can be measured by categories such as “current,” “accurate,” and “easy to read.” Getto et al. (2023) suggest similar categories like “readability,” and accuracy in terms of “authoritativeness” (pp. 87–88). The part of content auditing I am most focused on in this article is the assessment or evaluation of pieces of content. Getto et al. (2023) define this part: “A content audit is also used to assess content, meaning to measure its overall effectiveness in the context of organizational goals and audience goals” (p. 73, emphasis original).
Tips from Getto et al. (2023) and Land (2023) include spreadsheet samples, ways to prioritize pages (since a website often had an overwhelming amount of content), and ways to develop a simple yes/no category (Getto et al., 2023, p. 83). The audit is a foundational tool for knowing what exists in a content channel, and how that content can be revised and reorganized to become more effective for users.
Industry has used content auditing to conduct case analyses of websites. Altamirano and Stephens (2022) chart the workflow of their content audit. They showed the assessment process as recursive (see also Getto et al., 2023). Though Altamirano and Stephens (2022) provide sample audit pages, they compared so many different websites in their audit that it is difficult to see particular pieces or lines of the audit. Land (2023) added more cases in her second edition of Content Audits and Inventories, providing additional insight on issues of accessibility which have become prominent since the publication of website accessibility guidelines in the form of WCAG 2.0 (2008). This set of guidelines was formed to support web reading and usage for people with a variety of disabilities such as visual impairment, hearing loss, and some cognitive disabilities (among others). Getto et al. (2023), Land (2023) and Rayl (2021) suggest using the online guidelines provided via W3schools at https://www.w3.org/TR/WCAG21/.
Content auditing is used in university settings where there is a large quantity of content. Tang and Ding (2023) audited Chinese university websites and compared their English pages, including 35–75 pages per website (p. 363). They found that content on the English pages was often “irrelevant” or not “updated” (Tang & Ding, 2023, p. 366). Issues of outdated content is a common finding of content audits.
Another university audit, Texas Tech’s library website audit, specifically focused on issues of accessibility (Rayl, 2021). Rayl (2021) reports that her audit covered more than 300 pages and took on average 2.5 hours per page. This emphasized the point that comprehensive auditing does in fact take a fair amount of time and can be “tedious” (Batova, 2021; Land, 2023).
Content Auditing and Service Learning
The complex context of a real organization in service-learning gives students real-world challenges that have long been praised in technical communication (Bowden & Scott, 2002; Huckin, 1997; Sapp & Crabtree, 2002; Scott, 2004) yet problems can arise (McEachern, 2001). Even good partnerships can turn sour (Mathieu, 2005; Mathieu 2012). Facilitating clear communication between the nonprofit client and students can require time and effort on the part of the instructor (Bay, 2022; Brizee, 2015; Bourelle, 2012; Grabill, 2012; Jacoby, 2015; Rivera & Gonzales, 2021). Developing and sustaining relationships with organizations requires trust and time in addition to teaching the instructional content.
Service-learning needs to benefit both students and the nonprofit. Brizee (2015) distinguishes between writing with and writing about the community, with the goal to work with the community (p. 144). Stoecker and Tryon (2009) prioritize the use of projects for both students and the community. Grant (2022) discusses the impact of this mutual benefit in the high stakes work for providing resources for families of homicide victims, who needed complete and professional work. Dush (2017) used a graduate student for further implementation of class projects. Nonprofits in rural areas may not have the bandwidth to implement student projects (Jacoby, 2015). Oversight from the instructor and use of existing partnerships within the university or community can help address some of these concerns (Batova, 2021; Bourelle, 2012; Getto & Labriola, 2016; Howard, 2020; Steiner, 2020).
Content strategists call for service-learning to teach content auditing (Getto et al. 2020; Howard, 2020). The “real” work context can provide more depth in audience and purpose for creating content strategy and knowing how to meet organizational goals. Batova (2021) used a content audit (p. 413) to focus on “sustainability” that later helped the class set up clickable prototypes of the website as a final deliverable. Focusing on a theme like sustainability gave students a goal to focus on in the content audit. However, the auditing is a brief mention in this article. More work is needed in case studies of content strategy to describe the how-to of content auditing, a first step in website redesign.
Using local organizations may help make audiences for the website content clearer (Gonzales et al., 2016; Steiner, 2020). Steiner (2020) mentions using an internal client of the department because it was “a client with which I was familiar” (p. 188). Steiner used students’ knowledge of the department to address audience concerns.
Content auditing can help flesh out the organization’s values if the instructor has a strong relationship with the organization (Howard, 2020; Steiner, 2020). Howard (2020) chose to work with an organization with which he had a local connection. In his 2019 presentation at STC, Howard was noticeably moved by the mission of the organization. As a result of the connection, Howard had significant access to organizational communication and the graduate students were able to develop personas from a database of email communication within the organization. The content audit of the email database shaped recommendations and helped students identify the tendency of users to send short, “pithy” messages (Howard, 2020, p. 129).
Content audits in service-learning can be necessary first steps to scaffold larger website redesign. Steiner (2020) scaffolds her larger website redesign group project with individual smaller projects like a content audit. She mentions that the students found the audit “tedious” at first, but then used it as “a foundation” for their website recommendations (Steiner, 2020, p. 181). Auditing helped students talk about the website in a “consistent” way, rather than basing recommendations for redesign on their intuition or feeling (Steiner, 2020, p. 181). One of the strengths of auditing is that it gives justifications for evaluative decisions, rather than impulse redesigns.
This study aims to examine content auditing in two graduate classes that then used the audits to implement website redesign. The focus of the study is in unpacking how and why students made auditing decisions.
Background
Our University and Students
Our university is a small, STEM-focused state institution with a reputation for research located in a rural part of the southwest. Our department houses the humanities and social sciences. We offer several undergraduate degrees in Psychology, Interdisciplinary Sciences, and Technical Communication. Students participating in the website redesigns came mostly from our MS in Public Engagement in Science, Design, and Communication. The program is interdisciplinary and involves science and technology studies, ethics, communication, and design practices. The graduate program emphasizes “community and public engagement” and the practical skills associated with implementing content strategy. Students in this program often hold full-time jobs at other research entities. They juggle classwork with family and work. Our department programs share the university vision to “solve real-world problems” and as a result, we prioritize hands-on, service-learning and client projects. This experience requires substantial planning and relationship building across department and community boundaries. We have worked with several nonprofits over the last few years. I endorse a long-term partnership model, and I had worked with these two clients over the last three years on multiple projects.
Class 1: Documentation
Class 1, a graduate class, emphasized documentation and used Garrett’s (2011) text as a foundation, with chapters from Getto et al. (2020). Learning outcomes for the class involved iterative, agile processes for creating documentation that incorporated user testing. The client, Serena, asked for help restructuring her local health nonprofit website. Students collaborated with Serena, who did not have web training. She came to the class several times, once to provide context and issues for the website, and several more times to discuss the audit, the strategy document, the website structure, and page mock-ups. Students were given a handout derived from Getto (2020, p. 10) and the handout was 2/3 page with brief guidelines (see Appendix A). Foundational concepts such as “creating criteria by which content will be assessed that also meshes with the intended goals of the content strategy plan” guided ways that I helped students evaluate content (Getto et al., 2020, p. 10).
Four students divided the website and each inventoried a part of the website. One student with web experience in the workplace set the assessment criteria for the audit, and made categories such as outdated/current, redundant, action/information, and keep/change/delete. Students created a strategy document from the audit that included patterns and findings from looking at all the pages of the organization’s website. One of the findings was, “Mixing info from different years for recurring events made the website hard to navigate.” Another key finding was “Redundant info/videos,” though part of this may be the students not understanding the full audience for the information. Students were given two weeks to complete the audit. Six weeks were spent on user testing and website mock-up development. The tree structure for the website redesign had a new page for volunteers, but did not have a central place for a data dashboard.
Class 2: Media, Communication, and Public Engagement
Class 2, also a graduate class, emphasized media, communication, and public engagement. Learning outcomes for the class included audience and branding connected to organizational goals, systematic identification of recurring issues in websites, and communicating those patterns to a variety of stakeholders. The textbook used was Getto et al.’s (2023) Content Strategy: A How-to Guide.
Six students divided up the webpages for our department and audited the existing website before implementing changes. Our client was primarily the department chair Gloria, who did not have web experience. Gloria came to class in early September to answer student questions about the website’s design and purpose, and later in September to listen to and read students’ findings from the audit. Gloria remained available for consultation throughout the implementation part of the project but did not return to class. Two students worked together on the department home page (though they each had separate program pages as well that they were responsible for), and two students worked together on the faculty pages. Two students worked independently on separate program pages: Technical Communication and Education.
Specifically, Getto et al.’s (2023) Chapter 4 on “Identifying Content Types and Channels” and Chapter 5 which is simply titled “Content Auditing” proved foundational. I broke the “Content Auditing” chapter into three parts that I taught over the course of three weeks. First, we as a class identified what Getto et al. (2023) term “MAST goals” for an organization (pp. 74–77) (an acronym focused on measurable, specific goals similar to SMART goals). The revised goal Getto et al. (2023) present as an example that further defines “mobile friendliness” was a concrete example I returned to often in our discussions (p. 77). We discussed the goals and audience needs for our department website as a large group over two days of class with an extra handout on how to inventory and identify client goals. We also discussed the assignment sheet (see Appendix B). In the second part of the three weeks, I asked students to focus on creating assessable, measurable criteria for the goals, as discussed in Getto et. al (2023, pp. 83–93). This was the most recursive process and where students struggled the most, which I will explain with student and client interviews in the results of this article. Finally, I spent several days discussing how to abstract patterns and create a strategy document that students could share with the client. The client returned to the class at the end of September to discuss plans for the website. Five weeks were then spent on webpage implementation with a week of that being dedicated to accessibility according to WCAG 2.0 guidelines.
Methods
To examine how students developed content audits of the nonprofit organizations’ websites, I chose a qualitative approach. I was interested to see how and why students made the rubric or criteria to assess the content on the website. I focused on content auditing specifically because for both of these websites, there was already a substantial amount of information in existence, but it was outdated and incomplete. Before we could recommend changes to the website, or plan a content strategy for the organization, we needed to know what was there. Two central research questions guided this empirical, reflective study:
- How do students create assessment criteria for website content audits?
- What additional support could help students better determine assessment or rubric criteria to make them specific, and most of all, easily measurable?
IRB Approval
I obtained Institutional Review Board (IRB) approval to retrospectively collect interviews and audits created for the class from students and clients. I did not mention student grades or my comments on the assignments as per the request from the IRB. Both the clients and students had copies of the documents I would be referencing before the interviews and understood (and signed) the consent to participate before the interviews. The IRB approved the study in November of 2023 as exempt. The IRB database number was 2023-11-001.
Participants
Two of the four students from Class 1’s website redesign of the nonprofit’s website agreed to participate in the study. Five of the six students from Class 2’s website redesign of the university department’s website agreed to participate in the study. Both clients agreed to participate in the retrospective interviews. In order to acknowledge the power differential between teacher and students, and even clients and teacher, I “adopt[ed] the role of a learner as much as an expert and engage[d] with humility” (Rose & Cardinal, 2021, p. 83). I focused on listening carefully to participants, asking follow-up questions, and paraphrasing their answers to confirm. Participants were encouraged to select their own pseudonyms and as a researcher, I tried to “slow down, to listen and to share power” (Rose & Cardinal, 2021, p. 87). As with much qualitative, case study research, “sample sizes are usually much too small to warrant random selection,” so I drew a “purposive sample, building in variety” (Stake, 2003, p. 152). In this case, variety included multiple classes, in-person and distance students, and a variety of background training in web design and content auditing.
Data Collection
I triangulated collection of rich, detailed descriptions of participants’ actions through observation, text collection and analysis, and retrospective interviews from both students and clients. “Seeing data from multiple perspectives—for example using multiple researchers or multiple data collection techniques—increases rigor” (Hughes & Hayhoe, 2008, p. 81). In this case, the multiple perspectives involved both the students creating the audits and the organizational clients who asked for student support in redesigning their websites.
I noted students’ struggles during classes with post-class notes. I collected students’ content audit spreadsheets, their strategy synthesis documents based on the audits, and the to-do lists/agendas from Class 2. I also collected screen shots of the webpage redesigns. These texts served as a foundation for the retrospective interviews. Preparation for interviews involved 1–2 hours of analysis of the texts and included student-specific follow up questions.
Each interview lasted 30 minutes following the protocol (see Appendix C). Different questions were asked of clients and students, but both protocols averaged eight questions. Hughes and Hayhoe (2008) comment that for qualitative research, “if you want to know why they do it, or how they feel about it, then interviews and focus groups can be credible methods” (p. 79, emphasis original). I wanted to see the process behind the product that was created for the class. Questions in my protocol like “What information from class, handouts, textbook, etc. was helpful in setting up your content audit?” helped me understand scaffolding. More specific questions about the assessment criteria such as “How did you name the categories?” shed light on the rubric elements used to evaluate the pieces of content. I asked more closed questions about who the audience for these websites were (from both students and clients). Interviews were audio recorded and transcribed.
Data Analysis
I looked for recurring themes across the classes. Hughes and Hayhoe (2008) write “Furthermore, for a qualitative study to have rigor, it must employ a formal, systematic technique for examining the data and finding patterns or common themes across the data” (p. 82). I first analyzed my observation notes and students’ texts before the interviews with a focus on the rubrics used and the naming of the criteria for the audits. I use Huckin’s (1992) context-sensitive textual analysis to limit the number of possible interpretations of student texts, in that “the number of plausible interpretations is constrained by various linguistic conventions that are manifested in the text” (p. 86). Huckin (1992) describes that the writer of these texts, in this case students authoring the content audit and strategy documents, had something in particular to communicate, and that message helped shape the direction of the interviews. I traced the assessment of the content through to the mock-ups or actual redesign of the webpages, noting changes deemed “successful” by the client from interviews and in-class commentary.
Recursively analyzing transcripts for emergent themes, I began by pulling quotes from transcripts on themes such as “use during implementation,” “goals,” or “naming criteria.” I made separate documents with repeating categories for each class. Initial categories included awareness of client’s goals, conversations with stakeholders in class, and information helpful in setting up the audit. Within those categories, specific examples such as “audience,” “seeing,” and “mobile friendly” arose as I analyzed the data a second and third time. I gathered responses on these themes within subsections of the transcript and refined the categories. I further refined “mobile friendly” and added in “scaffolding” to “seeing.” Yes/no turned into “binary categories” on the spreadsheet and measuring the ease and consistency with which students used criteria.
Results
Research Question 1: How Do Students Create Assessment Criteria for Website Content Audits?
To answer the first research question, “How do students create assessment criteria for website content audits?” findings were traced through analysis of student audits and retrospective interviews. Particular protocol questions that helped shed light on the development of student audits were “How did you learn about the organization’s goals, values and purpose for the website?” and its follow up question, “What did you perceive as the purpose and audience for the website?” Another question from the student protocol that was critical in answering this research questions was: “How did you turn goals and values from the organization into assessable criteria for evaluating the inventoried pieces of content on the pages of the website? How did you name the categories?” In particular, I was interested in how students named the categories given the suggestions from Getto et al. (2023, pp. 85–93) (the students’ textbook). My protocol questions and follow up questions answered how and why students created the content audit for the websites.
Close analysis of the students’ transcripts and audits yielded several themes related to their naming and use of audit criteria. Recursively analyzing the students’ issues and process through the transcripts and the audits, four themes emerged:
- Students found binary assessment categories easier to apply and define.
- Students had trouble identifying the audiences of the nonprofit.
- Students missed key elements of the website in their redesigns because of lack of audience awareness.
- Class 2 was able to effectively use a vague assessment category such as “mobile friendly” and still achieve a product that satisfied the client in the website redesign.
Students found binary assessment categories easier to apply and define
Categories students named for assessment were distributed into two columns based on how consistently students defined and used these categories:
Class 1. Some categories like current or outdated were easier for students to assess or determine as binary. For Class 1, elements like the distribution of face shields were relevant only during the Covid lockdown and were clearly not applicable anymore. Catherine marked these as outdated and correspondingly noted that this information should be deleted from the website. Catherine commented, “So I thought it wouldn’t be a good idea if people went on the site and saw that they could get a face shield, but then that’s a program that doesn’t exist anymore.” It is common for many websites, even banking websites, to have outdated content if there is no governance plan (Kenyon, 2024).
Class 2. Another category that was easy for students to check was spelling/grammar. In particular, Renee and Claudia mentioned the importance of checking faculty name spellings, including in publications written in another language. Students found this category easy to check, with clear answers to names being spelled correctly or not. Judge Brown and Sammy also used this category.
Mobile was a category that all five students checked on the display on their phones. Claudia in particular took several tries to edit the faculty main page to be “mobile friendly” and had to adjust the proportions of the pictures but achieved success in implementing the changes on the website for better mobile display. Cassandra also mentioned checking the webpages from the department on her mobile phone, as did Judge Brown, Sammy, and Renee. Mobile display was also a kind of binary category, with the display either reading clearly or with overlapping columns and text.
Accuracy/Authoritative proved a difficult category for many of the students to consistently use. Claudia knew to check with faculty (the subject matter experts for the department website). Only eight comments appeared in the audit in this category out of the 150+ comments in her audit criteria, four having to do with who was currently Chair of the department. Cassandra and Judge Brown used the accuracy category in name only. Cassandra’s comments ranged from “irrelevant” (four instances) to “not accurate” (two instances) and were used in the context of comments about content being outdated. Judge Brown’s category was named accuracy, but in reality, it was spelling and grammar. It was hard to maintain focus within this category.
Two students used “usability” in their audits but struggled to define this category. Cassandra pictured herself as the user and determined how many clicks were needed to get to important information. Sammy had the most defined rubrics to evaluate content. Sammy defined usability as “aligned to audience.” She tailored website content to three different primary audiences in her website edits. The faculty member responsible for these pages was very happy with Sammy’s work on the website. See Sammy’s rubric below followed by a page of her audit:
Figure 1: Sammy’s Rubric and Auditing Criteria
Figure 2: Sammy’s Page of Auditing Using Criteria
Even Sammy, however, had trouble defining usability and her explanation of content that failed the usability category was “the information is not readily obvious or is not obvious at all.” Asking Sammy “How do you define obvious?” could have further helped her in naming criteria for assessment.
Two categories mentioned in particular by Getto et al. (2023) and discussed in class were not used by the students: Relevance and Credibility. Students in Class 1 struggled to understand what “relevant” was in initial website discussions, so I discouraged the category. I more strongly discouraged Class 2 from using the category of relevant because it was less tangible to the students, and only Cassandra mixed that into her category on accuracy.
Practitioners defining auditing criteria may find that binary categories like spelling/grammar, mobile display, current/outdated may be easier to use when evaluating pieces of content because the standards are more clearly defined. This follows Getto et al.’s (2023) advice in the content audit chapter to use a yes/no category (p. 84). Usability may take additional definition, and accuracy may be a more difficult category which requires checks with subject matter experts.
Students had trouble identifying the audiences of the nonprofit
Class 1. Class 1 had trouble identifying the audiences for the nonprofit website. In order to help students better understand the audiences and goals of the nonprofit, I shared a document with notes about Serena, the website, and the organization. Serena attended class within the first three weeks of the semester. The students asked questions after Serena presented some additional context about the organization’s goals and values. Serena commented that students’ questions helped her self-reflect: “That really did make us sit there and go, ‘Okay, what are we trying to do?’” Because especially since we were growing so fast, it is really hard to lose sight of those things.” The organization had grown from a 2018 budget of $20,000 to a current 2024 budget of $600,000. The website had grown exponentially with the organization; there was no plan for the website. Due to this, it is possible that Serena had a hard time articulating some of the goals and audiences of the website.
Conversations with the client were helpful in teaching the students about the organization. In reflective interviews, students Catherine and Ken mentioned these conversations as teaching them about the organization and its audience. Ken mentioned that he also read the document I provided from an additional conversation I had with Serena. Both of the students interviewed from Class 1 had also worked with Serena in a previous class, so they had some exposure to the types of priorities and projects of the nonprofit. However, disconnects arose.
Serena, our client, in her retrospective interview, named “other nonprofits and providers” as a key audience for her nonprofit “because then if they [other nonprofits] needed something for other community members, or to write a grant or something like that” they could use this website as a resource.
The students were not aware of other local organizations using the website. When asked the question from the protocol “How did you learn about the organization’s goals, values and purpose for the website?” and specifically what the audience was for the website, in the retrospective interview, Ken answered, “To get more people to either volunteer or get more funders, types of funders.” He saw volunteers as the audience and had a vague idea of other funders looking at the website. Ken did not identify other local nonprofits as potential users. Catherine named the core goals for the nonprofit as “providing information to website visitors and having calls to action” for volunteering. The students clearly saw the audience as community members supporting the organization rather than other organizations who would need to use regional facts and information to leverage grant opportunities.
Class 2. Class 2 had fewer audience disconnects, but I was surprised that two students did not gather more information from the client visits. I asked Cassandra directly if she learned something from the visits from the faculty/clients beyond what she gleaned from the website and she said “no.” I asked Judge Brown if he learned something from the client visits about audience and purpose of the website, and he mentioned “restructuring the website.” Gloria noted students missed some of the secondary audiences, like “alumni and potential donors and the grading agencies who want to know expertise of various people [faculty] who might be on panels.” Judge Brown and Cassandra focused more on the website itself and did not learn as much from the in-class conversations. Judge Brown did identify one of the goals of the department correctly as “one of their goals is to enroll students.” Similarly, Cassandra understood in part the goal of the department as “[the department] wants to be clear on what classes are available in that degree field, what professors are available to reach out to if you have an interest in something specific in that department.” It was easier for these students to identify goals for the department since they knew the organization well, even without input from the Chair or other faculty.
Students missed key elements of the website in their redesigns because of lack of audience awareness
Class 1. For Class 1, understanding the “data dashboard,” a central feature of the website, was a key miss in the students’ audit of the nonprofit website. Catherine mentioned she would have liked to know some of the sources of the data on the website, as there was a lot of information, but the organization and origin of the information was not always clear. This information could have helped her make a less arbitrary decision on what to keep, change, or delete, which she acknowledged in the retrospective interview as a “relatively subjective” decision. Catherine wasn’t clear on why there was so much information on the website and had a hard time knowing how to treat this information in the content audit.
Catherine only used two audit rows for the data dashboard, one for the text blurb describing what it was and one for the link. She treated this as an external link and did not have extensive notes about the information included in this part of the site. She did mention in her comments on the audit that she felt the blurb was unclear and that was a signal that she was not understanding the purpose of this element of the website. Catherine was unclear about the role of the data dashboard, and thought it was for external funders to reference.
What may have been confusing to the students was in fact a key function of the client website. Serena commented in a follow up interview that “the data dashboard and certain things that we put a really, really high priority on being on our website, that was a little unclear with the students” and she felt that it was not reflected in the subsequent website tree structure or webpage redesign mock-up. Catherine’s audit treated the data dashboard as another textual resource, equal to the PDF of the parent resource guide in the subsequent line of the audit. The relative importance of this part of the website was opaque to students.
Figure 3: Catherine’s Content Audit
A vague category like “mobile friendly” proved simple for website redesign
Class 2. A finding from content audit analyses, webpages, and student follow-up interviews was that students often used more generic constructs like “mobile friendly” but were still able to implement the changes to the website and make improvements.
Mobile display was only part of the issues addressed in each group. For example, Claudia and Renee audited the 40+ webpages associated with the faculty for the department. For example, nine faculty who left the university were still on the website and six faculty who were currently employed were not at all listed. Claudia commented, “I definitely went back [to the content audit] when I made my list for later as to my checklist, what I needed to do and stuff like that. I went back to the main page to make sure I didn’t miss anybody.” A few faculty pages showed up well on a phone, but most, and particularly the faculty homepage, did not display well on mobile devices.
In any assessment process, it is important to develop specific, clear criteria in order to consistently assess the content. In content auditing for website redesign, Getto et al. (2023) give a good example of how to take the term “mobile friendly” and give it concrete criteria like “calls to action in the top third of the page” (p. 76). However, in the context of Class 2, some students were able to have a simple category of “mobile” without it being further defined. They edited the pages to view on mobile devices and made improvements to the website. The issue of mobile display was referenced and prioritized by most of the students interviewed from Class 2: Claudia, Renee, Cassandra, and Judge Brown. All the students checked the university webpages on their phones when gauging the mobile display. However, even when I pointed to the book’s suggestions on how to make the criteria more specific, the students’ content audit criteria remained simple: “mobility, readability” or “mobile friendliness and layout.”
Renee commented that the simplicity helped her: “We need something that’s very direct to know what it is.” Making the assessment criteria more specific seemed to be a burden, and students were able to implement changes even with the vague criteria. Despite the lack of specificity, the students’ attention to mobile issues translated to effective mobile display on the revised pages. Initially, the faculty names read as overlapping columns.
Figure 4: Mobile Display Problems
Mobile display resolutions to the website may have been a “happy accident.” When asked about how she made the page read well on a mobile device, Claudia at first couldn’t remember. When I reminded her that the first publication of the new page had proportion distortions on the faculty pictures, Claudia recalled, “There was a couple of the faculty images that actually worked so I went in and I looked to see and there was just this one little click that I had to do to each one of them and then it fixed it.” Through trial and error, and help from the OMNI content management system for the university website, Claudia was able to fix mobile issues on the faculty pages. See below for the faculty index page:
Figure 5: Mobile Issues Addressed
As Claudia mentioned, “it was just kind of a happy accident that it worked out.” She dismissed this as a coincidence, but I believe the simplicity helped students focus on what to change and had a concrete measure (display on their phones) to compare revisions of the website.
Research Question 2: What Additional Support Could Help Students Better Determine Assessment Criteria?
To determine the kinds of additional support that would help students better determine assessment criteria for content audits, I triangulated findings from my teaching journals, my assignment sheets, retrospective interviews from the students, and recursive analysis of the students’ content audit spreadsheets. Two themes emerged:
- Students preferred to see examples of the audit spreadsheets when developing their audits.
- Pre-setting audit criteria could help scaffold students in the auditing process.
Students preferred to see examples of audit spreadsheets when developing their audits
Several students in Class 2 mentioned how important it was to see content audit examples in order to create their own auditing spreadsheets and criteria. Cassandra mentioned in the retrospective interview that she wanted a list of resources to see examples of content audits. This points to the idea that seeing examples of audits are helpful in setting up content audits for websites. Claudia mentioned that it was key that other students shared their audits with the rest of the class. She liked “seeing Cassandra’s and Judge Brown’s, [as] I believe, he showed us his as well. Ours was definitely different than what they were doing, but it helped a lot, because we got to see what they did and it made sense.” Cassandra mentioned that it was helpful to see a template in our textbook (Getto et al., 2023, p. 81). She remembers, “The book for the class did have a sample table layout of some of their headings, and so I grabbed some of those, and I ended up building that table very similar to that.”
However, Cassandra notes that she made her own adaptations: “But then as I started going through the site and trying to apply them in the content audit, I was noticing that not all of them directly correlated to what we have on our website.” Renee also thought that seeing Getto et al.’s (2023) layout was helpful, in addition to other students’ content audits. Claudia said simply, “I need a visual.” Screen shots of content audits in case studies can be helpful for students learning the process. See a part of Claudia’s content audit below.
Figure 6: Excerpt from Claudia’s Content Audit
Pre-setting particular criteria may scaffold student content auditing
Two students mentioned needing more scaffolding in setting the criteria in the auditing process. Judge Brown would have liked a little more guidance on what needed to be in assessment criteria and mentioned, “The professor could also match it to maybe his kind of criteria in a rubric form, just match it to what he or she has, and compare if we do it this way.”
Judge Brown wanted a way for students to check their criteria against an instructor version. Some of his plans to change the website were just not feasible with the particular OMNI system, and he would have liked to know more of those restrictions in advance. Ken also felt that it would be a good idea to have base criteria from the instructor in the content audit. He further clarified that “some [categories] that they [students] have to have just so that they cover the basics, but then leave some of it up to them [students].” In this way, students would have chosen more scaffolded guidance in setting up the audit. Both of these students wanted the instructor to pre-set at least a couple criteria for the audit assessment so that they had a base to work from.
Students also varied on how much discussion was needed for them to understand the content auditing process, but Claudia, Renee, and Cassandra all appreciated the option to determine their own assessment criteria. Sammy, however, felt that she was “handed” the criteria for the audit, and would have liked more specific assignment guidelines instead of pre-set auditing criteria. Renee commented on the class discussion: “I feel like we still had some questions a week or two into this [content auditing], and then we were like, oh, that’s what you meant, or this is what we were talking about. And there was some miscommunication or misunderstandings, and discussing it all together as a class was really helpful.” Renee noted that initially she had a different opinion about what information was needed on the faculty pages (from her analysis of other university websites), but she learned to listen to faculty input and adopt some of their priorities in the development of a template for the faculty pages.
Discussion
Teach Students to Listen to the Client!
The audiences in these two service-learning projects, as with many service-learning projects (Bowden & Scott, 2002; Gonzales et al., 2016; Grabill, 2016; Huckin, 1997; Mathieu, 2005; Sapp & Crabtree, 2002; Scott, 2004) were real and complex. As Howard (2020) mentions, having a real audience is good training for students who may not know how to evaluate website content according to organizational goals. In both classes, students saw part of the audience but not the whole. The students who saw the audience most clearly spent more time listening to the clients and realized that their values and priorities for the website differed from the clients’ perspectives (particularly Sammy, Claudia, and Renee). Class 2 was more familiar with the audience for the department website but still had trouble seeing peripheral audiences. Judge Brown and Cassandra had a tendency to default to their own web preferences and experiences. Instructors of content auditing may need to train students to listen more actively to client and user needs in order to have students fully grasp the multiple audiences of websites.
Scaffold and “See” Evaluation
Another takeaway for instructors teaching content auditing is that recursively checking a draft of the audited website and evaluated content inventory is an important step before students move on to give the client strategy recommendations or even implement changes to the website. Scaffolding is an important part of teaching content auditing (Batova, 2021; Getto & Labriola, 2016; Steiner, 2020). I checked the inventory stage, but I did not ask enough questions about students’ criteria for evaluating the effectiveness of the website. More feedback at this stage would, I believe, address Judge Brown and Ken’s desire to have instructors predetermine criteria. Best practices (Altamirano & Stephens, 2022; Getto et al., 2023) and findings from this study indicate that instructors need to check the auditing process frequently and recursively.
Seeing audit samples helps students and practitioners alike. Land (2023) shows excerpts of case content audit spreadsheets. In particular, page 73 shares a sample criteria table with definitions of what gets rated 1, 2, or 3. Further, the case study by Jen Boland and TJ Peeler in Land’s auditing book shows a table of using web analytics and prioritizing particular pages (2023, pp. 82–83). More examples of audits help us understand how to adapt criteria to our organizations. However, many companies are sensitive about their audits and the information can be proprietary (Rayl, 2021).
A possible additional support for graduate students doing a content audit is to teach more on “why” they are evaluating the effectiveness of information. Sammy wanted a rubric for the assignment; Judge Brown and Ken wanted the instructor to predetermine criteria. Part of a service-learning project is the authentic context. The “why” which I could have communicated to students more clearly was that they were discovering with me how to evaluate the websites. In this way, I was not handing students something that I predetermined, but I offered an opportunity for students to practice industry standards in real time.
Simplicity over Specificity for Students
Simplicity in the form of binary categories like spelling/grammar, outdated/current, or mobile friendly helped students evaluate content more clearly than other categories such as usability or authoritativeness. More examples of audit criteria, like Land’s (2023) examples with screenshots of the actual website may better demonstrate the connection between audit and implementation. “Mobile-friendly” was critiqued by Getto et al. (2023) as being a nonspecific website assessment criterion. However, just determining whether website pages displayed readably on mobile phones was clear enough for students to implement changes that improved the website. Renee commented that the simplicity of “mobile friendly” made it easier for her to check the pages systematically and the resulting implementation was mobile-readable faculty pages. Gloria mentioned: “We ended up with a better website; we did.”
Conclusion
Students found Getto et al. (2023) a useful guide for creating content auditing, but they wanted more samples of completed audits. Even after recursive practice, students’ audit criteria were not always clearly defined. Emphasis on the effectiveness of pieces of website content need to be demonstrated with the resulting impact defined in example websites, so that students see the quality elements and tie it to systematic rubrics. Students often have a tendency to favor their own opinions on what makes a good website rather than listening to the client and may need repetitive training to unlearn this tendency. Students may find it difficult to prioritize an organization’s goals for a website over their own. Additional scaffolding with drafts, teacher comments, and perhaps a rubric for the activity of content auditing in subsequent iterations of the audit will help students create criteria that is specific for the audits. Practitioners may also find recursive practice helpful for clarifying auditing.
Limitations of this study include the sample size of the classes and the internal reflection of the teacher/researcher. With seven student participants, this is a small sample. This gives a case study snapshot, but more students and a quantitative survey of auditing practices or more questions focused on how criterion are named may better access questions of content auditing criteria, specifically.
One of the potential benefits to this study is that there is evidence that simple criteria, such as mobile friendly, may be enough to improve design. Even if criteria are not industry-specific, they may be clear enough to get the main idea implemented.
Future work could focus on content auditing specifically as a precursor to website redesign. Content auditing is a helpful check against otherwise subjective web design decisions. Additional research on how to communicate why audits are useful could ensure more buy-in among students. More blending of industry practice into the technical and professional communication classroom could offer additional skills that students could more directly implement in the workplace.
References
Albers, M. (2020). Transformation of complex information to fit a situation. In G. Getto, J.T. Labriola, & S. Ruszkiewicz (Eds.), Content strategy in technical communication. Routledge.
Altamirano, A., & Stephens, S. H. (2022). Streamlining complex website design using a content audit selection heuristic. Communication Design Quarterly, 10(1), 14–23. https://doi.org/10.1145/3507454.3507456
Andersen, R. (2014). Rhetorical work in the age of content management: Implications for the field of technical communication. Journal of Business and Technical Communication, 28(2), 115–157. https://doi.org/10.1177/1050651913513904
Bailie, R. (2024). Defining content operations. In C. Evia (Ed.) Content operations from start to scale: Perspectives from industry experts. 13–24. Virginia Tech Publishing. https://doi.org/10.21061/content_operations_evia_3.
Batova, T. (2021). An approach for incorporating community-engaged learning in intensive online classes: Sustainability and lean user experience. Technical Communication Quarterly, 30(4), 410–422. https://doi.org/10.1080/10572252.2020.1860257
Bay, J. (2022). Fostering diversity, equity and inclusion in the technical and professional communication service course. IEEE Transactions on Professional Communication, 65(1), 213–225. https://doi.org/10.1109/TPC.2021.3137708
Bourelle, T. (2012). Bridging the gap between the technical communication classroom and internship: Teaching social consciousness and real-world writing. Journal of Technical Writing and Communication, 42(2), 183–197. https://doi.org/10.2190/TW.42.2.f
Bowden, M., & Scott, J.B. (2002). Service-learning in technical and professional communication. Longman.
Brizee, A. (2015). Using Isocrates to teach technical communication and civic engagement. Journal of Technical Writing and Communication, 45(2), 134–165. https://doi.org/10.1177/0047281615569481
Dush, L. (2017). Nonprofit collections of digital personal experience narratives: An exploratory study. Journal of Business and Technical Communication, 31(2), 188–221. https://doi.org/10.1177/1050651916682287
Garrett, J. J. (2011). The elements of user experience: User centered design for the web and beyond (2nd Ed.). New Riders.
Getto, G., & Labriola, J. T. (2016). iFixit myself: User-generated content strategy in “the free repair guide for everything. IEEE Transactions on Professional Communication, 59(1), 37–55. https://doi.org/10.1109/TPC.2016.2527259
Getto, G., Labriola, J.T., & Ruszkiewicz, S. (Eds.). (2023). Content strategy: A how to guide. Routledge.
Getto, G., Labriola, J.T., & Ruszkiewicz, S. (Eds.). (2020). Content strategy in technical communication. Routledge.
Gonzales, L., Potts, L., Hart-Davidson, B., & McLeod, M. (2016). Revising a content-management course for a content strategy world. IEEE Transactions on Professional Communication, 59(1), 56–67. https://doi.org/10.1109/TPC.2016.2537098
Grant, C. (2022). Collaborative tactics for equitable community partnerships toward social justice impact. IEEE Transactions on Professional Communication, 65(1), 151–163. https://doi.org/10.1109/TPC.2022.3141227
Grabill, J. T. (2012). Community-based research and the importance of a research stance. In L. Nikoson & M. P. Sheridan (Eds.), Writing studies research in practice: Methods and methodologies (pp. 210–219). Southern Illinois University Press.
Holmes, A. J. (2016). Public pedagogy in composition studies. NCTE.
Howard, T. (2020). Teaching content strategy to graduate students with real clients. In G. Getto, J.T. Labriola, & S. Ruszkiewicz (Eds.), Content strategy in technical communication (pp. 119–153). Routledge.
Huckin. T. N. (1992). Context-sensitive text analysis: What texts talk about. In G. Kirsch & P. Sullivan (Eds.), Methods and methodology in composition research (pp. 84–104). Southern Illinois University Press.
Huckin, T. N. (1997). Technical writing and community service. Journal of Business and Technical Communication, 11(1), 49–59. https://doi.org/10.1177/1050651997011001003
Hughes, M.A., & Hayhoe, G. F. (2008). A research primer for technical communication: Methods, exemplars and analyses. Routledge. https://doi.org/10.4324/9780203877203
Jacoby, B. (2015). Service-learning essentials: Questions, answers and lessons learned. Jossey-Bass.
Kenyon, K. (2024). Governance for content operations. In C. Evia (Ed.), Content operations from start to scale: Perspectives from industry experts (pp. 33–42). Virginia Tech Publishing. https://doi.org/10.21061/content_operations_evia_5
Land, P. L. (2023). Content audits and inventories: A handbook for content analysis (2nd ed). XML Press.
Mathieu, P. (2005). Tactics of hope: The public turn in English composition. Boynton/Cook, Heinemann.
Mathieu, P. (2012). Short-lived projects, long-lived value. In L. Cella & J. Restaino (Eds.), Unsustainable: Re-imagining community literacy, public writing, service-learning and the university (pp. 1–13). ProQuest Ebook Central, http://ebookcentral.proquest.com/lib/xxxebooks/detail.action?docID=1117143
McEachern, R. (2001). Problems in service learning and technical/professional writing: Incorporating the perspective of nonprofit management. Technical Communication Quarterly, 10(2), 211–224. https://doi.org/10.1207/s15427625tcq1002_6
Nielsen, J. (1994). How to conduct a heuristic evaluation. Nielsen Norman Group. https://www.nngroup.com/articles/how-to-conduct-a-heuristic-evaluation/
Rayl, R. (2021). How to audit your library website for WCAG 2.1 compliance. WEAVE: Journal of Library User Experience, 4(1). https://doi.org/10.3998/weaveux.218
Rivera, N., & Gonzales, L. (2021). Community engagement in TPC programs during times of crisis: Embracing Chicana and Latina feminist practices. Programmatic Perspectives, 12(1), 39–65.
Rockley, A., & Cooper, C (2012). Managing enterprise content: A unified content strategy. 2nd Edition. New Riders.
Rose, E. J., & Cardinal, A. (2021). Purpose and participation: Heuristics for planning, implementing and reflecting on social justice work. In R. Walton & G. Y. Agboka (Eds.), Equipping technical communicators for social justice work: Theories methodologies and pedagogies (pp. 75–97). University Press of Colorado.
Sapp, D. A., & Crabtree, R. D. (2002). A laboratory in citizenship: Service learning in the technical communication classroom. Technical Communication Quarterly, 11, 411–431.
Scott, J. B. (2004). Rearticulating civic engagement through cultural studies and service-learning. Technical Communication Quarterly, 17(4), 381–412.
Shah, R. W. (2020). Rewriting partnerships: Community perspectives on community-based learning. Utah State University Press.
Sperano, I. (2017). Content audit for the assessment of digital information space: Definitions and exploratory typology. In Proceedings of ACM SIGDOC conference, Halifax, Nova Scotia, Canada.
Stake, R. E. (2003). Case studies. In N. K. Denzin & Y. S. Lincoln (Eds.) Strategies of qualitative inquiry (pp. 134–164). Sage Publishers.
Steiner, L. (2020). Teaching content strategy to undergraduate students with real clients. In G. Getto, J.T. Labriola, & S. Ruszkiewicz (Eds.), Content strategy in technical communication (pp. 171–191). Routledge.
Stoecker, R., & Tryon, E. A. with Hilgendorf. (2009). The unheard voices: Community organizations and service learning. Temple University Press.
Tang, Y., & Ding, H. (2023). Content strategy and intercultural communication: Analysis of international websites of Chinese universities. Journal of Technical Writing and Communication, 53(4), 356–381. https://doi.org/10.1177/00472816231171982
About the Author
Dr. Elisabeth Kramer-Simpson is an Associate Professor of Technical Communication at New Mexico Tech. She directs the Bachelor of Science in Technical Communication and the Master of Science in Science, Design and Communication in Public Engagement. She teaches courses in documentation, content strategy, grant writing, and social justice. Elisabeth researches service learning in a variety of teaching contexts for undergraduate and graduate students and uses qualitative and ethnographic methods. Elisabeth has published articles in the Journal of Technical Writing and Communication as well as IEEE Transactions on Professional Communication.
Appendix A: Class 1 Content Audit Handout
Content Audit, in content strategy p. 10
Interview users regarding content needs (goals).
Create criteria by which content will be assessed that also meshes with the intended goals and objectives of the content strategy plan. (What makes it a success? Rating system? Rubric?) (Criteria vs. norm referenced grading).
Inventory all related content within a spreadsheet or other storage system that can easily be visualized and compared across documents (This will be important for documentation consistency).
Assess related content via the criteria developed at the beginning of the audit (for additions or deletions) and other decisions related to scope.
Analyze/identify patterns within the content assessed. What is effective and ineffective?
Consider best practices for content by comparing to other industry examples. Have a strategy and a plan.
Consider requirements for technology and readability and access (and justice). Think about the communication channels that the content will be delivered through.
Report the findings from the audit to stakeholders (in a succinct way, and with visuals) that accounts for and prioritizes the company/organization needs and values (and brand).
Appendix B: Class 2 Content Audit Assignment Sheet
Limited Content Audit of the CLASS Department Website and Strategy Report
You will need to select part of the class website for this limited content audit. You will use information gleaned from interviews with Taffeta Elliot, Matt Johnson, and Steve Simpson to create a deliverable spreadsheet analyzing the existing content, assessing that content with a rubric tied to the branding and strategy determined from interviews, and create a report highlighting themes and action to implement strategy in revision of the website. Please use Excel or Google Sheets for your spread sheet, and you will have columns to identify the type of content as well as assess it (for the assessment, use two columns, which we will go over in class). The report should be a brief 2 single-spaced pages, and at max 3. Use guidelines analyzed in class from a previous class sample, Chapters 4 and 5 in our textbook, and other resources provided. Remember that you must make these findings easily digestible and understandable for your stakeholders who have limited time and many other responsibilities.
First, develop with the class a set of MAST goals for this section of the website, and for the website as a whole. Remember: the goals are both the organization goals intersecting with the audience goals. These goals need to be specific and quantifiable against standards. In other words, your goal can’t just be aesthetic because there is a lot of bias there, and you need to use someone else’s specific design standards, or Google or WCAG 2.0 standards for measurement.
Create a spreadsheet and give links to the content, define the content type in a specific metadata way, and leave room for three-four assessment criteria. Make two columns for each assessment criteria, one that is yes/no, and one that leaves room for specifics.
Consider a numerical rating scale according to criteria. We will discuss developing a specific rubric on Thursday, September 14, 2023. Use pp. 83–93 in the textbook for ideas for the rubric and how to make the assessment specific and measurable. You’ll need to know when you’ve reached your goal, and a yes/no on if it meets the assessment criteria or not. Some of the assessment criteria should help tailor that part of the website to its specific audience.
You will need to write up your findings in an easy-to-read way for Taffeta, Steve, Matt and others. Limit your strategy report document to 2 single-spaced pages please, and use specific headings and specific facts from your content audit to support the claims you are making about the CLASS department website. Prioritize themes into 3–4 achievable plans for improving the website to match the MAST goals discussed by the class and the assessment criteria you developed.
On a 3rd page, profile 2 personas for this section of the website, and give thorough information including pain points and how the CLASS website will help these people in their goals.
Appendix C: Interview protocol for students and clients
Content Audit Guiding Questions Protocol for students:
What experience do you have using a content audit approach to evaluating existing web content?
How did you learn about the organization’s goals, values, and purpose for the website? What was helpful information in this stage? What would have been helpful to have in addition?
How did you turn goals and values from the organization into assessable criteria for evaluating the inventoried pieces of content on the pages of the website? How did you name the categories?
What information from class, handouts, textbook, etc. was helpful in setting up your content audit?
Would you have preferred if I as the instructor had set the assessment criteria and rubric?
Why/why not?
Did the direction from the textbook to have a yes/no column help in assessing pieces of content?
What was difficult transitioning between the organization’s goals and values for the website and the assessment criteria?
Did you reference the content audit in later stages of the implementation of website changes?
Why/Why not?
Client Questions:
Why did you agree to work with students to redesign the website?
What goals did you have for the website redesign?
What do you remember of their presentations of content audit findings? Did you look at the content audit spreadsheets?
Did you read the strategy documents from the students? Did you rely on an oral presentation of the proposed changes and issues in the website?
How could we have better communicated this information, particularly main issues from the audit or patterns we saw?
What ideas the students proposed helped you re-see the website?
What do you wish the students had done differently?
What do you think the students didn’t understand about your organization and its website?