Kevin Garrison
Abstract
Purpose: To share advice about how to create a low-cost usability testing lab based on the results of a year-long process of acquiring funding for a lab that supports Angelo State University’s faculty, staff, and students.
Methods: Literature review of usability testing labs, trends, and theories; a case study of designing and implementing a usability lab.
Results: While there are numerous descriptions and justifications for labs in usability textbooks (Barnum, 2011; Rubin & Chisnell, 2008), Web sites (Nielsen, 1994), and articles (Potosnak, 1990; Blatt et al., 1994), changes in usability technology and decreasing prices allows for the development of new, cost-efficient laboratories in locations that have historically not contained labs, such as small university campuses. This article shares how Angelo State University created an up-to-date, personalized, on-campus lab based on past and contemporary trends in usability testing.
Conclusions: A usability lab on a university campus is possible to build, even with limited funding. Moreover, a step-by-step guide reveals that there are 1) strong theoretical reasons for constructing a usability lab, and 2) practical solutions for how to implement a design.
Keywords: usability testing, usability lab, costs, lab design, technical communication programs
Practitioner’s Takeaway
- Designing and implementing a usability testing lab is practical and potentially doable for university members and practitioners.
- A lab can potentially be a valuable space for supporting usability research.
- The costs of usability labs have decreased immensely due to digital components, high-tech usability software, and low-cost eye trackers.
Introduction
The prognosis for future careers in technical communication has been positive as of recent years, with US News naming “technical writing” one of the top 50 careers of 2011 (Grant, 2010) and the U.S. Bureau of Labor Statistics (2013) predicting an 8% growth in jobs for technical writers and editors from 2008 to 2018. Similarly, the field of user experience—with careers as usability engineers, usability specialists, usability analysts, usability testers, usability researchers, and heuristic evaluators—has seen a similar positive outlook with US News naming “usability experience specialists” one of the top 30 jobs in 2009 (U.S. News Staff, 2008).
While usability and technical communication have not always considered to be directly related, largely because “usability” is an interdisciplinary concept, Redish (2010) has argued that their pasts and futures are intertwined, and numerous textbooks introducing students to technical communication include chapters on usability (Anderson, 2007; Johnson-Sheehan, 2010). The trends to merge technical communication with usability trends can be seen, also, in a number of institutions which have been attempting to prepare undergraduate students for potential careers in usability. In Harner and Rich’s study (2005), nine of the 80 programs explored offer courses in usability, and in Yeats and Thompson’s study (2010), several of the 143 programs surveyed have a direct focus on human-centered design, usability, and user experience design. Individually, a number of institutions, such as Texas Tech University (2013), the University of Washington (2013), and Minnesota State University Mankato (2013), offer undergraduate technical communication courses in usability, usability testing, and usability design.
Responding to these trends, the Technical and Business Writing program at Angelo State University (a division II university with under 7,000 students) created a senior-level course (ENG 4365: Usability Testing in Technical and Business Writing) as one of the degree requirements for students working toward a B.A. in English with a specialization in technical and business writing. Offered for the first time in the spring of 2011 (Courses and Faculty, 2013), the course description states that the course provides:
An overview of usability testing (testing of products, product documentation, and web sites) procedures in technical and business writing, including the construction of a usability testing lab, practice at conducting usability tests through a service-learning project, and methods for reporting usability findings to clients. (Garrison, 2013)
During its first offering, the class attracted sixteen students from a variety of majors, such as Marketing, Computer Science, and English, who took the course from January to May of 2011. The capstone project of the course was for students to divide into groups of three or four, contact a client from either the university or the local community, and conduct a discount usability test on a small-scale project that, ideally, was in an early stage of development.
To insure the highest-quality instruction, the author, in conjunction with the English Department, spent more than a year, beginning in the summer semester of 2010, researching, designing, and acquiring funding for the creation of a usability testing lab to insure the success of the course while still allowing for the possibility of the lab being used to assist faculty in their research, staff in their support of university materials, and local community in their projects. In researching the lab, the author drew from several individuals who have undertaken to describe the process of designing and implementing a usability lab. Sources were consulted that focus on industrial laboratories—for companies (Sazegari, 1994), for computer software testing (Potosnak, 1990), and for a user-centered “feel” (Blatt, Jacobson, & Miller, 1994). The most helpful sources were Barnum’s (2011) chapter on where to conduct usability tests and Rubin and Chisnell’s (2008) description of several different layouts for one room testing, multi-room testing, and for remote testing. As well, a number of online sources were used to inspire a design for Angelo State University’s context (Koyani, 2006; Scanlon, 1999; STC, 2013). While most of these descriptions are helpful for providing context-specific ideas or general layout suggestions, most of these sources were not written for university labs at mid-sized institutions; many were written between five and fifteen years ago; and oftentimes, they describe theories, trends, and costs that are outdated.
This article shares a case study of one institution’s attempt to keep pace with usability trends and develop a lab of their own. The rest of this article describes Angelo State University’s four-part approach to designing and implementing a usability testing laboratory by first, inserting the lab as an important component to the twenty-first century university; second, designing the lab and explaining the logistics behind choices; third, populating the lab with technologies and analyzing costs; and finally, preparing the lab for course instruction and university availability.
Step 1: Why a Lab? Arguing for a “Space”
Some trends in usability testing suggest a movement away from laboratories, as some research has shown that it is more efficient to conduct remote tests for target audiences who are not always local (Hartson et al., 1996), more helpful to incorporate usability testing throughout the life-cycle of a product outside of the confines of a lab (Palmiter, Lynch, Lewis, & Stempski, 1994), and more user-friendly to conduct onsite testing (Andrzejczak & Liu, 2010). In this intellectual landscape, the obvious concern becomes whether a laboratory, which can often be expensive, is necessary and financially justifiable. The author maintains that it is for several theoretical and practical reasons.
First, in the context of the university, several schools have adopted usability laboratories as a way to provide an incentive for recruiting students, supporting user experience research, and offering technologies to support public, private, and educational research endeavors. Of course, labs do not have to be housed in any particular department or with any particular design. Currently, labs of all different types are located in a variety of university locations, such as a traditional lab in a technical communication and rhetoric program (TTU, 2013), a video gaming lab sponsored by gaming giant THQ (Radd, 2010), a one-room laboratory in a library (University of Utah, 2013), and a lab that supports a worldwide research team in a department that offers degrees in human factors (Bentley University, 2013). User experience research—as having ties to the cognitive sciences, to technical communication, to computer science, to information and library sciences, and to business and marketing—supports numerous interdisciplinary studies and academic programs. As such, any lab on campus is potentially better than no lab on campus since it provides faculty, staff, students, and community members with an ability to conduct tests. And for universities with no labs, technical communication programs remain in a strong intellectual tradition that makes advocating for labs easier, largely because they are one of the few academic fields that focus on the user (Carter, 2005).
Secondly, also in the context of the university, a university lab can help prepare students for workplace trends. Numerous companies have laboratories for testing, including Nielsen’s outdated list of 13 companies that have labs (1994) as well as a number of technology giants, such as Microsoft (2013) and Google (2013). Even a quick glance at any major company Web site, such as Sony (2013) or Apple (2013), reveals numerous jobs in user experience research—oftentimes including a need for working knowledge of usability lab technologies and several years of experience in a laboratory environment. Of course not all areas need new labs. Larger cities such as Austin, Texas already have several independent usability laboratories available for contracts and testing, such as Human Interfaces (2013). Moreover, labs in other cities can be found via a search at Quirks.com (2013) which provides a list of numerous usability labs available nationwide, though often within the context of market research. However, for several isolated populations, such as in a location like San Angelo, Texas, the closest industrial and university labs are over three hours away. Building a small, low-cost laboratory remains a more viable option for exposing students to workplace trends in user experience research rather than traveling to a larger city and renting an existing lab.
A third argument, and perhaps the most important argument, is that the presence of a laboratory, as Rubin and Chisnell (2008) suggest, helps make the abstract concept of “usability” into a more tangible and implementable idea. As many technical communicators can attest, an initial mentioning of the word “usability” tends to garner a confused look from individuals unfamiliar with the concept until the idea is put into more practical terms, the most famous being the oft-used example of “everyone has experienced frustration while programming a VCR/DVR; hence, a need for usability.” Describing usability as the “absence of frustration” (Rubin & Chisnell, 2008, p. 4) or in the five part MEELS (memorability, errors, efficiency, learnability, satisfaction) acronym (TTU, 2013) quickly excites anyone who is interested in working with technology, business, or psychology since the elephant in the room for studies of technology has been, for years, that technocracy and capitalism both have largely created a world that is not designed with people in mind (Feenberg, 1999). Usability testing strikes a human element in discourse about technology as it allows for individuals to be in a dialogue with the engineers, programmers, and other producers of technology rather than being regulated to the fringes of technological discourse (Johnson, 1998). As such, a lab invites a metaphorical and literal “space” in which individuals can envision the testing process through the use of physical labs and lab technologies.
In the context of the university, advocates of usability have the difficult task of making the user relevant to faculty, staff, students, and community members when they lack access to a space for testing. As such, usability can largely only exist as an ideal that has no practical outlet. As a typical example, consider instructors teaching usability in a Web publishing course that uses service-learning to accomplish its goals. As Schriver (1992) has argued, implementing usability testing into a curriculum has the ability to help students write better due to stronger awareness of their target audience; however, instructors in Web publishing courses have to be relatively creative with usability instructional methods because merely sharing with students the most common trends in usability does not make usability a tangible concept. One common usability instructional method for Web design is to share tips or provide heuristics to follow, such as the “guiding principles” (chapters 1-5) found in Krug (2006) or the advice offered by Nielsen and Pernice (2010) from their eye tracking studies, but such are limited largely because advice and heuristics don’t directly access actual users of a new or revised site. Another common solution for instructors is to require workshops where peers serve as potential members of the Web audience and help to bridge the gap between designer and user, but, of course, the findings are somewhat limited to whether the peers are members of the user profile. Other options suffer similar shortcomings, such as requiring students do onsite or remote testing of their early or final drafts of Web projects, which can rarely succeed due to lack of student accessibility to technological resources. At best, teachers can only require students to conduct site visits, interviews, or focus groups, which, of course, provide user feedback, but are limited in scope, depth, and data. As such, any teacher who prioritizes usability in a classroom can mostly only prioritize it in theory, not practice, thereby exposing students to the potential power of usability testing. A physical lab, with its fixed location and available resources, allows students to conduct, at minimum, formative tests (Rubin & Chisnell, 2008) on early versions of their Web projects, and at best, summative and validative tests (Rubin & Chisnell, 2008) to insure that their projects are fully functional for their target audiences.
Likewise, staff and faculty members face a similar problem. As universities continue to upgrade to new content management systems, redesign Web sites, create promotional materials, develop handheld applications, and generally adapt to a twenty-first century world of technology, usability becomes an ever-increasing need to insure that potential students, prospective students, alumni, and other university participants can engage with a world increasingly centered on information. Without a lab environment, usability can largely only serve as an academic theory that can rarely be realized, primarily because of lack of resources and access to information about usability. The lab functions, conclusively then, as a “hub” for encouraging university members to adopt usability as a practical and implementable goal for making their projects accessible to users.
A final reason for designing and implementing a usability lab is simply that university participants need access to usability technologies since many lack offices spaces, laptops, software, quiet spaces, and other resources necessary to do testing. On the course-level, the adoption of ENG 4365: Usability Testing in Technical and Business Writing at Angelo State University can largely work only if students can do the tests—a question of utility that is potentially answered by a laboratory space. While Angelo State, as do most campuses, currently has numerous computer laboratories that are used for classroom instruction, student meeting places, and extracurricular workshops, none of these computer labs were sufficient for the needs of the course for several reasons. First, most computer labs contain up to 25 computers in a large room that are designed in such a way as to focus on instructors and a projector screen/white boards; as such, the design of the rooms are largely unusable for small group testing. Second, the labs are mostly used for course instruction, and as such, they are frequently occupied from 8:00 a.m. to 5:00 p.m., thereby disallowing a conductive time for testing with real-world participants. Finally, even if students were to access the labs before or after hours, the technologies of the lab are not conducive for conducting usability tests as they largely only contain desktop computers with basic software, such as MS Office 2010 and Adobe CS 3; they did not have camera/microphone technologies, screen-capturing software, usability software, or video editing tools. A lab, then, responds to these three problems by allowing for a quiet meeting space for students as they collaborate on projects, design tests, conduct tests, analyze data, and prepare to present findings.
While not all campuses or programs have an explicit need to develop labs for specific courses in usability, a lab is not just limited to functioning as a course tool; it can also support faculty and staff as they conduct research on their local projects. Purdue University, for example, conducted a usability test of their famous Online Writing Lab (OWL) through online tests conducted in the Writing Lab (Salvo et al., 2006). Harvard University (Pierce, 2005) and the University of Alberta (2003) similarly conducted extensive testing of their new university Web sites before launch. As well, the University of Washington (McDavid et al., 2008) conducted a test on their intranet service “Educational Outreach Network.” Several university library Web sites have also been analyzed, such as Northern Illinois (VandeCreek, 2005) and Georgia Tech (King & Jannik, 2005). In the UK, the Jennie Lee Research Labs support individuals interested in university research (2013).
While universities have striven to make their materials more usable, numerous academic and non-academic departments do not have access to technologies that usability labs are famous for—usability software, video cameras and tripods, eye tracking hardware and software, screen capturing software, video editing software, software that allows for remote viewing of tests, accessibility software, and more. As such, while faculty and staff might have computers and an office space to do testing, these spaces are not guaranteed to be quiet, nor are individuals guaranteed to have access to the correct technologies and appropriate resources. Financially, universities also cannot justify having to purchase numerous, identical technologies for each department so that faculty and staff can conduct tests sporadically as projects emerge. A lab minimizes university costs by making a one-time purchase with few recurring costs while still encouraging usability throughout departments.
Step 2: What Type of Lab? Designing the Layout of the Lab
Assuming that a usability testing lab is appropriate, as argued in the previous section, the task becomes defining what one means, specifically, by a “lab” as well as defining a context for a lab at in a twenty-first century university. Numerous designs exist for labs—each meant to address specific contexts with different users and purposes. The challenge for Angelo State was to create a design that served its four primary audiences and purposes, which were: 1) to insure that the lab at Angelo State was appropriate to support ENG 4365 and its students while they worked on individual and small group projects, 2) to allow faculty to conduct usability research on their projects, 3) to give staff access to usability technologies as they explored problems with university materials, and 4) to encourage individuals from the local community to engage with questions of usability. This section provides an overview of the different types of lab designs as found in the research, explains the limitations of each design as it relates to Angelo State’s context, and then provides the solution that the author devised.
The Portable Lab
A usability “lab” can be defined by either access to a testing space or testing technology. In terms of technology, the cheapest lab can be created in real-time with nothing more than a pencil and paper to draw out designs, test paper-prototypes, and take notes. For more complex tasks, however, some types of usability, such as Web usability testing, game usability testing, and document usability testing, require more complex technology, such as a laptop with a built-in Webcam and microphone, a screen-capturing program, and office software. As such, one potential lab design is to create a portable “lab.” Portable labs can either be pre-fabricated, such as Noldus’ (2013) portable usability lab (see Figure 1), or they can be designed and constructed for individual needs, such as the University of Georgia’s “Luggage Lab 2000” (1995).
While a portable lab was considered for Angelo State, the primary concern with a portable lab came down to one problem: it does not meet the needs of ENG 4365. Students conducting projects could (in theory) each check-out a portable laptop at the beginning of each semester, take the laptop onsite, conduct tests, analyze data, and then create a report and presentation of their findings; however, such would require a number of portable labs—at least four per semester—in order for each group to be able to conduct all their tests. Moreover, several other irreconcilable problems led to rejecting the idea of a portable lab, including the following:
- High risk for damage of technologies,
- Redundancy of costs by having several identical “labs” that students would need to check out at the same time when group projects were in place,
- Lack of a physical meeting space for student groups,
- Need for group members to use the laptop simultaneously,
- Problems with infringing on participant and tester spaces,
- Lack of student ethos in convincing clients of the validity of findings and participants in the validity of the tests,
- Difficulties with recruitment when a physical space is not present (Seffah & Habieb-Mammar, 2009).
The Traditional Lab
The traditional lab was also considered. As described by Seffah and Habieb-Mammar (2009), the traditional lab separates two rooms with a one-way mirror and sound-proofs the walls (see Figure 2). One room is designated as the “observation area,” where clients and testers observe the tests, while the other room is designated the “testing area,” where test participants and moderators gather to conduct the tests. In the most expensive labs, oftentimes a third room (Rubin & Chisnell, 2008) serves as a “lobby area,” where test participants gather to sign forms, wait for the tests to begin, and consume food and drink, and sometimes a fourth room serves as a “control area” where technicians control the technology.
A traditional lab was considered for Angelo State’s needs, but was rejected for several reasons, including the following:
- A potential for high anxiety level of test participants from being observed through a one-way mirror which creates an “impersonal environment” (Rubin & Chisnell, 2008, p. 110),
- Communication difficulties that emerge from separating the moderator from observers and other testers,
- The cost of the mirror and the difficulty of sound-proofing it,
- Large amounts of “unused” space that is difficult to justify financially for small tests.
The Remote-Room Lab
In another version of the “traditional lab,” the remote room lab works by separating the observation room and the testing room and connects them via a network—a digital solution to separating the two rooms. A remote room set-up allows for the projecting of tests onto a projector screen in the observation area so that dozens of test observers can “watch” the tests remotely without actually being confined in a room behind a one-way mirror. By using digital software, such as Morae Observer (2013), testers, clients, and other individuals with a vested interest in the results can observe the tests without actually being in the same space.
The remote-room lab has similar disadvantages to the traditional lab, but without the added expense of a one-way mirror. The most problematic aspect of the remote-room design, though, is that few tests, especially in a course environment for Angelo State, will ever need to have more than just a few observers at a time.
The One-Room Lab
A one-room lab was eventually adopted for Angelo State, though elements of a remote and a portable lab were used. Based on Rubin and Chisnell’s (2008) description, in a one-room lab, the clients, testers, and participants are all placed in close quarters with the moderator of the test situated either next to the participant or at a distance on a second computer. Angelo State’s lab (see Figure 1) was an 8 foot by 18 foot storage closet area that was transformed into an office space and divided into three components: the testing area, with a desk, a desktop computer, and camera technologies for filming; an observation area, with a table and multiple chairs for individuals wanting to observe the tests and annotate the test via the white board; and a work/storage area where individuals could process video files on a second desktop and also observe tests via a LAN connection to the testing computer.
The one-room lab set-up allows for the cheapest (and most flexible) way to conduct numerous types of tests while still meeting all the requirements of the four possible test scenarios and more:
- The lab serves as a meeting, testing, and work space for campus testers—especially students in ENG 4365 who conduct mostly individual tests or small group (2-4 members) tests on digital media. For the group projects, students typically break into three roles—a moderator, one or two note-takers, and a technician. When combined with the participant, the maximum number of people in the room at any one time is five individuals. The 8 foot by 18 foot space was ideally suited for this small group testing.
- The space allows for greater ethos for student testers when inviting participants to conduct tests.
- The design maximizes the use of space by not having entire rooms dedicated to observation, while also allowing for testers and moderators to have a close connection to participants, which encourages more think-aloud.
- If individuals wanted to conduct onsite testing, the filing cabinet houses a laptop with a built-in Webcam and microphone that would allow for basic testing away from the lab.
- Both desktop computers were connected to the LAN, which allowed any of the current computer labs on campus, with their projector set-ups, to serve as a remote observation room when using Morae Observer (2013). This potential allows for large numbers of individuals to observe the test simultaneously.
- The white board and round table allows for paper-prototyping (Still, 2010).
- Both desktops could be used simultaneously (as well as the laptop, if set up on the observation table) to allow for up to three synchronous tests. The three computers are also all connected to the internet to allow for remote testing, as well.
- The lab serves as a hub and storage for technology for 1) local tests, 2) networked tests, 3) checking out technologies for onsite tests, and 4) conducting paper-prototyping/whiteboarding.
While this laboratory set-up has its own limitations, such as the lack of storage space and the tendency for participants and observers to feel overcrowded at times, the end result maximized efficiency for all four potential test scenarios—student testing, faculty testing, staff testing, and community member testing.
Step 3: What Are the Costs of a Lab? Populating the Lab with Technologies
A number of the technologies have already been explained in the previous section, including the physical technologies and some of the basic computer set-ups. This section, therefore, will focus on three aspects related to cost: (1) deciding on physical components of the lab, (2) purchasing usability components, and (3) acquiring funding. Because costs of technologies are constantly in a flux, the following sections will mostly only share total cost of technologies rather than the cost of individual pieces.
Room Renovations
The department made an 8 foot by 18 foot storage closet available for lab use, and it was renovated in several phases. First, we removed the previous technologies of the room, including transferring physical technologies and boxes of books to another storage area (and in some cases, eliminated unneeded “stuff” that had accumulated over time). Second, we submitted a request to Angelo State’s Facilities Management Office to overhaul the room and bring it up to code. Facilities Management workers spent several months completing several fundamental requests—repainting the room, repairing a hole in the sheetrock, laying new carpet over the older tile, repairing ceiling tiles, adding LAN connections, hanging the clock, and redirecting air flow to allow for the room to be cooled and heated.
Sound-proofing is an important part of a usability lab, and perhaps the most important part (Ovo Studios, 2013). Without being able to control loud and unpredictable outside noises when recording, much of the test data could be questioned or invalidated. While sound-proofing material in between the walls would be ideal, Ovo Studios (2013), a company that designs labs, argues that a “sound-resistant” lab is more realistic than a “soundproofed” room since ambient noises, such as conversations from adjoining rooms, are not likely to interfere with the test or the test results. Because Angelo State’s room contained a thick, wooden door, the room itself had no windows, and the walls were adjacent to three areas that were not noise-heavy, we made no additional steps to reduce levels of noise in the room.
Renovating the room cost approximately $5,800.00.
Physical Lab Components
Once the room was prepared, we acquired the following items to populate the room:
- Five ergonomic chairs (one for the participant, one for the moderator, and three for potential observers)
- Two six foot desks (one for the observation computer and one for the participant’s computer)
- One small table (a piece that serves as a space for signing forms, doing paper-prototyping, and as a meeting area for observers)
- One whiteboard (hung above the table to serve as a place for generating ideas, sharing instructions with users, or doing paper-prototyping)
- One poster board (hung in the hallway for announcements and advertising the lab)
- One filing cabinet (acquired from a storage closet and used to store physical technologies)
- One plant (donated from a lab assistant)
- One picture frame (for a “Policies and Procedures” to be hung on the wall)
The cost for all of these components was approximately $5,000.00.
Usability Components
After the room was prepared and the physical technologies acquired, we purchased the following usability technologies with the corresponding justifications:
Computers. Because of the cheaper cost and flexibility of digital technologies in comparison to analog technology, Angelo State Usability Lab is entirely digital; there are no televisions, switchboxes, and analog video recorders. Instead, all video is recorded by using two standard desktop computers. We purchased both computers through a university contract with Dell, and both were identical in their specifications, which included multi-core processors, large hard drives, several gigabytes of ram, memory card readers, and rewritable DVD drives. Few computers on campus are Macs or use Linux-based operating systems, so for consistency, as well as compatibility with usability software (see below), we installed Windows 7. Both computers are connected to Angelo State’s local area network (LAN).
While most contemporary desktop computers should work for lab usage, the most important aspect for the computers was a dual monitor set-up, which also required the purchasing of video cards. The dual monitors allows for advanced video editing, more collaboration from students on projects as they do multi-tasking, and the use of the eye tracking software (described below).
As well, we ordered a third computer—a laptop—with similar specifications for the possibility of having another computer available for note-taking and data analysis. The laptop is more flexible in terms of placement, and it has also been used for demonstrations (using a projector and VGA cable) and conference presentations. Also, because the computer has a microphone and a built-in Webcam, the computer is also able to serve as a portable lab, though check-out is limited to faculty and staff members.
The computers and printer cost approximately $3,750.00
Audio/Video Equipment. We purchased four cameras—two Webcams with 720p recording capability for each desktop computer and two Sony Handycams with 480p recording capability (2013). The Webcams are useful primarily for conducting Web site usability, software usability, and other computer related tasks, largely because they film the individual’s face as they navigate a digital environment. As well, the two Handycams allow for more portable video recording and, when combined with tripods, are able to record more complex task scenarios, such as filming participants’ hands, legs, or body movements. The Handycams are standard definition (SD) cameras and film in a maximum of 720X480 resolution; this resolution was chosen largely because it allows for widescreen filming but without the extreme computer processing needs to work with high definition video (720 or 1080). While cameras built into the walls would have been ideal, as they are for many high-tech usability labs, the ease of movement that comes with Webcams and portable cameras dictated the purchasing of these cheaper cameras. Finally, we purchased 16 gigabyte memory cards for use with the cameras.
For audio, all four cameras have a built-in microphone, but we also purchased three headsets that include microphones. These headsets allow for both silent listening as well as direct recording through a microphone placed in front of the participant’s mouth.
The audio and video equipment cost approximately $1,000.00
Eye Tracking. We purchased an S2 eye tracker from Mirametrix (2013) in order to allow insights into the visual attention of users who work on computer technologies. The S2 model connects to the computer using a USB port and can be placed underneath the computer monitor, much like a Nintendo Wii sensor bar. The eye tracker works by using infrared cameras that are calibrated with the user’s eyes by doing a nine point check on one of the screens. Once calibrated, the user has a relatively free amount of movement as the cameras are unobtrusive and are able to record an AVI file of what the fovea of the eye focuses on during a given test scenario. As well, the Mirametrix S2 eye tracker comes with software that allows for a second monitor to reveal what the user is observing on the first monitor or the ability to stream the video over the LAN to a second computer. Because of this advanced capability, the dual monitor set-up allows for a test user to accomplish task scenarios while the moderator can tilt the second monitor toward him/herself to observe the user’s eye movements in real time.
We wanted an eye tracker for the lab primarily as a way for students and myself to conduct exploratory research. Students in ENG 4365 for the spring semester of 2013, for instance, conducted their first project of the semester using the eye tracker on different media (i.e. video games, Web sites, advertising, digital literature, and more). I required each student to conduct a test over “how does an Angelo State student _______” with each student filling in the blank with activities such as “find Waldo,” “play online Bingo,” “read a poem,” or “read subtitles in a foreign film.” These studies allowed for students to get exposure to the lab, practice moderating tests, learn about data analysis, and present their findings. The students shared their findings using a standardized poster, which I placed in the hall surrounding the lab as a way of advertising the course and sharing insights into human-computer interactions.
The eye tracker cost $4,000.00.
Usability Software. Through a campus license agreement, all three computers were able to have the standard Microsoft Office (2013) software which includes MS Word, MS Excel, and MS PowerPoint. As well, the computers were able to run IBM’s SPSS (2013) without any extra expense.
Most importantly, the lab obtained a licensed copy of Morae (2013), which was purchased due to its ability to collect large amounts of quantitative data. Morae allows team members to construct test scenarios, run unmoderated tests, capture video streams from both the monitor and the Webcam, create highlight videos, and mine large amounts of data, including mouse clicks, mouse distance moved, Web sites viewed, and more. For digital tests, such as Web site usability and software usability, Morae is one of the most comprehensive software programs available. Because we were limited to PCs, we did not consider Silverback 2.0, which is usability software for Macs.
Because Morae is a relatively complicated software package and might have more options than a simple test might require, each computer also runs a screen capturing program called Camtasia (2013). Camtasia 7 allows the user to perform task scenarios while Camtasia records both the screen as well as Webcam footage. Camtasia also allows for video editing of both video streams for a picture-in-picture highlights video.
We also purchased video editing software. For ENG 4365 group projects, I require students to capture video from one of four sources: 1) the Web cameras, 2) the video cameras, 3) Morae/Camtasia, or 4) the eye tracker. Then, the students are required to produce a highlights video (Yeats & Carter, 2005). Because the four sources save the files in a variety of formats (i.e. the video cameras save .MOV files while the eye tracker saves WMV files), we purchased Adobe Premiere Elements (2013) for video editing since Premiere Elements works with most file types while also not overloading the students with more extensive software, such as the full Adobe Premiere software. Windows Movie Maker Live (2013) was also downloaded free from Microsoft’s Web site to allow for students to have an alternative and more basic program for video editing, though Movie Maker is compatible with fewer file types.
For accessibility issues, we purchased Natural Reader (2013), a text-to-speech program, in order to read written text aloud for people with vision impairments. We also purchased Dragon Naturally Speaking (Nuance, 2013) s a speech-to-text program in order for an alternative method of inputting computer text. We plan to buy other accessibility software as necessary; for instance, one visually impaired student recently required the use of ZoomText (2013) which we installed on both computers.
The software cost approximately $3,000.00
Total Costs. Altogether, the English Department at Angelo State spent under $23,000.00 for the entire cost of the usability lab (see Figure 2). While on one level, this amount seems relatively high, there are several ways to cut costs. By purchasing fewer cameras and computers, only one licensed copy of each software package rather than multiple copies, and forgoing the more expensive costs, such as the eye tracker and new furniture, a department could fund an entire lab for under $10,000 if a sound-resistant space was available. Moreover, the cost of digital technologies is continually decreasing. Texas Tech, for instance, has recently developed an eye tracker for under $1,500 (2013).
On hindsight, since construction and use of Angelo State’s lab, the most unnecessary expenditures have proven to be: (1) the second cameras since most tests require only the use of a Webcam and a back-up video camera, (2) the laptop, since its use has been limited to faculty and staff, (3) Camtasia, since most testers have favored the use of Morae, and (4) the accessibility software, since none of the testers have required the use of speech-to-text or text-to-speech. Unsurprisingly, the most inexpensive items have also proven to be the least necessary. Most of the extreme costs were directly related to renovating the room and purchasing furniture. If an office space and second-hand furniture was available, then simply purchasing a few computers, cameras, and software for a few thousand dollars would be more than sufficient for conducting low-key tests.
Acquiring Funding
Acquiring funding at Angelo State was possible due to the support of the dean of the college. Because Angelo State has recently committed more funding to programs that are growing and developing desirable courses and laboratories, funding was achieved by writing a proposal for faculty development grant money, which was read by the dean of the college and then funded via the college’s money.
There are, however, other ways to raise funds for designing a lab, such as:
- Contacting the university about faculty development grant opportunities
- Contacting department heads and college deans about potential funding available
- Identifying external grant opportunities through university resources. At Angelo State, the Office of Sponsored Projects provided information about such opportunities
- Contacting the university’s warehouse or IT support for information about furniture, carpet, computers, and other items that could be obtained for free or at a low cost
Contrary to intuition, once you have the space, it is NOT as expensive to populate a laboratory with technologies provided that the context is similar to Angelo State’s context and provided that a “state of the art” laboratory is not needed. The primary cost for the author was more in terms of time than money—time for conducting the research, writing the proposal, designing a functional laboratory to fit with multiple university contexts, and doing the final part—finalizing all the details for making the lab functional, described in the following section.
Step 4: What Is Needed to Finalize the Lab? Preparing the Lab for Use
Once the lab was completed, the author spent several months preparing the lab for use. The overall process of preparing the lab for use can be broken down into five parts.
Setting up the Lab
During the summer and fall of 2010, the author spent most of June through December researching justifications for a lab, writing a grant proposal, meeting with the department head, sketching out several possible configurations for labs, meeting with Facilities Management, identifying the core users and purposes of the lab, requesting quotes, and drafting a preliminary budget. Once the proposal was accepted in September of 2010, work began on renovating the room.
Once the room was prepped, Angelo State’s department of English was fortunate enough to contact the Angelo State’s Information Technology Office and request that all the computers be set up. After the hardware has been set up and the computers connected to the LAN, then the primary obligation becomes installing all the security measures, allowing testers access to the hard drive, and installing the software. Several hours were then spent testing each software package and insuring that any “bugs” were eliminated from the system.
Setting up the lab was possibly the most difficult step of the entire project. We were able to renovate the room during the latter part of the fall semester of 2010, and the room was ready to be used during the early part of the spring semester of 2011; however, purchasing the furniture proved to be a several month process. Because students in ENG 4365 needed to conduct tests for their clients while we were still setting up the lab, we were forced to borrowed furniture from several storage closets on campus as a temporary measure so that we could set-up and use the computers and cameras to perform tests. The furniture arrived while student were working on group projects, and we had to shut down the lab for a day as we removed all the temporary furniture and set-up the permanent furniture.
Overall, the set-up of the lab took a complete year from conception to final implementation.
Advertising the Lab
Because usability is a field that is not always well known outside of technical communication and certain business areas, the use of the lab is largely dependent upon marketing done by the director in charge of the lab. The author has found the following ideas helpful in spreading the word about the lab:
- Develop a Web site—The Web site at Angelo State serves not only the purpose of advertising the lab, but it also advertises the class, contains forms for download, and provides tutorials for people unfamiliar with concepts of usability and usability testing. See http://www.angelo.edu/usability for the current Web site.
- Make connections—Contact individuals on campus connected to fields closely related to usability, such as Psychology, Computer Science, or Business, in order to alert individuals to the presence of a lab.
- Contact the local newspaper and the university journalism office—By contacting the university or local newspapers, it is possible to get “free” advertising for the lab by alerting faculty, staff, students, and the community to the capabilities of usability testing. At Angelo State, one newspaper article in the local newspaper, written by a local journalist, provided several contacts and potential clients from the community.
- Insure that university and faculty advisers are aware of the presence of the lab—By having advisers aware of the presence of a lab on campus, they can direct students to the possibilities of taking courses doing research using the lab’s technologies.
Advertising the lab is an ongoing issue. The Web site took over a week for us to develop content, and other advertising opportunities (such as interviews and presentations) take several hours each semester to continue educating new students, administration, and faculty/staff.
Developing Forms for the Lab
In addition to advertising the lab, several other forms might need to be created. Many of these forms can be downloaded at Angelo State’s usability lab Web site: http://www.angelo.edu/usability.
- Covenant not to Compete—In a university setting, consider drafting a legal document, preferably in connection with the legal offices at the university, that establishes the lab and/or the class as being useful primarily for university faculty, staff, and students—not as a competitor for local usability labs.
- Client FAQ—Consider creating a Client FAQ, especially if the lab will be used in conjunction with a course. A Client FAQ should provide ample information about what types of projects are applicable for usability testing as well as expectations for both students and clients.
- Consent Form—Draft a Consent Form that establishes the rights of the testers and the rights of the participants in order to alleviate any potential legal conflicts and to protect the rights of all parties involved. Also, establish where the forms will be stored. At Angelo State, we store electronic forms at the Web site, paper forms in the lab, and store all signed forms in the filing cabinet to maintain the privacy of the participants.
- Video/Audio Release—Draft a form that allows for the video and audio recordings to be used both for data collection and also describes the limits and uses of how the video can be used by testers. We store these release agreements in the same places as the consent forms.
- Lab Policies and Procedures—Consider drafting a document that establishes what is and is not allowed in the lab to prevent potential breaking/loss/theft of equipment and behaviors acceptable for lab use.
Researching and creating forms for the lab took several weeks. The author was fortunate to borrow drafts of these forms from different books (Rubin & Chisnell, 2008) and from existing resources on campus.
Applying for an IRB
As discussed by Rubin and Chisnell (2008), each university has different internal review board (IRB) requirements as it relates to research done with human participants. At Angelo State, the IRB officer deemed that all research conducted in the usability lab did not need an IRB because the research was conducted for purposes of changing Web sites, software programs, brochures, and other written discourse. As such, the research is not being conducted on “human participants” so much as it is on the technologies themselves. At Texas Tech University, however, the director of their usability lab has applied for a “blanket IRB” which allows for all research conducted in the lab to be covered under the IRB process. Each university considering a lab should contact their IRB officer to determine the right process for them.
Researching and talking with the IRB officer took approximately one week.
Acquiring a Student Assistant
If funds are available (and even if they aren’t), having a student assistant work in the lab allows for the director to focus on larger issues related to lab use rather than focusing on tasks such as scheduling, locking/unlocking the lab, equipment maintenance, and other tasks that could be done by a student worker. By having a paid student worker take care of the lab, the student gains valuable workplace experience and the director saves time.
The author was fortunate to be able to hire a student assistant for the spring semester of 2011. This student was able to prepare forms, manage scheduling of the lab, test the software, tutor users of the lab, and generally be the primary “go-to” person for lab use.
Conclusion
Usability, as a field and as a workplace trend, is only likely to grow as our world becomes increasingly centered on the production and consumption of information. To use Richard Lanham’s (2006) concept, twenty-first century people bring limited attention spans to the ever-growing information networks, and as they interact with information in increasingly digital ways, via tablet PCs, cell phones, computer monitors, television screens, projectors, and more, the user experience expert and usability lab become increasingly important entities on each university’s campus. Moreover, the practical benefits of a lab are equally obvious, such as providing insight into recruitment opportunities, saving money on larger projects, and eliminating frustration for individuals involved in university processes.
The lab has been in use at Angelo State for over two years, now. The course has filled each spring semester, and the students have conducted tests on everything from paper-prototypes of campus Web sites to validation tests of non-profit organizations in the region. One student was recently offered a job in quality assurance, and he credits his experience in ENG 4365 with helping him get the job. As well, faculty and staff on campus have conducted tests on portions of the university Web site, revised forms based on feedback from the lab, and conducted exploratory research on pedagogical tools, such as asynchronous software in a course management system. The author, as well, has continued to find ways of encouraging a campus-wide awareness of the importance of the user, being interviewed several times, conducting eye tracking studies, working on an undergraduate research project, presenting his research at several conferences, using the course as part of a university-wide push for service-learning, and doing formal presentations across campus to raise awareness of usability issues.
More than anything else, this article has attempted to reveal that building a physical “space” for usability is not as costly as it once was—largely due to the switch from analog to digital technologies, the dropping costs of expensive eye trackers, and the recognition that state-of-the-art labs are not always necessary. A space can be designed and created in roughly a year, but will be able to serve university participants for multiple years after a decent amount of start-up work. While a usability lab is not a necessary requirement for advocating for the user, its presence does create a memorable encounter for people unconnected with one of the primary goals of technical communicators—to place the user at the center of the design process and not at the periphery.
Acknowledgments
The author would like to thank Dr. Laurence Musgrove for his help in acquiring funding and purchasing technologies, Diane Spraggins for making all the purchase requests, Dr. Kevin Lambert for providing funding, Katherine Garrison for allowing herself to be consulted about her knowledge of usability, Erwin Loyd for his work as the first student assistant in the lab, and the ENG 4365 students for their help and patience in “testing the usability lab.”
References
Adobe Photoshop CS5 (2013). Retrieved from http://www.adobe.com/products/photoshop.html
Adobe Premiere Elements 10.0 (2013). Retrieved from http://www.adobe.com/products/premiere-elements.html
Anderson, P. V. (2007). Technical communication: A reader-centered approach. Boston, MA: Thomson Wadsworth.
Andrzejczak, C., & Liu, D. (2010). The effect of testing location on usability testing performance, participant stress levels, and subjective testing experience. Journal of Systems and Software, 83, 1258-1266.
Angelo State University (2013). Courses and faculty. Retrieved from http://www.angelo.edu/courses/?201120
Apple. (2013). Retrieved from http://www.apple.com/
Barnum, C. M. (2011). Usability testing essentials: Ready, set…test! Burlington, MA: Elsevier.
Bentley University. (2013). User Experience Center. Retrieved from http://usability.bentley.edu/about-us.
Blatt, L., Jacobson, M., & Miller, S. (1994). Designing and equipping a usability laboratory. Behaviour & Information Technology, 13(1-2), 81-93.
Camtasia (2013). Camtasia Studio 7. Retrieved from http://www.techsmith.com/tutorial-camtasia-current.html
Carter, L. (2005). Market matters: Applied rhetoric studies and free market competition. Cresskill, NJ: Hampton Press.
Feenberg, A. (1999). Questioning technology. New York, NY: Routledge.
Garrison, K. (2011). Angelo State University ENG 4365 Syllabus. Retrieved from http://www.angelo.edu/courses/syllabi/201120/22704.pdf
Grant, A. (2010). The 50 best careers of 2011. US News. Retrieved from http://money.usnews.com/money/careers/articles/2010/12/06/the-50-best-careers-of-2011.
Harner, S., & Rich, A. (2005). Trends in undergraduate curriculum in scientific and technical communication programs. Technical Communication, 52, 209-220.
Hartson, H. R., Castillo, J. C., Kelso, J., & Neale, W. C. (1996). Remote evaluation: The network as an extension of the usability laboratory. In CHI ’96 Conference Proceedings. Vancouver, BC Canada: ACM.
Human Interfaces Inc. (2013). Retrieved from http://humaninterfaces.net/index.shtml
IBM SPSS. (2013). Retrieved from. http://www-01.ibm.com/software/analytics/spss/
Isbister, K., & Schaffer, N. (2008). Game usability: Advice from the experts for advancing the player experience. Burlington, MA: Elsevier.
Jennie Lee Research Laboratories (2013). Retrieved from http://www.open.ac.uk/about/campus/jennie-lee-research-labs/
Johnson, R. (1998). User-centered technology: A rhetorical theory for computers and other mundane artifacts. Albany, NY: SUNY.
Johnson-Sheehan, R. (2010). Technical communication today. New York, NY: Pearson.
King, H. J., & Jannik, C. M. (2005). Redesigning for usability: Information architecture and usability testing for Georgia Tech Library’s website. OCLC Systems and Services, 21, 235-243.
Koyani, S. (2006). Usability labs: Portable versus fixed. Retrieved from http://www.usability.gov/articles/newsletter/pubs/062006news.html
Krug, S. (2006). Don’t make me think: A common sense approach to web usability. Berkeley, CA: New Riders.
Lanham, R. (2006). The economics of attention. Chicago, IL: University of Chicago Press.
McDavid, J., Outlaw, B., & Wachai, J. (2008). Educational Outreach Network (EON) usability test report. Retrieved from http://justinmcdavid.com/Files/UWEO_Usability_Test_Report.pdf
Microsoft. (2013). Getting started with Microsoft Office 2007. Retrieved from http://office.microsoft.com/en-us/support/getting-started-with-microsoft-office-2007-FX101839657.aspx
Minnesota State University Mankato. (2013). English, Technical Communication courses. Retrieved from http://english.mnsu.edu/techcomm/tccourses.html
Mirametrix. (2013). S2 Eyetracker. Retrieved from http://mirametrix.com/products/eye-tracker/
Morae Observer (2013). Retrieved from http://www.techsmith.com/morae.html
Natural Reader 10.0. (2013). Retrieved from http://www.naturalreaders.com/index.htm
Nielsen, J., & Pernice, K. (2010). Eyetracking Web usability. Berkeley, CA: New Riders.
Nielsen, J. (1994). Usability labs: A 1994 survey. In Useit.com. Retrieved from http://www.useit.com/papers/uselabs.html
Noldus. (2013). Portable usability lab. Retrieved from http://www.noldus.com/human-behavior-research/solutions/portable-usability-lab
Nuance. (2013). Dragon Speech recognition software. You talk. It types. Retrieved from http://www.nuance.com/dragon/index.htm
Ovo Studios. (2013). Lab design services. Retrieved from http://www.ovostudios.com/labdesign.asp
Palmiter, S., Lynch, G., Lewis, S., & Stempski, M. (1994). Breaking away from the conventional ‘usability lab’: The customer-centered design group at Tektronix, Inc. Behaviour & Information Technology, 13, 128-131.
Pierce, K. R. (2005). Website usability report for Harvard University. Retrieved from http://digitalcommons.utep.edu/cgi/viewcontent.cgi?article=1002&context=kenneth_pierce&sei-redir=1#search=”usability+university+report
Potosnak, K. (1990). How to build a usability lab. IEEE Software, 7(2), 96-97.
Quirks.com. (2013). Quirk’s marketing research media. Retrieved from http://www.quirks.com/
Radd, D. (2010). THQ opening usability lab at Guildhall. In Industry Gamers. Retrieved from http://www.industrygamers.com/news/thq-opening-usability-lab-at-the-guildhall/
Redish, J. (2010). Technical communication and usability: Intertwined strands and mutual influences. IEEE Transactions on Professional Communication, 53, 191-201.
Rubin, J., & Chisnell, D. (2008). Handbook of usability testing: How to plan, design, and conduct effective tests. Indianapolis, IN: John Wiley.
Salvo, M., Brizee, H. A., Driscoll, D. L., & Sousa, M. (2006). Purdue Online Writing Lab (OWL) usability report. In Online Writing Lab at Purdue University. Retrieved from http://owl.english.purdue.edu/research/OWLreport.pdf
Sazegari, S. (1994). Designing a usability lab: A case study from taligent. Behavior and Information Technology, 13(1-2), 20-24.
Scanlon, T. (1999). Usability labs: Our take. In User Interface Engineering. Retrieved from http://www.uie.com/articles/usability_labs/
Schriver, K. A. (1992). Teaching writers to anticipate readers’ needs: What can document designers learn from usability testing? In H. Pander Maat & M. Steehouder (Eds.), Studies of functional text quality (pp. 141-158). Atlanta, GA: Rodopi.
Seffah, A., & Habieb-Mammar, H. (2009). Usability engineering laboratories: Limitations and challenges toward a unifying tools/practice environment. Behaviour & Information Technology, 28, 281-291.
Sony. (2013). Retrieved from http://www.sony.com/index.php
STC. (2013). Usability and User Experience: An STC Community: Usability labs. Retrieved from http://www.stcsig.org/usability/topics/usabilty-labs.html
Still, B., & Morris, J. (2010). The blank-page technique: Reinvigorating paper prototyping in usability testing. IEEE Transactions on Professional Communication, 53, 144-157.
Texas Tech University. (2013). Department of English, Technical Communication and Rhetoric, Undergraduate courses. Retrieved from http://www.english.ttu.edu/tcr/BATC/undergraduate_courses.asp
University of Alberta. (2003). Usability reports. Retrieved from http://www.uofaweb.ualberta.ca/usability/
University of Utah. (2013). Usability lab. Retrieved from http://www.lib.utah.edu/services/usability-lab.php
University of Washington. (2013). Human Centered Design & Engineering, Courses. Retrieved from http://www.hcde.washington.edu/courses
U.S. Bureau of Labor Statistics. (2013). Occupational outlook handbook, 2010-11 edition. Retrieved from http://www.bls.gov/oco/ocos320.htm
U.S. News Staff. (2008). The report card. The 30 best careers, 2009’s grades on the selection criteria. In US News. Retrieved from http://money.usnews.com/money/careers/articles/2008/12/11/the-report-card
VandeCreek, L. M. (2005). Usability analysis of northern Illinois university libraries’ website: A case study. OCLC Systems and Services, 21, 181-192.
Windows Movie Maker Live. (2013). Retrieved from http://explore.live.com/windows-live-essentials-movie-maker-get-started
Yeats, D., & Carter, L. (2005). The role of the highlights video in usability testing: Rhetorical and generic conventions. Technical Communication, 52, 156-162.
Yeats, D. & Thompson, I. (2010). Mapping technical and professional communication: A summary and survey of academic locations for programs. Technical Communication Quarterly, 19, 225-261.
ZoomText. (2013). Retrieved from http://www.aisquared.com/zoomtext
About the Author
Kevin Garrison is a tenure-track assistant professor of English at Angelo State University. He also serves as the director of the Angelo State University Usability Testing Lab and teaches a senior-level course in usability testing. He graduated from Texas Tech University (TTU) in May 2009 with a PhD in Technical Communication and Rhetoric. Contact: kevin.garrison@angelo.edu.
Manuscript received 10 February 2012; revised 26 April 2013; accepted 25 June 2013.