By Kathryn Summers, Michael Summers, and Amy Pointer
Plain language, clear visual hierarchy, and plain interaction are important whenever you’re trying to communicate with clients or customers. They’re particularly important when the stakes are high for readers, when you’re trying to reach audiences who may have trouble interpreting texts, and when you’re trying to get readers to do something—not just understand something—based on what they read. All of these conditions come together when low-income patients try to complete forms to qualify for free prescription medicine.
Nowadays, we often think about user interfaces from the context of online electronic formats. However, paper forms continue to play an important role in many everyday contexts. Such forms involve issues similar to those faced by online designers, but with a different suite of available tools to resolve those issues. Our experience showed that revising such forms for plain language, clear visual hierarchy, and plain interaction can both substantially increase user success rates and reduce processing costs for the sponsoring company.
The Problem
Like many pharmaceutical companies, AstraZeneca (AZ) has a patient assistance program to make medicines available for free to low-income patients. In fact, AZ had an unusually generous patient assistance program, with higher income limits than many other pharmaceutical companies. But a complicated form made the costs of administering the program high, while applicants who were English language learners, senior citizens, or had low literacy skills struggled to apply for the program successfully—groups that constitute a large proportion of potential program participants.
How Our Project Began
Stakeholders within AZ wanted to make their program more successful by simplifying and clarifying the application process. These stakeholders also wanted to identify and eliminate any aspects of the form that were inhibiting immigrants from applying to the program. To justify the cost of the research and redesign, AZ stakeholders wanted to measure improvements in participant success rate, satisfaction, time on task, and any reductions in internal processing costs.
The original form had a number of known problems. Applicants failed to attach required documentation, such as the original prescription from their doctor and proof of financial need (such as a tax return). Information about income, insurance, and prescription drug coverage tended to be incomplete or inaccurate. Information from the doctor, such as the doctor’s DEA (Drug Enforcement Administration) number, was missing or incorrect. Applicants sometimes failed to include social security or green card information, and information about other current prescriptions and allergies.

What We Did
We started by revising the form to correct known problems, using known principles of plain language and plain interaction design (Doak, Doak, & Root, 1996; Redish, 2012; Summers & Summers, 2005; Summers et al., 2006). Then we performed cycles of iterative testing to (a) confirm that what we had done fixed the previously identified problems, (b) identify new possible problem areas, and (c) guide further changes to the design.
A growing body of evidence confirms that designing for universal usability improves usability for most audiences. In this case, three demographic groups were particularly likely to need the form and to face challenges in filling out the form: readers with low literacy skills, English language learners, and senior citizens. By focusing our iterative testing and redesign on these groups, we felt confident that we would improve the usability of the form for all groups.
The revised form was developed in seven rounds of iterative qualitative testing, with a total of 48 participants: 24 participants with low literacy skills, 13 English language learners, and 11 seniors. The form went through 40 iterations.
Most interaction designers know that the best results come from tackling the entire process, not just the form or the final interaction. Our experience bears this out. As we clarified the form, it was like shining a flashlight on the complexities of the underlying business process and form requirements. In response to our initial proposed changes, the client team started to simplify and improve their expectations about what the program and the form were trying to accomplish. Increasingly enthusiastic internal advocates at AZ were able to achieve notable simplifications to these underlying complexities during the course of the project. They were even able to get the legal team to agree to plain language revisions to the consent information! In short, the form design changed not only in response to the feedback from test participants, but also in response to the changing business requirements from AZ—a typical experience for many online developers.

The changing business requirements meant that designing the form was like trying to hit a moving target, but the frustration was worth it. While our initial efforts at revision may have seemed like wasted effort, in fact they were the necessary catalyst to getting the client to reconsider the business model.
Instructions and Physician Information
One problem in the original form was the long, dense page of instructions, which included some information aimed at doctors and some aimed at patients. This page of instructions was skipped by most applicants. We knew from previous research with low literacy users that they would benefit from making the text more visually inviting, with a larger text size, shorter line lengths, and more white space. Information that was essential to filling out the form needed to be moved out of the front-end instructions to appear in the form, right at the moment of use. All revised text needed to use familiar words and simple syntax. See Figures 1 and 2 for a comparison.
The original version of the instructions is dense and uninviting. The syntax and vocabulary are complex, and information for patients and doctors is mixed. Information about what to do with the form after it is filled out is buried at the bottom of the instructions, where most applicants missed it.
The revised version uses plain language and focuses on explaining and introducing the program. Information that is essential for filling out the form has been moved to the form itself. An income table lets patients know if they are likely to qualify before they go to the effort of filling out the rest of the form. The information for the doctor is combined with the form elements that should be filled out by the doctor.
Much of the information about the doctor can be obtained from the prescription itself—that is, if applicants are successful in attaching this prescription. Thus, the team opted to focus redesign efforts on making sure that patients would actually include their prescription, and kept the section for physician information usable but not intrusive. The biggest win here was the reduction in erroneous doctor information entered by applicants using the original design, such as entering their own name rather than the doctor’s name, or entering their driver’s license rather than the doctor’s DEA license.

Personal, Insurance, and Income Information
The known issues with the original form included areas where applicants entered incorrect information and areas where required information was not entered at all.
The original layout of the personal information section (see Figure 3) led some applicants to skip some sections, such as social security number and date of birth, without meaning to. (When we talked to participants, we found that a few participants skipped these fields deliberately, but most skipped them accidentally.) The small text was hard for applicants to read, especially seniors, and the small spaces were hard to write in. Patients taking many medications couldn’t fit them into the space provided—information that is crucial in order to prevent dangerous drug interactions.
In the sections about insurance and income, applicants made many errors on the original form. The crucial piece of information about insurance coverage was whether or not applicants had prescription coverage. AZ also wanted to know additional details about applicants’ health insurance for market research purposes, but realized that this information was not really necessary.
In the income section, many applicants provided monthly rather than yearly income, and often filled in personal information rather than household information.
Our solution was to increase the text size and make the path through the form much more vertical to reduce field skipping (see Figure 4). Information on residency status was simplified, more options were made available, and some explanatory text was added to reassure users about providing this sensitive information. Ample additional space was added to the section about medicines and allergies. The end result was that the form expanded from two to four pages. However, the form was also considerably easier for participants to complete, because the appearance was more inviting and the lower information density meant that the cognitive load had been reduced. Research on paper-based information for low health literacy users verifies that when you make information look easier to process, it becomes easier to process (Doak et al., 1996). Shorter is not always better if brevity makes the information too dense.
Questions about health insurance were removed from the form, and questions about prescription coverage were made more clear. A definition for “household” was added to the form (“yourself, your spouse, and dependents”), and the question about income was revised to allow applicants to provide either monthly or yearly income, since low-income applicants mostly thought about their income in monthly terms.

AZ was able to approve a simplified list of acceptable proofs of income, and a phone number was added for those who might need special help—a more efficient way to support these special cases than expensive and time- consuming correspondence. The mid-form signature line on the original form was eliminated—originally included for applicants who needed to request a tax return transcript from the federal government. In the original version of the form, this extra signature line had made some applicants miss the required signature line at the bottom of the form, adding to the expensive need for follow-up with program applicants. People assumed that since they had already signed the form once, they didn’t need to sign it again.
Consent and Mailing Instructions
The final challenge was to make sure applicants knew what they were agreeing to (the consent information) and what to do with the completed form—and what other documents they should include in the envelope. The legalese in the original consent language was intimidating for everyone, but applicants who have lower income or who are immigrants or who are older may have a lower level of confidence or trust, and may be especially wary of what they sign. The end of the original form also contained no clear call to action, so users did not know what to do next.
The typography used to display the consent language in the revised form is larger (see Figure 5). And legalese has been virtually eliminated. The signature field has visual emphasis to make it less likely that applicants will miss it. A strong visual signal is provided for the mailing instructions, along with a checklist focusing on the most common mailing errors.

Outcomes
While it’s gratifying to know that you’ve made things better because you can see people going through the revised process more easily, sometimes there’s quantitative confirmation as well. In this project, improvements were confirmed by a direct comparison of users completing the original and revised forms, and by the first-quarter results from deploying the form.
The final form was compared to the original form in a quantitative test with 59 participants, including 30 participants with low literacy skills and 29 seniors (one senior dropped out for health reasons). We used a within-subjects test design, meaning that all the participants used both forms. Half of the participants started with the original form; the other half started with the revised form. Participants returned after two weeks to fill out the alternate form. The two-week delay was to reduce the degree to which they remembered the prior form, and by alternating which form came first we were able to compensate for improved performance on the second form due to familiarity with the questions. After the second round of testing, the moderator discussed the questions with participants in an interview. This allowed us to make sure we had correct information about how each question should have been answered.
The overall decline in entry errors on the revised form was 48%. This included:
- 133% improvement in understanding what proof of income was required
- 124% improvement in entering other medicines correctly
- 83% improvement in providing accurate information about household size
- 76% improvement in providing accurate income information
- 50% improvement in providing accurate information about prescription drug coverage
- 48% improvement in including a prescription(s)
- 9% improvement in addressing envelope correctly
In addition:
- The number of skipped fields decreased by 54%.
- Time on task actually increased by 15%, possibly as a result of fewer skipped fields.
- Satisfaction (measured using the System Usability Scale, or SUS; Brooke, 1996) increased 7.3%, from 75.1% to 80.6%.
- The rate of potentially acceptable applications that would require no further follow-up by program administrators increased from 10% of applications to 59% of applications.
The revised form was deployed during the last quarter of 2007. During that quarter, the rejection rate for submitted forms dropped from 36% to 23%, a 36% improvement. This resulted in both a significant cost savings in processing and an increase in the number of applicants who were able to receive their medication quickly.
Kathryn Summers directs graduate programs in interaction design and information architecture at the University of Baltimore. She teaches user research methods, human/computer interaction, and interaction design, and supervises the University’s user research and eye tracking lab. Kathryn’s research and publications focus on usability for audiences with lower literacy skills, who are older, or whose native language is not English.
Michael Summers heads the Global User Experience Research team at PayPal. He has held senior user research roles at USWeb/CKS, Scient, and Nielsen Norman Group. He has conducted UX research throughout Europe and Asia. Michael has solved usability problems for organizations such as Toys “R” Us, Ralph Lauren Media, Mattel, The Sports Authority, Crate and Barrel, and Timberland. He blogs at USERresearch.com.
Amy Pointer directs graduate programs in integrated and communications design at the University of Baltimore. She teaches design theory, the integration of writing and design, design across media, and professional development courses. Amy has worked professionally as a strategic analyst, a branding expert and graphic designer. Her current research is focused on market integration on social media platforms.
REFERENCES
Brooke, J. 1996. “SUS—A quick and dirty usability scale.” Usability evaluation in industry 189: 194.
Doak, C., L. Doak, and J. Root. 1996. Teaching Patients with Low Literacy Skills, 2d ed. Philadelphia: J. B. Lippincott Company.
Redish, J. 2012. Letting Go of the Words: Writing Web Content that Works, 2d ed. New York: Morgan Kaufmann.
Summers, K., J. Langford, J. Wu, C. Abela, and R. Souza. 2006. Designing Web-based Forms for Users with Lower Literacy Skills. Proceedings of the American Society for Information Science and Technology 43(1): 1–12.
Summers, K., and M. Summers. 2005. Reading and Navigational Strategies of Web Users with Lower Literacy Skills. Proceedings of the American Society for Information Science and Technology 42(1).