Features

Validate Your Content: Quality Assurance Testing for Documentation

By Emily Alfson | Senior Member

Quality assurance testing of documentation validates that the product documentation is accurate and usable. It ensures that the information published with a product is accurate, relevant, and will not cause harm or injury. Just as a product is tested for safety, accuracy, and usability, documentation should be subjected to the same rigorous testing methods to ensure accurate and usable information is provided to users.

Successfully implementing a process for documentation quality assurance testing can be accomplished by designing a process that meets the needs of your products and organization, and ensuring adoption through integration into your product development lifecycle.

Components of Quality Assurance Testing for Documentation

Creating a quality assurance testing process for documentation requires four key components. These four components will create the basis for your testing and can be adapted to fit the needs of your organization.

The four components are:

  1. Professional and dedicated quality assurance analysts
  2. Defined testing method
  3. Defect classification system
  4. Feedback and error-correction loop
Professional and dedicated quality assurance analysts

The most effective tester is a dedicated, trained quality assurance analyst. This role requires a skill set that balances technical know-how with outstanding communication skills. This is not a task to assign to anyone on the team; it requires a professional trained in quality assurance testing methodology. The quality analyst must understand the product they are testing, how the product is used in the real world, and the audience for whom the product and documentation is intended.

Testing should be the primary focus of the quality analyst. A team member dedicated exclusively to testing best fills this role.

In small companies or teams, having a dedicated quality analyst may not be feasible. In the absence of a dedicated quality analyst for documentation, there are alternative methods that can be used to perform quality assurance testing:

  • Borrow a quality analyst from another team. Ideally the tester would not have been involved in the product or documentation development so they have an unbiased approach to testing.
  • Utilize an internal resource with a role similar to the intended audience of the documentation. For example, you might use an internal developer from another product development team to test your API documentation.
  • Peer review among the writers. While this method is better than no testing, it can be difficult for writers to review and test documentation without bias. Just as a subject matter expert may be so focused on their area of expertise that they may not see the big picture, a technical writer may have a similar struggle when testing a peer’s document.
Defined testing method

The testing method can vary depending on the product and testing requirements. Determine which method will most effectively validate the documentation. The method may be different depending on the product or type of documentation being tested.

Demonstration

The most effective method of validating the documentation is to use it to perform the procedures described in the documentation. The tester should perform the steps exactly as they are written in documentation. This is the most effective method of testing.

Simulation

In some cases, performing the actual steps or procedures described in the documentation may not be possible. For example, testing procedures on hardware that is in production may not be possible without causing major disruption or putting equipment at risk. In this case, validation by simulation requires the tester to review the procedure while examining the equipment in-place to determine any possible errors, issues, or concerns. The tester observes the equipment in its operational configuration while studying the task to ensure that it is logical, effectively descriptive, and can be accomplished as written.

Comparison/Engineering Review

In some cases, comparing the documentation to the original source data can be used to test the accuracy of the documentation. In cases where the documentation includes a large amount of descriptive and tabular data, tolerances, measurements, and specific data elements, comparing the documentation to the engineering or source data will validate the accuracy of the technical specifications in the documentation.

Defect Classification

A system of ranking defects is a good way to ensure the test results are recorded in a meaningful and actionable way. The quality of the information returned to the author is vital to ensure resolution of any defects. A defect classification system will ensure consistency in reporting and resolution of defects.

This is an example of a five-tier classification system that could be applicable to hardware or software documentation.

  1. Critical—Generally reserved for errors that would cause injury or death.
  2. Severe—An error that would result in damage to the system, but would not harm personnel.
  3. Major—The error has impaired the user’s ability to perform the task being described. No damage is caused to the system, but the user was unable to complete the task using the information at hand.
  4. Minor—The procedure is wrong, but there is sufficient data for the user to continue. This category can also include spelling, punctuation, and grammar errors if they don’t change the meaning of the steps.
  5. Improvement—The procedure is actually correct as written, but the tester recommends clarifying the information.
Feedback and Error Correction

Test results are of little use if authors do not receive clear, actionable feedback. The quality assurance analyst must create a report of defects in the documentation, including the specific location of the text or image in question (page number, section number, or a screenshot of an interface) and information on how to resolve the defect. The final report should provide detailed and specific corrective data to the writers.

Feedback methods include:
  1. Entry in a bug-tracking system
  2. PDF file with comments and mark-ups
  3. Email with the test results
  4. Printed copies of the documentation with markup
  5. Wiki-based entries or other collaborative writing methods

The actual method could vary from company to company, based on what works best in each situation. Smaller organizations might function well with an informal feedback method. Larger organizations may require the use of sophisticated defect tracking tools that capture traceability of bugs and resolutions. Detailed reports and records may be required for certifications such as ISO 9001.

Quality Assurance Test Steps

The steps to test the documentation are similar regardless of the subject matter. The following process could be adapted to work for software or hardware documentation testing.

  1. Develop the quality assurance strategy. Define the goal of the quality assurance testing before any testing begins. Ensure all stakeholders have been educated on the quality assurance process and agree to adhere to it. Getting buy-in from stakeholders will help to ensure that quality assurance testing of the documentation is integrated into the product development cycle.
  2. Create a test plan. Before the documentation can be tested, a test plan must be constructed to ensure that the testing method and execution will result in validation of the documentation or product. Using a test plan brings consistency to the process, ensuring that the test is repeatable and the results are actionable.
  3. Test the documentation. After stakeholders have approved the test plan, the quality analyst executes the test. The quality analyst will test the documentation, performing the tasks described in the documentation exactly as they are written. Testing can take place over hour or days, depending on the type of test being performed. Testing can be performed in a testing lab, on a dedicated computer, in the field, or wherever the documentation will likely be used in a real-world situation.
  4. Report the results. The output of the quality assurance test is a detailed report of the results. Throughout testing, the analyst will take careful note of any differences between the documentation and the actual hardware or software, any errors in the steps or procedures, error codes, screen-shots of software interface bugs, and any issues with the usability of the documentation. In addition to noting any errors, the report should also include information on how to resolve them. The report is submitted to the author for resolution of defects.
  5. Rework the documentation. In this step, the technical writer will process the quality assurance report and make any necessary corrections to the documentation. As the defects are resolved, the writer should include notes for each defect explaining how it was resolved or why it was not resolved. Using whatever defect tracking system is in place at the company, this method provides traceability between any failures in the quality assurance test and the resolution in the documentation.
  6. Request approval. In most cases, the quality analyst will review the writer’s response to each defect and verify that the defect was resolved. If the quality analyst is satisfied with the corrections, they will issue approval of the documentation, certifying it as tested and validated through the approved quality assurance process.

A quality assurance testing process for documentation will ensure accurate information is published in product documentation. Just as products go through extensive testing before being released, applying the same principles to documentation will ensure that accurate information is being published, and will improve overall quality and customer satisfaction.

EMILY ALFSON is an STC Senior Member, a former president of STC New England, and currently serving on the Community Affairs Committee as the Leadership Day lead. She is the manager of the Boston-area InterChange conference for technical communication professionals. Emily is the technical writer and knowledge manager at MOTOR Information Systems in Troy, Michigan.