Inside the Board: With Steven Jong

Today we bring you another look “Inside the Board,” from members of the STC Board of Directors on various topics of interest. Today, Steven Jong discusses the latest with the STC Certification Task Force.

Certification

The STC Certification Task Force (CTF) is charged with collecting the information necessary for the Board to make a yes-or-no decision on offering certification for technical communicators. I have chaired this task force since 2007, and we’ve researched every aspect of the certification issue. Any journalist would know the questions we’ve asked: Certification for whom? Certification of what? When? Where? Why? How? We have found answers to all these questions; I’d like to address the last one.

Granting certification means declaring that an applicant has been found to have reached a standard minimum level of competence. But how do you know an applicant is worthy of certification? On what basis do you make that judgment? The question goes to the matter of assessment, and any technical communicator interested in—or concerned about!—certification is interested in the answer.

There are several ways to assess applicants. Which is appropriate? Having studied quality as it applies to documentation, I know that any measurement has strengths and weaknesses, and any measurement can be subverted. But measuring along multiple dimensions gives a much more accurate result and is much harder to “trick.” It may be a good idea to establish multiple requirements for certification. Here are several different forms of requirement.

Passing an Examination

When we think of assessment, the first thing that comes to mind is an examination. There are many variants of the venerable sit-down exam, including multiple choice, fill in the blank, and my favorite, the essay. The strength of this approach is that precise questions can be defined, graded objectively, and administered to many applicants economically, and that a passing score can be established. The weakness is that it rewards people who are good at taking tests, not necessarily people who can apply knowledge, and also that determining what to ask can be very difficult. Professions with a defined body of knowledge use it to define their exams, but though we’re working on one, we don’t have it yet.

But there are also tests that can establish that an applicant can accomplish some set task. For instance, to become a certified pump repairman, the applicant can be given a broken pump and expected to fix it. It’s hard to argue with that approach. Nevertheless, many technical communicators we’ve spoken to get nervous about the very idea of testing. So what else is there?

Level of Education

One possible requirement is that an applicant completes one or more prerequisite courses. For instance, we could declare that anyone who has earned one of a list of degrees, attended certain courses from an approved list of training companies, or accumulated a certain number of continuing-education credits in approved subjects can be certified. The strength of this approach is that we can piggyback on existing standards, which is quick and economical. The weakness is that it rewards taking courses, not doing work. (And who approves the lists of courses, degrees, and approved schools? That is accreditation, and it’s an equally vexing issue.)

Long-time working professionals hate this approach, pointing out that many have become technical writers with no formal education in the field. Conversely, academics and holders of technical-communication degrees naturally want credit for what they’ve accomplished.

Breadth or Depth of Experience

Another possible requirement is work experience. For instance, we could declare that anyone with N years of experience in the field can be certified. The strength of this approach is that we directly tie competence to holding down a job, for if you’re done the work, you can do the work. The weakness is that the actual skills of people paid as “technical writers” can vary widely. And how do we determine the value of N?

Unsurprisingly, roles are reversed here. Working professionals love the idea of getting credit for experience; new graduates hate it.

Quality of Work Portfolio

Another possibility is to require a portfolio of documents, work products, or both. Now, this is familiar ground for STC and its members: it resembles the publication competitions conducted by a number of chapters and by the Society itself. The strengths of this approach are that it directly measures the ability of an applicant, and that we already know how to assess portfolios and have an infrastructure in place. We could even consider student portfolios. The weaknesses are the subjectivity of judging (a known issue with competitions), and that, like the competitions, the process doesn’t scale well: the more applicants we get, the more judges we need, and the greater the cost of assessment.

Strength of CV

Finally, we could assess the curriculum vitae of an applicant, in the manner of a university tenure committee. Again, STC has experience with this process; it resembles our Associate Fellow and Fellow nomination process. The strength of this approach is that we can combine education, experience, and endorsements (references). The weaknesses are possible breaches of anonymity and that, like portfolio assessment, the process can be subjective and doesn’t scale well.

Which One?

So, which one method should we use for certifying technical communicators? We think none of them. But why not a combination of at least two or three? The combination that gives the most credible and prestigious certification would be all of the above. However, we also realize that for an applicant, meeting all these requirements would be the most difficult approach.

The CTF would love to hear your thoughts on this question in particular and certification in general. What do you think?