How to Test Writing

The College Board has revamped the tests used by students at many colleges to either place out of introductory composition or earn credit for the course. The changes involve an additional type of essay -- more research-oriented and less philosophical -- as well as shifts in the multiple choice questions.

July 8, 2010

The College Board has revamped the tests used by students at many colleges to either place out of introductory composition or earn credit for the course. The changes involve an additional type of essay -- more research-oriented and less philosophical -- as well as shifts in the multiple choice questions.

The changes follow modest dips in recent years in the number of students taking the tests (last year the figure was 35,000), as many colleges have rejected the idea of using a standardized test for writing placement and credit. While some college officials said that the College Board's move was a step in the right direction, others said it did not go far enough and predicted that skepticism would grow of evaluating writing competence through tests.

The tests in question are the composition portion of the College Board's College-Level Examination Program, known by its acronym, CLEP. Currently there are three tests used for composition: English composition with essay, English composition, and freshman college composition. They will be replaced by two college composition tests, one with multiple choice questions and essays, and one in which colleges can create and score their own essays.

The changes "are meant to bring the exams more in line with the way composition is taught these days," said Marc Singer, associate director of CLEP. He said that as the College Board saw declines in CLEP participation, it conducted surveys of composition instructors and then did follow-up interviews to get a better sense of what experts thought might be missing.

Several of the changes involve a greater emphasis on context. In the old tests' multiple choice questions, "there might have been a sentence and we asked, 'What is wrong with this sentence?' " Singer said. Now the questions will generally be "more passage-based, with the questions addressing the use of language and rhetoric in context." In examples released by the College Board, many feature multiple questions about passages.

The other major change -- also focused on context -- concerns the essays. The old tests featured one essay, and it fit the general model people associate with standardized writing tests: a prompt says something philosophical, to which test-takers must respond. This essay format is continuing in the new test, and the example released by the College Board features this prompt: "There are no challenges so difficult, no goals so impossible, as the ones we set for ourselves."

But the second essay being added will show "research skills" and the ability of a student to "look at a couple of sources and to synthesize the material into a coherent essay," Singer said.

In the example provided, students are asked to write about copyright issues and are given short excerpts from two articles -- one generally defending copyright regulations and one arguing for more open access to information -- and told to discuss the issues, citing the essays appropriately.

Singer said that the College Board hopes that the changes will increase the number of colleges participating in CLEP for placement or credit granting.

The ACT, the College Board's testing rival, does not have a test equivalent to CLEP's composition exams, but some colleges use ACT scores for placement, and the testing entity has seen a slight increase in their number (although the data don't specify whether the writing scores or other parts are being used). The ACT also is reporting significant increases in the number of colleges, generally two-year institutions, using its COMPASS placement exams, but the data are again across all disciplines and not just writing.

Among leading composition experts, enthusiasm for standardized testing in writing has long been minimal and the reaction to the CLEP changes was lukewarm.

Doug Hesse, professor and executive director of writing at the University of Denver, said that the shifts made sense within the framework of what a standardized test can do. "Any writing test that includes more actual writing is going to be a better test, provided the prompts are well-designed and the scoring guides are smart," he said. "That makes me feel better" about the new tests, he added. Denver does not currently use CLEP to award credit or place students, and while Hesse said he would study the new test, he doubted that policy would change. He said that from a scan of the comments on the e-mail lists of writing program administrators, it appears his colleagues are also dubious of starting to use CLEP.

He said that there are "unavoidable limitations in the kinds of writing skills" that can be demonstrated in these kinds of tests. "It's better to have students write than to complete multiple choice exams, if you want to evaluate writing," he said.

While Hesse gave some credit to the College Board, Les Perelman, director of the Writing Across the Curriculum Program at the Massachusetts Institute of Technology, was more critical. MIT does not award credit based on CLEP and Perelman's views on writing tests are summed up in a forthcoming volume, where his essay is called "Mass Market Writing Assessments as Bullshit."

He noted that while the College Board expects praise for adding a new form of essay, it is keeping the style of essay many educators find to have little value. "The first essay is the same kind of general argumentative stupid one-sentence prompt that is used on the SAT writing section," he said. (Perelman regularly coaches students on how to write laughingly poor essays that nonetheless appeal to the College Board's scoring rubric and win top ratings.)

After reviewing the College Board's sample questions, he also said they showed problems. For instance, one question asks what the letters "n.p." mean in a term paper: "(A) the source has several publication dates; (B) no page number for the quotation is available; (C) the quotation is from section n.p. of a source.... (D) a new paragraph begins here in the quotation; (E) the quotation is from section n.p. of the Constitution."

The correct answer is "no page," but Perelman said that's true only under the style guide of the American Psychological Association. Under the style guide of the Modern Language Association, "n.p." has a different meaning (no place of publication, or no publisher) and someone would use "n. pag." to indicate the lack of page number. Perelman said he didn't know why the College Board should decide which style guide was superior. More broadly, he questioned why quizzing students on such matters was a good way to judge whether a freshman could skip an introductory composition course.

"These kind of details are now handled primarily by word processors," he said. "Students should be learning more important things than fourth-level differences between the two citation styles."

Singer of CLEP said that officials realized that there could be differences of opinion on the best style guide. But he said that "a student should be able to figure out [the answer]. You're not going to spend your life in MLA-world."

Generally, he also said that skeptics of testing may not see the value in helping students earn more college credit based on past work, measured through testing.

The skeptics of CLEP, Singer said, can be found at "a lot of schools that are saying that you need to be branded with the experience that the school is all about." But he said that "on the other hand, there's an opposing force, which addresses the economic realities of schools' budgets and students' budgets and ability to finish, so another group of schools is saying what's the best way to maximize our dollars." And those colleges are finding that letting students demonstrate competency sufficient to skip the introductory writing course has its benefits, he said.

For some new students, "it's a confidence-builder" to obtain credit or advanced standing on the basis of a good CLEP score, he said. "The test is not meant to teach anyone anything, but to see if there is something they have picked up on their own. They want to come in with some validation for what they have learned -- that goal [of a college degree] is a little closer and they see they can do it."


Be the first to know.
Get our free daily newsletter.


Back to Top