How to Tell Whether Writing Instruction Works
Complaints about students' poor writing skills have prompted many colleges to create new programs or adopt curricular changes, but do these efforts work?
As writing program directors gathered Thursday at the annual meeting of the Modern Language Association, many voiced confidence that their efforts are making a difference. But at one of the kickoff sessions for the meeting, many of these officials said they worried that views of their success were based more on hunches and intuition than solid evidence. That may be changing, however, as composition scholars described a range of projects designed to test the effectiveness of their efforts. Some said they see a shift in composition away from theory and toward more practical research on student learning and instructor strategies.
"For writing centers and programs, the dearth of empirical research is dangerous," said Linda S. Bergmann, director of the Writing Lab at Purdue University. Too much of what writing instructors believe is based on "lore," she said. At a time of political demands for assessment and commercial companies promising quick results if they take over tutoring services, writing instructors need evidence of what works, she said.
The research projects described at the meeting, in Chicago, are generally small scale, involving one or two campuses each. But those conducting them -- and audience members -- said there was a need for more such studies, and for efforts to enlarge and replicate some of those being conducted.
The research efforts included the following topics:
- How writing tutors and students set up relationships and agendas. Laurel Reinking of Purdue has been studying the conversations (through transcripts) between tutors and those they help in the writing center. Issues related to how students express their needs and accept (or reject) advice are crucial to the success of these tutoring programs, Reinking said. Her hope is to identify ways that tutors can get the information they need about students' needs without just giving in to what students say they want. "The bottom line is: We need to know what makes the agenda-setting part of this relationship work," she said.
- How peer advising on writing changes student learning. Dara Rossman Regaignon, director of writing at Pomona College, is testing the impact of "writing fellows," two students who are assigned to a course and who review student writing assignments and suggest revisions. Pomona is testing the impact of this approach by conducting surveys of students and professors in similar classes with and without the fellows, and by having outside experts examine portfolios of student writing in classes with and without writing fellows. The early results are encouraging, and suggest that the gut feeling of many that writing fellows help is something that can be backed up, Regaignon said.
- How teaching assistants teach writing. E. Shelley Reid, director of composition at George Mason University, is exploring which skills teaching assistants are confident of, and which they aren't, after they start teaching. Reid is also doing surveys to see how much TA's use the pedagogy they are given in orientation programs prior to teaching. Among early findings: First-year male TA's have difficulty balancing time demands with responding to student writing. Second and third-year female TA's are more likely to worry about pressure to give students higher grades than they might think are deserved.
Chris Anson, director of the Campus Writing and Speaking Program at North Carolina State University, said that there were many reasons to support such research projects. Politically, he said, writing programs need to be able to defend their programs. But educationally, he said the reality is that research could find flaws in current practice. "We need to be ready to abandon cherished practices if they don’t work," he said.
The projects discussed suggest "a reinvigoration of our research agenda," Anson said, and that could ultimately get to what really matters, he said: Finding out "what really works and what doesn't work."