The National Census of Writing on Monday released the results of its ambitious survey on writing centers and programs in the U.S., giving administrators, faculty members and researchers an open-access view of the national landscape of how writing is taught.
The release is a culmination of a nearly three-year effort to survey every college and university in the U.S. on the state of their writing programs, how they identify and support underprepared students, the administrators in charge, and more. Writing program administrators have for years toyed with the idea of such a survey, but the daunting logistical challenges of the effort have served as a major deterrent.
The researchers behind the census say the survey is the most comprehensive of its kind conducted in more than three decades. In total, the census contains data from 900 colleges -- three-quarters of them four-year institutions, the rest two-year institutions -- that answered all or some of the 200-question, eight-section survey. Colleges had to complete at least one section for their responses to be collected.
“This will allow finer-grained analysis of the correlation between teaching, curricular, programmatic and administrative configurations and what students are learning as a result of those programs,” said Douglas Hesse, president-elect of the National Council of Teachers of English. “People are happy to make all sorts of claims about the efficacy of first-year writing programs. In fact, first-year writing is not a monolithic thing across the country. We know a fair amount in terms of differences and curricula, but what we don’t know much -- except anecdotally -- about the conditions in which those programs exist.”
The database is housed at Swarthmore College, but a full list of contributors can be found here. It includes the Andrew W. Mellon Foundation, which provided funding to hire a developer.
To build on the data, which was collected between March 2013 and October 2014, the researchers plan to redo the census every four years. They are not publishing an official summary of findings, choosing instead to treat the data as a resource.
“The idea was to create a space where we could gather data so it could be open access and sustainable so that researchers and administrators could come and use the data in whichever way they like,” said Jill Gladstein, associate professor of English literature and director of Swarthmore’s Writing Associates Program. She served as project leader for the census.
Members of the NCTE are one potential audience. Hesse, also executive director of writing at the University of Denver, said he suspected the NCTE will “look with interest at the findings” in search of opportunities for follow-up studies.
For most of the faculty members and developers, the census is an additional commitment on top of full-time positions. Asked why he and others volunteered their time to design the survey, identify contacts at thousands of colleges and comb through data from their responses, Brandon D. Fralix, associate professor of English at Bloomfield College, stressed the importance of having easily accessible data.
“The discipline has largely used case studies of single campuses, anecdotal evidence and smaller-scale empirical research to answer questions about writing program administration, but the census will provide answers to some of the most commonly asked questions in writing program administration,” Fralix said in an email.
In other words, the census can serve as a starting point for an administrator who needs to make a case for his or her writing program or a researcher looking for a simple data point, Gladstein said. By visiting the census website, those and other users base their decisions on data, not anecdotes. With a few clicks, visitors can see how 39 percent of writing centers at four-year institutions are freestanding units, for example, or that 63 percent of two-year institutions allow students to place out of the first-semester first-year writing requirement.
Dozens of colleges, from Central Arizona College to Purdue University, also agreed to have their institutions publicly associated with their results. Institutions seeking to emulate other colleges can read how Grinnell College offers faculty development in the form of optional workshops, or that the average consultation at the University of Dayton writing center lasts 45 minutes, among other examples.
First-year writing is more likely to reside outside the official writing program or department (58 percent) than within (42 percent), according to the census. For those colleges that have chosen the former option, English departments are overwhelmingly responsible for teaching first-year writing (71 percent).
Over all, four-year institutions reported mostly similar results, with only slight variations between institutions of different sizes, Carnegie classifications and geographic regions. The exception is historically black colleges and universities (HBCUs), which in some cases reported data that tracked more closely to data collected from two-year institutions.
Fralix pointed out that while about one-third of all four-year institutions surveyed said they offer a writing major, only 20 percent of respondent HBCUs did. At 30 percent, minority-serving institutions in general also lagged behind on that measure.
“This may have some impact on the diversity of the discipline itself, given that [minority-serving institutions] educate such a large proportion of minority students,” Fralix wrote. “And there are other such correlations and comparisons that we could make such as class size, status of instructors and types of developmental writing.”
Writing centers at HBCUs are more likely to be staffed by professional tutors, however. While most (91 percent) of writing centers at four-year institutions are staffed with undergraduate students, more than half of HBCUs said they use professional tutors. Faculty who work in other departments are also more likely to help out in the writing centers at those institutions, the census shows.
Gladstein said she was surprised by the lack of shared terminology in the field. The census was previously known as the Writing Program Administration Census Data Project, but it changed its name after several colleges responded to say they didn’t have an official writing program.
“For me, it’s raised all kinds of questions about how individuals within programs define themselves,” Gladstein said. In order to conduct a national survey on writing instruction, she said, “We have to figure out -- what are our similarities?”