In 1991, Elliot L. Gilbert, chair of English at the University of California at Davis, went to the hospital for what ought to have been some fairly routine surgery. Mistakes were made. He died in the recovery room. His widow, Sandra M. Gilbert (also a professor of English at Davis), brought suit – a case finally settled out of court, but not before she piled up a mound of documents that gave her some sense of just what had happened. In Wrongful Death: A Memoir (Norton, 1995), she wrote: "Responsibility in the often miraculous but always highly technologized realm of modern medicine is so dispersed, so fragmented, that finally it accrues to no one."
Years earlier -- long before her work with Susan Gubar on the landmark work of American feminist literary criticism The Madwoman in the Attic: The Woman Writer and the Nineteenth Century Literary Imagination (Yale University Press, 1979) -- Gilbert had worked on a monograph she planned to call “‘Different, and Luckier’: Romantic and Post-Romantic Metaphors of Death.” The phrase in the title came from “Song of Myself,” in which Whitman declaimed that “to die is different from what anyone supposed, and luckier.”
A cosmic sentiment like that cannot do much to mitigate grief. But Gilbert dug the notes for that abandoned project out of her files, and has just published a remarkable book, Death’s Door: Modern Dying and the Ways We Grieve (also from Norton), which revisits her longstanding interest in elegy.
Calling Death’s Door a work of literary criticism, while accurate enough, seems very incomplete. Like Wrongful Death, it recounts the story of her husband’s death. It also offers a historical meditation on the emergence of what Gilbert calls the “technologies” of death and grief. (The famous “five stages” of confronting mortality, while originally meant as descriptive, now seems at times both prescriptive and somewhat compulsory. Woe to anyone who doesn’t follow the script.)
It is a rich book, and a deep one – and also, at times, somewhat terrifying to read, for it is the work of someone for whom “the denial of death” is simply not an option. While reading Death’s Door, I contacted the author to ask her a few questions. The following interview took place by e-mail.
Q:The book seems like a hybrid -- part memoir, part cultural history, part critical study. Those categories correspond reasonably well to the three big sections you've divided it into, but there are also margins of overlap. How did you come to understand just what kind of book Death's Door was turning out to be?
A: Yes, the book is indeed a kind of hybrid, or as my son put it, it's an attempt at "genre-bending." But I hadn't planned it that way. In fact, I began the work as a fairly traditional project in literary criticism. My goal was to explore what I called "the fate of the elegy" in the 20th century and beyond -- although even as I formulated that project the ambiguity of the word "fate" had begun to haunt me.
Did I intend to explore the evolution of the modern and contemporary elegy? Or did I want to explore modern ideas about fate in the elegy? If the latter, I was already moving beyond purely literary analyses into cultural studies. In any case, however, once I began researching and writing Death's Door it became clear to me that I was no longer able to do critical and scholarly work in the way I had.
As you've noted, following my husband's unexpected death in 1991, I felt compelled to tell his story -- including what I'd been able to reconstruct about the medical negligence that evidently killed him -- in my memoir, Wrongful Death. But the very mode in which I'd written that book, along with the elegiac poems clustered around it, had changed my way of writing. I had wanted to bear witness to my husband's loss of life and to my own grief. And now, as I began drafting Death's Door, I was still working in a testimonial mode, although with much greater self-awareness and, I think, with much larger ambitions, for now I was using my own case as an entry into meditations on the cultural formulations that shape our mourning and on the literary forms in which we mourn.
Of course, I should note here that, as a number of commentators have observed, beginning in the late 80's and 90's many academics in my own fields (literary studies, women's studies) had started writing autobiographically, testing their postulates, in effect, on their own pulses. So I wasn't alone in my sense that I needed a new and different way to approach my subject. And of course, as a feminist critic, I'd always argued that the personal was not only the political but the poetical.
Nonetheless, I suspect that the urgency of my need to "genre-bend" developed out of what I experienced, in early widowhood, as an urgent (indeed a surprisingly urgent) responsibility to testify about my own family's sorrow.
Q: One the one hand, you offer a phenemenology of death and grieving; that is, a description of the kinds of experience of care, fear, concern, etc. that seem to be just about as inescapable as mortality itself. On the other hand, you draw quite a bit on social and cultural history. That serves as a reminder that our vocabulary (and, to whatever degree, our experience) has been conditioned or "constructed." So at the risk of asking you to make an absurd choice: Which comes first? Which is definitive? The intimate level of experience or the social level of cultural meanings?
A: I think both are equally important but they're also inextricably related. As you point out, there's a phenomenology of death and grief that's as inescapable as mortality itself, and this manifests itself cross-culturally as well as trans-historically.
As I try to show in the book, almost every society imagines death as a kind of "place" that one enters, and all around the world people are haunted by what's often experienced as the nearness of the dead. (Are the dead on what George Eliot called -- in a different context -- "the other side of silence," merely separated from us by no more than "a thin piece of silk," as one of W.G. Sebald's narrators puts it?) Wherever we believe the dead mysteriously survive, many cultures have also experienced the dead as needy, often angry or sorrowful.
Throughout history, too, and worldwide, mourners have structured grief in special ways, elaborately patterning the prayers or diatribes with which the bereaved implore or reproach the gods, the fates, and the dead themselves. And again, almost everywhere the spouses -- especially the widows -- of those who die occupy a crucial place in ceremonies of grief. So these are all matters I investigate in the first major section of Death's Door, which takes as its intellectual starting point Zygmunt Bauman's comment that the omnipresence of "funeral rites and ritualized commemoration of the dead" along with the "discovery of graves and cemeteries" is thought by anthropologists to constitute "proof that a humanoid strain ... had passed the threshold of humanhood."
At the same time, as I argue throughout the second major section of Death's Door, history "makes" death, shaping both how we die and how we mourn. So our persistent human needs to imagine the fate of the dead and to pattern grief in special ways are formed, informed, and reformed by all kinds of cultural changes.
In most English-speaking nations, for instance -- and these are the societies that concern me most in the book -- the traditional visions of God and the afterlife that had already begun to disappear in the 19th century continued to erode throughout the 20th century, at least among the educated classes that produce poets, novelists, journalists, and film-makers. And as historians and sociologists from Phillipe Aries to David Moller have shown, everyone, no matter the class, dies differently now than in the past -- more privately yet often more technologically, in hospitals equipped with unnervingly complex machinery.
All of us, too, share a recent history of mass "death events," from the killing fields of the first World War to the Holocaust and Hiroshima in the second World War and on through Vietnam to the "shock and awe" of the present -- and surely this history has re-made our ideas of death and dying while changing our relationship to grief. The skeletons in the trenches of No Man's Land and the corpses charring in the crematoria of Auschwitz point down to an abyss of nihilism rather than up to heaven. But if we no longer hope for a redemptive heaven, then maybe we don't want to talk about death, maybe we need to deny its imminence.
Yet even while our theology and technology have grown increasingly nihilistic, we're quite literally haunted by images of the dead that refuse to leave us because they reside in celluloid or virtual permanence, populating our photo albums, movie screens, home videos, even digital libraries. How does this conflict between the real absence and the virtual presence of the dead change our modes of mourning?
Finally, then, as I worked on Death's Door I became increasingly conscious that the need to grieve whose urgencies I shared with mourners everywhere had a special 20th-century shape. For one thing (and this helped me understand a number of elegies I studied), I experienced my mourning as curiously embarrassing to many people I met, as if, because we fear death, we fear mourners too and suspect their sorrow might be somehow contaminating. In response to such embarrassment, I guess I sometimes become defiantly testimonial about my loss, both in prose (in Wrongful Death, for instance) and in poetry (in the elegies I published in my collection Ghost Volcano). And countless memoirists have done the same thing (most recently and famously Joan Didion) along with contemporary poets from Allen Ginsberg ( Kaddish) to Sharon Olds ( The Father), Ted Hughes ( Birthday Letters), and Donald Hall ( Without).
Q:The deep, dark core of the book is the contrast you make between "expiration" and "termination." It seems like that distinction is where the elements of memoir, cultural history, and literary analysis all link up.
A: "The deep dark core of the book." Thank you. That's a really incisive and insightful point because the basic argument of the book -- certainly the argument about the "fate of the elegy" -- began with my own experience of that distinction.
In chapter six, I tell the story of two episodes that powerfully moved me. In the first, the surgeon who was in charge of my husband's case testified that he had arrived at the hospital when his patient (my husband) was "terminating" -- i.e., dying. In the second, a nurse, more than three decades earlier, told me that my first child (a very premature baby who survived a few days) had "expired" -- i.e., died. After the doctor talked about "termination," the two words became so resonant for me that I brooded on them for quite some time.
To "terminate" is to come to a flat end. To "expire" is to breathe out something -- a breath that represents, perhaps, a soul. So each word seemed to me to have key metaphysical implications. "Termination," I decided, is modernity's definition of death; "expiration" the more traditional western (Christian) notion. For "termination" leads to Beckett, to what in Waiting for Godot Lucky calls "the earth abode of stones" while "expiration" empowers Milton, whose "Lycidas" has breathed out a soul that ultimately lands in heaven, where "entertain him all the saints above." So "termination" is terrifying, makes death almost unspeakably scary, and leads toward horror, repression, and denial, while "expiration" leaves us with some hope -- or anyway it used to.
Q:Your book isn't anti-technology, as such. But I did get the sense you were making the case for literature (and poetry in particular) as capable of providing something unavailable from the medical system. Almost an old-fashioned notion of the humanities as corrective -- if not to science, then to the scientistic or technocratic mentality. Or is that reading of your project off, in some way?
A: I'm not sure that I want to make a case for poetry, and more generally the humanities, as corrective, curative, or medicinal. But I do think I want to note that poets (and novelists and memoirists too, but especially poets) have refused to deny death and grief in a culture that finds these tokens of inescapable mortality at the least embarrassing because at the worst horrifying.
Poets testify, bear witness to the particulars of pain, the details of loss that technology flattens or sometimes even seeks to annihilate with words like "termination." I don't mean to suggest that those who work among the dying -- doctors in hospitals, medics on battlefields -- don't notice these details, but the language of science is in its way sedative, just as medicine's goals are (often appropriately) sedative and palliative. Poets remind us of what really happens. They don't take away the pain: on the contrary, they teach us how to feel it, to meet it, to know it.
Q:With all the quotations you incorporate, Death's Door serves (de facto anyway) as a kind of anthology. Was there a particular poem or passage that you recall as really being definitive for you? (In whatever way you'd construe "definative" as meaning: epiphantic, consoling, etc.)
A: No, there was no one poem that dramatized for me the practice of contemporary elegists, although there were several works that functioned for me as aesthetic manifestos -- most notably, perhaps, W.C. Williams's "Tract" (about "how to perform a funeral") and Stevens's "The Owl in the Sarcophagus" (about the "mythology of modern death" and its "monsters of elegy").
But before I began drafting Death's Door I had put together an anthology of traditional and modern elegies in a book called Inventions of Farewell, and in assembling this volume I found that, taken together, the elegies poets have produced from the mid-twentieth century onwards functioned for me as radiant examples of what I mean when I say that recent poets insist with unprecedented passion on the particulars of pain and grief.
Think of the resonant specifics Thom Gunn compiles in The Man with Night Sweats or, earlier, the details Ginsberg unflinchingly offers in Kaddish, Olds in The Father, Hall in Without. But I could go on and on about this historically "monstrous" elegiac genre, which dates back to the poems Wilfred Owen, Siegfried Sassoon and others sent back from the Front during the first World War or, even earlier, to Hardy's Poems of 1912. These writers won't let us forget -- as Tolstoy wouldn't either, in The Death of Ivan Ilyich --that death and its sorrows are often excruciating physical processes whose course usually binds and bends the spirit to the body's sufferings.
Such art may not be "consoling" in the traditional sense, but it consoles because it confronts pain and because in doing so it helps us accept loss, lets us know we aren't alone, and teaches us to hope that if we can articulate our suffering we can somehow master it or at least pass through and beyond it.
Paula M. Krebs has been a professor of English at Wheaton College, a selective New England liberal arts college, for 15 years, since earning her Ph.D. at Indiana University. Her sister Mary Krebs Flaherty has been an administrative assistant at Rutgers University’s Camden campus for a year longer than Paula has been at Wheaton. Last fall Mary taught her first course, Basic Writing Skills III, on the inner-city, campus of a two-year college, Camden County College. She teaches on her lunch break from her job at Rutgers. Mary has been taking evening classes toward her M.A. for three years, ever since she finished her B.A. at Rutgers via the same part-time route. This article is the first in a series in which Paula and Mary will discuss what it’s like to teach English at their respective institutions.
Paula: My place is about as different from yours as can be, I know. I often find myself longing for your city setting, your students who are so motivated. At the same time, I realize that teaching my students is a real privilege -- I can push them in exciting ways. Wheaton’s admissions standards keep going up, and I’m starting to see it in my classes. This semester my sophomores in English 290, Approaches to Literature and Culture, seemed to finish with a really good sense of how they can use literary criticism and theory in writing essays for their other English classes. They weren’t intimidated by the critics and theorists they were reading -- they actually used them well in their final essays.If only they could follow MLA style and prepare a proper Works Cited!
Mary: MLA style is something my students can do. They were able to pick up on it easily -- I think that’s because they take well to the idea of structure. They like the five-paragraph theme. The part of the class they had the most difficulty with was the content of their papers -- they couldn’t find their voice at all, let alone critiquing literary theorists.
Paula: Oh, mine had plenty of voice. Sometimes I wished for a bit less voice and a bit more work. I think sometimes that the sense of entitlement many of them have means that they don’t necessarily understand that their word isn’t always good enough. They need to cite some authorities, place their work in a larger context, indicate their scholarly debts. They have pretty good skills coming in, so it’s sometimes difficult to make clear to them how they can push to the next level. If they’ve been getting A’s on their five-paragraph themes in high school, they find it difficult to understand why their first efforts, in English 101 or a beginning lit class, are producing C+’s or B-‘s. Some are grade-grubbers, but most just don’t understand what makes a college A.
Mary: Just a week before the semester ended, one of my students finally understood what makes a college B. In the beginning of the term, her grades were “R’s,” which means that the paper cannot receive a grade; it must be revised. When she failed her midterm portfolio, she cried to me that she couldn’t see her mistakes so she couldn’t fix them. She continued to work on her essays and revise them, over and over. Close to the end of the semester, she approached me before class and said, “Mary, please take a look at this paper that someone wrote for another class and tell me what you think.” Knowing that I was being set up, I quickly looked over the essay. Out of the corner of my eye, I could see her smirking, so I told her “You’re right, I wouldn’t have graded this paper.” She shouted, “I knew it! Look at the subject-verb agreement error in the first sentence. There’s even a fragment in the introduction!” Not wanting to trash another teacher’s grading, I pointed out to her that the most important thing was how she had changed since midterm -- that she was now able to identify mistakes so she could correct her own. She passed the course with a B and I am so proud of her.
Paula: See, that’s what’s so great about teaching! I knew you’d love it. That pleasure when you see the lightbulb go on over their heads. That’s the same at Camden County as at Wheaton. But I think you have to do a different kind of work than I have to do in order to get it to happen. In some ways, both our students believe in the value of what we’re teaching, but we both have to do some convincing as well.
Mary: Mine need convincing that what they have to say is important and that saying it in an academic format is worth the effort. Most of the Camden campus students are from Camden city, recently awarded the dubious distinction of being named the most dangerous city in the nation for the second year in a row. They are typically from poor or working class families whose parent(s) may or may not have a high school diploma; many students are parents themselves, and most are minorities: African-American, Latino, or Asian-American. Many CCC students test into basic writing or reading skills classes, which is an indicator that their high school education did not prepare them well enough for college. In an informal discussion, I asked several students about their high school experience, and they claimed that they were never asked to write for content in English class -- the focus was on grammar and fill-in-the-blank or short answer tests. This explains why they are more comfortable with the grammar portion of the writing skills class, as well as how easily they grasp the five-paragraph essay structure. Following the rules is easy for these students, but finding something to say is much more difficult. I am there to assist them in this writing process and hopefully to convince them that they can grow as individuals and be successful in the academic community.
Paula: I have to do some of that, too. But we’re starting from such different places. Mine come to college because it’s expected of them. They need convincing that a liberal arts education really can bring them advantages after they graduate -- that digging into how a literary text works, learning to put together a really well researched research essay, or understanding the connections between Darwin and the poetry of Robert Browning is worth the money the parents are investing and the time the students are investing. In some ways, it’s a harder sell than yours. I have the luxury of time, though, in a way you sure don’t. My teaching is my full-time job, and my teaching load is relatively light. I can’t even imagine what it is like for you, working fulltime and taking classes while learning to teach in probably the most challenging of circumstances -- as an adjunct at a community college. I know how hard it is for you to keep all these balls in the air. Do you think it’ll be worth it in the long run?
Mary: I certainly hope so. That’s the reason I’m teaching this year -- to find out the answer to that very question.
Paula M. Krebs and Mary Krebs Flaherty
Paula and Mary's next exchange will be about the out-of-classroom work they can ask of students.
Normally my social calendar is slightly less crowded than that of Raskolnikov in Crime and Punishment. (He, at least, went out to see the pawnbroker.) But late last month, in an unprecedented burst of gregariousness, I had a couple of memorable visits with scholars who had come to town – small, impromptu get-togethers that were not just lively but, in a way, remarkable.
The first occurred just before Christmas, and it included (besides your feuilletonist reporter) a political scientist, a statistician, and a philosopher. The next gathering, also for lunch, took place a week later, during the convention of the Modern Language Association. Looking around the table, I drew up a quick census. One guest worked on British novels of the Victorian era. Another writes about contemporary postcolonial fiction and poetry. We had two Americanists, but of somewhat different specialist species; besides, one was a tenured professor, while the other is just starting his dissertation. And, finally, there was, once again, a philosopher. (Actually it was the same philosopher, visiting from Singapore and in town for a while.)
If the range of disciplines or specialties was unusual, so the was the degree of conviviality. Most of us had never met in person before -- though you’d never have known that from the flow of the conversation, which never seemed to slow down for very long. Shared interests and familiar arguments (some of them pretty esoteric) kept coming up. So did news about an electronic publishing initiative some of the participants were trying to get started. On at least one occasion in either meal, someone had to pull out a notebook to have someone else jot down an interesting citation to look up later.
In each case, the members of the ad hoc symposium were academic bloggers who had gotten to know one another online. That explained the conversational dynamics -- the sense, which was vivid and unmistakable, of continuing discussions in person that hadn’t started upon arriving at the restaurant, and wouldn’t end once everyone had dispersed.
The whole experience was too easygoing to call impressive, exactly. But later -- contemplating matters back at my hovel, over a slice of black bread and a bowl of cold cabbage soup -- I couldn’t help thinking that something very interesting had taken place. Something having little do with blogging, as such. Something that runs against the grain of how academic life in the United States has developed over the past two hundred years.
At least that’s my impression from having read Thomas Bender’s book Intellect and Public Life: Essays on the Social History of Academic Intellectuals in the United States, published by Johns Hopkins University Press in 1993. That was back when even knowing how to create a Web page would raise eyebrows in some departments. (Imagine the warnings that Ivan Tribble might have issued, at the time.)
But the specific paper I’m thinking of – reprinted as the first chapter – is even older. It’s called “The Cultures of Intellectual Life: The City and the Professions,” and Bender first presented it as a lecture in 1977. (He is currently professor of history at New York University.)
Although he does not exactly put it this way, Bender’s topic is how scholars learn to say “we.” An intellectual historian, he writes, is engaged in studying “an exceedingly complex interaction between speakers and hearers, writers and readers.” And the framework for that “dynamic interplay” has itself changed over time. Recognizing this is the first step towards understanding that the familiar patterns of cultural life – including those that prevail in academe – aren’t set in stone. (It’s easy to give lip service to this principle. Actually thinking through its implications, though, not so much.)
The history of American intellectual life, as Bender outlines it, involved a transition from civic professionalism (which prevailed in the 18th and early 19th centuries) to disciplinary professionalism (increasingly dominant after about 1850).
“Early American professionals,” he writes, “were essentially community oriented. Entry to the professions was usually through local elite sponsorship, and professionals won public trust within this established social context rather than through certification.” One’s prestige and authority was very strongly linked to a sense of belonging to the educated class of a given city.
Bender gives as an example the career of Samuel Bard, the New York doctor who championed building a hospital to improve the quality of medical instruction available from King’s College, as Columbia University was known back in the 1770). Bard had studied in Edinburgh and wanted New York to develop institutions of similar caliber; he also took the lead in creating a major library and two learned societies.
“These efforts in civic improvement were the product of the combined energies of the educated and the powerful in the city,” writes Bender, “and they integrated and gave shape to its intellectual life.”
Nor was this phenomenon restricted to major cities in the East. Visiting the United States in the early 1840s, the British geologist Charles Lyell noted that doctors, lawyers, scientists, and merchants with literary interests in Cincinnati “form[ed] a society of a superior kind.” Likewise, William Dean Howells recalled how, at this father’s printing office in a small Ohio town, the educated sort dropped in “to stand with their back to our stove and challenge opinion concerning Holmes and Poe, Irving and Macauley....”
In short, a great deal of one’s sense of cultural “belonging” was bound up with community institutions -- whether that meant a formally established local society for the advancement of learning, or an ad hoc discussion circle warming its collective backside near a stove.
But a deep structural change was already taking shape. The German model of the research university came into ever greater prominence, especially in the decades following the Civil War. The founding of Johns Hopkins University in 1876 defined the shape of things to come. “The original faculty of philosophy,” notes Bender, “included no Baltimoreans, and no major appointments in the medical school went to members of the local medical community.” William Welch, the first dean of the Johns Hopkins School of Medicine, “identified with his profession in a new way; it was a branch of science -- a discipline -- not a civic role.”
Under the old regime, the doctors, lawyers, scientists, and literary authors of a given city might feel reasonably comfortable in sharing the first-person plural. But life began to change as, in Bender’s words, “people of ideas were inducted, increasingly through the emerging university system, into the restricted worlds of specialized discourse.” If you said “we,” it probably referred to the community of other geologists, poets, or small-claims litigators.
“Knowledge and competence increasingly developed out of the internal dynamics of esoteric disciplines rather than within the context of shared perceptions of public needs,” writes Bender. “This is not to say that professionalized disciplines or the modern service professions that imitated them became socially irresponsible. But their contributions to society began to flow from their own self-definitions rather than from a reciprocal engagement with general public discourse.”
Now, there is a definite note of sadness in Bender’s narrative – as there always tends to be in accounts of the shift from Gemeinschaft to Gesellschaft. Yet it is also clear that the transformation from civic to disciplinary professionalism was necessary.
“The new disciplines offered relatively precise subject matter and procedures,” Bender concedes, “at a time when both were greatly confused. The new professionalism also promised guarantees of competence -- certification -- in an era when criteria of intellectual authority were vague and professional performance was unreliable.”
But in the epilogue to Intellect and Public Life, Bender suggests that the process eventually went too far. “The risk now is precisely the opposite,” he writes. “Academe is threatened by the twin dangers of fossilization and scholasticism (of three types: tedium, high tech, and radical chic). The agenda for the next decade, at least as I see it, ought to be the opening up of the disciplines, the ventilating of professional communities that have come to share too much and that have become too self-referential.”
He wrote that in 1993. We are now more than a decade downstream. I don’t know that anyone else at the lunchtime gatherings last month had Thomas Bender’s analysis in mind. But it has been interesting to think about those meetings with reference to his categories.
The people around the table, each time, didn’t share a civic identity: We weren’t all from the same city, or even from the same country. Nor was it a matter of sharing the same disciplinary background – though no effort was made to be “interdisciplinary” in any very deliberate way, either. At the same time, I should make clear that the conversations were pretty definitely academic: “How long before hundreds of people in literary studies start trying to master set theory, now that Alain Badiou is being translated?” rather than, “Who do you think is going to win American Idol?”
Of course, two casual gatherings for lunch does not a profound cultural shift make. But it was hard not to think something interesting had just transpired: A new sort of collegiality, stretching across both geographic and professional distances, fostered by online communication but not confined to it.
The discussions were fueled by the scholarly interests of the participants. But there was a built-in expectation that you would be willing to explain your references to someone who didn’t share them. And none of it seems at all likely to win the interest (let alone the approval) of academic bureaucrats.
Surely other people must be discovering and creating this sort of thing -- this experience of communitas. Or is that merely a dream?
It is not a matter of turning back the clock -- of undoing the division of labor that has created specialization. That really would be a dream.
But as Bender puts it, cultural life is shaped by “patterns of interaction” that develop over long periods of time. For younger scholars, anyway, the routine give-and-take of online communication (along with the relative ease of linking to documents that support a point or amplify a nuance) may become part of the deep grammar of how they think and argue. And if enough of them become accustomed to discussing their research with people working in other disciplines, who knows what could happen?
“What our contemporary culture wants,” as Bender put it in 1993, “is the combination of theoretical abstraction and historical concreteness, technical precision and civic give-and-take, data and rhetoric.” We aren’t there, of course, or anywhere near it. But sometimes it does seem as if there might yet be grounds for optimism.
Amelia, a university sophomore, scores a 60 on her first academic paper. On her second she scores a 60 again. On her third paper, she pulls up to an 80 -- mostly due to extensive rewrites. Yet on her midterm and final, she received an astounding 90 and 85. Not only was her paragraph structure and use of quotations significantly better, but her ability to sequence ideas and support claims had taken a leap. Even her mechanics (grammar, sentence structure and punctuation) had improved.
I'd like to say that these two high scores came at the end of the semester; this would prove what an effective instructor I was. Instead, they came at odd times -- the first A came just after the second paper (which scored a D). The solid B paper did come at the end of the semester. The difference was in how the papers were produced. Both the 90 and 85 papers were handwritten in-class timed essays that constituted the midterm and final. The much lower scores were for computer-generated papers that she produced out of class. These, of course, could be rewritten over and over before the due dates.
I'd like to say that Amelia's experience is an anomaly. But I can't. In fact, this semester, 8 of my 20 sophomore English composition students scored significantly better on in-class essays written by hand in a timed situation. Some jumped more than a full grade level. In my three freshman composition classes, almost 20 of 60 students excelled when allowed to write in class rather than compose typed papers on their own time. In fact, at a large community college in California where I taught for six years, I frequently saw 10 to 25 percent of my developmental- and freshman-level writers do significantly better when asked to compose in-class with a topic given just before a two-hour writing period.
How can I make sense of this? Of course I immediately considered my grading rubric. Was I somehow more relaxed when grading handwritten essays? Possible. But in my mind, that could not explain jumps from 75 to 90. Yes, I was somewhat easier on misspelled words when grading handwritten essays. Yes, I may have been swayed by a student's handwriting -- in fact, studies have shown that instructors are often influenced to grade slightly higher or lower, depending on a student's handwriting. But in my mind, there must have been something more to explain jumps of more than a full grade level.
Finally I typed up a student's handwritten midterm and compared it to two computer-generated essays. The handwritten midterm was so much smoother -- I was shocked. Transitions abounded. Other than a few run-ons, sentence structure was fluid. One idea followed another. Claims were supported. The writer seemed to have hit a stride that held out for the required three pages. The computer-generated essays were passable. The ideas were sound, but the writing seemed awkward in every sense. Other than the possibility that I was flawed in my grading, there were several explanations for this jump.
First, the process of writing in-class in a timed situation seemed to discourage the kind of overwrought, constipated writing that some students produce with a typed paper. In my courses, I appeal to the high-context student. After wrangling syllabi for seven years, I've come to the conclusion that I like giving the students necessary information on the front end. After the first class, students walk away with a course outline that gives out specific due dates for all papers -- along with general topics. Those who are worried about their ability to produce college-level work may start on a paper ahead of time and rewrite up until the due date.
Although my office hours are busier at the end of the semester, I do notice an influx of students a week before each paper is due. The good news is that some of these students are producing better work -- their essay structure is sound, their now-approved thesis statements are well supported, and their conclusion doesn't sound tossed-together. The bad news is that some of these well-intentioned students are working, rethinking, and rewriting their papers until they become stiff and self-conscious. They rehash each sentence, tormenting themselves, rewriting until they can no longer see what works anymore. Suddenly their original draft has become stiff and mechanical -- and the due date is looming.
These students often relate number of hours to their final grade. Thus every weekend they have poured into an eight-page study of the topic should translate into a 10 percent jump in grade. Unfortunately, the reality is that trying to infuse light and spontaneity into a paper that has been reworked several times is impossible. So the end product is dull and overworked -- and their grade less than what they expected.
In-class writing, on the other hand, is a completely different form of exercise. Instead of dumping hours and hours into a format that already feels old and overdone, students are given a topic at the top of the hour. True, some students choke. They deliver half a paper. What is on the page is poorly thought-out and incoherent. Yet some, relieved of the need to think and rethink the topic, find themselves rising to the challenge. After outlining for 15 minutes, they find themselves churning out coherent paragraphs that stand together as a unified essay. I've never been able to predict which way a student will perform. It is only when I've graded their midterm that I can make observations about which process seems to produce the best written work.
Next, handwriting encourages students to focus on the writing process; for those less experienced with computers, keyboarding encourages students to focus on the end product. When asked to type up a sample paragraph in a classroom computer lab, all 20 of my English composition students spent more than 15-minutes setting up a document in MSWord, setting margins, choosing a font, centering a title, and typing up their names, instructor's name and class name at the top so that it sat flush-right. This left a disappointing 30-minutes of actual composing of text -- and of that, approximately five to nine more minutes were wasted when students insisted on particular line breaks with text, tried to change the amount of space between lines, and attempted to remove forced underlining of URLs.
Students' questions were not about how to approach the topic -- but were focused on the particular mechanics of the assignment: how many words they would have to provide, whether they could utilize grammar- and spell-check, whether the sample was to be single, one-and-a-half, or double-spaced, if one-inch margins were acceptable and the like. I started to feel like a software instructor instead of an English composition teacher. My frustration was compounded when students either couldn't print out their single paragraphs -- or attempted to e-mail them to me.
Second, handwriting brings writers closer to their work -- which may encourage excellence with particular students. Daniel Chandler, a scholar out of the University of Wales, has done extensive research on how students learn. His article, "The Phenomenology of Writing by Hand," comments on the conditions present when writers write by hand rather than by computer -- and the effect on the end product. In effect, the neurophysiological mechanism of each process is different. And although both handwriting and typing are under the influence of the central nervous system, the dynamics are noticeably different.
With substantial practice at the keyboard, I do believe that students are can become more "fluent" at writing and produce a product as creative as that produced by handwriting. In fact, studies often show that students do as well on a computer than they do handwriting compositions.
In the end, questions still remain for me. How does the time-constraint affect the end product? Do some students simply do better under pressure? Is there something about the timed in-class work that encourages a more focused end product? Does directly typing a work somehow encourage a piecemeal approach? If offered an in-class essay exam with computers, would students then do substantially better than those who chose handwriting? How does typing speed and familiarity with software and hardware impact a student's work?
What about the "power of print"? Isn't it true that students often view a typed paper as an "end product" whereas handwritten work feels like a step in a process? And, of course, how exactly can ideas be more "fluid" with the preferred composition method -- whether it be writing by hand or word processing? With research, more will be revealed. Until then, I will give my students the benefit of both methods. I will continue to offer both in- and out-of-class writing. Those who flourish with the additional time for writing will produce more polished work; those who chafe with the weight of long-term deadlines will rush into the midterm and final to write well -- and ultimately both groups will find the process that produces the best work. Those students who then hone their ability to do both handwriting and word processing may do better in all areas; the resulting degreed professionals may find that both processes serve them well.
Shari Wilson, who writes Nomad Scholar under a pseudonym, explores life off the tenure track.
About 10 minutes into last week's now legendary episode of Oprah (the show that made it to the front page of newspapers; the one that left "memoirist" James Frey on the verge of confessing that he possibly made up his own name, but couldn’t be sure), one part of my mind was riveted to the tube while another part wandered off to conduct an intensive seminar about the whole thing, complete with Power Point slides containing extensive quotations from Foucault’s late writings on the "technologies of the self."
This happens a lot, actually. What Steve Martin once said about philosophy also applies to cultural theory: "When you study it in college, you learn just enough to screw you up for the rest of your life."
Well, it turns out that a certain amount of my seminar was just a repetition of work already done in the field that we might well call "Oprah studies." It has a substantial literature, including four academic books and numerous journal articles, most of which I have read over the past few days. Some of it is smart and insightful. Some of it consists of banalities gussied up with footnotes. In other words, it's like Shakespeare criticism, only there isn't as much of it.
Though there's plenty, to be sure. I've now spent more time reading the literature than I ever have watching the show. Some of it has been very instructive. There was, for example, a journal article from a few years ago complaining that other scholars had not grasped Oprah's postmodernity because they had failed to draw on Mikhail Bakhtin’s work on dialogism.
What important results follow from applying Bakhtin? Well, the concept of dialogism reveals that on talk shows, people talk to one another.
We may not have realized that before. But we do now. Scholarship is cumulative.
Indeed, by 2003, there were grounds to think that Oprah was not postmodern, but an alternative to postmodernity. So it was revealed when the first book-length study of the daytime diva appeared from Columbia University Press: Oprah Winfrey and the Glamour of Misery: An Essay on Popular Culture, by Eva Illouz, a professor of sociology and anthropology at Hebrew University of Jerusalem.
"Far from confirming Fredric Jameson's view that postmodern culture lacks emotionality or intensity because cultural products are disconnected from the people who produced them," writes Illouz, "Oprah Winfrey suggests that both the meaning and the emotional intensity of her products are closely intertwined with her narrative authority." Her programs, books, movies, magazine, and other cultural commodities all add up to "nothing less than a narrative work [able] to restore the coherence and unity of contemporary life."
For an example of this redemptive process in action, we might turn to the program from six years ago called "Men Whose Hair Is Too Long" -- during which, as Illouz describes it, "Oprah brought to the stage women who told the audience of their desire to have their sons, lovers, brothers, or husbands change a ‘hairy part’ of their body (mustache, hair, beard)." The menfolk are briefly “exposed to the public” and then “taken to a back room” – from which they later emerge with “a change supposed to effect a spectacular transformation.”
Such transformations are part of the Oprah metanarrative, as we might want to call it.
“The ‘hairy parts’ are exposed as a transactional object in a domestic, intimate relationship that is constructed as contentious,” as Illouz explains. “The haircut or moustache shave provides a double change, in the man’s physical appearance and in his intimate relationship with a close other. The show’s pleasure derives from the instantaneous transformation -- physical or psychic -- undergone by the guests and their relationships, which in turn promote closer bonds.”
This all sounds deeply transformative, to be sure. It made me want to go get a haircut.
But something about the whole argument -- Illouz’s reference to Oprah’s “narrative authority”; the framing of makeover as ritual of self-transfiguration; the blurring of the line between intimate relationship and televised spectacle -- is really frustrating to consider.
It is hard not to think of Richard Sennett’s argument in The Fall of Public Man: On the Social Psychology of Capitalism (Knopf, 1977), for example, that we have been on a long, steady march towards “the tyranny of intimacy,” in which every aspect of the social conversations gets reduced to the level of the personal. “It is the measurement of society in psychological terms,” as Sennett put it. “And to the extent that this seductive tyranny succeeds, society itself is deformed.”
But no! Such worries are part of an “elite” cultural discourse, according to Sherryl Wilson’s book Oprah, Celebrity, and Formations of Self, published by Palgrave in 2003. A whole raft of theorists (the Frankfurt School, David Reisman in The Lonely Crowd (1950), the arguments about the rise of “psychological man” and “the culture of narcissism” in the writings of Philip Rieff and Christopher Lasch, and so on) have treated mass society as a force creating an almost inescapable force of consumerism and privatized experience. The fascination with celebrities is part of this process. Their every quirk and mishap becomes news.
To the “elitist” eye, then, Oprah might look like just another symptom. But according to Wilson (who is a lecturer in media theory at Bournemouth University in the UK) the Oprah phenomenon belongs to an altogether different cultural logic. It is a mistake to regard her program as just another version of therapeutic discourse. It draws, rather, on feminist and African-American understandings of dialogue -- the public sharing of pain, survival, and mutual affirmation -- as a necessary means of transcending the experience of degradation.
The unusually intense relationship between Oprah and her audience would probably have impressed a stodgy old Marxist like Theodor Adorno as evidence of alienation under advanced capitalism. Wilson regards “the apparent closing of the gap between the star self and the personal self” as something quite different.
“Rather than the participants seeking to transcend their ‘ordinariness’ by emulating the personal of a celebrity,” writes Wilson, “it is the ‘ordinary’ and everyday experience of Oprah which works to validate the personal stories recounted by the guests. In other words, those who speak on the show, and who participate through viewing at home, do not position themselves within the aura of a personal anchored in a glamour that for the majority is unattainable; rather, empowerment is located within the realm of everyday life.”
While the star does possess an undeniable charisma, Oprah’s is the glamour of simple decency. “Irrespective of the topic of the day or the treatment through which the topic is handled,” as Wilson puts it, “Oprah’s performance is guaranteed to be inclusive, (generally) nonjudgmental, (often) humorous, and (almost always) empathic.”
How that amiable persona then generated certain massive effects in the literary sphere is a matter addressed in the two scholarly volumes devoted to analyzing the Oprah Book Club.
Each book has a defensive quality; the authors seem to want to defend the book club, nearly as much as they do to analyze it. “From its inception in September 1996,” notes Rooney, “OBC was commandeered as a rallying point around which both cultural commentators and common people positioned themselves in perpetuation of America’s ongoing struggle of highbrow versus lowbrow. Both sides made reductive use of the club to galvanize themselves either as populist champions of literature for the masses or as intellectual defenders of literature from the hands of the incompetent.”
But Rooney contends that a closer look at the club, and at the books themselves, suggests “that there exists a far greater fluidity among the traditions categories of artistic classification than may initially meet the eye; that we needn’t shove every text we encounter into a prefabricated box labeled ‘high,’ ‘low,’ or ‘middle.’”
Farr’s argument in Reading Oprah converges with Rooney’s -- finding in the conversational praxis of the book club something like a down-home version of Barbara Hernnstein Smith’s Contingencies of Value: Alternative Perspectives for Literary Theory (Harvard University Press, 1988).
The book club has embodied “contingent relativism,” writes Farr, “constructed not in the absence of truth, but in the context of many truths, negotiated truths, truths that people arrive at in conversation with others and with their own often contradictory values.” Hence the need to discuss the reading, to embed the books in a conversation. They need to “have a talking life” so that so that readers can “explore and work their way through the myriad of possible responses.”
Given their interest in giving Oprah’s aesthetic and ethical stances the benefit of the doubt, it is all the more striking when either author admits to feeling some reservations about the program. While doing her research, Farr recalls, she “tuned into a pre-Christmas program” that proved to be “an hour-long consumer frenzy.”
This was an “O List show” which is evidently a major event among the Oprahites. The celebrity “gives away literally hundreds of dollars worth of free stuff to every guest in her audience,” writes Farr. “Pants, candles, shoes, electronics – you name it. If Oprah likes it, she’s giving it away on this show....I watched open-mouthed, both appalled and envious. Was this incredibly tacky or unbelievably generous? Did I want to run screaming from the room or do my best to get on the next show? Both/and. It was a moment of genuine American ambivalence.”
The protocols of the book club were also grounds for concern, at least for Rooney. “Once the tape started rolling,” she writes, “neither Winfrey nor her readers seemed permitted to remark critically on the selections, or to advance beyond any but the most immature, advertisement-like, unconditionally loving responses to every single novel they encountered.”
What made last week’s program with James Frey so fascinating was the sudden revelation of another side of the Oprah persona. Gone was the branded performance as “inclusive, (generally) nonjudgmental, (often) humorous, and (almost always) empathic.” Her manner had scarcely any trace left of its familiar “I’m OK, you’re OK” spirit.
Oprah was angry, and Frey was some very considerable distance from OK. She was also indignant to discover that the publishing industry makes no real effort to enforce the implicit contract between reader and writer that goes with a book being shelved as nonfiction. This seems terribly naive on her part. But no doubt most of her audience shared her surprise. (“She wants publishers to fact-check their books?” I thought. “Hell, they don’t even edit them.”)
Remarkable as the spectacle was, however, it did not come as a total surprise. Perhaps I will give myself away as an “elitist” here, in the terms that Sherryl Wilson uses in Oprah, Celebrity, and Formations of Self. But at the end of the day, the therapeutic ethos is not antithetical to a deep yearning for authority (a craving then met by the stentorian Dr. Phil, who scholars have yet to analyze, oddly enough).
Nor is there any deep discontinuity between the conspicuous consumption of an “O List show” and the completely uncritical attitude towards whatever book Oprah has selected for the month. If anything, they seem like sides of a coin.
In search of a different perspective on the matter, I contacted Cecelia Koncharr Farr – whose book Reading Oprah seems, on the whole, an endorsement of the “individual pluralism” of the show’s ethos. What did she make of l’affaire Frey?
“It seems apparent to me,” Farr told me by e-mail, “that Oprah started out with a viewpoint that most experienced readers would have in this situation, that the facts aren't as important as the more general truthfulness of the story in a novel or memoir. Most readers surely took some of Frey's aggrandizements and exaggerations with a grain of salt from the beginning, while still enjoying the character he was constructing, still enjoying the story, and still finding the book powerful and interesting.....
“My guess is that the righteous indignation we saw on last week's show comes from Oprah representing the less experienced readers who needed Frey's memoir to be true in a journalistic sense. Her chastisement of the publishing industry was the first real exertion of her authority I have seen beyond her selection of books. She's earned that authority, certainly, but it was surprising to see her use it. Still, I believe she used it on behalf of her readers.”
I was, to be honest, dumbfounded by this response. I printed it out, and read it a few times to make sure Farr had actually said what she seemed to be saying.
Her contention seemed to be that Oprah’s audience had become upset from mistakenly reading the book as “true in a journalistic sense” -- which was, somehow, a function of readerly inexperience, not of authorial dishonesty.
And from her account, it appeared that Frey’s memoir contained a "general truthfulness" -- one it would be naive to expect to be manifested at the level of occasional correspondence between the text's claims and ascertainable facts.
So I wrote her back, checking to see if I’d followed her.
“I think theorists and critics, especially, but also seasoned readers, read memoirs without an expectation of ‘correspondence between the text's claims and ascertainable facts,’” she responded. “Memoirists creatively construct characters and situations with a lot of license -- and readers and publishers have tacitly allowed that license. That's not to say Frey didn't take this license to its very limit. His constructions at times lose an even tenuous connection with ascertainable facts. When Frey pushed the limits, he drew intense attention to the slippage this connection has seen in recent years. But he wasn't the first to take such license, nor is he responsible for the larger changing perception of what ‘memoir’ (or ‘creative nonfiction’) means.”
Perhaps those terms now just mean “whatever you can get away with” -- though that seems vaguely insulting to honest writers working in those genres. (There is a some difference, after all, between the tricks played by memory and the kind that a con man practices.)
Why the furor over Frey? “I think the vilification he has been subject to in the media is extreme,” writes Farr, “and probably stems from some larger discomfort about dishonesty from sources who are (and ought to be ) culturally more responsible to the ‘ascertainable facts.’"
There may be something to that. And yet it begs any number of questions.
The man has made a small fortune off of fabricating a life and selling it -- while loudly talking, in the very same book, about the personally transformative power of “the truth.” Oprah Winfrey endorsed it, and (at first anyway) insisted that mere factual details were subordinate to a larger truth... A personal truth....A truth that, it seems, is accountable to nothing and nobody.
Suppose this becomes an acceptable aspect of public life – so that it seems naive to be surprised or angered by it. Then in what sense can we expect there to be institutions that, in Farr’s words, “are (and ought to be ) culturally more responsible to the ‘ascertainable facts’”?
Let’s leave that topic for the Oprah scholars to consider. In the meantime, remember that her next selection is Eli Wiesel’s Night, a memoir about surviving the Nazi death camps. It might be an interesting discussion. Especially if the book club takes up the idea that there are forms of truth that, in the final analysis, have exactly nothing to do with self-esteem.
Many now consider the humanities to be facing a relevancy “crisis.” Partly because of the culture wars, the humanities -- if not the whole university -- appear to have lost their reason to be. To choose just one compelling example, Bill Readings argues in The University in Ruins that the primary role of the university is no longer to inculcate national culture, so it now resorts to rhetorically convenient but substantively empty and ideologically suspect vagaries like the term “excellence” to justify its existence. As one result, faculty in English and composition also suffer from what some recent publications are casting as a labor “crisis.”
While the public grows increasingly skeptical of the nature and purposes of liberal arts education, academics generally, and we suspect English scholars particularly, have not been as effective as they could, should, and must be when representing the value of their work, especially teaching. In a colloquial nutshell, public criticism tends to follow some version of this reasoning: English departments aren’t teaching my kids to write and read well enough because they’re too busy trying to turn them into Marxists, feminists, homosexuals, or -- worse -- grad students. Meanwhile, our scholarship is derided as obtuse, cryptic, or absurd. It matters little that such descriptions are inaccurate, unfair, and often advanced in service of narrow-minded ideologies at odds with the democratic underpinnings of a liberal arts education. The fact remains that our work is nevertheless perceived at turns as irrelevant or threatening, a fact which directly and indirectly contributes to the deplorable state of labor conditions in English.
Because the value of work in English studies is so poorly understood, even among ourselves, negative stereotypes become entrenched in the general cultural psyche in the form of common sense: e.g., literature is boring, difficult to understand, and best left to experts who talk about it in ways that are also boring and difficult to understand. And the value of writing is often reduced to its correctness, which, to many, is valuable only to the extent that it earns, as in earns good grades and jobs. This leads (or likely will lead) to further decreases in the number of English majors (currently about 4 in every 100) and this, in turn, will lead to fewer tenure-track lines and increased stratification of faculty, in the form of part-time and other non-tenurable lines. For example, a 1999 Modern Language Association survey found that only 37 percent of English faculty members were on tenure-track lines. While jobs in composition, tenure-tenure track and otherwise, have proven more available than those in, say, 19th century American literature, such jobs often consist of administrative positions, or what both critics and reformers are now calling the middle-management class of faculty, wherein one or two tenured faculty are charged with supervising a large and shifting class of part-time faculty.
As faculty continue to stratify, it will become increasingly difficult to represent the purpose, direction, and value of work in English studies beyond the rudiments of business writing and the cultural capital afforded by cocktail party knowledge of Shakespeare or Melville. The vicious cycle can be simplified as follows: A managed and stratified faculty often has difficulty representing itself effectively in the culture wars, which in turn exacerbates the level of stratification, which in turn leads to increasing difficulty with representation. The consequences of poor representation and increased stratification harm all faculty and students in nearly every imaginable category, including infringement on academic freedom, especially in matters of curriculum design and assessment, as well as decreasing job security, inequitable pay scales, little or no benefits, high teaching loads, large class sizes, and pitiful office conditions.
James Piereson, writing in a recent issue of the conservative periodical The Weekly Standard reflects the views of many non-academics who haven’t been made to care or care enough about our problems and, in fact, resent academics for our seeming disengagement with their values. He writes: “When this year’s freshmen enter the academic world, they will encounter a bizarre universe in […] institutions that define themselves in terms of left wing ideology. […] which is both anti-American and anticapitalist.” Piereson approvingly refers to university trustees who (in his words) contend that “if their institutions are to be rescued, they dare not rely on faculties to do it.” Piereson’s variety of culture-war mongering and his apparent comfort with making outlandish claims without much more than scatter-shot anecdotal evidence, often lead to equally bombastic and antagonistic counter-statements, and so go the culture wars.
Citing findings from the National Center for Educational Statistics, Louis Menand points out in his 2005 contribution to MLA’s Profession that between 1970 and 2001 the number of English majors dropped, roughly, by a third; however, “the system is producing the same number of doctorates in English that it was producing back in 1970. These Ph.D.'s have trouble getting tenure-track jobs because fewer students major in English, and therefore the demand for English literature specialists has declined.” There are many theories about the causes of this discrepancy (e.g., students who would have previously majored in English are now turning to interdisciplinary programs, in, say, cultural studies, or students are driven by the increasing costs of college education to specialize in areas, such as, say, computer science, which have a reputation for more immediate financial pay off than does a B.A. in English). Regardless, more and more conversations in English studies seem to be focusing on ways to reinvigorate the work of English studies in the 21st century, so as to make it more relevant to the public, especially students.
The various strands of this already vast and quickly growing debate are difficult to summarize and properly attribute in the space that we have. For the moment, suffice to say that the main idea is that work in the humanities, both critical and imaginative, seems to be increasingly alien and perhaps irrelevant to the public. It is often said that scholarship in the humanities has become too insular for its own good. One possible solution to the perceived problem of insularity is often described with the phrase “going public.” In 1995, Linda Ray Pratt uses it in her contribution to the influential collection Higher Education Under Fire. In 1998, Peter Mortensen uses the phrase as the title to his article in the journal College Composition and Communication. More recently, it has been invoked in a Duke University panel on academic publishing, and Henry Boyte makes “going public” the focus of his 2005 occasional paper for the Kettering Foundation. If the catch phrase for the late 90s was “critical thinking,” the phrase for the early years of the 21st century may just be “going public.”
While we believe it is important to go public with academic work in the humanities, this phrase, however catchy, raises more questions than it answers. Go public with what, exactly? And what venues qualify as appropriately public? Further, Louis Menand invites us to consider the possibility that going public may not be as easy or as desirable as it may at first sound: "The last premise academic humanists should be accepting is that the value of their views is measured by the correspondence of those views to common sense and the common culture. Being an intellectual and thinking theoretically are going outside the parameters of a common culture and common sense." (Menand’s emphasis)
This is to say that the duty of academics, be they physicists or humanists, is not to the public but to knowledge, dare we say truth. And the public is not necessarily concerned with either. Menand concludes: "Ignorance has almost become an entitlement. We are living in a country in which liberals would rather move to the right than offend the superstitions of the uneducated. As always, the invitation to academics is to assist in the construction of the intellectual armature of the status quo. This is an invitation we should decline without regrets."
Here, Menand raises some valuable points of caution. In his line of argument, going public may mean caving in, stripping our ideas of nuance, and abandoning precision or critical thinking for the sake of public acceptance. Of course most of us agree that teachers who passively abide by common sense notions and status quo values are not acting like responsible academics, and none of us would endorse this behavior. However, as noteworthy as such cautions may be, the distinction between the academic and the public seems overdrawn here. After all, there are nearly 5,000 college campuses in the United States, enrolling more than 14 million students, with enrollments projected to increase through the year 2014. This is to say that the question of “going public” has already, to a very large extent, been settled: academic work is quite thoroughly situated in the public realm, and if the public considers ignorance to be “almost an entitlement,” then we are at least partly to blame for this state of affairs. Gerald Graff goes so far as to claim that the “university is itself popular culture -- what else should we call an institution that serves millions if not an agent of mass popularization. But the university still behaves as if it were unpopular culture, and the anachronistic opposition of academia and journalism continues to provide academics with an iron clad excuse for communicative ineptitude.”
Going public, therefore, is a useful but not entirely adequate phrase, since it does not explain how more public exposure will improve the current state of the humanities or the public’s view of work done within it. Therefore, we would like to focus on improving the work which is, far and away, the most public and the most popular -- that is to say, our teaching. It will be necessary for educators in English studies to make the case for the work of English studies. Increased and accessible public discourse about teaching literature and writing may be a first step, but one which would require more questioning of what we mean by teaching, to whom it is valuable, and why. As opposed to (re)fighting the culture wars with those like James Piereson, or resisting the public face of academic work, we might practice our discourse theories with the public, rather than merely attempt to report on them, even in jargon-free language. This assumes a dialogue that transforms not only the content of the humanities but also the participants of the conversation -- especially, teachers and their students.
Taking up this point in his recent book, English Composition as a Happening, Geoffrey Sirc bemoans the dulling influence of academic routine, which has led many of us to (re)produce the sort of polemical prose and responses which have, thus far, not proven particularly effective tactics in the culture wars. Instead Sirc urges us, as educators and scholars, to define teaching and writing in ways that articulate the value of innovation and imaginative thinking. And we would like to see Sirc’s suggestion enacted both internally and externally, that is in forums such as this one and in public venues such as newspapers, periodicals, and community meetings, in short, any of a variety of venues that serve to establish dialogue among academics, students, administrators, parents, media members, and legislators. The better we are able to do this, the better we will be able to supplant negative and inaccurate representations of our work.
While critics such as Sirc and Menand are clearly influential here, we understand this task to be of particular importance to graduate students, not least of all because the future of work in the humanities is quite literally in our hands. Should we continue the tradition of predominantly insular and/or antagonistic discourse, our degree of leverage and relevance with the public will continue to decrease, as will our prospects for tenure-line work. It is incumbent upon us to open the lines of communication and to make known the good work that is already being done in our classrooms.
Scholarship on this issue is already underway. For example, at the 2005 MLA conference, Michael Bérubé and Cary Nelson spoke to issues of contingent labor; others such as Peter Mortensen and David Shumway attended to matters of representation. We regard these two issues as linked; that is, the better we understand and represent our work (especially teaching), the better our working conditions stand a chance of improving. For this, we conclude with the following proposals that take from and build on the work of these and other scholars:
1. Cultivate existing trends toward interdisciplinarity, such as linked or clustered courses, in ways that effectively demonstrate the value of English studies, particularly in terms of accomplished reading and writing.
2. Realize that the Ph.D., as a credential for teaching, requires civic responsibility and ethical action. The better we collectively attend to this fact and make this work known, the better we will be able to build a platform from which to argue for improved working conditions.
3. Accept and embrace the possibility of working through cultural debates in ways and venues that are accessible to the general public. This is not to suggest necessary agreement with the public, but to encourage a variety of discourse that holds the public in vital partnership.
4. Encourage hiring, promotion, and tenure committees to value the above efforts or else they simply will not happen, or at least not to the extent that they should. In other words, in order to improve the representation of our work, it will be necessary to appeal effectively not only to the public but also to our senior colleagues.
Frank P. Gaughan and Peter H. Khost
Peter H. Khost is a lecturer in writing and rhetoric at the State University of New York at Stony Brook. Frank P. Gaughan is an instructor in English and first-year writing at Hofstra University. Frank and Peter are both doctoral candidates in English at the Graduate Center of the City University of New York. This article is adapted from a talk they gave at the annual meeting of the Modern Language Association.
On December 20, 2005, U.S. District Judge John Jones ordered the Dover Pennsylvania Area School Board to put science back in its place -- protected from intelligent design and other religious ideas. In Kansas, where we have had no such luck, I participated last semester in a new interdisciplinary college course also designed to put science in its place -- separate not only from religion, but from the humanities in general.
DAS 333 (the numerological implications are coincidental but amusing) -- Human Life and the Universe -- was the work of faculty in physics, geology, the life sciences, philosophy, and English, affiliated with the new Center for the Understanding of Origins at Kansas State University. The course was explicitly developed in the context of the evolution controversy to educate students about the fundamental constitution of science as a discipline. As the sole representative of the non-sciences in the course (the philosopher was a hard-nosed philosopher of science, no fuzzy humanist), I did not expect my contribution to come off as particularly consequential; I was merely there as a reminder of the Other to science, providing a sketch of non-scientific disciplinary thinking. All the action would take place in science's bailiwick. But, by the end of the course, I realized that I had not anticipated the dramatic though inchoate demand for what science cannot deliver. In a twist on C.P. Snow's classic criticism of the emerging chasm between science and the humanities, DAS 333 demonstrated both the necessity of their distinction and the urgent need for both.
A rigorous and reflexive approach to science education is the best way to manage the evolution controversy. The recent Fordham Institute report, "The State of State Science Standards," indirectly but forcefully underscores the wisdom of such an approach; Paul Gross et al in the introduction to the report conclude that the state of science in public schools does not so much reflect the impact of religiously-driven anti-science or intelligent design so much as a demonstrate a correlation between the weak handling of evolution and a general weakness in disciplinary content for science across the board. It follows that the most powerful redress to the resurgence of creationism is a strengthening of disciplinary content in the sciences. Anticipating this connection, DAS 333 provided serious although introductory college-level academic content in its science disciplines. Students calculated luminosities, grappled with the data responsible for the emergence of plate tectonics, and managed some of the microbiology involved in gene suppression. The philosopher of science then used this science content as a source of examples for demonstrating the interaction between theories, their auxiliary hypotheses, and observations, both clarifying the boundary between science and non-science and making the definition of a scientific "theory" clear -- and distinct from mere "opinion."
However, it became increasingly evident to students that the constraints on science, enabling progress in understanding nature, are disabling in other areas. Science cannot, for example, pronounce on the truth or falsehood of propositions like intelligent design. Essentially, the products of science are predictions of new observations consistent with the explanation of existing data. The product of science is not meaning. This set up my work; science: prediction; literature: meaning. I tried to counterpoint the science units with topical fictions, for example, H.G. Wells' "The Star," Burroughs' The Land that Time Forgot, and Crichton's Jurassic Park, in order to demonstrate how the use of language, including figures of speech and fabricated scenarios, elicits feelings and desires in order to construct meaning -- in contrast to what scientific accounts do in response to the same world. This was, more or less, fine. But it was not enough. I discovered at the end that these forays into the literary formation of meaning sidestepped the real force of the humanities in a course like this.
It was during a final class on the implications of the limits of science in the Terri Schiavo case that the greatest challenge to the non-sciences emerged. All students saw the controversy over Schiavo's care as a dispute in which the person whose wishes should have been paramount, Schiavo herself, had no reliable input. They had no idea of how we might achieve greater consensus and resolution on this life-issue as a society. Nor do most of us. The problem with that case, for college students, the general public -- and most of us -- is that, like the issue of abortion, it requires navigating waters murky with emergent technologies, religious tenets, strong feelings, and massive distrust.
Yet this is precisely the miasma into which students must plunge as citizens and decision-makers, and it is for these eventualities they need better and more intelligent preparation. The humanities cannot be content with just developing and promoting the ethical imagination for private use; they must also do much more to connect minds so enriched (all minds, not just those of future lawyers and bioethicists) to complex situations that demand such resources be put into action. The humanities must do more to offset the temptation to either authoritarian or excessively personal solutions for solving complex problems that science often creates, but whose solution is beyond the its reach.
One of the most revealing features of the evolution controversy, as well as so many of the controversies that, properly or improperly, connect religion to public life, is the degree to which it subtly relegates non-science to the personal and the private, to a space beyond the pale of public education. And so it is with the text of the now-infamous statement that Dover, Pennsylvania Area School District science teachers were required to read to students in ninth grade biology. The statement portrays the difference between science and non-science as a distinction between publicly-acknowledged fact and private opinion. Essentially, the creationist strategy is to eliminate the teaching of evolution by privatizing it. "Because Darwin's Theory is a theory," the statement reads, students should be encouraged to explore other views--specifically, "Intelligent Design ... an explanation of the origin of life that differs from Darwin's view."
As a result of the mere theory-status of Darwinism, "students are encouraged to keep an open mind. The school leaves discussion of the Origins of Life to individual students and their families." In other words, given that evolution, as theory, does not have the form of other scientific propositions constituting or based on laws (gravitation, thermodynamics, and so forth), determination of the origin of species is left to the individual and family, that is, to those realms beyond the institution and beyond the state and its laws. In essence, according to this line of reasoning, scientific "fact" is law, in contrast to mere theory. In the absence of scientific law, the formulation of beliefs about the world is essentially beyond education. Ironically, perhaps, it is now against the law in Dover, Pennsylvania, to exempt evolution from the authority of science that the anti-evolution forces narrowly equate with scientific law.
In any case, for creationists and their teach-the-controversy fellow travelers, "theory" on the one hand, and "authority," "fact," and "law" on the other, exist at opposite ends of the spectrum. Furthermore, where authority, fact, and law end is precisely where "each individual must decide for him or herself" begins; there is no controversy, no public arena in which private individuals, informed by science, by traditions religious and secular, by rational ethics and embodied sentiment and compassion can use their wisdom and knowledge to work out a course of action, and it appears that even the possibility of such deliberation is an alien concept to our students -- as it is to the Dover Area School Board. It was clear to Judge Jones, moreover, that the appeal to individual opinion on the part of the Dover Area School Board was a thinly-disguised effort to support religious authoritarianism; he opined that by reminding school children they can maintain beliefs taught by their parents, critical thinking is stifled, not promoted.
While it is conceivable, even likely, that science will succeed in reasserting methodological naturalism as its fundamental feature, and so reduce the inroads of creationism into science curricula to occasional nuisances easily parried by existing institutions, this is not enough to restore the integrity of public education generally. Just as the weakness of science education accompanies avoidance of evolution, the weakness of humanities education emerges as an avoidance of a host of issues that polarize American social and political life. It is hard to calculate the greater omission.
Linda Brigham is head of the English department and a member of the Center for the Understanding of Origins at Kansas State University.
Paula M. Krebs has been a professor of English at Wheaton College, a selective New England liberal arts college, for 15 years. Her sister Mary Krebs Flaherty teaches writing as an adjunct at the inner-city campus of Camden County College, a two-year institution. They are writing a series of artiles about what it's like to teach English at their respective institutions.
Paula: I'm trying not to be annoyed at my students who have e-mailed me that they won't be in class today and tomorrow because their flights back to school were cancelled due to the snow. What business, I wonder, do they have flying out of town three weeks into the semester? And this snowstorm was predicted all week -- they knew there was a chance they'd not get back for classes. Then I remind myself that one said she'd left for a "family emergency" and another because his sister had just given birth. They have a right to set their own priorities -- it's up to me how to handle those decisions in terms of grading.
Mary: Very few of my students have a computer at home, let alone internet access, so they can't e-mail me about problems that come up, such as not being able to attend class due to a snowstorm. None of my women students with children attended class during the snowstorm -- not because they couldn't make the commute, but because they didn't have a babysitter for their kids and the elementary schools were closed in Camden. The priority for these students is exactly that -- their children first, class second. I am acutely aware of the time restrictions that my students face in their personal lives. Most, if not all, have part time or full time jobs, and as I said before, many of my female students have parenting duties when they get home. I find that I have to make homework assignment decisions based on what I think they can actually accomplish without overwhelming them.
Paula: Mine would love it if I took into account their part-time jobs and other obligations when I assigned homework, but I can’t do that. This is a residential college (more than 90 percent of our students live on campus), and I operate on the assumption that taking classes is their full-time job. So I assume that they’ll spend at least three hours outside of class for every hour they spend in class, and I assign reading and writing accordingly. They grumble, but most of them do it.
Mary: I would love half of that time commitment from my students! Instead, I have accepted doctor’s notes for prenatal care appointments and family court documents from students who wanted “excused” absences from class. If a student wants to see me before or after class for additional help, I feel that I have to be generous with my schedule to accommodate them given that, as an adjunct, I have no office or office hours. Since most students have part time jobs and several students even work full time jobs, they have to balance outside work, family obligations, and homework. I admire their tenacity, but I also have to make sure that they are doing a fair amount of school work outside the classroom. This is especially difficult because for many of them their only access to a computer is on campus, and they have to alter work schedules and family schedules to type their papers. To add on to their schedules, I encourage them to participate in a campus bookclub called Mental Elevations, which is one of only three school-sponsored clubs on the Camden campus.
Paula: On my campus, most students have part-time jobs, but many also participate in activities on campus -- theater, singing groups, clubs, and, of course, sports. Scheduling events outside of class is always problematic. We have to work around rehearsals, practices, and working hours. I have never had a student with childcare responsibilities. For me the biggest problem is to make sure they see the relevance to their future careers of what I’m asking them to do. The value of a liberal arts education is clear to the faculty, but it isn’t necessarily self-evident to a 19-year-old how reading Elizabeth Gaskell will help in the world of high finance or state government or retail management.
Mary: It’s much easier for me to make clear to my students that effective writing carries over into their other academic courses as well as future careers. We read paragraphs and essays in different rhetorical patterns that directly correlate to specific career choices. We recently worked on the process essay (“how-to”), and I told them to think about being a human resource manager who had to write a training manual. Before that, we went over the narration paragraph, which corresponded with a nurse’s record of a patient. For me, translating the usefulness of effective writing is relatively easy -- getting the students to believe that writing is a skill that they can learn is the difficult part. They bring a "one and done" attitude into the class, and I need to help them come to think about writing as a process. By following certain steps, they can learn to be effective writers.
Paula: Your students must have pretty clear career goals or aspirations that bring them to a community college at a nontraditional age.
Mary: My class dynamic is definitely interesting because I do have some students directly out of high school (with children of their own), as well as a number of returning students who have now realized that, say, having a CNA certificate (Certified Nursing Assistant) is not as valuable or rewarding as an RN degree. In either case, it seems that the beginning students in Basic Skills classes only have a level of practicality that college equals money and better opportunities for their potential careers.
Paula: I think liberal arts colleges like mine want to have it both ways, really. The students and their parents are investing huge amounts of money in this bachelor’s degree, so they want a return on that investment in the form of a job. At the same time, they have chosen a liberal arts college and not a community college or a state college or university, so they also have a sense that they want an education that is more training in critical thinking, writing, and arts and sciences than it is job training or vocation-oriented, as in engineering or business school. So in our courses we treat knowledge and inquiry as valuable in and of themselves, but outside of class we stress internships, networking, and job and graduate school placement.
Mary: I find myself having this exact duality in my role as graduate student and as a teacher. There is a huge gap between critically discussing 19th century novels like Bleak House at night with fellow graduate students, then turning around and teaching the concept of concrete supporting details in a basic skills class the next morning. What makes this even harder is the fact that in between teaching and being a grad student is working 40 hours a week at a job that doesn't have any relevance to my academic life. But it's the job that pays the bills, and allows for my education, so it has first priority. Maybe this is why I have so much empathy for my students....
Paula M. Krebs and Mary Krebs Flaherty
The previous column by Paula M. Krebs and Mary Krebs Flaherty explored grading and other measures of academic performance.
At December’s meeting of the Modern Language Association, the Committee on Information Technology sponsored two special sessions on electronic scholarship and publication in literary studies. Rather than the familiar panels consisting of three 15-20 minute papers, a pitcher of water, and a brief Q&A (time permitting), these meetings were structured as “electronic poster sessions.” Multiple presenters stationed around the perimeter of the room in front of easel displays and laptop computers demonstrated their projects and spoke to anyone who stopped by with questions.
Conversational and (literally) interactive, these presentations sometimes lasted a few minutes and sometimes more, depending on the size of the group standing at the station and the nature of the questions asked. In the room as a whole, the general hum and the constant movement of people from station to station produced a very different atmosphere from what might be expected in the usual MLA session. It was noisier, more chaotic and informal, more the result of many localized one-to-one encounters. All of this may be new at the MLA, with its culture of paper-readings as public performances, but poster sessions in general have been around for a long time in other disciplines, and sessions like these have long been the norm at science and technology conferences, and even, for example, at the annual meetings of the interdisciplinary Association for Computing in the Humanities. Instead of paper (or foamcore) posters containing labeled graphs, charts, or other visualizations, technology poster sessions usually involve a computer screen. More important, they actually provide for live, hands-on demonstrations of the tools and resources being presented.
This was the second year the MLA has included this kind of technology poster session in its schedule and this year participation was markedly increased. Co-organizer Michael Groden of the University of Western Ontario speculated that simply changing the name in the program from “poster sessions” (which may have puzzled some MLA members) to “digital demonstrations” probably helped. This year’s sessions saw a constant stream of people moving into the room, making their way slowly around the various stations, and then leaving, to be replaced by others. Outside the room a sign containing graphical “thumbnail” images of the posters displayed inside -- a kind of poster of posters -- worked to attract some curious passersby, who wandered in to see what was being demonstrated. Inside, almost every station sustained a rotating cluster of two or three, or six or seven people, sometimes more, watching demonstrations, trying out resources on the computer, or talking to the presenters. Two presenters worked at some stations, in order to be able to divide their attention between two different groups of questioners.
The advent of this new format, along with the increased interest at the sessions, coincides with an increased attention by the MLA to the problem of how to acknowledge new forms of research in general -- and digital projects in particular. The crisis in book and journal publishing has been on the agenda for several years now, but this year a special panel was convened by the MLA to discuss strategies for changing expectations for tenure reviews and to encourage the profession to take account of “’multiple pathways’ to demonstrating research excellence,” other than the traditional letterpress monograph, among them forms of scholarship produced and published online. These new forms include peer-reviewed online journals, which was the focus of the first poster session, featuring my own Web site’s Romantic Circles Praxis Series, as well as The Writing Instructor, and Language and Learning Technology, among other projects.
Though peer reviewed in the traditional way, carefully edited, and indexed by the MLA Bibliography under its unique ISSN, our Praxis Series is also able to take advantage of the flexibilities of scale, inventive genres of scholarship, and more malleable production schedules made possible by online publication. Multimedia essays, illustrated or even using or audio recordings, are among the contributions to its 28 volumes to date. And of course they are searchable, both internally and via Google, and in future may be linked to growing clusters of interoperable digital scholarship.
Besides online electronic journals, which extend traditional forms of scholarship, alternative “pathways” to scholarly research (and publication of the results of that research) will undoubtedly also lead through the kinds of archival, editorial, and analytical work represented in projects at the second poster session: ”New Technologies of Literary Investigation: Digital Demonstrations." These included the William Blake Archive (recently awarded the MLA’s Prize for a Distinguished Scholarly Edition and granted the approval of the MLA’s Committee on Scholarly Editions, both firsts for a digital edition), the Stolen Time Archive, the Mark Twain Digital Project, text-analysis tools such as TAPoR and Tamarind (the latter of which aims to process large collections of XML documents for text-mining and visualization by the larger NORA project), and research interface tools such as the Litgloss Collaboratory for collecting and sharing annotated foreign-language texts, or Turning the Pages (already in use at the British Library and other places as an interface that allows for virtual, animated page-reading, magnification, and other digital manipulations).
These new forms of scholarship call for new forms of presentation beyond the traditional paper-reading panel. Though C.P. Snow surely exaggerated the divide between the two cultures (literary intellectuals are no longer, if they ever were, “natural Luddites”), these unfamiliar forms of presentation, associated for many with the sciences, may require a slight adjustment in expectations and conventional roles on the part of some MLA conventioneers. As the morning poster session was preparing to get underway and many of us were still booting laptops, arranging our tables, or setting out handouts, a group of several participants entered the hall carrying book bags and sat down in the handful of chairs left at the back of the room still arranged in rows. Facing the “front” of the room, they waited patiently for the panel to begin -- until it gradually dawned on them that there was no panel and no real front to the room, and that they were supposed to stand and circulate around the exhibits on their own.
Those attendees serve to remind us that the inertia of “conventional” culture must be overcome if these new forms of presentation are really to become accepted at MLA meetings. They also stand as a reminder of all the potential users out there who may be waiting passively for online scholarship to begin to speak to them where they “sit,” who may not yet know how to seek out and actively engage digital resources. A change in the culture of the discipline is needed if the new tools and resources are to have their desired impact among scholars. And some of those potential users still require basic education about digital scholarship, how to use it -- and how it might change what they do and how they think about their subject matter.
Kari Kraus of the William Blake Archive told me that “about 70 percent of the people were asking very basic questions about the material,” and that she found herself introducing the archive more often than she had expected. “But that’s fine,” she added, “I love talking about it.” At that point in our conversation we were interrupted by someone approaching the station: “Hi,” said Kraus, “Are you familiar with the William Blake Archive?” She was eventually able to demonstrate the visually striking (paper) print standing on her easel -- a pair of landscape-format plates from Blake’s "The Song of Los," digitally reconstructed by Blake scholar (and one of the archive’s editors) Joseph Viscomi from digital facsimiles downloaded from the archive. Using materials freely available at the archive and opening them in Photoshop on his own desktop, Viscomi was able to make and print out a new facsimile of Blake’s etched plates, reunified in order to demonstrate Blake’s own “virtual designs” according to his original intentions. This is only one example (though a particularly vivid one) of the kinds of individual acts of do-it-yourself scholarship now possible as a result of the many years of collaborative editorial work done to build and edit the massive image and text archive.
Despite the lack of live Internet connections (the hotel’s rates were exorbitant), and despite the newness of the format for some MLA members, most participants -- presenters and questioners -- I spoke with responded very positively to the sessions. In fact I learned that the line between presenter and questioner was not always perfectly clear. There was no dais, no microphone, no chairs in rows -- except for those few left in the morning session that confused the early attendees. Sometimes a passerby turned out to be working in a similar field, or was trying to get support for a new digital project at his or her own institution, or was a would-be contributor to an electronic journal on display, or even a long-distance collaborator at several removes -- such is the nature of the networked world of digital scholarship. Some people making their way around the poster-session rooms were conducting business, exchanging cards or URLs, offering informal proposals to journal editors, and others were engaged in detailed technical discussions about problems of textual encoding or user interface.
During the second session it suddenly dawned on me where I had seen this general kind of activity before: not just at humanities computing conferences, but (of course!) at the MLA’s own legendary and massive “poster session” -- the publishers’ book exhibit hall. At the moment the MLA was meeting to discuss ways to encourage “multiple pathways” to demonstrating excellence in research, including a “commitment to treating electronic work with the same respect accorded to work published in print” -- when many in the profession are seeking alternatives to the monographs produced in increasingly diminishing numbers by university presses, the center of gravity in this regard may have shifted ever so slightly away from the cavernous exhibit hall, with its familiar displays, and toward these buzzing, teeming poster sessions, with their multiple demonstrations of a variety of new digital research projects.
Steven E. Jones
Steven E. Jones is professor of English at Loyola University Chicago and is co-creator and co-editor of the Romantic Circles Web site. Â
At my university, I chair a faculty committee charged with reviewing and revising our general education curriculum. Over the past two and a half years, we have examined programs at similar colleges and studied best practices nationwide. In response, we have begun to propose a new curriculum that responds to some of the weaknesses in our current program (few shared courses and little curricular oversight), and adds what we believe will be some new strengths (first-year seminars and a junior-level multidisciplinary seminar).
In addition, we are proposing that we dispense with our standard second course in research writing, revise our English 101 into an introduction to academic writing, and institute a writing-across-the-curriculum program. Our intention is to infuse the general education curriculum with additional writing practice and to prompt departments to take more responsibility for teaching the conventions of research and writing in their disciplines. As you might imagine, this change has fostered quite a bit of anxiety (and in some cases, outright outrage) on the part of a few colleagues who believe that if we drop a course in writing, we have dodged our duty to ensure that all students can write clearly and correctly. They claim that their students don’t know how to write as it is, and our proposal will only make matters worse.
I believe most faculty think that when they find an error in grammar or logic or format, it is because their students don’t know “how” to write. When I find significant errors in student writing, I chalk it up to one of three reasons: they don’t care, they don’t know, or they didn’t see it. And I believe that the first and last are the most frequent causes of error. In other words, when push comes to shove, I’ve found that most students really do know how to write -- that is, if we can help them learn to value and care about what they are writing and then help them manage the time they need to compose effectively.
Still, I sympathize with my colleagues who are frustrated with the quality of writing they encounter. I have been teaching first-year writing for many years, and I have directed rhetoric and compositions programs at two universities. During this time, I have had many students who demonstrate passive aggressive behavior when it comes to completing writing projects. The least they can get away with or the later they can turn it in, the better. I have also had students with little interest in writing because they have had no personally satisfying experiences in writing in high school. Then there are those students who fail to give themselves enough time to handle the complex process of planning, drafting, revising, and editing their work.
But let’s not just blame the students. Most college professors would prefer to complain about poor writing than simply refuse to accept it. Therefore, students rarely experience any significant penalties for their bad behaviors in writing. They may get a low mark on an assignment, but it would a rare event indeed if a student failed a course for an inadequate writing performance. Just imagine the line at the dean’s door!
This leads me to my modest proposal. First, let me draw a quick analogy between driving and writing. Most drivers are good drivers because the rules of the road are public and shared, they are consistently enforced, and the consequences of bad driving are clear. I believe most students would become better writers if the rules of writing were public and shared, they were consistently enforced, and the consequences of bad writing were made clear.
Therefore, I propose that all institutions of higher learning adopt the following policy. All faculty members are hereby authorized to challenge their students’ writing proficiency. Students who fail to demonstrate the generally accepted minimum standards of proficiency in writing may be issued a “writing ticket” by their instructors. Writing tickets become part of students’ institutional “writing records.” Students may have tickets removed from their writing records by completing requirements identified by their instructors. These requirements may include substantially revising the paper, attending a writing workshop, taking a writing proficiency examination, or registering for a developmental writing course. Students who fail to have tickets removed from their records will receive additional penalties, such as a failing grade for the course, academic probation, or the inability to register for classes.
What would the consequences of such a policy be? First of all, it would mean that we would have to take writing-across-the curriculum more seriously than most of us do now. We would have to institute placement and assessment procedures to ensure that students receive effective introductory instruction and can demonstrate proficiency in writing at an appropriate level before moving forward.
Professors would also be required to get together, talk seriously and openly, and come to agreements about what they think are “generally accepted minimum standards of proficiency in writing” at various levels, in each discipline, and across the board. We would be required to develop more consistent ways of assigning, responding to, and evaluating writing. We would also have to join with our colleagues in academic support services to recruit, hire, and train effective tutors.
And we would have to issue tickets. Lots of them. But not so many after awhile when students soon learn the consequences of going too fast, too slow, or in the wrong direction, stopping in the wrong place or failing to stop altogether, forgetting to signal when making a turn, or just ending up in a wreck. Then there is that increasing problem of students who take someone else’s car for a joy ride.
Here’s your badge.
Laurence Musgrove is an associate professor of English and foreign languages at Saint Xavier University, in Chicago.