If we could retire for good one old expression from the Culture Wars, I’d like to nominate "the literary canon." Is there anything new to say about it? Has even the most gung-ho Culture Warrior seized a new bit of territory within recent memory? It looks as if all the positions have been occupied, and the battles fought to a dull standstill.
On the one side, Bill O’Reilly and his ilk passionately love Shakespeare. Or rather, they at least enjoy the idea that somebody else will be forced to read him. And on the other side, the fierce struggle to “open the canon” usually looks like an effort to break down an unlocked door.
Checking the entry in New Keywords: A Revised Vocabulary of Culture and Society -- a reference work recently published by Blackwell -- I learn that the canon is, by definition, always something open to revision. Which would, of course, come as a really big surprise to many generations of rabbis, priests, and imams.
But perhaps that underscores the real problem here. The term "canon" rests on an analogy between an established set of cultural masterpieces, on the one hand, and the authoritative body of scriptures, on the other hand. And the problem with this comparison is that, deep down, it is almost impossible to take seriously. "Canon" is not so much a concept as a dead metaphor -- or rather, perhaps, a stillborn one.
If you are a full-fledged resident of secular modernity (that is, somebody accustomed to the existence of a deep moat of separation between sacred and worldly institutions) then the strongest original sense of “the canon” is just barely imaginable.
And if you have rejected secular modernity altogether -- if you believe that God once broke into human affairs long enough to make perfectly clear what He has in mind for us -– then the notion of secular literary works as having some vaguely comparable degree of authority must seem absurd. Or blasphemous.
Once in a great while, a writer or thinker reframes things so that the expression seems to come back to life. The late Northrop Frye, for example, took seriously William Blake’s aphorism calling the Bible "the Great Code of Art." Frye worked out a theory of literature that, in effect, saw the entire DNA of Western literature as contained in Judeo-Christian scripture. And then there is the example of Adonis, the great Lebanese author, who has pointed to the challenge of creating poetry in Arabic. How can you obey the modernist imperative to "make it new" in a language deeply marked by the moment in time it was used to record the commands of God?
But Frye and Adonis are exceptions. Usually, when we talk about "the canon," it is without any strong sense of a complicated relationship between literature and authority. Between words and the Word.
Instead, the debates are really over the allocation of resources -- and the economy of prestige within academic institutions. To say that a given literary figure is "part of the canon" actually means any number of profitable investments have been made in the study of that author. Conversely, to "question the canon" is a strategic move with consequences for the bottom line. (As in, "Do we really need to hire a Miltonist?")
But that means we’ll never get rid of that expression “the literary canon” -- if only because it sounds more dignified than “the literary spreadsheet.”
Is that too cynical? Can’t we assume that works defined as canonical possess some quality that places them above the give-and-take of institutional horse trading?
As a roundabout way of thinking about such questions, let me point your attention to a seemingly unrelated item that appeared in The Washington Post over the weekend.
It seems that there has recently been an intense discussion on an Internet bulletin board in China devoted to the work of Lu Xun, an author who lived between 1881 and 1936. The exchanges concerned one text in particular, his essay "In Memory of Ms. Liu Hezhen” -- a work unfortunately not available online in English, so far as I can tell.
The essay appeared in April 1926, a few weeks after government troops opened fire on a demonstration, killing 40 students. One of them, a 22-year-old woman named Liu Hezhen, had been a devoted reader of Lu Xun’s literary magazine The Wilderness and attended his lectures on Chinese literature at National Beijing Women's Normal University.
She was, Lu wrote “a student of mine. At least, I used to think of her as one.... She, as a young Chinese woman who has dedicated her life to the nation, is no longer a student of a person like me, who still lingers on superfluously in this world.” (All quotations are from the translation appearing in Women in Republican China: A Sourcebook, edited by Hua R. Lan and Vanessa L. Fong, published by M.E. Sharpe in 1999.)
It is a moving essay, and there is now a substantial body of scholarly commentary on it. But as the Post article reported, the sudden interest in Lu Xun’s essay suggests that people are using it “as a pretext to discuss a more current and politically sensitive event -- the Dec. 6 police shooting of rural protesters in the southern town of Dongzhou in Guangdong province.” Despite the official news blackout and the Chinese government’s efforts to censor the Internet, it seems that information about the Dongzhou massacre is spreading.
This development raises complex questions about the role that new media play in developing countries, and under authoritarian regimes. This being the age of high tech, people always want to discuss it -- and, of course, we’d damned well better.
But to be honest, I found it a lot more interesting that people were using Lu Xun’s essay as a reference point. It points to questions about the relationship between literary power and political authority. That Chinese citizens are using the Web and instant messaging to execute an end-run around official censorship is certainly interesting and important. But so is the classic author they are rereading while so engaged.
It is hard to overstate the role that Lu Xun has played in Chinese culture over most of the past century. His martyred student Liu Hezhen was only one of thousands of young readers inspired by his work in the 1920s. He did not join the Communist Party, but drew close to it in the years before his death in 1936. And after the revolutionaries came to power in 1949, Lu was “canonized” in the strongest sense possible for a completely secular regime.
At the height of the Cultural Revolution (when, as a friend who lived through it once told me, the morning class in elementary school was math, and the afternoon was Mao), the selected quotations of Lu Xun were available in a little red book, just as the Great Helmsman’s were. And even after Mao’s own legacy was quietly downplayed in later decades, the field of “Lu Xun studies” continued as a basic part of Chinese scholarly life.
The novelist Ha Jin, professor of English at Boston University, gives some sense of the author’s continuing prominence in his introduction to a recent edition of Lu’s short stories. “Hundreds of books have been written on his life and writings,” he notes, “and several officially funded journals have been devoted to him. There are even papers on his real estate contracts, the aesthetics of the designs of his books, the rents he paid, and his favorite Japanese bookstores. Novels have appeared based on different periods and aspects of his life, not to mention movies, operas, and TV shows adapted from his fiction.”
All of this might look like evidence for the simplest model of how a literary canon is formed: An author gives voice to the ideology of the powers-that-be -- whether dead white property-owning European males, or revolutionary communist Chinese bureaucrats, or whatever. And those powers then return the favor by making the author a “classic.” All very clearcut, yes?
Actually, no. It happens that Lu Xun gained his prominence, not as an ideologue, but as a writer of great power -- a figure embodying both moral authority and a capacity for literary innovation.
His earliest work was written in the classic or high style of literary language. He gave an important course of lectures on the history of Chinese fiction, and was a master practitioner of the “eight-legged essay” (a very formal structure once used in civil-service exams for the Imperial bureaucracy).
But at some point in his 30s, Lu Xun had a creative breakthrough. He published a series of classic short stories combining sophisticated fictional technique with colloquial language. I don’t know Chinese, and must rely on the accounts of those who do. But even scholars disgusted by the official Maoist cult around Lu Xun admire his profound effect on the literary resources of the language. For example, in his book Lu Xun and Evolution (SUNY Press, 1998), James Reeve Pusey writes that the author “ ‘found himself’ in the creation of a new language, a highly literary, iconoclastically erudite, powerfully subtle vernacular that no one has since used with such mastery.”
And some of his power comes through even in translation. One of Lu Xun’s classic stories is “Diary of a Madman,” in which the everyday corruption and brutality of village life is seen as refracted through the mind of someone sinking ever deeper into paranoia. The narrator becomes convinced that the people around him practice cannibalism. His only hope, he confides to his diary, is that a few young people haven’t tasted human flesh. The final line of the story reads: “Save the children....”
Around the time government troops were shooting down students in 1926, Lu was drifting away from fiction. He instead concentrated on writing what were called zagan (“sundry thoughts”) or zawen (“miscellaneous writings”) -- short, topical prose compositions on whatever caught his attention. The state of his country worried him, and he poured his anger into hundreds of short pieces.
Not everyone liked this phase of his work. Have a look at the following bitter comment from 1931, by a critic who disliked Lu Xun’s later writings: “Zagan compositions, limited to a paltry thousand words, can naturally be done in one sweep of the brush. You catch at a thought, and in the time it takes to smoke a cigarette your thousand words are produced....There is just one formula for zagan compositions: either heated abuse or cold sarcasm. If you can append a word or two of cold sarcasm to the heated abuse, or insert some heated abuse amidst the cold sarcasm, that is all to the good.”
In short, Lu Xun invented the blog entry. (I’m sure that somewhat anachronistic thought has already occurred to people in China, who are discussing recent events via commentary on his work.)
His topics were as ephemeral as any newspaper article. But there is enough wordplay, historical allusion, metaphorical resonance, and heartfelt passion to make them something more than that. A whole scholarly industry is devoted to analyzing these essays. Indeed, by the late 1980s, the field of Lu Xun studies had become so “professionalized” (as that favorite expression of the MLA has it) that one young scholar was worried that it had become completely disconnected from anything of interest to the average reader.
So Lu Xun remains, by any definition, part of the Chinese literary canon, to use that word once again. (And if you see the revolutionary ideologies of the 20th century as continuing the old Gnostic heresy of “immanentizing the eschaton” -- as one school of conservative thinkers does -- then I suppose even the quasi-scriptural overtones might also apply.)
But does that mean that it would mean that China was democratizing only if Lu Xun lost his place? Or to put it more broadly: Are cultural and social power necessarily related? Don’t literary authority and political regime tend to be mutually reinforcing?
Those are open questions. But I can’t help thinking of another question – one that someone reportedly asked Mao in the late 1950s. What would Lu Xun’s role be if he were still alive? Mao answered that Lu would either remain quiet or go to jail. (And this from the man who canonized him. )
Rereading “In Memory of Miss Liu Hezhen” this week, I was struck in particular by the second of the essay’s seven parts. The translation is a little stiff, but the passage is worth quoting in full:
“A real hero should dare to face the tragedy of life and look unwaveringly at bloodshed. It is at once sorrowful and joyful! But the Creator has determined for the sake of the ordinary people to let time heal all the wounds and to leave behind only slight traces of blood and sorrow. It is in these traces of blood and sorrow that people get a humble life and manage to keep this woeful world going. When shall we see the light at the end of such a tunnel, I do not know.”
Imagine how much has been written about that passage over the past couple of weeks. And think of all the questions it must raise – about the past, about the future.
If I were a Chinese official with some interest in the long-term welfare of my own hide, then I might have a strong interest, right about now, in “opening up the canon.” (Or abolishing it.) Perhaps literature is an unreliable way of shoring up the established order and transmitting stabilizing cultural values. It might be a good idea to discourage the reading of Lu Xun, and get people to watch "Fear Factor " instead.
If my recent experiences are any indication, we professors face a daunting challenge: The polarized American political environment has conditioned our students to see life in monochrome. The Right tells them to view all as either black or white, while the Left insists that everything is a shade of gray.
We’ve long struggled with the either/or student, the one who writes a history essay in which events are stripped of nuance and presented as the working out of God’s preordained plan; or the sociology student who wants to view poverty as a modern variant of 19th century Social Darwinism. These students -- assuming they’re not acting out some ideological group’s agenda -- can be helped along simply by designing lessons that require them to argue opposing points of view.
Yet despite all the hoopla about the resurgence of conservatism, I get more students whose blinders are more postmodern than traditional. This is to say that many of them don’t see the value of holding a steadfast position on much of anything, nor do they exhibit much understanding of those who do. They live in worlds of constant parsing and exceptions. Let me illumine through two examples.
In history classes dealing with the Gilded Age I routinely assign Edward Bellamy’s utopian novel Looking Backward. In brief, protagonist Julian West employs a hypnotist for his insomnia and retires to an underground chamber. His Boston home burns in 1887 and West is not discovered until 2000, when he is revived by Dr. Leete. He awakens to a cooperative socialist utopia. West’s comments on his time say much about late 19th century social conflict, and Leete’s description of utopian Boston make for interesting class discussion. I know that some students will complain about the novel’s didactic tone, others will argue that Bellamy’s utopia is too homogeneous, and a few will assert that Bellamy’s explanation of how utopia emerged is contrived. What I had not foreseen is how many students find the very notion of a utopia so far-fetched that many can’t move beyond incredulity to consider other themes.
When I paraphrase Oscar Wilde that a map of the world that doesn’t include Utopia isn’t worth glancing at, some students simply don’t get it. "Utopia is impossible” is the most common remark I hear. “Perhaps so,” I challenge, “but is an impossible quest the same as a worthless quest?” That sparks some debate, but the room lights up when I ask students to explain why a utopia is impossible. Their reasons are rooted more in contemporary frustration than historical failure. Multiculturalism is often cited. “The world is too diverse to ever get people to agree” is one rejoinder I often receive.
It’s disturbing enough to contemplate that a social construct designed to promote global understanding can be twisted to justify existing social division, but far more unsettling was often comes next. When I ask students if they could envision dystopia, the floodgates open. No problems on that score! In fact, they draw upon popular culture to chronicle various forms of it: Escape From New York, Blade Runner, Planet of the Apes…. “Could any of these happen?” I timidly ask. “Oh sure, these could happen easily,” I’m told.
My second jolt came in a different form, an interdisciplinary course I teach in which students read Tim O’Brien’s elegantly written Vietnam War novel The Things They Carried. O’Brien violates old novelistic standards; his book is both fictional and autobiographical, with the lines between the two left deliberately blurred. My students adored the book and looked at me as if they had just seen a Model-T Ford when I mentioned that a few critics felt that the book was dishonest because it did not distinguish fact from imagination. “It says right on the cover ‘a work of fiction’” noted one student. When I countered that we ourselves we using it to discuss the actual Vietnam War, several students immediately defended the superiority of metaphorical truth because it “makes you think more.” I then asked students who had seen the film The Deer Hunter whether the famed Russian roulette scene was troubling, given that there was no recorded incident of such events taking place in Vietnam. None of them were bothered by this.
I mentioned John Sayles’ use of composite characters in the film Matewan. They had no problem with that, though none could tell me what actually happened during the bloody coal strikes that convulsed West Virginia in the early 1920s. When I probed whether writers or film makers have any responsibility to tell the truth, not a single student felt they did. “What about politicians?” I asked. While many felt that truth-telling politicians were no more likely than utopia, the consensus view was that they should tell the truth. I then queried, “So who gets to say who has to tell the truth and who gets to stretch it?” I was prepared to rest on my own clever laurels, until I got the students’ rejoinder! Two of my very best students said, in essence, that all ethics are situational, with one remarking, “No one believes there’s an absolute standard of right and wrong.” I tentatively reminded him that many of the 40 million Americans who call themselves "evangelical Christians" believe rather firmly in moral absolutes. From the back of the room pipped a voice, “They need to get over themselves.”
I should interject that this intense give-and-take was possible because I let my students know that their values are their own business. In this debate I went out of way to let them know I wasn’t condemning their values; in fact, I share many of their views on moral relativism, the ambiguity of truth, and artistic license. But I felt I could not allow them to dismiss objective reality so cavalierly. Nor, if I am true to my professed belief in the academy as a place where various viewpoints must be engaged, could I allow them to refuse to consider anyone who holds fast to moral absolutism.
The stories have semi-happy endings. I eventually got my history students to consider the usefulness of utopian thinking. This happened after I suggested that people of the late 19th century had better imaginations than those of the early 21st, which challenged them to contemplate the link between utopian visions and reform, and to see how a moralist like Bellamy could inspire what they would deem more pragmatic social changes. My O’Brien class came through when I taught the concept of simulacra, showed them a clip from the film Wag the Dog and then asked them to contemplate why some see disguised fiction as dangerous. (Some made connections to the current war in Iraq, but that’s another story!)
My goal in both cases was to make students see points of view other than their own. Both incidents also reminded me it’s not just the religious or conservative kids who need to broaden their horizons. We need to get all students to see the world in Technicolor, even when their own social palettes are monochromatic. Indeed, the entire academy could do worse than remember the words of Dudley Field Malone, one of the lawyers who defended John T. Scopes. Malone remarked, “I have never in my life learned anything from any man who agreed with me.”
Robert E. Weir
Robert E. Weir is a visiting professor at Commonwealth College of the University of Massachusetts at Amherst and in the American studies program at Smith College.
I fell in love in New Orleans. I was wandering through a crammed antique shop, meandering without any sense of time or purpose, enjoying the handsome legacies of earlier artisans. On the second floor, my eyes met the desk of my dreams. The slender, 18th-century spinet-shaped desk was made for writing. Slots, drawers and cubby-holes for inks, pens, paper, sealing wax faced the writer while the curved and sloping sides encircled the desktop, and each lifted to reveal a hidden pocket, waiting for some secret cache of letters, or poems. I lingered, stroking the numinous wood, imagining Jane Austen writing there. As I explored each clever nook, I saw the carpenter’s joy at his work, his art. In the presence of that creative genius, I wanted to write, to put his beauty to work for my art, to create in appreciation of, and inspired by, his creation.
I called my friend, also a writer, over and we admired the piece together, respectful of its sanguine beauty, and appreciative of the talent of its maker. We imagined who might have owned such a luxurious piece. We imagined the brilliant writing we’d surely produce if we owned a desk like this one. We imagined how we’d die a little if one of our cats should scratch its surface. If I’d had the money, I would have paid it without blinking; however, imagining was all I could do, given my tenuous employment, my small salary, and the $4,000 price tag.
Still I lingered at that desk, as now it lingers in my memory, and when I eventually came away, I possessed something more significant, sparked by the artist who, two centuries earlier, had put his hands to work. What emanated from his art, and that whole city, was the creative radiance of inspired delight.
That was my last visit to New Orleans, six years ago. My life changed that year. I lost the tenuous job, moved suddenly to a new, equally tenuous, one. When that ended too, I was adrift in unspent wishes and altered dreams. I moved home to Austin, a city of uncommon lives, and into my parents’ house. There I began a long, slow rebuilding. In the midst of my personal chaos came the larger chaos of September 11th. Though Septima Clark may “consider chaos a gift,” I could not rejoice. In the midst of the chaos, my creative brain was off.
Instead, I leaned on the creativity of others. Digging to the foundations of my education, I listened to Emerson telling me to be self-reliant, to have the courage to try many things, to be undaunted by the challenges we name failures. I re-read Thoreau. Since I felt I had nothing, his edict to simplify seemed easy enough to follow! I thought many times of the unconventional life of Emily Dickinson, living in her parents’ home her whole life. I looked around and saw others in my city living creative lives, living “weird” as we proudly say. I let go of orthodoxy, focusing instead on joy.
In Finding Your Own North Star, Martha Beck stipulates two rules for using joy to chart a course toward your north star:
Rule 1: If it brings you joy, do it.
Rule 2: No, really, if it brings you joy, do it.
Of course, she also cautions that it’s not as easy. It is impossible in the midst of chaos. It can, however, be a way out of chaos. After some of my chaos settled, I laughed that I had read Transcendentalists for personal gain. What a clue to who I am! I noticed my obsession with writing -- another little clue to that north star of mine. I returned to teaching, as an adjunct instructor, and loved it anew. I cobbled together writing, teaching, and also built a practice as a writing mentor. In this creative city, no one batted an eye, accepting my weird life as normal, and it wasn’t nearly as weird as some! Surrounded by creative lives, I found the courage to begin again. Eventually, I found I had something to say, and my writing erupted because, like Alice Walker writes, “there is a place the loss must go. There is a place the gain must go. The leftover love.” After great chaos, creativity arises. In the middle of creativity, creativity flourishes.
If you are in the middle of great chaos, anchor in the safe harbor of others’ creativity. “Human life itself may be almost pure chaos,” Katherine Anne Porter wrote, “but the work of the artist ... is to take these handfuls of confusion and disparate things, things that seem to be irreconcilable, and put them together in a frame to give them some kind of shape and meaning.” Seek music, seek literature, seek art. Stand outside in the creative genius of nature. Put your hands on a fine piece of furniture to feel the spirit of the carpenter who loved his work. Connect with all surrounding creators. Begin rebuilding.
Amy L. Wink
Amy L. Wink teaches English at Southwestern University, in Georgetown, Tex., and at Austin Community College.
Graphs, Maps, Trees: Abstract Models for a Literary History is a weird and stimulating little book by Franco Moretti, a professor of English and comparative literature at Stanford University. It was published a few months ago by Verso. But observation suggests that its argument, or rather its notoriety, now has much wider circulation than the book itself. That isn’t, I think, a good thing, though it is certainly the way of the world.
In a few months, Princeton University Press will bring out the first volume of The Novel: History, Geography, and Culture -- a set of papers edited by Moretti, based on the research program that he sketches in Graphs, Maps, Trees. (The Princeton edition of The Novel is a much-abridged translation of a work running to five volumes in Italian.) Perhaps that will redefine how Moretti’s work is understood. But for now, its reputation is a hostage to somewhat lazy journalistic caricature -- one mouthed, sometimes, even by people in literature departments.
What happened, it seems, is this: About two years ago, a prominent American newspaper devoted an article to Moretti’s work, announcing that he had launched a new wave of academic fashion by ignoring the content of novels and, instead, just counting them. Once, critics had practiced “close reading.” Moretti proposed what he called “distant reading.” Instead of looking at masterpieces, he and his students were preparing gigantic tables of data about how many books were published in the 19th century.
Harold Bloom, when reached for comment, gave one of those deep sighs for which he is so famous. (Imagine Zero Mostel playing a very weary Goethe.) And all over the country, people began smacking their foreheads in exaggerated gestures of astonishment. “Those wacky academics!” you could almost hear them say. “Counting novels! Whoever heard of such a thing? What’ll those professors think of next -- weighing them?”
In the meantime, it seems, Moretti and his students have been working their way across 19th century British literature with an adding machine -- tabulating shelf after shelf of Victorian novels, most of them utterly forgotten even while the Queen herself was alive. There is something almost urban legend-like about the whole enterprise. It has the quality of a cautionary tale about the dangers of pursuing graduate study in literature: You start out with a love of Dickens, but end up turning into Mr. Gradgrind.
That, anyway, is how Moretti’s “distant reading” looks ... well, from a distance. But things take on a somewhat different character if you actually spend some time with Moretti’s work itself.
As it happens, he has been publishing in English for quite some while: His collection of essays called Signs Taken for Wonders: On the Sociology of Literary Forms (Verso, 1983) was, for a long time, the only book I’d ever read by a contemporary Italian cultural theorist not named Umberto Eco. (It has recently been reissued as volume seven in Verso’s new Radical Thinkers series.) The papers in that volume include analyses of Restoration tragedy, of Balzac’s fiction, and of Joyce’s Ulysses.
In short, then, don’t believe the hype – the man is more than a bean-counter. There is even an anecdote circulating about how, during a lecture on “distant reading,” Moretti let slip a reference that he could only have known via close familiarity with an obscure 19th century novel. When questioned later -– so the story goes -– Moretti made some excuse for having accidentally read it. (Chances are this is an apocryphal story. It sounds like a reversal of David Lodge’s famous game of “intellectual strip-poker” called Humiliation.)
And yet it is quite literally true that Moretti and his followers are turning literary history into graphs and tables. So what’s really going on with Moretti’s work? Why are his students counting novels? Is there anything about “distant reading” that would be of interest to people who don’t, say, need to finish a dissertation on 19th century literature sometime soon? And the part, earlier, about how the next step would be to weigh the books -- that was a joke, right?
To address these and many other puzzling matters, I have prepared the following Brief Guide to Avoid Saying Anything Too Dumb About Franco Moretti.
He is doing literary history, not literary analysis. In other words, Moretti is not asking “What does [insert name of famous author or novel here] mean?” but rather, “How has literature changed over time? And are there patterns to how it has changed?” These are very different lines of inquiry, obviously. Moretti’s hunch is that it might be possible to think in a new way about what counts as “evidence” in cultural history.
Yes, in crunching numbers, he is messing with your head. The idea of using statistical methods to understand the long-term development of literary trends runs against some deeply entrenched patterns of thought. It violates the old idea that the natural sciences are engaged in the explanation of mathematically describable phenomena, while the humanities are devoted to the interpretation of meanings embedded in documents and cultural artifacts.
Many people in the humanities are now used to seeing diagrams and charts analyzing the structure of a given text. But there is something disconcerting about a work of literary history filled with quantitative tables and statistical graphs. In doing so, Moretti is not just being provocative. He’s trying to get you to “think outside the text,” so to speak.
Moretti is taking the long view.... A basic point of reference for his “distant reading” is the work of Fernand Braudel and the Annales school of historians who traced the very long-term development of social and economic trends. Instead of chronicling events and the doings of individuals (the ebb and flow of history), Braudel and company looked at tendencies taking shape over decades or centuries. With his tables and graphs showing the number (and variety) of novels offered to the reading public over the years, Moretti is trying to chart the longue dure’e of literary history, much as Braudel did the centuries-long development of the Mediterranean.
Some of the results are fascinating, even to the layperson’s eye. One of Moretti’s graphs shows the emergence of the market for novels in Britain, Japan, Italy, Spain, and Nigeria between about 1700 and 2000. In each case, the number of new novels produced per year grows -- not at the smooth, gradual pace one might expect, but with the wild upward surge one might expect of a lab rat’s increasing interest in a liquid cocaine drip.
“Five countries, three continents, over two centuries apart,” writes Moretti, “and it’s the same pattern ... in twenty years or so, the graph leaps from five [to] ten new titles per year, which means one new novel every month or so, to one new novel per week. And at that point, the horizon of novel-reading changes. As long as only a handful of new titles are published each year, I mean, novels remain unreliable products, that disappear for long stretches of time, and cannot really command the loyalty of the reading public; they are commodities, yes, but commodities still waiting for a fully developed market.”
But as that market emerges and consolidates itself -- with at least one new title per week becoming available -- the novel becomes “the great capitalist oxymoron of the regular novelty: the unexpected that is produced with such efficiency and punctuality that readers become unable to do without it.”
And then the niches emerge: The subgenres of fiction that appeal to a specific readership. On another table, Moretti shows the life-span of about four dozen varieties of fiction that scholars have identified as emerging in British fiction between 1740 and 1900. The first few genres appearing in the late 18th century (for example, the courtship novel, the picaresque, the “Oriental tale,” and the epistolary novel) tend to thrive for long periods. Then something happens: After about 1810, new genres tend to emerge, rise, and decline in waves that last about 25 years each.
“Instead of changing all the time and a little at a time,” as Moretti puts it, “the system stands still for decades, and is then ‘punctuated’ by brief bursts of invention: forms change once, rapidly, across the board, and then repeat themselves for two [to] three decades....”
Genres as distinct as the “romantic farrago,” the “silver-fork novel,” and the “conversion novel” all appear and fade at about the same time -– to be replaced a different constellation of new forms. It can’t, argues Moretti, just be a matter of novelists all being inspired at the same time. (Or running out of steam all at once.) The changes reflect “a sudden, total change of their ecosystem."
Moretti is a cultural Darwinist, or something like one. Anyway, he is offering an alternative to what we might call the “intelligent design” model of literary history, in which various masterpieces are the almost sacramental representatives of some Higher Power. (Call that Power what you will -– individual genius, “the literary imagination,” society, Western Civilization, etc.) Instead, the works and the genres that survive are, in effect, literary mutations that possess qualities that somehow permit them to adapt to changes in the social ecosystem.
Sherlock Holmes, for example, was not the only detective in Victorian popular literature, nor even the first. So why is it that we still read his adventures, and not those of his competitors? Moretti and his team looked at the work of Conan Doyle’s rivals. While clues and deductions were scattered around in their texts, the authors were often a bit off about how they were connected. (A detective might notice the clues, then end up solving the mystery through a psychic experience, for example.)
Clearly the idea of solving a crime by gathering clues and decoding their relationship was in the air. It was Conan Doyle’s breakthrough to create a character whose “amazing powers” were, effectively, just an extremely acute version of the rational powers shared by the reader. But the distinctiveness of that adaptation only comes into view by looking at hundreds of other texts in the literary ecosystem.
This is the tip of the tip of the iceberg. Moretti’s project is not limited by the frontiers of any given national literature. He takes seriously Goethe’s idea that all literature is now world literature. In theory, anyway, it would be possible to create a gigantic database tracking global literary history.
This would require enormous computational power, of course, along with an army of graduate students. (Most of them getting very, very annoyed as they keypunched data about Icelandic magazine fiction of the 1920s into their laptops.)
My own feeling is that life is much too short for that. But perhaps a case can be made for the heuristic value of imagining that kind of vast overview of how cultural forms spread and mutate over time. Only in part is Moretti’s work a matter of counting and classifying particular works. Ultimately, it’s about how literature is as much a part of the infrastructure of ordinary life as the grocery store or Netscape. And like them, it is caught up in economic and ecological processes that do not respect local boundaries.
That, anyway, is an introduction to some aspects of Moretti’s work. I’ve just learned that Jonathan Goodwin, a Brittain Postdoctoral Fellow at Georgia Tech, is organizing an online symposium on Moretti that will start next week at The Valve.
Goodwin reports that there is a chance Moretti himself may join the fray. In the interim, I will be trying to untangle some thoughts on whether his “distant reading” might owe something to the (resolutely uncybernetic) literary theory of Georg Lukacs. And one of the participants will be Cosma Shalizi, a visiting assistant professor of statistics at Carnegie Mellon University.
It probably wouldn’t do much good to invite Harold Bloom into the conversation. He is doubtless busy reciting Paradise Lost from memory, and thinking about Moretti would not be good for his health. Besides, all the sighing would be a distraction.
In 1991, Elliot L. Gilbert, chair of English at the University of California at Davis, went to the hospital for what ought to have been some fairly routine surgery. Mistakes were made. He died in the recovery room. His widow, Sandra M. Gilbert (also a professor of English at Davis), brought suit – a case finally settled out of court, but not before she piled up a mound of documents that gave her some sense of just what had happened. In Wrongful Death: A Memoir (Norton, 1995), she wrote: "Responsibility in the often miraculous but always highly technologized realm of modern medicine is so dispersed, so fragmented, that finally it accrues to no one."
Years earlier -- long before her work with Susan Gubar on the landmark work of American feminist literary criticism The Madwoman in the Attic: The Woman Writer and the Nineteenth Century Literary Imagination (Yale University Press, 1979) -- Gilbert had worked on a monograph she planned to call “‘Different, and Luckier’: Romantic and Post-Romantic Metaphors of Death.” The phrase in the title came from “Song of Myself,” in which Whitman declaimed that “to die is different from what anyone supposed, and luckier.”
A cosmic sentiment like that cannot do much to mitigate grief. But Gilbert dug the notes for that abandoned project out of her files, and has just published a remarkable book, Death’s Door: Modern Dying and the Ways We Grieve (also from Norton), which revisits her longstanding interest in elegy.
Calling Death’s Door a work of literary criticism, while accurate enough, seems very incomplete. Like Wrongful Death, it recounts the story of her husband’s death. It also offers a historical meditation on the emergence of what Gilbert calls the “technologies” of death and grief. (The famous “five stages” of confronting mortality, while originally meant as descriptive, now seems at times both prescriptive and somewhat compulsory. Woe to anyone who doesn’t follow the script.)
It is a rich book, and a deep one – and also, at times, somewhat terrifying to read, for it is the work of someone for whom “the denial of death” is simply not an option. While reading Death’s Door, I contacted the author to ask her a few questions. The following interview took place by e-mail.
Q:The book seems like a hybrid -- part memoir, part cultural history, part critical study. Those categories correspond reasonably well to the three big sections you've divided it into, but there are also margins of overlap. How did you come to understand just what kind of book Death's Door was turning out to be?
A: Yes, the book is indeed a kind of hybrid, or as my son put it, it's an attempt at "genre-bending." But I hadn't planned it that way. In fact, I began the work as a fairly traditional project in literary criticism. My goal was to explore what I called "the fate of the elegy" in the 20th century and beyond -- although even as I formulated that project the ambiguity of the word "fate" had begun to haunt me.
Did I intend to explore the evolution of the modern and contemporary elegy? Or did I want to explore modern ideas about fate in the elegy? If the latter, I was already moving beyond purely literary analyses into cultural studies. In any case, however, once I began researching and writing Death's Door it became clear to me that I was no longer able to do critical and scholarly work in the way I had.
As you've noted, following my husband's unexpected death in 1991, I felt compelled to tell his story -- including what I'd been able to reconstruct about the medical negligence that evidently killed him -- in my memoir, Wrongful Death. But the very mode in which I'd written that book, along with the elegiac poems clustered around it, had changed my way of writing. I had wanted to bear witness to my husband's loss of life and to my own grief. And now, as I began drafting Death's Door, I was still working in a testimonial mode, although with much greater self-awareness and, I think, with much larger ambitions, for now I was using my own case as an entry into meditations on the cultural formulations that shape our mourning and on the literary forms in which we mourn.
Of course, I should note here that, as a number of commentators have observed, beginning in the late 80's and 90's many academics in my own fields (literary studies, women's studies) had started writing autobiographically, testing their postulates, in effect, on their own pulses. So I wasn't alone in my sense that I needed a new and different way to approach my subject. And of course, as a feminist critic, I'd always argued that the personal was not only the political but the poetical.
Nonetheless, I suspect that the urgency of my need to "genre-bend" developed out of what I experienced, in early widowhood, as an urgent (indeed a surprisingly urgent) responsibility to testify about my own family's sorrow.
Q: One the one hand, you offer a phenemenology of death and grieving; that is, a description of the kinds of experience of care, fear, concern, etc. that seem to be just about as inescapable as mortality itself. On the other hand, you draw quite a bit on social and cultural history. That serves as a reminder that our vocabulary (and, to whatever degree, our experience) has been conditioned or "constructed." So at the risk of asking you to make an absurd choice: Which comes first? Which is definitive? The intimate level of experience or the social level of cultural meanings?
A: I think both are equally important but they're also inextricably related. As you point out, there's a phenomenology of death and grief that's as inescapable as mortality itself, and this manifests itself cross-culturally as well as trans-historically.
As I try to show in the book, almost every society imagines death as a kind of "place" that one enters, and all around the world people are haunted by what's often experienced as the nearness of the dead. (Are the dead on what George Eliot called -- in a different context -- "the other side of silence," merely separated from us by no more than "a thin piece of silk," as one of W.G. Sebald's narrators puts it?) Wherever we believe the dead mysteriously survive, many cultures have also experienced the dead as needy, often angry or sorrowful.
Throughout history, too, and worldwide, mourners have structured grief in special ways, elaborately patterning the prayers or diatribes with which the bereaved implore or reproach the gods, the fates, and the dead themselves. And again, almost everywhere the spouses -- especially the widows -- of those who die occupy a crucial place in ceremonies of grief. So these are all matters I investigate in the first major section of Death's Door, which takes as its intellectual starting point Zygmunt Bauman's comment that the omnipresence of "funeral rites and ritualized commemoration of the dead" along with the "discovery of graves and cemeteries" is thought by anthropologists to constitute "proof that a humanoid strain ... had passed the threshold of humanhood."
At the same time, as I argue throughout the second major section of Death's Door, history "makes" death, shaping both how we die and how we mourn. So our persistent human needs to imagine the fate of the dead and to pattern grief in special ways are formed, informed, and reformed by all kinds of cultural changes.
In most English-speaking nations, for instance -- and these are the societies that concern me most in the book -- the traditional visions of God and the afterlife that had already begun to disappear in the 19th century continued to erode throughout the 20th century, at least among the educated classes that produce poets, novelists, journalists, and film-makers. And as historians and sociologists from Phillipe Aries to David Moller have shown, everyone, no matter the class, dies differently now than in the past -- more privately yet often more technologically, in hospitals equipped with unnervingly complex machinery.
All of us, too, share a recent history of mass "death events," from the killing fields of the first World War to the Holocaust and Hiroshima in the second World War and on through Vietnam to the "shock and awe" of the present -- and surely this history has re-made our ideas of death and dying while changing our relationship to grief. The skeletons in the trenches of No Man's Land and the corpses charring in the crematoria of Auschwitz point down to an abyss of nihilism rather than up to heaven. But if we no longer hope for a redemptive heaven, then maybe we don't want to talk about death, maybe we need to deny its imminence.
Yet even while our theology and technology have grown increasingly nihilistic, we're quite literally haunted by images of the dead that refuse to leave us because they reside in celluloid or virtual permanence, populating our photo albums, movie screens, home videos, even digital libraries. How does this conflict between the real absence and the virtual presence of the dead change our modes of mourning?
Finally, then, as I worked on Death's Door I became increasingly conscious that the need to grieve whose urgencies I shared with mourners everywhere had a special 20th-century shape. For one thing (and this helped me understand a number of elegies I studied), I experienced my mourning as curiously embarrassing to many people I met, as if, because we fear death, we fear mourners too and suspect their sorrow might be somehow contaminating. In response to such embarrassment, I guess I sometimes become defiantly testimonial about my loss, both in prose (in Wrongful Death, for instance) and in poetry (in the elegies I published in my collection Ghost Volcano). And countless memoirists have done the same thing (most recently and famously Joan Didion) along with contemporary poets from Allen Ginsberg ( Kaddish) to Sharon Olds ( The Father), Ted Hughes ( Birthday Letters), and Donald Hall ( Without).
Q:The deep, dark core of the book is the contrast you make between "expiration" and "termination." It seems like that distinction is where the elements of memoir, cultural history, and literary analysis all link up.
A: "The deep dark core of the book." Thank you. That's a really incisive and insightful point because the basic argument of the book -- certainly the argument about the "fate of the elegy" -- began with my own experience of that distinction.
In chapter six, I tell the story of two episodes that powerfully moved me. In the first, the surgeon who was in charge of my husband's case testified that he had arrived at the hospital when his patient (my husband) was "terminating" -- i.e., dying. In the second, a nurse, more than three decades earlier, told me that my first child (a very premature baby who survived a few days) had "expired" -- i.e., died. After the doctor talked about "termination," the two words became so resonant for me that I brooded on them for quite some time.
To "terminate" is to come to a flat end. To "expire" is to breathe out something -- a breath that represents, perhaps, a soul. So each word seemed to me to have key metaphysical implications. "Termination," I decided, is modernity's definition of death; "expiration" the more traditional western (Christian) notion. For "termination" leads to Beckett, to what in Waiting for Godot Lucky calls "the earth abode of stones" while "expiration" empowers Milton, whose "Lycidas" has breathed out a soul that ultimately lands in heaven, where "entertain him all the saints above." So "termination" is terrifying, makes death almost unspeakably scary, and leads toward horror, repression, and denial, while "expiration" leaves us with some hope -- or anyway it used to.
Q:Your book isn't anti-technology, as such. But I did get the sense you were making the case for literature (and poetry in particular) as capable of providing something unavailable from the medical system. Almost an old-fashioned notion of the humanities as corrective -- if not to science, then to the scientistic or technocratic mentality. Or is that reading of your project off, in some way?
A: I'm not sure that I want to make a case for poetry, and more generally the humanities, as corrective, curative, or medicinal. But I do think I want to note that poets (and novelists and memoirists too, but especially poets) have refused to deny death and grief in a culture that finds these tokens of inescapable mortality at the least embarrassing because at the worst horrifying.
Poets testify, bear witness to the particulars of pain, the details of loss that technology flattens or sometimes even seeks to annihilate with words like "termination." I don't mean to suggest that those who work among the dying -- doctors in hospitals, medics on battlefields -- don't notice these details, but the language of science is in its way sedative, just as medicine's goals are (often appropriately) sedative and palliative. Poets remind us of what really happens. They don't take away the pain: on the contrary, they teach us how to feel it, to meet it, to know it.
Q:With all the quotations you incorporate, Death's Door serves (de facto anyway) as a kind of anthology. Was there a particular poem or passage that you recall as really being definitive for you? (In whatever way you'd construe "definative" as meaning: epiphantic, consoling, etc.)
A: No, there was no one poem that dramatized for me the practice of contemporary elegists, although there were several works that functioned for me as aesthetic manifestos -- most notably, perhaps, W.C. Williams's "Tract" (about "how to perform a funeral") and Stevens's "The Owl in the Sarcophagus" (about the "mythology of modern death" and its "monsters of elegy").
But before I began drafting Death's Door I had put together an anthology of traditional and modern elegies in a book called Inventions of Farewell, and in assembling this volume I found that, taken together, the elegies poets have produced from the mid-twentieth century onwards functioned for me as radiant examples of what I mean when I say that recent poets insist with unprecedented passion on the particulars of pain and grief.
Think of the resonant specifics Thom Gunn compiles in The Man with Night Sweats or, earlier, the details Ginsberg unflinchingly offers in Kaddish, Olds in The Father, Hall in Without. But I could go on and on about this historically "monstrous" elegiac genre, which dates back to the poems Wilfred Owen, Siegfried Sassoon and others sent back from the Front during the first World War or, even earlier, to Hardy's Poems of 1912. These writers won't let us forget -- as Tolstoy wouldn't either, in The Death of Ivan Ilyich --that death and its sorrows are often excruciating physical processes whose course usually binds and bends the spirit to the body's sufferings.
Such art may not be "consoling" in the traditional sense, but it consoles because it confronts pain and because in doing so it helps us accept loss, lets us know we aren't alone, and teaches us to hope that if we can articulate our suffering we can somehow master it or at least pass through and beyond it.
Paula M. Krebs has been a professor of English at Wheaton College, a selective New England liberal arts college, for 15 years, since earning her Ph.D. at Indiana University. Her sister Mary Krebs Flaherty has been an administrative assistant at Rutgers University’s Camden campus for a year longer than Paula has been at Wheaton. Last fall Mary taught her first course, Basic Writing Skills III, on the inner-city, campus of a two-year college, Camden County College. She teaches on her lunch break from her job at Rutgers. Mary has been taking evening classes toward her M.A. for three years, ever since she finished her B.A. at Rutgers via the same part-time route. This article is the first in a series in which Paula and Mary will discuss what it’s like to teach English at their respective institutions.
Paula: My place is about as different from yours as can be, I know. I often find myself longing for your city setting, your students who are so motivated. At the same time, I realize that teaching my students is a real privilege -- I can push them in exciting ways. Wheaton’s admissions standards keep going up, and I’m starting to see it in my classes. This semester my sophomores in English 290, Approaches to Literature and Culture, seemed to finish with a really good sense of how they can use literary criticism and theory in writing essays for their other English classes. They weren’t intimidated by the critics and theorists they were reading -- they actually used them well in their final essays.If only they could follow MLA style and prepare a proper Works Cited!
Mary: MLA style is something my students can do. They were able to pick up on it easily -- I think that’s because they take well to the idea of structure. They like the five-paragraph theme. The part of the class they had the most difficulty with was the content of their papers -- they couldn’t find their voice at all, let alone critiquing literary theorists.
Paula: Oh, mine had plenty of voice. Sometimes I wished for a bit less voice and a bit more work. I think sometimes that the sense of entitlement many of them have means that they don’t necessarily understand that their word isn’t always good enough. They need to cite some authorities, place their work in a larger context, indicate their scholarly debts. They have pretty good skills coming in, so it’s sometimes difficult to make clear to them how they can push to the next level. If they’ve been getting A’s on their five-paragraph themes in high school, they find it difficult to understand why their first efforts, in English 101 or a beginning lit class, are producing C+’s or B-‘s. Some are grade-grubbers, but most just don’t understand what makes a college A.
Mary: Just a week before the semester ended, one of my students finally understood what makes a college B. In the beginning of the term, her grades were “R’s,” which means that the paper cannot receive a grade; it must be revised. When she failed her midterm portfolio, she cried to me that she couldn’t see her mistakes so she couldn’t fix them. She continued to work on her essays and revise them, over and over. Close to the end of the semester, she approached me before class and said, “Mary, please take a look at this paper that someone wrote for another class and tell me what you think.” Knowing that I was being set up, I quickly looked over the essay. Out of the corner of my eye, I could see her smirking, so I told her “You’re right, I wouldn’t have graded this paper.” She shouted, “I knew it! Look at the subject-verb agreement error in the first sentence. There’s even a fragment in the introduction!” Not wanting to trash another teacher’s grading, I pointed out to her that the most important thing was how she had changed since midterm -- that she was now able to identify mistakes so she could correct her own. She passed the course with a B and I am so proud of her.
Paula: See, that’s what’s so great about teaching! I knew you’d love it. That pleasure when you see the lightbulb go on over their heads. That’s the same at Camden County as at Wheaton. But I think you have to do a different kind of work than I have to do in order to get it to happen. In some ways, both our students believe in the value of what we’re teaching, but we both have to do some convincing as well.
Mary: Mine need convincing that what they have to say is important and that saying it in an academic format is worth the effort. Most of the Camden campus students are from Camden city, recently awarded the dubious distinction of being named the most dangerous city in the nation for the second year in a row. They are typically from poor or working class families whose parent(s) may or may not have a high school diploma; many students are parents themselves, and most are minorities: African-American, Latino, or Asian-American. Many CCC students test into basic writing or reading skills classes, which is an indicator that their high school education did not prepare them well enough for college. In an informal discussion, I asked several students about their high school experience, and they claimed that they were never asked to write for content in English class -- the focus was on grammar and fill-in-the-blank or short answer tests. This explains why they are more comfortable with the grammar portion of the writing skills class, as well as how easily they grasp the five-paragraph essay structure. Following the rules is easy for these students, but finding something to say is much more difficult. I am there to assist them in this writing process and hopefully to convince them that they can grow as individuals and be successful in the academic community.
Paula: I have to do some of that, too. But we’re starting from such different places. Mine come to college because it’s expected of them. They need convincing that a liberal arts education really can bring them advantages after they graduate -- that digging into how a literary text works, learning to put together a really well researched research essay, or understanding the connections between Darwin and the poetry of Robert Browning is worth the money the parents are investing and the time the students are investing. In some ways, it’s a harder sell than yours. I have the luxury of time, though, in a way you sure don’t. My teaching is my full-time job, and my teaching load is relatively light. I can’t even imagine what it is like for you, working fulltime and taking classes while learning to teach in probably the most challenging of circumstances -- as an adjunct at a community college. I know how hard it is for you to keep all these balls in the air. Do you think it’ll be worth it in the long run?
Mary: I certainly hope so. That’s the reason I’m teaching this year -- to find out the answer to that very question.
Paula M. Krebs and Mary Krebs Flaherty
Paula and Mary's next exchange will be about the out-of-classroom work they can ask of students.
Normally my social calendar is slightly less crowded than that of Raskolnikov in Crime and Punishment. (He, at least, went out to see the pawnbroker.) But late last month, in an unprecedented burst of gregariousness, I had a couple of memorable visits with scholars who had come to town – small, impromptu get-togethers that were not just lively but, in a way, remarkable.
The first occurred just before Christmas, and it included (besides your feuilletonist reporter) a political scientist, a statistician, and a philosopher. The next gathering, also for lunch, took place a week later, during the convention of the Modern Language Association. Looking around the table, I drew up a quick census. One guest worked on British novels of the Victorian era. Another writes about contemporary postcolonial fiction and poetry. We had two Americanists, but of somewhat different specialist species; besides, one was a tenured professor, while the other is just starting his dissertation. And, finally, there was, once again, a philosopher. (Actually it was the same philosopher, visiting from Singapore and in town for a while.)
If the range of disciplines or specialties was unusual, so the was the degree of conviviality. Most of us had never met in person before -- though you’d never have known that from the flow of the conversation, which never seemed to slow down for very long. Shared interests and familiar arguments (some of them pretty esoteric) kept coming up. So did news about an electronic publishing initiative some of the participants were trying to get started. On at least one occasion in either meal, someone had to pull out a notebook to have someone else jot down an interesting citation to look up later.
In each case, the members of the ad hoc symposium were academic bloggers who had gotten to know one another online. That explained the conversational dynamics -- the sense, which was vivid and unmistakable, of continuing discussions in person that hadn’t started upon arriving at the restaurant, and wouldn’t end once everyone had dispersed.
The whole experience was too easygoing to call impressive, exactly. But later -- contemplating matters back at my hovel, over a slice of black bread and a bowl of cold cabbage soup -- I couldn’t help thinking that something very interesting had taken place. Something having little do with blogging, as such. Something that runs against the grain of how academic life in the United States has developed over the past two hundred years.
At least that’s my impression from having read Thomas Bender’s book Intellect and Public Life: Essays on the Social History of Academic Intellectuals in the United States, published by Johns Hopkins University Press in 1993. That was back when even knowing how to create a Web page would raise eyebrows in some departments. (Imagine the warnings that Ivan Tribble might have issued, at the time.)
But the specific paper I’m thinking of – reprinted as the first chapter – is even older. It’s called “The Cultures of Intellectual Life: The City and the Professions,” and Bender first presented it as a lecture in 1977. (He is currently professor of history at New York University.)
Although he does not exactly put it this way, Bender’s topic is how scholars learn to say “we.” An intellectual historian, he writes, is engaged in studying “an exceedingly complex interaction between speakers and hearers, writers and readers.” And the framework for that “dynamic interplay” has itself changed over time. Recognizing this is the first step towards understanding that the familiar patterns of cultural life – including those that prevail in academe – aren’t set in stone. (It’s easy to give lip service to this principle. Actually thinking through its implications, though, not so much.)
The history of American intellectual life, as Bender outlines it, involved a transition from civic professionalism (which prevailed in the 18th and early 19th centuries) to disciplinary professionalism (increasingly dominant after about 1850).
“Early American professionals,” he writes, “were essentially community oriented. Entry to the professions was usually through local elite sponsorship, and professionals won public trust within this established social context rather than through certification.” One’s prestige and authority was very strongly linked to a sense of belonging to the educated class of a given city.
Bender gives as an example the career of Samuel Bard, the New York doctor who championed building a hospital to improve the quality of medical instruction available from King’s College, as Columbia University was known back in the 1770). Bard had studied in Edinburgh and wanted New York to develop institutions of similar caliber; he also took the lead in creating a major library and two learned societies.
“These efforts in civic improvement were the product of the combined energies of the educated and the powerful in the city,” writes Bender, “and they integrated and gave shape to its intellectual life.”
Nor was this phenomenon restricted to major cities in the East. Visiting the United States in the early 1840s, the British geologist Charles Lyell noted that doctors, lawyers, scientists, and merchants with literary interests in Cincinnati “form[ed] a society of a superior kind.” Likewise, William Dean Howells recalled how, at this father’s printing office in a small Ohio town, the educated sort dropped in “to stand with their back to our stove and challenge opinion concerning Holmes and Poe, Irving and Macauley....”
In short, a great deal of one’s sense of cultural “belonging” was bound up with community institutions -- whether that meant a formally established local society for the advancement of learning, or an ad hoc discussion circle warming its collective backside near a stove.
But a deep structural change was already taking shape. The German model of the research university came into ever greater prominence, especially in the decades following the Civil War. The founding of Johns Hopkins University in 1876 defined the shape of things to come. “The original faculty of philosophy,” notes Bender, “included no Baltimoreans, and no major appointments in the medical school went to members of the local medical community.” William Welch, the first dean of the Johns Hopkins School of Medicine, “identified with his profession in a new way; it was a branch of science -- a discipline -- not a civic role.”
Under the old regime, the doctors, lawyers, scientists, and literary authors of a given city might feel reasonably comfortable in sharing the first-person plural. But life began to change as, in Bender’s words, “people of ideas were inducted, increasingly through the emerging university system, into the restricted worlds of specialized discourse.” If you said “we,” it probably referred to the community of other geologists, poets, or small-claims litigators.
“Knowledge and competence increasingly developed out of the internal dynamics of esoteric disciplines rather than within the context of shared perceptions of public needs,” writes Bender. “This is not to say that professionalized disciplines or the modern service professions that imitated them became socially irresponsible. But their contributions to society began to flow from their own self-definitions rather than from a reciprocal engagement with general public discourse.”
Now, there is a definite note of sadness in Bender’s narrative – as there always tends to be in accounts of the shift from Gemeinschaft to Gesellschaft. Yet it is also clear that the transformation from civic to disciplinary professionalism was necessary.
“The new disciplines offered relatively precise subject matter and procedures,” Bender concedes, “at a time when both were greatly confused. The new professionalism also promised guarantees of competence -- certification -- in an era when criteria of intellectual authority were vague and professional performance was unreliable.”
But in the epilogue to Intellect and Public Life, Bender suggests that the process eventually went too far. “The risk now is precisely the opposite,” he writes. “Academe is threatened by the twin dangers of fossilization and scholasticism (of three types: tedium, high tech, and radical chic). The agenda for the next decade, at least as I see it, ought to be the opening up of the disciplines, the ventilating of professional communities that have come to share too much and that have become too self-referential.”
He wrote that in 1993. We are now more than a decade downstream. I don’t know that anyone else at the lunchtime gatherings last month had Thomas Bender’s analysis in mind. But it has been interesting to think about those meetings with reference to his categories.
The people around the table, each time, didn’t share a civic identity: We weren’t all from the same city, or even from the same country. Nor was it a matter of sharing the same disciplinary background – though no effort was made to be “interdisciplinary” in any very deliberate way, either. At the same time, I should make clear that the conversations were pretty definitely academic: “How long before hundreds of people in literary studies start trying to master set theory, now that Alain Badiou is being translated?” rather than, “Who do you think is going to win American Idol?”
Of course, two casual gatherings for lunch does not a profound cultural shift make. But it was hard not to think something interesting had just transpired: A new sort of collegiality, stretching across both geographic and professional distances, fostered by online communication but not confined to it.
The discussions were fueled by the scholarly interests of the participants. But there was a built-in expectation that you would be willing to explain your references to someone who didn’t share them. And none of it seems at all likely to win the interest (let alone the approval) of academic bureaucrats.
Surely other people must be discovering and creating this sort of thing -- this experience of communitas. Or is that merely a dream?
It is not a matter of turning back the clock -- of undoing the division of labor that has created specialization. That really would be a dream.
But as Bender puts it, cultural life is shaped by “patterns of interaction” that develop over long periods of time. For younger scholars, anyway, the routine give-and-take of online communication (along with the relative ease of linking to documents that support a point or amplify a nuance) may become part of the deep grammar of how they think and argue. And if enough of them become accustomed to discussing their research with people working in other disciplines, who knows what could happen?
“What our contemporary culture wants,” as Bender put it in 1993, “is the combination of theoretical abstraction and historical concreteness, technical precision and civic give-and-take, data and rhetoric.” We aren’t there, of course, or anywhere near it. But sometimes it does seem as if there might yet be grounds for optimism.
Amelia, a university sophomore, scores a 60 on her first academic paper. On her second she scores a 60 again. On her third paper, she pulls up to an 80 -- mostly due to extensive rewrites. Yet on her midterm and final, she received an astounding 90 and 85. Not only was her paragraph structure and use of quotations significantly better, but her ability to sequence ideas and support claims had taken a leap. Even her mechanics (grammar, sentence structure and punctuation) had improved.
I'd like to say that these two high scores came at the end of the semester; this would prove what an effective instructor I was. Instead, they came at odd times -- the first A came just after the second paper (which scored a D). The solid B paper did come at the end of the semester. The difference was in how the papers were produced. Both the 90 and 85 papers were handwritten in-class timed essays that constituted the midterm and final. The much lower scores were for computer-generated papers that she produced out of class. These, of course, could be rewritten over and over before the due dates.
I'd like to say that Amelia's experience is an anomaly. But I can't. In fact, this semester, 8 of my 20 sophomore English composition students scored significantly better on in-class essays written by hand in a timed situation. Some jumped more than a full grade level. In my three freshman composition classes, almost 20 of 60 students excelled when allowed to write in class rather than compose typed papers on their own time. In fact, at a large community college in California where I taught for six years, I frequently saw 10 to 25 percent of my developmental- and freshman-level writers do significantly better when asked to compose in-class with a topic given just before a two-hour writing period.
How can I make sense of this? Of course I immediately considered my grading rubric. Was I somehow more relaxed when grading handwritten essays? Possible. But in my mind, that could not explain jumps from 75 to 90. Yes, I was somewhat easier on misspelled words when grading handwritten essays. Yes, I may have been swayed by a student's handwriting -- in fact, studies have shown that instructors are often influenced to grade slightly higher or lower, depending on a student's handwriting. But in my mind, there must have been something more to explain jumps of more than a full grade level.
Finally I typed up a student's handwritten midterm and compared it to two computer-generated essays. The handwritten midterm was so much smoother -- I was shocked. Transitions abounded. Other than a few run-ons, sentence structure was fluid. One idea followed another. Claims were supported. The writer seemed to have hit a stride that held out for the required three pages. The computer-generated essays were passable. The ideas were sound, but the writing seemed awkward in every sense. Other than the possibility that I was flawed in my grading, there were several explanations for this jump.
First, the process of writing in-class in a timed situation seemed to discourage the kind of overwrought, constipated writing that some students produce with a typed paper. In my courses, I appeal to the high-context student. After wrangling syllabi for seven years, I've come to the conclusion that I like giving the students necessary information on the front end. After the first class, students walk away with a course outline that gives out specific due dates for all papers -- along with general topics. Those who are worried about their ability to produce college-level work may start on a paper ahead of time and rewrite up until the due date.
Although my office hours are busier at the end of the semester, I do notice an influx of students a week before each paper is due. The good news is that some of these students are producing better work -- their essay structure is sound, their now-approved thesis statements are well supported, and their conclusion doesn't sound tossed-together. The bad news is that some of these well-intentioned students are working, rethinking, and rewriting their papers until they become stiff and self-conscious. They rehash each sentence, tormenting themselves, rewriting until they can no longer see what works anymore. Suddenly their original draft has become stiff and mechanical -- and the due date is looming.
These students often relate number of hours to their final grade. Thus every weekend they have poured into an eight-page study of the topic should translate into a 10 percent jump in grade. Unfortunately, the reality is that trying to infuse light and spontaneity into a paper that has been reworked several times is impossible. So the end product is dull and overworked -- and their grade less than what they expected.
In-class writing, on the other hand, is a completely different form of exercise. Instead of dumping hours and hours into a format that already feels old and overdone, students are given a topic at the top of the hour. True, some students choke. They deliver half a paper. What is on the page is poorly thought-out and incoherent. Yet some, relieved of the need to think and rethink the topic, find themselves rising to the challenge. After outlining for 15 minutes, they find themselves churning out coherent paragraphs that stand together as a unified essay. I've never been able to predict which way a student will perform. It is only when I've graded their midterm that I can make observations about which process seems to produce the best written work.
Next, handwriting encourages students to focus on the writing process; for those less experienced with computers, keyboarding encourages students to focus on the end product. When asked to type up a sample paragraph in a classroom computer lab, all 20 of my English composition students spent more than 15-minutes setting up a document in MSWord, setting margins, choosing a font, centering a title, and typing up their names, instructor's name and class name at the top so that it sat flush-right. This left a disappointing 30-minutes of actual composing of text -- and of that, approximately five to nine more minutes were wasted when students insisted on particular line breaks with text, tried to change the amount of space between lines, and attempted to remove forced underlining of URLs.
Students' questions were not about how to approach the topic -- but were focused on the particular mechanics of the assignment: how many words they would have to provide, whether they could utilize grammar- and spell-check, whether the sample was to be single, one-and-a-half, or double-spaced, if one-inch margins were acceptable and the like. I started to feel like a software instructor instead of an English composition teacher. My frustration was compounded when students either couldn't print out their single paragraphs -- or attempted to e-mail them to me.
Second, handwriting brings writers closer to their work -- which may encourage excellence with particular students. Daniel Chandler, a scholar out of the University of Wales, has done extensive research on how students learn. His article, "The Phenomenology of Writing by Hand," comments on the conditions present when writers write by hand rather than by computer -- and the effect on the end product. In effect, the neurophysiological mechanism of each process is different. And although both handwriting and typing are under the influence of the central nervous system, the dynamics are noticeably different.
With substantial practice at the keyboard, I do believe that students are can become more "fluent" at writing and produce a product as creative as that produced by handwriting. In fact, studies often show that students do as well on a computer than they do handwriting compositions.
In the end, questions still remain for me. How does the time-constraint affect the end product? Do some students simply do better under pressure? Is there something about the timed in-class work that encourages a more focused end product? Does directly typing a work somehow encourage a piecemeal approach? If offered an in-class essay exam with computers, would students then do substantially better than those who chose handwriting? How does typing speed and familiarity with software and hardware impact a student's work?
What about the "power of print"? Isn't it true that students often view a typed paper as an "end product" whereas handwritten work feels like a step in a process? And, of course, how exactly can ideas be more "fluid" with the preferred composition method -- whether it be writing by hand or word processing? With research, more will be revealed. Until then, I will give my students the benefit of both methods. I will continue to offer both in- and out-of-class writing. Those who flourish with the additional time for writing will produce more polished work; those who chafe with the weight of long-term deadlines will rush into the midterm and final to write well -- and ultimately both groups will find the process that produces the best work. Those students who then hone their ability to do both handwriting and word processing may do better in all areas; the resulting degreed professionals may find that both processes serve them well.
Shari Wilson, who writes Nomad Scholar under a pseudonym, explores life off the tenure track.
About 10 minutes into last week's now legendary episode of Oprah (the show that made it to the front page of newspapers; the one that left "memoirist" James Frey on the verge of confessing that he possibly made up his own name, but couldn’t be sure), one part of my mind was riveted to the tube while another part wandered off to conduct an intensive seminar about the whole thing, complete with Power Point slides containing extensive quotations from Foucault’s late writings on the "technologies of the self."
This happens a lot, actually. What Steve Martin once said about philosophy also applies to cultural theory: "When you study it in college, you learn just enough to screw you up for the rest of your life."
Well, it turns out that a certain amount of my seminar was just a repetition of work already done in the field that we might well call "Oprah studies." It has a substantial literature, including four academic books and numerous journal articles, most of which I have read over the past few days. Some of it is smart and insightful. Some of it consists of banalities gussied up with footnotes. In other words, it's like Shakespeare criticism, only there isn't as much of it.
Though there's plenty, to be sure. I've now spent more time reading the literature than I ever have watching the show. Some of it has been very instructive. There was, for example, a journal article from a few years ago complaining that other scholars had not grasped Oprah's postmodernity because they had failed to draw on Mikhail Bakhtin’s work on dialogism.
What important results follow from applying Bakhtin? Well, the concept of dialogism reveals that on talk shows, people talk to one another.
We may not have realized that before. But we do now. Scholarship is cumulative.
Indeed, by 2003, there were grounds to think that Oprah was not postmodern, but an alternative to postmodernity. So it was revealed when the first book-length study of the daytime diva appeared from Columbia University Press: Oprah Winfrey and the Glamour of Misery: An Essay on Popular Culture, by Eva Illouz, a professor of sociology and anthropology at Hebrew University of Jerusalem.
"Far from confirming Fredric Jameson's view that postmodern culture lacks emotionality or intensity because cultural products are disconnected from the people who produced them," writes Illouz, "Oprah Winfrey suggests that both the meaning and the emotional intensity of her products are closely intertwined with her narrative authority." Her programs, books, movies, magazine, and other cultural commodities all add up to "nothing less than a narrative work [able] to restore the coherence and unity of contemporary life."
For an example of this redemptive process in action, we might turn to the program from six years ago called "Men Whose Hair Is Too Long" -- during which, as Illouz describes it, "Oprah brought to the stage women who told the audience of their desire to have their sons, lovers, brothers, or husbands change a ‘hairy part’ of their body (mustache, hair, beard)." The menfolk are briefly “exposed to the public” and then “taken to a back room” – from which they later emerge with “a change supposed to effect a spectacular transformation.”
Such transformations are part of the Oprah metanarrative, as we might want to call it.
“The ‘hairy parts’ are exposed as a transactional object in a domestic, intimate relationship that is constructed as contentious,” as Illouz explains. “The haircut or moustache shave provides a double change, in the man’s physical appearance and in his intimate relationship with a close other. The show’s pleasure derives from the instantaneous transformation -- physical or psychic -- undergone by the guests and their relationships, which in turn promote closer bonds.”
This all sounds deeply transformative, to be sure. It made me want to go get a haircut.
But something about the whole argument -- Illouz’s reference to Oprah’s “narrative authority”; the framing of makeover as ritual of self-transfiguration; the blurring of the line between intimate relationship and televised spectacle -- is really frustrating to consider.
It is hard not to think of Richard Sennett’s argument in The Fall of Public Man: On the Social Psychology of Capitalism (Knopf, 1977), for example, that we have been on a long, steady march towards “the tyranny of intimacy,” in which every aspect of the social conversations gets reduced to the level of the personal. “It is the measurement of society in psychological terms,” as Sennett put it. “And to the extent that this seductive tyranny succeeds, society itself is deformed.”
But no! Such worries are part of an “elite” cultural discourse, according to Sherryl Wilson’s book Oprah, Celebrity, and Formations of Self, published by Palgrave in 2003. A whole raft of theorists (the Frankfurt School, David Reisman in The Lonely Crowd (1950), the arguments about the rise of “psychological man” and “the culture of narcissism” in the writings of Philip Rieff and Christopher Lasch, and so on) have treated mass society as a force creating an almost inescapable force of consumerism and privatized experience. The fascination with celebrities is part of this process. Their every quirk and mishap becomes news.
To the “elitist” eye, then, Oprah might look like just another symptom. But according to Wilson (who is a lecturer in media theory at Bournemouth University in the UK) the Oprah phenomenon belongs to an altogether different cultural logic. It is a mistake to regard her program as just another version of therapeutic discourse. It draws, rather, on feminist and African-American understandings of dialogue -- the public sharing of pain, survival, and mutual affirmation -- as a necessary means of transcending the experience of degradation.
The unusually intense relationship between Oprah and her audience would probably have impressed a stodgy old Marxist like Theodor Adorno as evidence of alienation under advanced capitalism. Wilson regards “the apparent closing of the gap between the star self and the personal self” as something quite different.
“Rather than the participants seeking to transcend their ‘ordinariness’ by emulating the personal of a celebrity,” writes Wilson, “it is the ‘ordinary’ and everyday experience of Oprah which works to validate the personal stories recounted by the guests. In other words, those who speak on the show, and who participate through viewing at home, do not position themselves within the aura of a personal anchored in a glamour that for the majority is unattainable; rather, empowerment is located within the realm of everyday life.”
While the star does possess an undeniable charisma, Oprah’s is the glamour of simple decency. “Irrespective of the topic of the day or the treatment through which the topic is handled,” as Wilson puts it, “Oprah’s performance is guaranteed to be inclusive, (generally) nonjudgmental, (often) humorous, and (almost always) empathic.”
How that amiable persona then generated certain massive effects in the literary sphere is a matter addressed in the two scholarly volumes devoted to analyzing the Oprah Book Club.
Each book has a defensive quality; the authors seem to want to defend the book club, nearly as much as they do to analyze it. “From its inception in September 1996,” notes Rooney, “OBC was commandeered as a rallying point around which both cultural commentators and common people positioned themselves in perpetuation of America’s ongoing struggle of highbrow versus lowbrow. Both sides made reductive use of the club to galvanize themselves either as populist champions of literature for the masses or as intellectual defenders of literature from the hands of the incompetent.”
But Rooney contends that a closer look at the club, and at the books themselves, suggests “that there exists a far greater fluidity among the traditions categories of artistic classification than may initially meet the eye; that we needn’t shove every text we encounter into a prefabricated box labeled ‘high,’ ‘low,’ or ‘middle.’”
Farr’s argument in Reading Oprah converges with Rooney’s -- finding in the conversational praxis of the book club something like a down-home version of Barbara Hernnstein Smith’s Contingencies of Value: Alternative Perspectives for Literary Theory (Harvard University Press, 1988).
The book club has embodied “contingent relativism,” writes Farr, “constructed not in the absence of truth, but in the context of many truths, negotiated truths, truths that people arrive at in conversation with others and with their own often contradictory values.” Hence the need to discuss the reading, to embed the books in a conversation. They need to “have a talking life” so that so that readers can “explore and work their way through the myriad of possible responses.”
Given their interest in giving Oprah’s aesthetic and ethical stances the benefit of the doubt, it is all the more striking when either author admits to feeling some reservations about the program. While doing her research, Farr recalls, she “tuned into a pre-Christmas program” that proved to be “an hour-long consumer frenzy.”
This was an “O List show” which is evidently a major event among the Oprahites. The celebrity “gives away literally hundreds of dollars worth of free stuff to every guest in her audience,” writes Farr. “Pants, candles, shoes, electronics – you name it. If Oprah likes it, she’s giving it away on this show....I watched open-mouthed, both appalled and envious. Was this incredibly tacky or unbelievably generous? Did I want to run screaming from the room or do my best to get on the next show? Both/and. It was a moment of genuine American ambivalence.”
The protocols of the book club were also grounds for concern, at least for Rooney. “Once the tape started rolling,” she writes, “neither Winfrey nor her readers seemed permitted to remark critically on the selections, or to advance beyond any but the most immature, advertisement-like, unconditionally loving responses to every single novel they encountered.”
What made last week’s program with James Frey so fascinating was the sudden revelation of another side of the Oprah persona. Gone was the branded performance as “inclusive, (generally) nonjudgmental, (often) humorous, and (almost always) empathic.” Her manner had scarcely any trace left of its familiar “I’m OK, you’re OK” spirit.
Oprah was angry, and Frey was some very considerable distance from OK. She was also indignant to discover that the publishing industry makes no real effort to enforce the implicit contract between reader and writer that goes with a book being shelved as nonfiction. This seems terribly naive on her part. But no doubt most of her audience shared her surprise. (“She wants publishers to fact-check their books?” I thought. “Hell, they don’t even edit them.”)
Remarkable as the spectacle was, however, it did not come as a total surprise. Perhaps I will give myself away as an “elitist” here, in the terms that Sherryl Wilson uses in Oprah, Celebrity, and Formations of Self. But at the end of the day, the therapeutic ethos is not antithetical to a deep yearning for authority (a craving then met by the stentorian Dr. Phil, who scholars have yet to analyze, oddly enough).
Nor is there any deep discontinuity between the conspicuous consumption of an “O List show” and the completely uncritical attitude towards whatever book Oprah has selected for the month. If anything, they seem like sides of a coin.
In search of a different perspective on the matter, I contacted Cecelia Koncharr Farr – whose book Reading Oprah seems, on the whole, an endorsement of the “individual pluralism” of the show’s ethos. What did she make of l’affaire Frey?
“It seems apparent to me,” Farr told me by e-mail, “that Oprah started out with a viewpoint that most experienced readers would have in this situation, that the facts aren't as important as the more general truthfulness of the story in a novel or memoir. Most readers surely took some of Frey's aggrandizements and exaggerations with a grain of salt from the beginning, while still enjoying the character he was constructing, still enjoying the story, and still finding the book powerful and interesting.....
“My guess is that the righteous indignation we saw on last week's show comes from Oprah representing the less experienced readers who needed Frey's memoir to be true in a journalistic sense. Her chastisement of the publishing industry was the first real exertion of her authority I have seen beyond her selection of books. She's earned that authority, certainly, but it was surprising to see her use it. Still, I believe she used it on behalf of her readers.”
I was, to be honest, dumbfounded by this response. I printed it out, and read it a few times to make sure Farr had actually said what she seemed to be saying.
Her contention seemed to be that Oprah’s audience had become upset from mistakenly reading the book as “true in a journalistic sense” -- which was, somehow, a function of readerly inexperience, not of authorial dishonesty.
And from her account, it appeared that Frey’s memoir contained a "general truthfulness" -- one it would be naive to expect to be manifested at the level of occasional correspondence between the text's claims and ascertainable facts.
So I wrote her back, checking to see if I’d followed her.
“I think theorists and critics, especially, but also seasoned readers, read memoirs without an expectation of ‘correspondence between the text's claims and ascertainable facts,’” she responded. “Memoirists creatively construct characters and situations with a lot of license -- and readers and publishers have tacitly allowed that license. That's not to say Frey didn't take this license to its very limit. His constructions at times lose an even tenuous connection with ascertainable facts. When Frey pushed the limits, he drew intense attention to the slippage this connection has seen in recent years. But he wasn't the first to take such license, nor is he responsible for the larger changing perception of what ‘memoir’ (or ‘creative nonfiction’) means.”
Perhaps those terms now just mean “whatever you can get away with” -- though that seems vaguely insulting to honest writers working in those genres. (There is a some difference, after all, between the tricks played by memory and the kind that a con man practices.)
Why the furor over Frey? “I think the vilification he has been subject to in the media is extreme,” writes Farr, “and probably stems from some larger discomfort about dishonesty from sources who are (and ought to be ) culturally more responsible to the ‘ascertainable facts.’"
There may be something to that. And yet it begs any number of questions.
The man has made a small fortune off of fabricating a life and selling it -- while loudly talking, in the very same book, about the personally transformative power of “the truth.” Oprah Winfrey endorsed it, and (at first anyway) insisted that mere factual details were subordinate to a larger truth... A personal truth....A truth that, it seems, is accountable to nothing and nobody.
Suppose this becomes an acceptable aspect of public life – so that it seems naive to be surprised or angered by it. Then in what sense can we expect there to be institutions that, in Farr’s words, “are (and ought to be ) culturally more responsible to the ‘ascertainable facts’”?
Let’s leave that topic for the Oprah scholars to consider. In the meantime, remember that her next selection is Eli Wiesel’s Night, a memoir about surviving the Nazi death camps. It might be an interesting discussion. Especially if the book club takes up the idea that there are forms of truth that, in the final analysis, have exactly nothing to do with self-esteem.
Many now consider the humanities to be facing a relevancy “crisis.” Partly because of the culture wars, the humanities -- if not the whole university -- appear to have lost their reason to be. To choose just one compelling example, Bill Readings argues in The University in Ruins that the primary role of the university is no longer to inculcate national culture, so it now resorts to rhetorically convenient but substantively empty and ideologically suspect vagaries like the term “excellence” to justify its existence. As one result, faculty in English and composition also suffer from what some recent publications are casting as a labor “crisis.”
While the public grows increasingly skeptical of the nature and purposes of liberal arts education, academics generally, and we suspect English scholars particularly, have not been as effective as they could, should, and must be when representing the value of their work, especially teaching. In a colloquial nutshell, public criticism tends to follow some version of this reasoning: English departments aren’t teaching my kids to write and read well enough because they’re too busy trying to turn them into Marxists, feminists, homosexuals, or -- worse -- grad students. Meanwhile, our scholarship is derided as obtuse, cryptic, or absurd. It matters little that such descriptions are inaccurate, unfair, and often advanced in service of narrow-minded ideologies at odds with the democratic underpinnings of a liberal arts education. The fact remains that our work is nevertheless perceived at turns as irrelevant or threatening, a fact which directly and indirectly contributes to the deplorable state of labor conditions in English.
Because the value of work in English studies is so poorly understood, even among ourselves, negative stereotypes become entrenched in the general cultural psyche in the form of common sense: e.g., literature is boring, difficult to understand, and best left to experts who talk about it in ways that are also boring and difficult to understand. And the value of writing is often reduced to its correctness, which, to many, is valuable only to the extent that it earns, as in earns good grades and jobs. This leads (or likely will lead) to further decreases in the number of English majors (currently about 4 in every 100) and this, in turn, will lead to fewer tenure-track lines and increased stratification of faculty, in the form of part-time and other non-tenurable lines. For example, a 1999 Modern Language Association survey found that only 37 percent of English faculty members were on tenure-track lines. While jobs in composition, tenure-tenure track and otherwise, have proven more available than those in, say, 19th century American literature, such jobs often consist of administrative positions, or what both critics and reformers are now calling the middle-management class of faculty, wherein one or two tenured faculty are charged with supervising a large and shifting class of part-time faculty.
As faculty continue to stratify, it will become increasingly difficult to represent the purpose, direction, and value of work in English studies beyond the rudiments of business writing and the cultural capital afforded by cocktail party knowledge of Shakespeare or Melville. The vicious cycle can be simplified as follows: A managed and stratified faculty often has difficulty representing itself effectively in the culture wars, which in turn exacerbates the level of stratification, which in turn leads to increasing difficulty with representation. The consequences of poor representation and increased stratification harm all faculty and students in nearly every imaginable category, including infringement on academic freedom, especially in matters of curriculum design and assessment, as well as decreasing job security, inequitable pay scales, little or no benefits, high teaching loads, large class sizes, and pitiful office conditions.
James Piereson, writing in a recent issue of the conservative periodical The Weekly Standard reflects the views of many non-academics who haven’t been made to care or care enough about our problems and, in fact, resent academics for our seeming disengagement with their values. He writes: “When this year’s freshmen enter the academic world, they will encounter a bizarre universe in […] institutions that define themselves in terms of left wing ideology. […] which is both anti-American and anticapitalist.” Piereson approvingly refers to university trustees who (in his words) contend that “if their institutions are to be rescued, they dare not rely on faculties to do it.” Piereson’s variety of culture-war mongering and his apparent comfort with making outlandish claims without much more than scatter-shot anecdotal evidence, often lead to equally bombastic and antagonistic counter-statements, and so go the culture wars.
Citing findings from the National Center for Educational Statistics, Louis Menand points out in his 2005 contribution to MLA’s Profession that between 1970 and 2001 the number of English majors dropped, roughly, by a third; however, “the system is producing the same number of doctorates in English that it was producing back in 1970. These Ph.D.'s have trouble getting tenure-track jobs because fewer students major in English, and therefore the demand for English literature specialists has declined.” There are many theories about the causes of this discrepancy (e.g., students who would have previously majored in English are now turning to interdisciplinary programs, in, say, cultural studies, or students are driven by the increasing costs of college education to specialize in areas, such as, say, computer science, which have a reputation for more immediate financial pay off than does a B.A. in English). Regardless, more and more conversations in English studies seem to be focusing on ways to reinvigorate the work of English studies in the 21st century, so as to make it more relevant to the public, especially students.
The various strands of this already vast and quickly growing debate are difficult to summarize and properly attribute in the space that we have. For the moment, suffice to say that the main idea is that work in the humanities, both critical and imaginative, seems to be increasingly alien and perhaps irrelevant to the public. It is often said that scholarship in the humanities has become too insular for its own good. One possible solution to the perceived problem of insularity is often described with the phrase “going public.” In 1995, Linda Ray Pratt uses it in her contribution to the influential collection Higher Education Under Fire. In 1998, Peter Mortensen uses the phrase as the title to his article in the journal College Composition and Communication. More recently, it has been invoked in a Duke University panel on academic publishing, and Henry Boyte makes “going public” the focus of his 2005 occasional paper for the Kettering Foundation. If the catch phrase for the late 90s was “critical thinking,” the phrase for the early years of the 21st century may just be “going public.”
While we believe it is important to go public with academic work in the humanities, this phrase, however catchy, raises more questions than it answers. Go public with what, exactly? And what venues qualify as appropriately public? Further, Louis Menand invites us to consider the possibility that going public may not be as easy or as desirable as it may at first sound: "The last premise academic humanists should be accepting is that the value of their views is measured by the correspondence of those views to common sense and the common culture. Being an intellectual and thinking theoretically are going outside the parameters of a common culture and common sense." (Menand’s emphasis)
This is to say that the duty of academics, be they physicists or humanists, is not to the public but to knowledge, dare we say truth. And the public is not necessarily concerned with either. Menand concludes: "Ignorance has almost become an entitlement. We are living in a country in which liberals would rather move to the right than offend the superstitions of the uneducated. As always, the invitation to academics is to assist in the construction of the intellectual armature of the status quo. This is an invitation we should decline without regrets."
Here, Menand raises some valuable points of caution. In his line of argument, going public may mean caving in, stripping our ideas of nuance, and abandoning precision or critical thinking for the sake of public acceptance. Of course most of us agree that teachers who passively abide by common sense notions and status quo values are not acting like responsible academics, and none of us would endorse this behavior. However, as noteworthy as such cautions may be, the distinction between the academic and the public seems overdrawn here. After all, there are nearly 5,000 college campuses in the United States, enrolling more than 14 million students, with enrollments projected to increase through the year 2014. This is to say that the question of “going public” has already, to a very large extent, been settled: academic work is quite thoroughly situated in the public realm, and if the public considers ignorance to be “almost an entitlement,” then we are at least partly to blame for this state of affairs. Gerald Graff goes so far as to claim that the “university is itself popular culture -- what else should we call an institution that serves millions if not an agent of mass popularization. But the university still behaves as if it were unpopular culture, and the anachronistic opposition of academia and journalism continues to provide academics with an iron clad excuse for communicative ineptitude.”
Going public, therefore, is a useful but not entirely adequate phrase, since it does not explain how more public exposure will improve the current state of the humanities or the public’s view of work done within it. Therefore, we would like to focus on improving the work which is, far and away, the most public and the most popular -- that is to say, our teaching. It will be necessary for educators in English studies to make the case for the work of English studies. Increased and accessible public discourse about teaching literature and writing may be a first step, but one which would require more questioning of what we mean by teaching, to whom it is valuable, and why. As opposed to (re)fighting the culture wars with those like James Piereson, or resisting the public face of academic work, we might practice our discourse theories with the public, rather than merely attempt to report on them, even in jargon-free language. This assumes a dialogue that transforms not only the content of the humanities but also the participants of the conversation -- especially, teachers and their students.
Taking up this point in his recent book, English Composition as a Happening, Geoffrey Sirc bemoans the dulling influence of academic routine, which has led many of us to (re)produce the sort of polemical prose and responses which have, thus far, not proven particularly effective tactics in the culture wars. Instead Sirc urges us, as educators and scholars, to define teaching and writing in ways that articulate the value of innovation and imaginative thinking. And we would like to see Sirc’s suggestion enacted both internally and externally, that is in forums such as this one and in public venues such as newspapers, periodicals, and community meetings, in short, any of a variety of venues that serve to establish dialogue among academics, students, administrators, parents, media members, and legislators. The better we are able to do this, the better we will be able to supplant negative and inaccurate representations of our work.
While critics such as Sirc and Menand are clearly influential here, we understand this task to be of particular importance to graduate students, not least of all because the future of work in the humanities is quite literally in our hands. Should we continue the tradition of predominantly insular and/or antagonistic discourse, our degree of leverage and relevance with the public will continue to decrease, as will our prospects for tenure-line work. It is incumbent upon us to open the lines of communication and to make known the good work that is already being done in our classrooms.
Scholarship on this issue is already underway. For example, at the 2005 MLA conference, Michael Bérubé and Cary Nelson spoke to issues of contingent labor; others such as Peter Mortensen and David Shumway attended to matters of representation. We regard these two issues as linked; that is, the better we understand and represent our work (especially teaching), the better our working conditions stand a chance of improving. For this, we conclude with the following proposals that take from and build on the work of these and other scholars:
1. Cultivate existing trends toward interdisciplinarity, such as linked or clustered courses, in ways that effectively demonstrate the value of English studies, particularly in terms of accomplished reading and writing.
2. Realize that the Ph.D., as a credential for teaching, requires civic responsibility and ethical action. The better we collectively attend to this fact and make this work known, the better we will be able to build a platform from which to argue for improved working conditions.
3. Accept and embrace the possibility of working through cultural debates in ways and venues that are accessible to the general public. This is not to suggest necessary agreement with the public, but to encourage a variety of discourse that holds the public in vital partnership.
4. Encourage hiring, promotion, and tenure committees to value the above efforts or else they simply will not happen, or at least not to the extent that they should. In other words, in order to improve the representation of our work, it will be necessary to appeal effectively not only to the public but also to our senior colleagues.
Frank P. Gaughan and Peter H. Khost
Peter H. Khost is a lecturer in writing and rhetoric at the State University of New York at Stony Brook. Frank P. Gaughan is an instructor in English and first-year writing at Hofstra University. Frank and Peter are both doctoral candidates in English at the Graduate Center of the City University of New York. This article is adapted from a talk they gave at the annual meeting of the Modern Language Association.