English

No Field, No Future

I gave a paper recently as part of a colloquium at George Washington University whose general title was "Futures of the Field." The tension in that plural -- "Future s" -- carried the weight of much of what I had to say about the current state of literary study.

My audience and I were seated around a seminar table in what has long been called, and continues to be called, an "English" department. The name "English," I pointed out, designates a primary activity involving the reading and interpreting of literary texts in English. (This would include foreign literature translated into English.) If we want primarily to involve ourselves with historical texts, we go over to the history department; philosophical, the philosophy department, and so forth. What distinguishes our department, as Judith Halberstam wrote in her essay, is the "appraisal of aesthetic complexity through close readings." Not philosophical or historical, but aesthetic complexity.

This model of the English department, and the carefully chosen canon of great aesthetic works which comprised its content, has in most colleges and universities collapsed. The value and nature of our reading (that is, when English departments feature reading at all, film, television, music, and material culture courses having displaced to some extent written texts in many schools), has radically changed, with the inclusion of cheap detective novels and poorly written political essays, for instance, now routine in departments that used to disdain prose that exhibited little aesthetic complexity and/or stylistic distinction.

On the other end, there's also now the inclusion of notoriously over-complex -- to the point of unintelligibility, never mind stylistic ugliness -- advanced critical texts in our courses. A character in Don DeLillo's White Noise says of his university's English department, "There are full professors in this place who do nothing but read cereal box tops." But there are as many professors there who read nothing but the densest, most arcane, and most poorly written critical theory.

All of which is to say that there is no "field," so there can't be any "future" or even "futures." That "s" in our GW lecture series title is trying to reassure us that instead of a profession-killing chaos what we have now is a profession-enhancing variety, richness, flexibility, ferment, inclusiveness, choose your reassuring adjective. Yet when there's not even a broadly conceived field of valuable objects around which we all agree our intellectual and pedagogical activity should revolve, there's no discipline of any kind.

Instead, there's a strong tendency, as Louis Menand  puts it, toward "a predictable and aimless eclecticism." A young English professor who has a column under the name Thomas Hart Benton in The Chronicle of Higher Education puts it this way: "I can't even figure out what 'English' is anymore, after ten years of graduate school and five years on the tenure track. I can't understand eighty percent of PMLA, the discipline's major journal. I can't talk to most people in my own profession, not that we have anything to say to each other. We don't even buy one another's books; apparently they are not worth reading. We complain about how awful everything is, how there's no point to continuing, but nobody has any idea what to do next."

The English department mainly survives as a utilitarian administrative conceit, while the English profession operates largely as a hiring and credentialing extension of that conceit.

If we wish to say that we've retained disciplinary integrity based on our continued close attention to texts of all kinds -- aesthetic and non-aesthetic -- that sharpen our ideological clarity about the world (or, as Menand puts it, texts that allow us to "examine the political implications of culture through the study of representations"), then we have already conceded the death of the English department, as Halberstam rightly notes. Indeed, since highly complex aesthetic texts tend to be concerned with personal, moral, and spiritual, rather than political, matters, we shouldn't be surprised to find in Halberstam an outright hostility to precisely the imaginative written texts in English that have more or less from the outset comprised the English department's objects of value and communal study.

Menand notes that the "crisis of rationale" I'm describing here has had serious negative consequences. Among a number of humanities departments that are losing disciplinary definition, English, he says, is the most advanced in this direction: "English has become almost completely post-disciplinary." (Menand has earlier pointed out the inaccuracy of another reassuring word -- interdisciplinary: "The collapse of disciplines must mean the collapse of interdisciplinarity as well; for interdisciplinarity is the ratification of the logic of disciplinarity. The very term implies respect for the discrete perspectives of different disciplines.") The absence of disciplines means the "collapse of consensus about the humanities curriculum," and this at a time of rapidly escalating outside scrutiny of the intellectual organization and justification of the expensive American university.

Further, "the discipline acts as a community that judges the merit of its members' work by community standards." When there's no self-defining and self-justifying community, English departments, Menand continues, become easy marks for downsizing administrators. "Administrators would love to melt down the disciplines, since this would allow them to deploy faculty more efficiently - and the claim that disciplinarity represents a factitious organization of knowledge is as good an excuse as any. Why support separate medievalists in your history department, your English department, your French department, and your art history department, none of them probably attracting huge enrollments, when you can hire one interdisciplinary super-medievalist and install her in a Medieval Studies program whose survival can be made to depend on its ability to attract outside funding?"

Halberstam acknowledges these effects and proposes that we "update our field before it is updated by some administrations wishing to downsize the humanities." By "update," though, she means provide a decent burial: "The discipline is dead, we willingly killed it," and we must "now decide what should replace it." In place of the "elitism" inherent in close readings of aesthetically complex works, Halberstam proposes an education in "plot summary," a better skill for making sense of our current reactionary political moment (as Halberstam sees it).

Indeed throughout her essay, Halberstam attacks religious Americans, conflating religious seriousness with politically reactionary positions.

Now, a huge amount of Western culture's high literature involves religious seriousness. If, like Halberstam, you regard contemporary America as a fundamentalist nightmare, and if your very definition of the American university is that it is, as she writes, "the last place in this increasingly conservative and religious country to invest in critical and counter-hegemonic discourse," then you have a problem. You either want to steer your students away from much of this literature, since, though perhaps not fundamentalist, it assumes a world permeated with religious belief (or, as in much literary modernism of Kafka's sort, as suffering from an absence of belief), or you want to present this literature in a way that undermines, to the extent possible, its own status as a document that takes religion seriously.

It's just this sort of cognitive dissonance relative to the very body of knowledge that, as an English professor, Halberstam has been trained to teach, that in part accounts for the death of English. Halberstam's primary motive as a university professor is political and social - she has situated herself in an American university because that location is our last best hope for changing the politics of the country. Indeed, if there is a "consensus" about anything in many English departments, it lies here, in the shared conviction, articulated by Halberstam, that focusing upon and changing immediate political arrangements in this country is our primary function as teachers and scholars.

One assumes, that is, a socially utilitarian attitude toward what one teaches.

There was nothing inevitable about this turn outward to the immediate exigencies of the political and social world, by the way. As Theodor Adorno writes in Minima Moralia, the intellectual is, more often than not, "cut off from practical life; revulsion from it has driven him to concern himself with so-called things of the mind." But this withdrawal also drives the intellectual's critical power: "Only someone who keeps himself in some measure pure has hatred, nerves, freedom, and mobility enough to oppose the world."

No one's arguing here that we return to a very narrow canon, to uncritical piety in regard to the literature of our culture, and to monastic withdrawal from the world. Instead, what I'd like to suggest is that we return to the one discrete thing that our discipline used to do, and still, in certain departments, does.

A few years back, in The New York Review of Books, Andrew Delbanco, an English professor at Columbia University, announced "the sad news… that teachers of literature have lost faith in their subject and themselves… . English today exhibits the contradictory attributes of a religion in its late phase - a certain desperation to attract converts, combined with an evident lack of convinced belief in its own scriptures and traditions."

Delbanco continues: "The even sadder news is that although students continue to come to the university with the human craving for contact with works of art that somehow register one's own longings and yet exceed what one has been able to articulate by and for oneself, this craving now, more often than not, goes unfulfilled, because the teachers of these students have lost faith." In similar language, Robert Scholes writes, "As our Romantic faith in the spiritual value of literary texts has waned, we have found ourselves more and more requiring knowledge about texts instead of encouraging the direct experience of these texts."

Notice the language here: direct experience, contact. The political and more broadly theoretical abstractions that have been thrown over the artwork from the outset, as it's often presented in class, block precisely this complex, essentially aesthetic experience. This experience, triggered by a patient engagement of some duration with challenging and beautiful language, by entry into a thickly layered world which gives shape and substance to one's own inchoate "cravings" and "longings," is the very heart, the glory, of the literary. Students -- some students -- arrive at the university with precisely these powerful ontological energies. Certain novels, poems, and plays, if they let them, can surprise these students, both with their anticipation of particularly acute states of consciousness, and their placement of those consciousnesses within formally ordered literary structures.

One of the noblest and most disciplinarily discrete things we can do in the classroom is to take those ontological drives seriously, to suggest ways in which great works of art repeatedly honor and clarify them as they animate them through character, style, and point of view.

One of the least noble and most self-defeating things we can do is avert our student's eye from the peculiar, delicate, and enlightening transaction I'm trying to describe here. When we dismiss this transaction as merely "moral" -- or as proto-religious -- rather than political, when we rush our students forward to formulated political beliefs, we fail them and we fail literature. Humanistic education is a slow process of assimilation, without any clear real-world point to it. We should trust our students enough to guide them lightly as they work their way toward the complex truths literature discloses.

Author/s: 
Margaret Soltan
Author's email: 
info@insidehighered.com

Margaret Soltan's blog, University Diaries, chronicles all aspects of contemporary American university life. Her essay "Don DeLillo and Loyalty to Reality" appears in the MLA's forthcoming Approaches to Teaching White Noise. She and Jennifer Green-Lewis are completing a manuscript titled The Promise of Happiness: The Return of Beauty to Literary Studies.

Thinking at the Limits

The curtain rises on a domestic scene –- though not, the audience soon learns, a tranquil one. It is the apartment of the philosopher Louis Althusser and his wife Hélène Rytman, on an evening in November, a quarter century ago. The play in question, which opened last month in Paris, is called The Caïman. That’s an old bit of university slang referring to Althusser's job as the “director of studies” -- an instructor who helps students prepare for the final exam at the École Normale Supérieure, part of what might be called the French Ivy League.

The caïman whose apartment the audience has entered was, in his prime, one of the “master thinkers” of the day. In the mid-1960s, Althusser conducted an incredibly influential seminar that unleashed structuralist Marxism on the world. He played a somewhat pestiferous role within the French Communist Party, where he was spokesman for Mao-minded student radicals. And he served as tutor and advisor for generations of philosophers-in-training.

At Althusser’s funeral in 1990, Jacques Derrida recalled how, “beginning in 1952 ... the caïman received in his office the young student I then was.” One of the biographers of Michel Foucault (another of his pupils) describes Althusser as an aloof and mysterious figure, but also one known for his gentleness and tact. When a student turned in an essay, Althusser wrote his comments on a separate sheet of paper -- feeling that there would be something humiliating about defacing the original with his criticisms.

But everyone in the audience knows how Althusser’s evening at home with his wife in November 1980 will end. How could they not? And even if you know the story, it is still horrifying to read Althusser’s own account of it. In a memoir that appeared posthumously, he recalls coming out of a groggy state the next morning, and finding himself massaging Hélène’s neck, just as he had countless times in the course of their long marriage.

“Suddenly, I was terror-struck,” he wrote. “Her eyes stared interminably, and I noticed the tip of her tongue was showing between her teeth and lips, strange and still.” He ran to the École, screaming, “I’ve strangled Hélène!”

He was whisked away for psychiatric evaluation, which can’t have taken long: Althusser’s entire career had been conducted between spells of hospitalization for manic-depression. In one autobiographical fragment from the late 1970s –- presumably written while on a manic high –- he brags about sneaking aboard a nuclear submarine and taking it for a joy-ride when no one was looking. If ever there were reason to question legal guilt on grounds of insanity, the murder of Hélène Rytman would seem to qualify.

He underwent a long spell of psychiatric incarceration -- a plunge, as he later wrote, back into the darkness from which he had awakened that morning. In the late 1980s, after he was released, the philosopher could be seen wandering in the streets, announcing “I am the great Althusser!” to startled pedestrians.

It became the stuff of legend. In the early 1980s, as a student at the University of Texas at Austin, I heard what turns out to have been an apocryphal account of that morning. A small crowd of Althusser’s students, it was said, routinely gathered outside his apartment to greet him each day. When he emerged, disheveled and shrieking that he was a murderer, everyone laughed and clapped their hands. They thought (so the story went) that Althusser was clowning around.

That rumor probably says more about American attitudes towards French thinkers than it does about Althusser himself, of course. The murder has become a standard reference in some of the lesser skirmishes of the culture wars – with Hélène Rytman’s fate a sort of morbid punch-line.

Althusser’s philosophical work took as its starting point the need to question, and ultimately to dissolve, any notion that social structures and historical changes are the result of some basic human essence. Somewhat like Foucault, at least in this regard, he regards the idea of “man” as a kind of myth. Instead, Althusser conceived of history as a “a process without a subject” – something operating in ways not quite available to consciousness. Various economic and linguistic structures interacted to “articulate” the various levels of life and experience.

Althusser called this perspective “theoretical anti-humanism.” And for anyone who loathes such thinking, the standard quip is that he practiced his anti-humanism at home.

That strikes me as being neither funny nor fair. At the risk of sounding like a pretty old-fashioned bourgeois humanist, I think you have to treat his ideas as ... well, ideas. Not necessarily as good ones, of course. (In his seminar, Althusser and his students undertook a laborious and ultimately preposterous effort to figure out when and how Marx became a Marxist, only to conclude that only a few of his works really qualified.)  But however you judge his writings, they make sense as part of a conversation that started long before Althusser entered the room -- one that will continue long after we are all dead.

One way to see his “theoretical anti-humanism,” for example, is as a retort to Jean-Paul Sartre’s “Existentialism is a Humanism” –- the lecture that drew standing-room only crowds in 1945, at just about the time Althusser was resuming an academic career interrupted by the war. (The Germans held him as a POW for most of it.) It was the breeziest of Sartre’s introductions to his basic themes: We are free – deep down, and for good. That freedom may be unbearable at times. But it never goes away. No matter what, each individual is always radically responsible for whatever action and meaning is possible in a given circumstance.

“Man,” Sartre told his listeners, “is nothing else but what he makes of himself.” But that “nothing” is, after all, everything. “There is no universe other than a human universe, a universe of human subjectivity.”

For Althusser, this is all completely off track. It rests on the idea that individuals are atoms who create their own meaning – and that somehow then link up to form a society. A very different conception is evident in “Ideology and Ideological State Apparatuses,” a paper from 1970 that is about as close to a smash-hit, era-defining performance as Althusser ever got. Which is to say, not that close at all. But extracts are available in The Norton Anthology of Theory and Criticism, and passages have turned up in countless thousands of course packets in lit-crit and cultural studies, over the years.

For Althusser, it’s exactly backwards to start from the individual as a basic unit capable, through its own imagination and endeavor, to create a world of meaning. On the contrary, there are societies that seek to reproduce themselves over time, not just by producing material goods (that too) but through imposing and enforcing order.

The police, military, and penal systems have an obvious role. Althusser calls them the Repressive State Apparatuses. But he’s much more interested in what he calls the Ideological State Apparatuses – the complex array of religious institutions, legal processes, communication systems, schools, etc. that surround us. And, in effect, create us. They give us the tools to make sense of the world. Most of all, the ISAs convey what the social order demands of us. And for anyone who doesn’t go along....Well, that’s when the Repressive State Apparatuses might just step in to put you in line.

Why has this idea been so appealing to so many academics –- and for such a long time? Well, at the time, it tended to confirm the sense that you could effect radical social change via “the long march through the institutions.” By challenging how the Ideological State Apparatuses operated, it might be possible to shift the whole culture’s center of gravity. And Althusser placed special emphasis on educational institutions as among the most important ISA's in capitalist society.

Such was the theory. In practice, of course, the social order tends to push back –- and not necessarily through repression. A handful of non-academic activists became interested in Althusser for a while; perhaps some still are. But for the most part, his work ended up as a fairly nonthreatening commodity within the grand supermarket of American academic life.

The brand is so well-established, in fact, that the thinker’s later misfortunes are often dismissed with a quick change of subject. The effect is sometimes bizarre.

In 1996, Columbia University Press issued a volume by Althusser called Writings on Psychoanalysis: Freud and Lacan. Surely an appropriate occasion for some thoughtful essays on how the theorist’s own experience of mental illness might have come into play in his work, right? Evidently not: The book contains only a few very perfunctory references to “temporary insanity” and psychiatric care. Presumably Althusser’s editors will be forthcoming next summer, with the publication by Verso of Philosophy of the Encounter: Later Writings, 1978-1987. The catalog text for the book refers to it as “his most prolific period.” But it was also one when much of his writing was done while hospitalized.

Is it possible to say anything about his work and his illness that doesn’t amount to a roundabout denunciation of Althusser? I think perhaps there is.

On one level, his theory about the Ideological State Apparatuses looks....maybe not optimistic, exactly, but like a guide to transforming things. From this point of view, each individual is a point of convergence among several ISAs. In other words, each of us has assimilated various codes and rules about how things are supposed to be. And if there are movements underway challenging how the different ISAs operate, that might have a cumulative effect. If, say, feminists and gay rights activists are transforming the rules about how gender is constructed, that creates new ways of life. (Though not necessarily a social revolution, as Althusser wanted. Capitalism is plenty flexible if there’s a buck to be extracted.)

But that notion of the individual as the intersection of rules and messages also has a melancholy side. It somewhat resembles the experience of depression. If a person suffering from depression is aware of anything, it is this: The self is a product of established patterns....fixed structures.... forces in the outside world that are definitive, and sometimes crushing.

Any Sartrean talk of “radical freedom” makes no sense whatever to anyone in that condition – which is, rather, a state of radical loss. And as the German poet Hans Magnus Enzensberger puts it in a recent essay, the most extreme “radical loser” may find the only transcendence in an act of violence.

“He can explode at any moment,” writes Enzensberger. “This is the only solution to his problem that he can imagine: a worsening of the evil conditions under which he suffers.... At last, he is master over life and death.”

Is that what happened in Althusser’s apartment, 25 years ago? That, or something like it. 

Author/s: 
Scott McLemee
Author's email: 
scott.mclemee@insidehighered.com

Scott McLemee writes Intellectual Affairs each week. Suggestions and ideas for future columns are welcome.

The Formless Form

Perhaps December is not a good time to sing the praises of the essay. The days grow shorter; the semester reaches its crisis; and right about now, any reference to the essay will call to mind not a literary genre, but a set of obligations -- even the occasion for profound weariness of the soul.

This is the time of year my academic friends send e-mail messages containing evidence that they have made no difference whatsoever in their students' capacity to assimilate information, let alone understand the world. It's bad enough when you can tell that from a multiple-choice exam. When it's demonstrated in consecutive paragraphs..... The word "essay" comes from a French root, essayer,  meaning "to try" -- an experiment, in other words. But to anyone facing a stack of dubiously argued and episodically coherent undergraduate essays, a different sort of trial probably comes to mind. One with a medieval ambiance. Something involving the rack.

In that case, maybe it would be best to wait until after the holidays to read Tracing the Essay: Through Experience to Truth by G. Douglas Atkins, published last month by the University of Georgia Press. But I'll recommend it anyway -- both as an antidote to the seasonal malady, and as evidence that academic lit-crit can sometimes be of interest to people not engaged in writing academic lit-crit.

Atkins, who is a professor of English at the University of Kansas, offers a small volume of (what else?) essays on the literary form now sometimes called "literary nonfiction." (The label is graceless, though not quite so oxymoronic as its detractors seem to think.) He looks at work by Montaigne, E. B. White, Cynthia Ozick, and other essayists -- trying to find the moments of "embodied truth" in their writing. An essay is, he writes, "the intersection of experience and meaning, idea and form (or body)."

And yet the form itself has a kind of second-class citizenship in the world of literature. "Refusing to 'take on airs,'" Atkins writes, "the essay often appears so unassuming as to be self-effacing, which is certainly part of its charm as well as of its power. Its brevity -- the essay can often be read in one sitting -- militates against epic pretensions or visions of grandeur, before which the maker of essays is, besides, uncomfortable and wary.... The essay has, typically, manifested contentment with itself, though just as typically it manifests discontentment with the world surrounding it."

Arguably, discontentment with academic routines or expectations is one of the more durable fuels to the essayistic flame. Atkins doesn't quite say so himself. But he mentions discovering the work of contemporary literary essayists only in the mid-1980s -- after years of graduate school and tenure-track immersion in the discourse of the professional "article," a format Atkins calls "scholarly, impersonal, important." Stray references indicate that he was also emerging from a midlife crisis involving divorce, alcohol, and usual unfinished business of the ego.

The moments of personal revelation are few, and discrete. But it sounds as if reading the work of contemporary authors such as Edward Hoagland and Richard Selzer was something akin to a conversion experience. "When I discovered these essayists," Atkins writes, "my life, both professional and personal, changed."

What he found in their work was "the essayist's wondrous mapping of the undulations of his own, simple, humble cogitations."

To the naked eye –- or, conversely, to the academic eye habituated to checking the endnotes every few minutes -- that sort of "mapping" can look pretty casual, even haphazard. And it may reek of egotism. Montaigne confronted that charge head-on more than 400 years ago. Some people think, he said, "that to be occupied with oneself means to cherish oneself too much." But not really. Socrates followed the oracle’s command to "know thyself" -- and, as Montaigne writes, "because by that study he had come to despise himself, he alone was worthy of the name wise."

The problem being, of course, that in the age of Oprah "know thyself" usually just means "love thyself." (Which in turn means "buy something nice for thyself.") Atkins acknowledges that the essay is a genre that "feeds into, and derives sustenance from, the culture of self-esteem." But it might also serve as a corrective. "Were I to look honestly into my heart," he writes, "I might think less well of myself, which, in our me culture, would be the worst of sinning."

Now, some 20 years ago, around the time Atkins was discovering the contemporary non-academic essay, I read another book of his. It was called Reading Deconstruction/Deconstructive Reading (University Press of Kentucky) -- a title that has a certain I Love the '80s-ish nostalgia factor now, though as I recall it offered a pretty solid basic introduction to the initial phase of Derrida's thought.  

Just a ghost of that earlier emphasis can be found in Atkins’s more recent work. The title Tracing the Essay, for instance, alludes to Derrida's thinking about "the structure of the trace." (To simplify wildly: Any given instant in time is always constituted by a past that is gone and a future that doesn’t yet exist. The simplest moment of perception is always marked by traces that complicate it beyond all telling.)

None of the old theoretical scaffolding is still standing now. Or rather, there is (if you’ll forgive the expression) just a trace of it between the lines of Atkins' prose -- along with echoes of Theodore Adorno's praise for the essay as the form that resists the ambitions of any system-builder.

"The essayist's is a literal imagination, his eye trained on the letter, on concrete particulars, on details," writes Atkins. "Spirit is something he can and often does reach, but only via the a posteriori path that spiritualists, Gnostics, and theorists alike eschew. One of the essay's great, enduring contributions lies just here, in its clear-sightedness and its stubborn refusal to pass too quickly beyond the commonplace."

Tracing the Essay is something rare: a book that is learned but plain-spoken, very personal yet also discrete. It may be that the author can say of it what Montaigne announced in creating the genre: "What I write here is not my teaching, but my study; it is not a lesson for others, but for me."

Author/s: 
Scott McLemee
Author's email: 
scott.mclemee@insidehighered.com

Scott McLemee writes Intellectual Affairs each week. Suggestions and ideas for future columns are welcome.

Aiming the Can(n)on

If we could retire for good one old expression from the Culture Wars, I’d like to nominate "the literary canon." Is there anything new to say about it? Has even the most gung-ho Culture Warrior seized a new bit of territory within recent memory? It looks as if all the positions have been occupied, and the battles fought to a dull standstill.

On the one side, Bill O’Reilly and his ilk passionately love Shakespeare. Or rather, they at least enjoy the idea that somebody else will be forced to read him. And on the other side, the fierce struggle to “open the canon” usually looks like an effort to break down an unlocked door.

Checking the entry in New Keywords: A Revised Vocabulary of Culture and Society -- a reference work recently published by Blackwell -- I learn that the canon is, by definition, always something open to revision. Which would, of course, come as a really big surprise to many generations of rabbis, priests, and imams.

But perhaps that underscores the real problem here. The term "canon" rests on an analogy between an established set of cultural masterpieces, on the one hand, and the authoritative body of scriptures, on the other hand. And the problem with this comparison is that, deep down, it is almost impossible to take seriously. "Canon" is not so much a concept as a dead metaphor -- or rather, perhaps, a stillborn one.

If you are a full-fledged resident of secular modernity (that is, somebody accustomed to the existence of a deep moat of separation between sacred and worldly institutions) then the strongest original sense of “the canon” is just barely imaginable.

And if you have rejected secular modernity altogether -- if you believe that God once broke into human affairs long enough to make perfectly clear what He has in mind for us -– then the notion of secular literary works as having some vaguely comparable degree of authority must seem absurd. Or blasphemous.

Once in a great while, a writer or thinker reframes things so that the expression seems to come back to life. The late Northrop Frye, for example, took seriously William Blake’s aphorism calling the Bible "the Great Code of Art." Frye worked out a theory of literature that, in effect, saw the entire DNA of Western literature as contained in Judeo-Christian scripture. And then there is the example of Adonis, the great Lebanese author, who has pointed to the challenge of creating poetry in Arabic. How can you obey the modernist imperative to "make it new" in a language deeply marked by the moment in time it was used to record the commands of God?

But Frye and Adonis are exceptions. Usually, when we talk about "the canon," it is without any strong sense of a complicated relationship between literature and authority. Between words and the Word.

Instead, the debates are really over the allocation of resources -- and the economy of prestige within academic institutions. To say that a given literary figure is "part of the canon" actually means any number of profitable investments have been made in the study of that author. Conversely, to "question the canon" is a strategic move with consequences for the bottom line. (As in, "Do we really need to hire a Miltonist?")

But that means we’ll never get rid of that expression “the literary canon” -- if only because it sounds more dignified than “the literary spreadsheet.”

Is that too cynical? Can’t we assume that works defined as canonical possess some quality that places them above the give-and-take of institutional horse trading?

As a roundabout way of thinking about such questions, let me point your attention to a seemingly unrelated item that appeared in The Washington Post over the weekend.

It seems that there has recently been an intense discussion on an Internet bulletin board in China devoted to the work of Lu Xun, an author who lived between 1881 and 1936. The exchanges concerned one text in particular, his essay "In Memory of Ms. Liu Hezhen” -- a work unfortunately not available online in English, so far as I can tell.

The essay appeared in April 1926, a few weeks after government troops opened fire on a demonstration, killing 40 students. One of them, a 22-year-old woman named Liu Hezhen, had been a devoted reader of Lu Xun’s literary magazine The Wilderness and attended his lectures on Chinese literature at National Beijing Women's Normal University.

She was, Lu wrote “a student of mine. At least, I used to think of her as one.... She, as a young Chinese woman who has dedicated her life to the nation, is no longer a student of a person like me, who still lingers on superfluously in this world.” (All quotations are from the translation appearing in Women in Republican China: A Sourcebook, edited by Hua R. Lan and Vanessa L. Fong, published by M.E. Sharpe in 1999.)

It is a moving essay, and there is now a substantial body of scholarly commentary on it. But as the Post article reported, the sudden interest in Lu Xun’s essay suggests that people are using it “as a pretext to discuss a more current and politically sensitive event -- the Dec. 6 police shooting of rural protesters in the southern town of Dongzhou in Guangdong province.” Despite the official news blackout and the Chinese government’s efforts to censor the Internet, it seems that information about the Dongzhou massacre is spreading.

This development raises complex questions about the role that new media play in developing countries, and under authoritarian regimes. This being the age of high tech, people always want to discuss it -- and, of course, we’d damned well better.

But to be honest, I found it a lot more interesting that people were using Lu Xun’s essay as a reference point. It points to questions about the relationship between literary power and political authority. That Chinese citizens are using the Web and instant messaging to execute an end-run around official censorship is certainly interesting and important. But so is the classic author they are rereading while so engaged. 

It is hard to overstate the role that Lu Xun has played in Chinese culture over most of the past century. His martyred student Liu Hezhen was only one of thousands of young readers inspired by his work in the 1920s. He did not join the Communist Party, but drew close to it in the years before his death in 1936. And after the revolutionaries came to power in 1949, Lu was “canonized” in the strongest sense possible for a completely secular regime.

At the height of the Cultural Revolution (when, as a friend who lived through it once told me, the morning class in elementary school was math, and the afternoon was Mao), the selected quotations of Lu Xun were available in a little red book, just as the Great Helmsman’s were. And even after Mao’s own legacy was quietly downplayed in later decades, the field of “Lu Xun studies” continued as a basic part of Chinese scholarly life.

The novelist Ha Jin, professor of English at Boston University, gives some sense of the author’s continuing prominence in his introduction to a recent edition of Lu’s short stories. “Hundreds of books have been written on his life and writings,” he notes, “and several officially funded journals have been devoted to him. There are even papers on his real estate contracts, the aesthetics of the designs of his books, the rents he paid, and his favorite Japanese bookstores. Novels have appeared based on different periods and aspects of his life, not to mention movies, operas, and TV shows adapted from his fiction.”

All of this might look like evidence for the simplest model of how a literary canon is formed: An author gives voice to the ideology of the powers-that-be -- whether dead white property-owning European males, or revolutionary communist Chinese bureaucrats, or whatever. And those powers then return the favor by making the author a “classic.” All very clearcut, yes?

Actually, no. It happens that Lu Xun gained his prominence, not as an ideologue, but as a writer of great power -- a figure embodying both moral authority and a capacity for literary innovation.

His earliest work was written in the classic or high style of literary language. He gave an important course of lectures on the history of Chinese fiction, and was a master practitioner of the “eight-legged essay” (a very formal structure once used in civil-service exams for the Imperial bureaucracy).

But at some point in his 30s, Lu Xun had a creative breakthrough. He published a series of classic short stories combining sophisticated fictional technique with colloquial language. I don’t know Chinese, and must rely on the accounts of those who do. But even scholars disgusted by the official Maoist cult around Lu Xun admire his profound effect on the literary resources of the language. For example, in his book Lu Xun and Evolution (SUNY Press, 1998), James Reeve Pusey writes that the author “ ‘found himself’ in the creation of a new language, a highly literary, iconoclastically erudite, powerfully subtle vernacular that no one has since used with such mastery.”

And some of his power comes through even in translation. One of Lu Xun’s classic stories is “Diary of a Madman,” in which the everyday corruption and brutality of village life is seen as refracted through the mind of someone sinking ever deeper into paranoia. The narrator becomes convinced that the people around him practice cannibalism. His only hope, he confides to his diary, is that a few young people haven’t tasted human flesh. The final line of the story reads: “Save the children....”

Around the time government troops were shooting down students in 1926, Lu was drifting away from fiction. He instead concentrated on writing what were called zagan (“sundry thoughts”) or zawen (“miscellaneous writings”) -- short, topical prose compositions on whatever caught his attention. The state of his country worried him, and he poured his anger into hundreds of short pieces.

Not everyone liked this phase of his work. Have a look at the following bitter comment from 1931, by a critic who disliked Lu Xun’s later writings: “Zagan compositions, limited to a paltry thousand words, can naturally be done in one sweep of the brush. You catch at a thought, and in the time it takes to smoke a cigarette your thousand words are produced....There is just one formula for zagan compositions: either heated abuse or cold sarcasm. If you can append a word or two of cold sarcasm to the heated abuse, or insert some heated abuse amidst the cold sarcasm, that is all to the good.”

In short, Lu Xun invented the blog entry. (I’m sure that somewhat anachronistic thought has already occurred to people in China, who are discussing recent events via commentary on his work.)

His topics were as ephemeral as any newspaper article. But there is enough wordplay, historical allusion, metaphorical resonance, and heartfelt passion to make them something more than that. A whole scholarly industry is devoted to analyzing these essays. Indeed, by the late 1980s, the field of Lu Xun studies had become so “professionalized” (as that favorite expression of the MLA has it) that one young scholar was worried that it had become completely disconnected from anything of interest to the average reader.

So Lu Xun remains, by any definition, part of the Chinese literary canon, to use that word once again. (And if you see the revolutionary ideologies of the 20th century as continuing the old Gnostic heresy of “immanentizing the eschaton” -- as one school of conservative thinkers does -- then I suppose even the quasi-scriptural overtones might also apply.)

But does that mean that it would mean that China was democratizing only if Lu Xun lost his place? Or to put it more broadly: Are cultural and social power necessarily related? Don’t literary authority and political regime tend to be mutually reinforcing?

Those are open questions. But I can’t help thinking of another question – one that someone reportedly asked Mao in the late 1950s. What would Lu Xun’s role be if he were still alive? Mao answered that Lu would either remain quiet or go to jail. (And this from the man who canonized him. )

Rereading “In Memory of Miss Liu Hezhen” this week, I was struck in particular by the second of the essay’s seven parts. The translation is a little stiff, but the passage is worth quoting in full:

“A real hero should dare to face the tragedy of life and look unwaveringly at bloodshed. It is at once sorrowful and joyful! But the Creator has determined for the sake of the ordinary people to let time heal all the wounds and to leave behind only slight traces of blood and sorrow. It is in these traces of blood and sorrow that people get a humble life and manage to keep this woeful world going. When shall we see the light at the end of such a tunnel, I do not know.”

Imagine how much has been written about that passage over the past couple of weeks. And think of all the questions it must raise – about the past, about the future.

If I were a Chinese official with some interest in the long-term welfare of my own hide, then I might have a strong interest, right about now, in “opening up the canon.” (Or abolishing it.) Perhaps literature is an unreliable way of shoring up the established order and transmitting stabilizing cultural values. It might be a good idea to discourage the reading of Lu Xun, and get people to watch "Fear Factor " instead.

Author/s: 
Scott McLemee
Author's email: 
scott.mclemee@insidehighered.com

Technicolor Dreams

If my recent experiences are any indication, we professors face a daunting challenge: The polarized American political environment has conditioned our students to see life in monochrome. The Right tells them to view all as either black or white, while the Left insists that everything is a shade of gray.

We’ve long struggled with the either/or student, the one who writes a history essay in which events are stripped of nuance and presented as the working out of God’s preordained plan; or the sociology student who wants to view poverty as a modern variant of 19th century Social Darwinism. These students -- assuming they’re not acting out some ideological group’s agenda -- can be helped along simply by designing lessons that require them to argue opposing points of view.

Yet despite all the hoopla about the resurgence of conservatism, I get more students whose blinders are more postmodern than traditional. This is to say that many of them don’t see the value of holding a steadfast position on much of anything, nor do they exhibit much understanding of those who do. They live in worlds of constant parsing and exceptions. Let me illumine through two examples.

In history classes dealing with the Gilded Age I routinely assign Edward Bellamy’s utopian novel Looking Backward.  In brief, protagonist Julian West employs a hypnotist for his insomnia and retires to an underground chamber. His Boston home burns in 1887 and West is not discovered until 2000, when he is revived by Dr. Leete. He awakens to a cooperative socialist utopia. West’s comments on his time say much about late 19th century social conflict, and Leete’s description of utopian Boston make for interesting class discussion. I know that some students will complain about the novel’s didactic tone, others will argue that Bellamy’s utopia is too homogeneous, and a few will assert that Bellamy’s explanation of how utopia emerged is contrived. What I had not foreseen is how many students find the very notion of a utopia so far-fetched that many can’t move beyond incredulity to consider other themes.

When I paraphrase Oscar Wilde that a map of the world that doesn’t include Utopia isn’t worth glancing at, some students simply don’t get it. "Utopia is impossible” is the most common remark I hear. “Perhaps so,” I challenge, “but is an impossible quest the same as a worthless quest?” That sparks some debate, but the room lights up when I ask students to explain why a utopia is impossible. Their reasons are rooted more in contemporary frustration than historical failure. Multiculturalism is often cited. “The world is too diverse to ever get people to agree” is one rejoinder I often receive.

It’s disturbing enough to contemplate that a social construct designed to promote global understanding can be twisted to justify existing social division, but far more unsettling was often comes next. When I ask students if they could envision dystopia, the floodgates open. No problems on that score!  In fact, they draw upon popular culture to chronicle various forms of it: Escape From New York, Blade Runner, Planet of the Apes…. “Could any of these happen?” I timidly ask. “Oh sure, these could happen easily,” I’m told.

My second jolt came in a different form, an interdisciplinary course I teach in which students read Tim O’Brien’s elegantly written Vietnam War novel The Things They Carried. O’Brien violates old novelistic standards; his book is both fictional and autobiographical, with the lines between the two left deliberately blurred. My students adored the book and looked at me as if they had just seen a Model-T Ford when I mentioned that a few critics felt that the book was dishonest because it did not distinguish fact from imagination. “It says right on the cover ‘a work of fiction’” noted one student.  When I countered that we ourselves we using it to discuss the actual Vietnam War, several students immediately defended the superiority of metaphorical truth because it “makes you think more.” I then asked students who had seen the film The Deer Hunter whether the famed Russian roulette scene was troubling, given that there was no recorded incident of such events taking place in Vietnam. None of them were bothered by this.

I mentioned John Sayles’ use of composite characters in the film Matewan. They had no problem with that, though none could tell me what actually happened during the bloody coal strikes that convulsed West Virginia in the early 1920s. When I probed whether writers or film makers have any responsibility to tell the truth, not a single student felt they did. “What about politicians?” I asked. While many felt that truth-telling politicians were no more likely than utopia, the consensus view was that they should tell the truth. I then queried, “So who gets to say who has to tell the truth and who gets to stretch it?” I was prepared to rest on my own clever laurels, until I got the students’ rejoinder! Two of my very best students said, in essence, that all ethics are situational, with one remarking, “No one believes there’s an absolute standard of right and wrong.” I tentatively reminded him that many of the 40 million Americans who call themselves "evangelical Christians" believe rather firmly in moral absolutes. From the back of the room pipped a voice, “They need to get over themselves.”

I should interject that this intense give-and-take was possible because I let my students know that their values are their own business. In this debate I went out of way to let them know I wasn’t condemning their values; in fact, I share many of their views on moral relativism, the ambiguity of truth, and artistic license. But I felt I could not allow them to dismiss objective reality so cavalierly. Nor, if I am true to my professed belief in the academy as a place where various viewpoints must be engaged, could I allow them to refuse to consider anyone who holds fast to moral absolutism.

The stories have semi-happy endings. I eventually got my history students to consider the usefulness of utopian thinking. This happened after I suggested that people of the late 19th century had better imaginations than those of the early 21st, which challenged them to contemplate the link between utopian visions and reform, and to see how a moralist like Bellamy could inspire what they would deem more pragmatic social changes. My O’Brien class came through when I taught the concept of simulacra, showed them a clip from the film Wag the Dog and then asked them to contemplate why some see disguised fiction as dangerous. (Some made connections to the current war in Iraq, but that’s another story!)

My goal in both cases was to make students see points of view other than their own. Both incidents also reminded me it’s not just the religious or conservative kids who need to broaden their horizons. We need to get all students to see the world in Technicolor, even when their own social palettes are monochromatic. Indeed, the entire academy could do worse than remember the words of Dudley Field Malone, one of the lawyers who defended John T. Scopes. Malone remarked, “I have never in my life learned anything from any man who agreed with me.” 

Author/s: 
Robert E. Weir
Author's email: 
info@insidehighered.com

Robert E. Weir is a visiting professor at Commonwealth College of the University of Massachusetts at Amherst and in the American studies program at Smith College.

Finding the Courage to Begin Again

"I exist as I am, that is enough."
--Walt Whitman

I fell in love in New Orleans. I was wandering through a crammed antique shop, meandering without any sense of time or purpose, enjoying the handsome legacies of earlier artisans. On the second floor, my eyes met the desk of my dreams. The slender, 18th-century spinet-shaped desk was made for writing. Slots, drawers and cubby-holes for inks, pens, paper, sealing wax faced the writer while the curved and sloping sides encircled the desktop, and each lifted to reveal a hidden pocket, waiting for some secret cache of letters, or poems. I lingered, stroking the numinous wood, imagining Jane Austen writing there. As I explored each clever nook, I saw the carpenter’s joy at his work, his art. In the presence of that creative genius, I wanted to write, to put his beauty to work for my art, to create in appreciation of, and inspired by, his creation.

I called my friend, also a writer, over and we admired the piece together, respectful of its sanguine beauty, and appreciative of the talent of its maker. We imagined who might have owned such a luxurious piece. We imagined the brilliant writing we’d surely produce if we owned a desk like this one. We imagined how we’d die a little if one of our cats should scratch its surface. If I’d had the money, I would have paid it without blinking; however, imagining was all I could do, given my tenuous employment, my small salary, and the $4,000 price tag.

Still I lingered at that desk, as now it lingers in my memory, and when I eventually came away, I possessed something more significant, sparked by the artist who, two centuries earlier, had put his hands to work. What emanated from his art, and that whole city, was the creative radiance of inspired delight.

That was my last visit to New Orleans, six years ago. My life changed that year. I lost the tenuous job, moved suddenly to a new, equally tenuous, one. When that ended too, I was adrift in unspent wishes and altered dreams. I moved home to Austin, a city of uncommon lives, and into my parents’ house. There I began a long, slow rebuilding. In the midst of my personal chaos came the larger chaos of September 11th. Though Septima Clark may “consider chaos a gift,” I could not rejoice. In the midst of the chaos, my creative brain was off.

Instead, I leaned on the creativity of others. Digging to the foundations of my education, I listened to Emerson telling me to be self-reliant, to have the courage to try many things, to be undaunted by the challenges we name failures. I re-read Thoreau. Since I felt I had nothing, his edict to simplify seemed easy enough to follow! I thought many times of the unconventional life of Emily Dickinson, living in her parents’ home her whole life. I looked around and saw others in my city living creative lives, living “weird” as we proudly say. I let go of orthodoxy, focusing instead on joy.

In Finding Your Own North Star, Martha Beck stipulates two rules for using joy to chart a course toward your north star:

  • Rule 1: If it brings you joy, do it.
  • Rule 2: No, really, if it brings you joy, do it.

 Of course, she also cautions that it’s not as easy. It is impossible in the midst of chaos. It can, however, be a way out of chaos. After some of my chaos settled, I laughed that I had read Transcendentalists for personal gain. What a clue to who I am! I noticed my obsession with writing -- another little clue to that north star of mine. I returned to teaching, as an adjunct instructor, and loved it anew. I cobbled together writing, teaching, and also built a practice as a writing mentor. In this creative city, no one batted an eye, accepting my weird life as normal, and it wasn’t nearly as weird as some! Surrounded by creative lives, I found the courage to begin again. Eventually, I found I had something to say, and my writing erupted because, like Alice Walker writes, “there is a place the loss must go. There is a place the gain must go. The leftover love.” After great chaos, creativity arises. In the middle of creativity, creativity flourishes.

If you are in the middle of great chaos, anchor in the safe harbor of others’ creativity. “Human life itself may be almost pure chaos,” Katherine Anne Porter wrote, “but the work of the artist ... is to take these handfuls of confusion and disparate things, things that seem to be irreconcilable, and put them together in a frame to give them some kind of shape and meaning.” Seek music, seek literature, seek art. Stand outside in the creative genius of nature. Put your hands on a fine piece of furniture to feel the spirit of the carpenter who loved his work. Connect with all surrounding creators. Begin rebuilding.

Author/s: 
Amy L. Wink
Author's email: 
info@insidehighered.com

Amy L. Wink teaches English at Southwestern University, in Georgetown, Tex., and at Austin Community College.

Literature to Infinity

Graphs, Maps, Trees: Abstract Models for a Literary History is a weird and stimulating little book by Franco Moretti, a professor of English and comparative literature at Stanford University. It was published a few months ago by Verso. But observation suggests that its argument, or rather its notoriety, now has much wider circulation than the book itself. That isn’t, I think, a good thing, though it is certainly the way of the world.

In a few months, Princeton University Press will bring out the first volume of The Novel: History, Geography, and Culture -- a set of papers edited by Moretti, based on the research program that he sketches in Graphs, Maps, Trees. (The Princeton edition of The Novel is a much-abridged translation of a work running to five volumes in Italian.) Perhaps that will redefine how Moretti’s work is understood. But for now, its reputation is a hostage to somewhat lazy journalistic caricature -- one mouthed, sometimes, even by people in literature departments.

What happened, it seems, is this: About two years ago, a prominent American newspaper devoted an article to Moretti’s work, announcing that he had launched a new wave of academic fashion by ignoring the content of novels and, instead, just counting them. Once, critics had practiced “close reading.” Moretti proposed what he called “distant reading.” Instead of looking at masterpieces, he and his students were preparing gigantic tables of data about how many books were published in the 19th century.

Harold Bloom, when reached for comment, gave one of those deep sighs for which he is so famous. (Imagine Zero Mostel playing a very weary Goethe.) And all over the country, people began smacking their foreheads in exaggerated gestures of astonishment. “Those wacky academics!” you could almost hear them say. “Counting novels! Whoever heard of such a thing? What’ll those professors think of next -- weighing them?”

In the meantime, it seems, Moretti and his students have been working their way across 19th century British literature with an adding machine -- tabulating shelf after shelf of Victorian novels, most of them utterly forgotten even while the Queen herself was alive. There is something almost urban legend-like about the whole enterprise. It has the quality of a cautionary tale about the dangers of pursuing graduate study in literature: You start out with a love of Dickens, but end up turning into Mr. Gradgrind.

That, anyway, is how Moretti’s “distant reading” looks ... well, from a distance. But things take on a somewhat different character if you actually spend some time with Moretti’s work itself.

As it happens, he has been publishing in English for quite some while: His collection of essays called Signs Taken for Wonders: On the Sociology of Literary Forms (Verso, 1983) was, for a long time, the only book I’d ever read by a contemporary Italian cultural theorist not named Umberto Eco. (It has recently been reissued as volume seven in Verso’s new Radical Thinkers series.) The papers in that volume include analyses of Restoration tragedy, of Balzac’s fiction, and of Joyce’s Ulysses.

In short, then, don’t believe the hype – the man is more than a bean-counter. There is even an anecdote circulating about how, during a lecture on “distant reading,” Moretti let slip a reference that he could only have known via close familiarity with an obscure 19th century novel. When questioned later -– so the story goes -– Moretti made some excuse for having accidentally read it. (Chances are this is an apocryphal story. It sounds like a reversal of David Lodge’s famous game of “intellectual strip-poker” called Humiliation.)

And yet it is quite literally true that Moretti and his followers are turning literary history into graphs and tables. So what’s really going on with Moretti’s work? Why are his students counting novels? Is there anything about “distant reading” that would be of interest to people who don’t, say, need to finish a dissertation on 19th century literature sometime soon? And the part, earlier, about how the next step would be to weigh the books -- that was a joke, right?

To address these and many other puzzling matters, I have prepared the following Brief Guide to Avoid Saying Anything Too Dumb About Franco Moretti.

He is doing literary history, not literary analysis. In other words, Moretti is not asking “What does [insert name of famous author or novel here] mean?” but rather, “How has literature changed over time? And are there patterns to how it has changed?” These are very different lines of inquiry, obviously. Moretti’s hunch is that it might be possible to think in a new way about what counts as “evidence” in cultural history.  

Yes, in crunching numbers, he is messing with your head. The idea of using statistical methods to understand the long-term development of literary trends runs against some deeply entrenched patterns of thought. It violates the old idea that the natural sciences are engaged in the explanation of mathematically describable phenomena, while the humanities are devoted to the interpretation of meanings embedded in documents and cultural artifacts.

Many people in the humanities are now used to seeing diagrams and charts analyzing the structure of a given text. But there is something disconcerting about a work of literary history filled with quantitative tables and statistical graphs. In doing so, Moretti is not just being provocative. He’s trying to get you to “think outside the text,” so to speak.

Moretti is taking the long view.... A basic point of reference for his “distant reading” is the work of Fernand Braudel and the Annales school of historians who traced the very long-term development of social and economic trends. Instead of chronicling events and the doings of individuals (the ebb and flow of history), Braudel and company looked at tendencies taking shape over decades or centuries. With his tables and graphs showing the number (and variety) of novels offered to the reading public over the years, Moretti is trying to chart the longue dure’e of literary history, much as Braudel did the centuries-long development of the Mediterranean.

Some of the results are fascinating, even to the layperson’s eye. One of Moretti’s graphs shows the emergence of the market for novels in Britain, Japan, Italy, Spain, and Nigeria between about 1700 and 2000. In each case, the number of new novels produced per year grows -- not at the smooth, gradual pace one might expect, but with the wild upward surge one might expect of a lab rat’s increasing interest in a liquid cocaine drip.

“Five countries, three continents, over two centuries apart,” writes Moretti, “and it’s the same pattern ... in twenty years or so, the graph leaps from five [to] ten new titles per year, which means one new novel every month or so, to one new novel per week. And at that point, the horizon of novel-reading changes. As long as only a handful of new titles are published each year, I mean, novels remain unreliable products, that disappear for long stretches of time, and cannot really command the loyalty of the reading public; they are commodities, yes, but commodities still waiting for a fully developed market.”

But as that market emerges and consolidates itself -- with at least one new title per week becoming available -- the novel becomes “the great capitalist oxymoron of the regular novelty: the unexpected that is produced with such efficiency and punctuality that readers become unable to do without it.”

And then the niches emerge: The subgenres of fiction that appeal to a specific readership. On another table, Moretti shows the life-span of about four dozen varieties of fiction that scholars have identified as emerging in British fiction between 1740 and 1900. The first few genres appearing in the late 18th century (for example, the courtship novel, the picaresque, the “Oriental tale,” and the epistolary novel) tend to thrive for long periods. Then something happens: After about 1810, new genres tend to emerge, rise, and decline in waves that last about 25 years each.

“Instead of changing all the time and a little at a time,” as Moretti puts it, “the system stands still for decades, and is then ‘punctuated’ by brief bursts of invention: forms change once, rapidly, across the board, and then repeat themselves for two [to] three decades....”

Genres as distinct as the “romantic farrago,” the “silver-fork novel,” and the “conversion novel” all appear and fade at about the same time -– to be replaced a different constellation of new forms. It can’t, argues Moretti, just be a matter of novelists all being inspired at the same time. (Or running out of steam all at once.) The changes reflect “a sudden, total change of their ecosystem."

Moretti is a cultural Darwinist, or something like one. Anyway, he is offering an alternative to what we might call the “intelligent design” model of literary history, in which various masterpieces are the almost sacramental representatives of some Higher Power. (Call that Power what you will -– individual genius, “the literary imagination,” society, Western Civilization, etc.) Instead, the works and the genres that survive are, in effect, literary mutations that possess qualities that somehow permit them to adapt to changes in the social ecosystem.

Sherlock Holmes, for example, was not the only detective in Victorian popular literature, nor even the first. So why is it that we still read his adventures, and not those of his competitors? Moretti and his team looked at the work of Conan Doyle’s rivals. While clues and deductions were scattered around in their texts, the authors were often a bit off about how they were connected. (A detective might notice the clues, then end up solving the mystery through a psychic experience, for example.)

Clearly the idea of solving a crime by gathering clues and decoding their relationship was in the air. It was Conan Doyle’s breakthrough to create a character whose “amazing powers” were, effectively, just an extremely acute version of the rational powers shared by the reader. But the distinctiveness of that adaptation only comes into view by looking at hundreds of other texts in the literary ecosystem.

This is the tip of the tip of the iceberg. Moretti’s project is not limited by the frontiers of any given national literature. He takes seriously Goethe’s idea that all literature is now world literature. In theory, anyway, it would be possible to create a gigantic database tracking global literary history.

This would require enormous computational power, of course, along with an army of graduate students. (Most of them getting very, very annoyed as they keypunched data about Icelandic magazine fiction of the 1920s into their laptops.)

My own feeling is that life is much too short for that. But perhaps a case can be made for the heuristic value of imagining that kind of vast overview of how cultural forms spread and mutate over time. Only in part is Moretti’s work a matter of counting and classifying particular works. Ultimately, it’s about how literature is as much a part of the infrastructure of ordinary life as the grocery store or Netscape. And like them, it is caught up in economic and ecological processes that do not respect local boundaries.

That, anyway, is an introduction to some aspects of Moretti’s work. I’ve just learned that Jonathan Goodwin, a Brittain Postdoctoral Fellow at Georgia Tech, is organizing an online symposium on Moretti that will start next week at The Valve.

Goodwin reports that there is a chance Moretti himself may join the fray. In the interim, I will be trying to untangle some thoughts on whether his “distant reading” might owe something to the (resolutely uncybernetic) literary theory of Georg Lukacs. And one of the participants will be Cosma Shalizi, a visiting assistant professor of statistics at Carnegie Mellon University.

It probably wouldn’t do much good to invite Harold Bloom into the conversation. He is doubtless busy reciting Paradise Lost from memory, and thinking about Moretti would not be good for his health. Besides, all the sighing would be a distraction.

Author/s: 
Scott McLemee
Author's email: 
scott.mclemee@insidehighered.com

Beyond Consolation

In 1991, Elliot L. Gilbert, chair of English at the University of California at Davis, went to the hospital for what ought to have been some fairly routine surgery. Mistakes were made. He died in the recovery room. His widow, Sandra M. Gilbert (also a professor of English at Davis), brought suit – a case finally settled out of court, but not before she piled up a mound of documents that gave her some sense of just what had happened. In Wrongful Death: A Memoir (Norton, 1995), she wrote: "Responsibility in the often miraculous but always highly technologized realm of modern medicine is so dispersed, so fragmented, that finally it accrues to no one."

Years earlier -- long before her work with Susan Gubar on the landmark work of American feminist literary criticism The Madwoman in the Attic: The Woman Writer and the Nineteenth Century Literary Imagination (Yale University Press, 1979) -- Gilbert had worked on a monograph she planned to call “‘Different, and Luckier’: Romantic and Post-Romantic Metaphors of Death.” The phrase in the title came from “Song of Myself,” in which Whitman declaimed that “to die is different from what anyone supposed, and luckier.”

A cosmic sentiment like that cannot do much to mitigate grief. But Gilbert dug the notes for that abandoned project out of her files, and has just published a remarkable book, Death’s Door: Modern Dying and the Ways We Grieve (also from Norton), which revisits her longstanding interest in elegy.

Calling Death’s Door a work of literary criticism, while accurate enough, seems very incomplete. Like Wrongful Death, it recounts the story of her husband’s death. It also offers a historical meditation on the emergence of what Gilbert calls the “technologies” of death and grief. (The famous “five stages” of confronting mortality, while originally meant as descriptive, now seems at times both prescriptive and somewhat compulsory. Woe to anyone who doesn’t follow the script.)

It is a rich book, and a deep one – and also, at times, somewhat terrifying to read, for it is the work of someone for whom “the denial of death” is simply not an option. While reading Death’s Door, I contacted the author to ask her a few questions. The following interview took place by e-mail.  

Q:The book seems like a hybrid -- part memoir, part cultural history, part critical study. Those categories correspond reasonably well to the three big sections you've divided it into, but there are also margins of overlap. How did you come to understand just what kind of book Death's Door was turning out to be?

A: Yes, the book is indeed a kind of hybrid, or as my son put it, it's an attempt at "genre-bending." But I hadn't planned it that way. In fact, I began the work as a fairly traditional project in literary criticism. My goal was to explore what I called "the fate of the elegy" in the 20th century and beyond -- although even as I formulated that project the ambiguity of the word "fate" had begun to haunt me.

Did I intend to explore the evolution of the modern and contemporary elegy? Or did I want to explore modern ideas about fate in the elegy? If the latter, I was already moving beyond purely literary analyses into cultural studies. In any case, however, once I began researching and writing Death's Door it became clear to me that I was no longer able to do critical and scholarly work in the way I had.

As you've noted, following my husband's unexpected death in 1991, I felt compelled to tell his story -- including what I'd been able to reconstruct about the medical negligence that evidently killed him -- in my memoir, Wrongful Death. But the very mode in which I'd written that book, along with the elegiac poems clustered around it, had changed my way of writing. I had wanted to bear witness to my husband's loss of life and to my own grief. And now, as I began drafting Death's Door, I was still working in a testimonial mode, although with much greater self-awareness and, I think, with much larger ambitions, for now I was using my own case as an entry into meditations on the cultural formulations that shape our mourning and on the literary forms in which we mourn.

Of course, I should note here that, as a number of commentators have observed, beginning in the late 80's and 90's many academics in my own fields (literary studies, women's studies) had started writing autobiographically, testing their postulates, in effect, on their own pulses. So I wasn't alone in my sense that I needed a new and different way to approach my subject. And of course, as a feminist critic, I'd always argued that the personal was not only the political but the poetical.

Nonetheless, I suspect that the urgency of my need to "genre-bend" developed out of what I experienced, in early widowhood, as an urgent (indeed a surprisingly urgent) responsibility to testify about my own family's sorrow.

Q: One the one hand, you offer a phenemenology of death and grieving; that is, a description of the kinds of experience of care, fear, concern, etc. that seem to be just about as inescapable as mortality itself. On the other hand, you draw quite a bit on social and cultural history. That serves as a reminder that our vocabulary (and, to whatever degree, our experience) has been conditioned or "constructed." So at the risk of asking you to make an absurd choice: Which comes first? Which is definitive? The intimate level of experience or the social level of cultural meanings?

A: I think both are equally important but they're also inextricably related. As you point out, there's a phenomenology of death and grief that's as inescapable as mortality itself, and this manifests itself cross-culturally as well as trans-historically.

As I try to show in the book, almost every society imagines death as a kind of "place" that one enters, and all around the world people are haunted by what's often experienced as the nearness of the dead. (Are the dead on what George Eliot called -- in a different context -- "the other side of silence," merely separated from us by no more than "a thin piece of silk," as one of W.G. Sebald's narrators puts it?) Wherever we believe the dead mysteriously survive, many cultures have also experienced the dead as needy, often angry or sorrowful.

Throughout history, too, and worldwide, mourners have structured grief in special ways, elaborately patterning the prayers or diatribes with which the bereaved implore or reproach the gods, the fates, and the dead themselves. And again, almost everywhere the spouses -- especially the widows -- of those who die occupy a crucial place in ceremonies of grief. So these are all matters I investigate in the first major section of Death's Door, which takes as its intellectual starting point Zygmunt Bauman's comment that the omnipresence of "funeral rites and ritualized commemoration of the dead" along with the "discovery of graves and cemeteries" is thought by anthropologists to constitute "proof that a humanoid strain ... had passed the threshold of humanhood."

At the same time, as I argue throughout the second major section of Death's Door, history "makes" death, shaping both how we die and how we mourn. So our persistent human needs to imagine the fate of the dead and to pattern grief in special ways are formed, informed, and reformed by all kinds of cultural changes.

In most English-speaking nations, for instance -- and these are the societies that concern me most in the book -- the traditional visions of God and the afterlife that had already begun to disappear in the 19th century continued to erode throughout the 20th century, at least among the educated classes that produce poets, novelists, journalists, and film-makers. And as historians and sociologists from Phillipe Aries to David Moller have shown, everyone, no matter the class, dies differently now than in the past -- more privately yet often more technologically, in hospitals equipped with unnervingly complex machinery.

All of us, too, share a recent history of mass "death events," from the killing fields of the first World War to the Holocaust and Hiroshima in the second World War and on through Vietnam to the "shock and awe" of the present -- and surely this history has re-made our ideas of death and dying while changing our relationship to grief. The skeletons in the trenches of No Man's Land and the corpses charring in the crematoria of Auschwitz point down to an abyss of nihilism rather than up to heaven. But if we no longer hope for a redemptive heaven, then maybe we don't want to talk about death, maybe we need to deny its imminence.

Yet even while our theology and technology have grown increasingly nihilistic, we're quite literally haunted by images of the dead that refuse to leave us because they reside in celluloid or virtual permanence, populating our photo albums, movie screens, home videos, even digital libraries. How does this conflict between the real absence and the virtual presence of the dead change our modes of mourning?

Finally, then, as I worked on Death's Door I became increasingly conscious that the need to grieve whose urgencies I shared with mourners everywhere had a special 20th-century shape. For one thing (and this helped me understand a number of elegies I studied), I experienced my mourning as curiously embarrassing to many people I met, as if, because we fear death, we fear mourners too and suspect their sorrow might be somehow contaminating. In response to such embarrassment, I guess I sometimes become defiantly testimonial about my loss, both in prose (in Wrongful Death, for instance) and in poetry (in the elegies I published in my collection Ghost Volcano). And countless memoirists have done the same thing (most recently and famously Joan Didion) along with contemporary  poets from Allen Ginsberg ( Kaddish) to Sharon Olds ( The Father), Ted Hughes ( Birthday Letters), and Donald Hall ( Without).

Q:The deep, dark core of the book is the contrast you make between "expiration" and "termination." It seems like that distinction is where the elements of memoir, cultural history, and literary analysis all link up.

A: "The deep dark core of the book." Thank you. That's a really incisive and insightful point because the basic argument of the book -- certainly the argument about the "fate of the elegy" -- began with my own experience of that distinction.

In chapter six, I tell the story of two episodes that powerfully moved me. In the first, the surgeon who was in charge of my husband's case testified that he had arrived at the hospital when his patient (my husband) was "terminating" -- i.e., dying. In the second, a nurse, more than three decades earlier, told me that my first child (a very premature baby who survived a few days) had "expired" -- i.e., died. After the doctor talked about "termination," the two words became so resonant for me that I brooded on them for quite some time.

To "terminate" is to come to a flat end. To "expire" is to breathe out something -- a breath that represents, perhaps, a soul.  So each word seemed to me to have key metaphysical implications. "Termination," I decided, is modernity's definition of death; "expiration" the more traditional western (Christian) notion.  For "termination" leads to Beckett, to what in Waiting for Godot Lucky calls "the earth abode of stones" while "expiration" empowers Milton, whose "Lycidas" has breathed out a soul that ultimately lands in heaven, where "entertain him all the saints above." So "termination" is terrifying, makes death almost unspeakably scary, and leads toward horror, repression, and denial, while "expiration" leaves us with some hope -- or anyway it used to.

Q:Your book isn't anti-technology, as such. But I did get the sense you were making the case for literature (and poetry in particular) as capable of providing something unavailable from the medical system. Almost an old-fashioned notion of the humanities as corrective -- if not to science, then to the scientistic or technocratic mentality. Or is that reading of your project off, in some way?

A: I'm not sure that I want to make a case for poetry, and more generally the humanities, as corrective, curative, or medicinal. But I do think I want to note that poets (and novelists and memoirists too, but especially poets) have refused to deny death and grief in a culture that finds these tokens of inescapable mortality at the least embarrassing because at the worst horrifying.  

Poets testify, bear witness to the particulars of pain, the details of loss that technology flattens or sometimes even seeks to annihilate with words like "termination." I don't mean to suggest that those who work among the dying -- doctors in hospitals, medics on battlefields -- don't notice these details, but the language of science is in its way sedative, just as medicine's goals are (often appropriately) sedative and palliative. Poets remind us of what really happens. They don't take away the pain: on the contrary, they teach us how to feel it, to meet it, to know it.

Q:With all the quotations you incorporate, Death's Door serves (de facto anyway) as a kind of anthology. Was there a particular poem or passage that you recall as really being definitive for you? (In whatever way you'd construe "definative" as meaning: epiphantic, consoling, etc.)

A: No, there was no one poem that dramatized for me the practice of contemporary elegists, although there were several works that functioned for me as aesthetic manifestos -- most notably, perhaps, W.C. Williams's "Tract" (about "how to perform a funeral") and Stevens's "The Owl in the Sarcophagus" (about the "mythology of modern death" and its "monsters of elegy").

But before I began drafting Death's Door I had put together an anthology of traditional and modern elegies in a book called Inventions of Farewell, and in assembling this volume I found that, taken together, the elegies poets have produced from the mid-twentieth century onwards functioned for me as radiant examples of what I mean when I say that recent poets insist with unprecedented passion on the particulars of pain and grief.

Think of the resonant specifics Thom Gunn compiles in The Man with Night Sweats or, earlier, the details Ginsberg unflinchingly offers in Kaddish, Olds in The Father, Hall in Without. But I could go on and on about this historically "monstrous" elegiac genre, which dates back to the poems Wilfred Owen, Siegfried Sassoon and others sent back from the Front during the first World War or, even earlier, to Hardy's Poems of 1912. These writers won't let us forget -- as Tolstoy wouldn't either, in The Death of Ivan Ilyich --that death and its sorrows are often excruciating physical processes whose course usually binds and bends the spirit to the body's sufferings.

Such art may not be "consoling" in the traditional sense, but it consoles because it confronts pain and because in doing so it helps us accept loss, lets us know we aren't alone, and teaches us to hope that if we can articulate our suffering we can somehow master it or at least pass through and beyond it.

Author/s: 
Scott McLemee
Author's email: 
scott.mclemee@insidehighered.com

Scott McLemee writes Intellectual Affairs each week. Suggestions and ideas for future columns are welcome.

Two Takes on Teaching

Paula M. Krebs has been a professor of English at Wheaton College, a selective New England liberal arts college, for 15 years, since earning her Ph.D. at Indiana University. Her sister Mary Krebs Flaherty has been an administrative assistant at Rutgers University’s Camden campus for a year longer than Paula has been at Wheaton. Last fall Mary taught her first course, Basic Writing Skills III, on the inner-city, campus of a two-year college, Camden County College. She teaches on her lunch break from her job at Rutgers. Mary has been taking evening classes toward her M.A. for three years, ever since she finished her B.A. at Rutgers via the same part-time route. This article is the first in a series in which Paula and Mary will discuss what it’s like to teach English at their respective institutions.

Paula: My place is about as different from yours as can be, I know. I often find myself longing for your city setting, your students who are so motivated. At the same time, I realize that teaching my students is a real privilege -- I can push them in exciting ways. Wheaton’s admissions standards keep going up, and I’m starting to see it in my classes. This semester my sophomores in English 290, Approaches to Literature and Culture, seemed to finish with a really good sense of how they can use literary criticism and theory in writing essays for their other English classes. They weren’t intimidated by the critics and theorists they were reading -- they actually used them well in their final essays.If only they could follow MLA style and prepare a proper Works Cited!

Mary: MLA style is something my students can do. They were able to pick up on it easily -- I think that’s because they take well to the idea of structure. They like the five-paragraph theme. The part of the class they had the most difficulty with was the content of their papers -- they couldn’t find their voice at all, let alone critiquing literary theorists.

Paula: Oh, mine had plenty of voice. Sometimes I wished for a bit less voice and a bit more work. I think sometimes that the sense of entitlement many of them have means that they don’t necessarily understand that their word isn’t always good enough. They need to cite some authorities, place their work in a larger context, indicate their scholarly debts. They have pretty good skills coming in, so it’s sometimes difficult to make clear to them how they can push to the next level. If they’ve been getting A’s on their five-paragraph themes in high school, they find it difficult to understand why their first efforts, in English 101 or a beginning lit class, are producing C+’s or B-‘s. Some are grade-grubbers, but most just don’t understand what makes a college A.

Mary: Just a week before the semester ended, one of my students finally understood what makes a college B. In the beginning of the term, her grades were “R’s,” which means that the paper cannot receive a grade; it must be revised. When she failed her midterm portfolio, she cried to me that she couldn’t see her mistakes so she couldn’t fix them. She continued to work on her essays and revise them, over and over. Close to the end of the semester, she approached me before class and said, “Mary, please take a look at this paper that someone wrote for another class and tell me what you think.” Knowing that I was being set up, I quickly looked over the essay. Out of the corner of my eye, I could see her smirking, so I told her “You’re right, I wouldn’t have graded this paper.” She shouted, “I knew it! Look at the subject-verb agreement error in the first sentence. There’s even a fragment in the introduction!” Not wanting to trash another teacher’s grading, I pointed out to her that the most important thing was how she had changed since midterm -- that she was now able to identify mistakes so she could correct her own. She passed the course with a B and I am so proud of her.

Paula: See, that’s what’s so great about teaching! I knew you’d love it. That pleasure when you see the lightbulb go on over their heads. That’s the same at Camden County as at Wheaton. But I think you have to do a different kind of work than I have to do in order to get it to happen. In some ways, both our students believe in the value of what we’re teaching, but we both have to do some convincing as well.

Mary: Mine need convincing that what they have to say is important and that saying it in an academic format is worth the effort. Most of the Camden campus students are from Camden city, recently awarded the dubious distinction of being named the most dangerous city in the nation for the second year in a row. They are typically from poor or working class families whose parent(s) may or may not have a high school diploma; many students are parents themselves, and most are minorities: African-American, Latino, or Asian-American. Many CCC students test into basic writing or reading skills classes, which is an indicator that their high school education did not prepare them well enough for college. In an informal discussion, I asked several students about their high school experience, and they claimed that they were never asked to write for content in English class -- the focus was on grammar and fill-in-the-blank or short answer tests. This explains why they are more comfortable with the grammar portion of the writing skills class, as well as how easily they grasp the five-paragraph essay structure. Following the rules is easy for these students, but finding something to say is much more difficult. I am there to assist them in this writing process and hopefully to convince them that they can grow as individuals and be successful in the academic community.

Paula: I have to do some of that, too. But we’re starting from such different places. Mine come to college because it’s expected of them. They need convincing that a liberal arts education really can bring them advantages after they graduate -- that digging into how a literary text works, learning to put together a really well researched research essay, or understanding the connections between Darwin and the poetry of Robert Browning is worth the money the parents are investing and the time the students are investing. In some ways, it’s a harder sell than yours. I have the luxury of time, though, in a way you sure don’t. My teaching is my full-time job, and my teaching load is relatively light. I can’t even imagine what it is like for you, working fulltime and taking classes while learning to teach in probably the most challenging of circumstances -- as an adjunct at a community college. I know how hard it is for you to keep all these balls in the air. Do you think it’ll be worth it in the long run?

Mary: I certainly hope so. That’s the reason I’m teaching this year -- to find out the answer to that very question.

Author/s: 
Paula M. Krebs and Mary Krebs Flaherty
Author's email: 
info@insidehighered.com

Paula and Mary's next exchange will be about the out-of-classroom work they can ask of students.

Notes from the Underground

Normally my social calendar is slightly less crowded than that of Raskolnikov in Crime and Punishment. (He, at least, went out to see the pawnbroker.) But late last month, in an unprecedented burst of gregariousness, I had a couple of memorable visits with scholars who had come to town – small, impromptu get-togethers that were not just lively but, in a way, remarkable.

The first occurred just before Christmas, and it included (besides your feuilletonist reporter) a political scientist, a statistician, and a philosopher. The next gathering, also for lunch, took place a week later, during the convention of the Modern Language Association. Looking around the table, I drew up a quick census. One guest worked on British novels of the Victorian era. Another writes about contemporary postcolonial fiction and poetry. We had two Americanists, but of somewhat different specialist species; besides, one was a tenured professor, while the other is just starting his dissertation. And, finally, there was, once again, a philosopher. (Actually it was the same philosopher, visiting from Singapore and in town for a while.)

If the range of disciplines or specialties was unusual, so the was the degree of conviviality. Most of us had never met in person before -- though you’d never have known that from the flow of the conversation, which never seemed to slow down for very long. Shared interests and familiar arguments (some of them pretty esoteric) kept coming up. So did news about an electronic publishing initiative some of the participants were trying to get started. On at least one occasion in either meal, someone had to pull out a notebook to have someone else jot down an interesting citation to look up later.

In each case, the members of the ad hoc symposium were academic bloggers who had gotten to know one another online. That explained the conversational dynamics -- the sense, which was vivid and  unmistakable, of continuing discussions in person that hadn’t started upon arriving at the restaurant, and wouldn’t end once everyone had dispersed.

The whole experience was too easygoing to call impressive, exactly. But later -- contemplating matters back at my hovel, over a slice of black bread and a bowl of cold cabbage soup -- I couldn’t help thinking that something very interesting had taken place. Something having little do with blogging, as such. Something that runs against the grain of how academic life in the United States has developed over the past two hundred years.

At least that’s my impression from having read Thomas Bender’s book Intellect and Public Life: Essays on the Social History of Academic Intellectuals in the United States, published by Johns Hopkins University Press in 1993. That was back when even knowing how to create a Web page would raise eyebrows in some departments. (Imagine the warnings that Ivan Tribble might have issued, at the time.)

But the specific paper I’m thinking of – reprinted as the first chapter – is even older. It’s called “The Cultures of Intellectual Life: The City and the Professions,” and Bender first presented it as a lecture in 1977. (He is currently professor of history at New York University.)

Although he does not exactly put it this way, Bender’s topic is how scholars learn to say “we.” An intellectual historian, he writes, is engaged in studying “an exceedingly complex interaction between speakers and hearers, writers and readers.”  And the framework for that “dynamic interplay” has itself changed over time. Recognizing this is the first step towards understanding that the familiar patterns of cultural life – including those that prevail in academe – aren’t set in stone. (It’s easy to give lip service to this principle. Actually thinking through its implications, though, not so much.)

The history of American intellectual life, as Bender outlines it, involved a transition from civic professionalism (which prevailed in the 18th and early 19th centuries) to disciplinary professionalism (increasingly dominant after about 1850).

“Early American professionals,” he writes, “were essentially community oriented. Entry to the professions was usually through local elite sponsorship, and professionals won public trust within this established social context rather than through certification.” One’s prestige and authority was very strongly linked to a sense of belonging to the educated class of a given city.

Bender gives as an example the career of Samuel Bard, the New York doctor who championed building a hospital to improve the quality of medical instruction available from King’s College, as Columbia University was known back in the 1770). Bard had studied in Edinburgh and wanted New York to develop institutions of similar caliber; he also took the lead in creating a major library and two learned societies.

“These efforts in civic improvement were the product of the combined energies of the educated and the powerful in the city,” writes Bender, “and they integrated and gave shape to its intellectual life.”

Nor was this phenomenon restricted to major cities in the East. Visiting the United States in the early 1840s, the British geologist Charles Lyell noted that doctors, lawyers, scientists, and merchants with literary interests in Cincinnati “form[ed] a society of a superior kind.” Likewise, William Dean Howells recalled how, at this father’s printing office in a small Ohio town, the educated sort dropped in “to stand with their back to our stove and challenge opinion concerning Holmes and Poe, Irving and Macauley....”

In short, a great deal of one’s sense of cultural “belonging” was bound up with community institutions -- whether that meant a formally established local society for the advancement of learning, or an ad hoc discussion circle warming its collective backside near a stove.

But a deep structural change was already taking shape. The German model of the research university came into ever greater prominence, especially in the decades following the Civil War. The founding of Johns Hopkins University in 1876 defined the shape of things to come. “The original faculty of philosophy,” notes Bender, “included no Baltimoreans, and no major appointments in the medical school went to members of the local medical community.” William Welch, the first dean of the Johns Hopkins School of Medicine, “identified with his profession in a new way; it was a branch of science -- a discipline -- not a civic role.”

Under the old regime, the doctors, lawyers, scientists, and literary authors of a given city might feel reasonably comfortable in sharing the first-person plural. But life began to change as, in Bender’s words, “people of ideas were inducted, increasingly through the emerging university system, into the restricted worlds of specialized discourse.” If you said “we,” it probably referred to the community of other geologists, poets, or small-claims litigators.

“Knowledge and competence increasingly developed out of the internal dynamics of esoteric disciplines rather than within the context of shared perceptions of public needs,” writes Bender. “This is not to say that professionalized disciplines or the modern service professions that imitated them became socially irresponsible. But their contributions to society began to flow from their own self-definitions rather than from a reciprocal engagement with general public discourse.”

Now, there is a definite note of sadness in Bender’s narrative – as there always tends to be in accounts of the shift from Gemeinschaft to Gesellschaft. Yet it is also clear that the transformation from civic to disciplinary professionalism was necessary.

“The new disciplines offered relatively precise subject matter and procedures,” Bender concedes, “at a time when both were greatly confused. The new professionalism also promised guarantees of competence -- certification -- in an era when criteria of intellectual authority were vague and professional performance was unreliable.”

But in the epilogue to Intellect and Public Life, Bender suggests that the process eventually went too far. “The risk now is precisely the opposite,” he writes. “Academe is threatened by the twin dangers of fossilization and scholasticism (of three types: tedium, high tech, and radical chic). The agenda for the next decade, at least as I see it, ought to be the opening up of the disciplines, the ventilating of professional communities that have come to share too much and that have become too self-referential.”

He wrote that in 1993. We are now more than a decade downstream. I don’t know that anyone else at the lunchtime gatherings last month had Thomas Bender’s analysis in mind. But it has been interesting to think about those meetings with reference to his categories.

The people around the table, each time, didn’t share a civic identity: We weren’t all from the same city, or even from the same country. Nor was it a matter of sharing the same disciplinary background – though no effort was made to be “interdisciplinary” in any very deliberate way, either. At the same time, I should make clear that the conversations were pretty definitely academic: “How long before hundreds of people in literary studies start trying to master set theory, now that Alain Badiou is being translated?” rather than, “Who do you think is going to win American Idol?”

Of course, two casual gatherings for lunch does not a profound cultural shift make. But it was hard not to think something interesting had just transpired: A new sort of collegiality, stretching across both geographic and professional distances, fostered by online communication but not confined to it.

The discussions were fueled by the scholarly interests of the participants. But there was a built-in expectation that you would be willing to explain your references to someone who didn’t share them. And none of it seems at all likely to win the interest (let alone the approval) of academic bureaucrats.

Surely other people must be discovering and creating this sort of thing -- this experience of communitas. Or is that merely a dream? 

It is not a matter of turning back the clock -- of undoing the division of labor that has created  specialization. That really would be a dream.

But as Bender puts it, cultural life is shaped by “patterns of interaction” that develop over long periods of time. For younger scholars, anyway, the routine give-and-take of online communication (along with the relative ease of linking to documents that support a point or amplify a nuance) may become part of the deep grammar of how they think and argue. And if enough of them become accustomed to discussing their research with people working in other disciplines, who knows what could happen?

“What our contemporary culture wants,” as Bender put it in 1993, “is the combination of theoretical abstraction and historical concreteness, technical precision and civic give-and-take, data and rhetoric.” We aren’t there, of course, or anywhere near it. But sometimes it does seem as if there might yet be grounds for optimism.

Author/s: 
Scott McLemee
Author's email: 
scott.mclemee@insidehighered.com

Pages

Subscribe to RSS - English
Back to Top