I know who I want to be when I grow up. I want to be Stephen Colbert. I even know the title of the book I’ll publish: I Am Academic (And So Can You!). Unfortunately, I’ve already had, I think, my 15 minutes of fame.
Part I: I Get a Thrill
The experience was distinctly postmodern: minimalist, ironic, and as deflating as it was exhilarating. I could say that I had spent my entire life up until that point preparing for my moment of celebrity; on the other hand, it wasn’t quite what I had dreamed of.
I can still recall how many how many conversations my friends and I, standing around on the playground outside St. Patrick’s Grammar School, had on the subject of being famous. Blame it on the post-World War II atmosphere of fear and longing, blame it on the space race or Tiger Beat Magazine, but my baby-boomer generation was obsessed with fame -- or at least being noticed. (Current celebrities, take note: You’re not even original in wanting to be celebrities.) My earliest plans had to do with receiving an Oscar for a dramatic tap-dancing role in a film that combined the most poignant moments of The Five Pennies and The Nun’s Story. This fantasy was followed by a phase in which I spent long hours in my pink bedroom writing variations on Frost’s “Death of the Hired Man”; perhaps a president- elect would invite me to share his inaugural stage.
By the time Maya Angelou read “On the Pulse of Morning” at President Clinton’s invitation in 1993, I had earned several degrees, given birth to several children, taught writing at several colleges, and published a number of poems and essays, but fame had eluded me. There had been a number of indirect links: I knew a few poets with national reputations; I knew professors who were either reputable critics or who peppered their lectures with references to their reputable mentors. My first college roommate went on to become the editor-in-chief of two glossy women’s magazines. The lead articles in her publications, which I surreptitiously scanned while on line at the supermarket, were depressingly like the surveys she had conducted in our dorm room after “lights out.” I thought she was just making conversation, not planning how she would achieve fame.
Oh, there were moments when fame seemed close at hand. Once, another mother at my children’s bus stop asked me to autograph her copy of my article “The Two-Year-Old’s Guide to Dressing, Dining and Shopping,” which she had discovered in a free parenting magazine in the waiting room of her ob/gyn. And there were the three odd cases of mistaken identity. In an earnest discussion with her pre-school teacher, my oldest child incorrectly attributed the authorship of “Hickory, Dickory Dock” to me; a very tiny, very old woman followed me into the women’s room after the opening of Love Serenade in Manhattan, insisting that I was Shirley Basset, the film’s Australian director; and once, in the elevator of a conference hotel, an overeager graduate student mistook me for Joyce Carol Oates.
And then it happened. I inadvertently stumbled upon what is apparently the universal subject in academe. It does not involve politics, theory, or tenure; it does cut across gender, race, academic majors, and all levels of faculty and staff. My topic was the student excuse.
Written in a white heat, the morning after a student attempted to justify missing the first two sessions of a class that met only once a week, "The Dog Ate My Disk and Other Tales of Academic Woe" was a simple classification piece, the sort of exercise I used to assign in Basic Composition. My thesis appeared at the end of the introductory paragraph. Excuses from college students, I explained, fell into five broad categories: the family, the best friend, the evils of dorm life, the evils of technology, and the totally bizarre.
I was pleased that the editors liked my essay, although I thought it was slighter than my first Chronicle of Higher Education piece, published several months earlier. That one had garnered a few congratulatory notes and several comments in the halls of my building, along with a single request for reprinting, and then it subsided into a line on my résumé, which often seems more alive than I am. I knew the second piece was scheduled for August 2000, but by the time it appeared, I was immersed in syllabi for the fall. And, truthfully, the earliest indications on the home front weren’t promising. My two younger children looked at the illustration and said “cool,” but declined my offers to let them actually hold the paper and read the piece. My oldest child, now in college but possibly still cautious since the pre-school fiasco, said, “I think your piece on Don DeLillo and your horoscope is much funnier. You know, the one where you freak out just because some phony astrologer said ‘Good day for industrial secrets’ under your sign. Why didn’t you send that one?” My mother said she didn’t think “throwing up blood” was a nice thing to write about.
The first e-mail message was equally disheartening. Its author delivered a lengthy lecture on compassion, liberally laced with insulting epithets for me. The rest of the mail, fortunately for my fragile ego, was positive. I heard from administrators as well as from adjuncts, lecturers, and full-time tenured professors at public and private universities, small liberal arts schools like mine, and community colleges. I received requests for reprints and an invitation to be on talk radio “to discuss this national problem.”
Everyone, it seemed, had a story: this is a maxim I tell my students; it was heartening to find such evidence. Deans wrote fondly about professors from 20 and 30 years ago who had called their undergraduate bluffs; professors relayed stories involving plots that rivaled those of Oprah’s book club selections; and I received enough tales involving body parts and organs to lead me to conclude that there should be a separate category of excuses under the heading “Mutilations” or “Excuses Inspired by American Psycho.” No one ever questioned the veracity of my anecdotes involving dangerous machinery or the pope. In fact, the chair of a math department at a private university wrote to me “on behalf of several colleagues” to check the initials of the student involved in what I had referred to as “The Pennsylvania Chain Saw Episode.” They were certain she had matriculated at their school before moving on to mine. I was grateful to the chair of an education department who offered (unsolicited) verification of the phrase that had troubled my mother. I was less certain how to respond to the reader who said he “particularly enjoyed the bloody parts.”
Marvelously inventive stories poured in for months. While all those who contacted me had had dealings with students on one level or another, there was one writer who had a personal interest. From her office across our (small) campus, she e-mailed me to ask if her daughter, whom I’d had in several classes, was responsible for any of the stories (The daughter was innocent).
Part II The Afterlife
As I opened that last message, it occurred to me that my celebrity was largely electronic and ultimately solitary -- much like the process of writing itself. This was 21st-century virtual fame. In fact, the next phase was something of a virtual nightmare, involving my e-mailing institutions and individuals who had posted my essay on their Web sites without permission. The copyright offenders included a Southern church; a professor of communications who, according to her home page, had a doctorate in journalism ethics; and a sociology professor who explained that his “[W]ebsite [was] intended to . . . introduce you to the many ways which [sic] you can utilize the World Wide Web in your sociological endeavors” -- those endeavors beginning, apparently, with piracy.
Even the essay’s inclusion in composition texts was another mixed -- and humbling -- experience. The promotional material for one of the earliest texts featured the title of my piece, referring to me as “a lesser known writer.” (There is something worse than being a lesser-known writer -- it’s seeing that fact announced in print.) In the tables of contents of anthologies, my name hovers, Zelig-like, alongside those of Amy Tan and Shakespeare. And then there are the instructors’ manuals (which I secretly scan the way I used to read magazines in the supermarket), where David Sedaris rates the adjective “hilarious,” while I am described as merely “very funny.” As for the suggested essay question, “Do you think Segal is being unfair,” I want to write my own 500-word answer. It is one thing to be a misunderstood poet; I’m not certain that I can bear going though the rest of my life as a lesser known and misunderstood essayist.
I will admit to one glorious moment of pleasure early on, when I sat (alone) at my desk and thought, “They like me. They really like me.” The teaching load at a small liberal arts college, however, does not leave much time for basking in the limelight. Moreover, as the semesters progressed, the excuses began to mount. I realized that I had thought my essay was a sort of talisman: I mistakenly believed that having articulated my -- and my students’ -- griefs and grievances, I had put an end to all excuses. But they continued to come, as varied, creative, and astounding as ever. There was the Hemingway-esque “Something tragic happened,” stunning in its brevity and stoicism, and the Zen-like explanation of “I know you allow only two cuts and this was my third, but I was with you in spirit.”
As for “The Dog Ate My Disk,” it lives on, in its final incarnation -- you can purchase, for a very small fee, analytical essays about my (famous) piece at various plagiarism sites.
Carolyn F. Segal
Carolyn F. Segal is associate professor of English at Cedar Crest College.
The editors of the cultural magazine .N+1 are publishing a booklet called What We Should Have Known: Two Discussions that they have prepared for undergraduates. Copies have only just come back from the printer, it seems, but I’ve had a look at a prepublication PDF and now feel a certain evangelizing fervor for the whole project.
Its topic, in brief, is the relationship between education and regret – how each one creates the conditions for the other. The books you read at a certain age can put you on the wrong path, even though you don’t recognize it at the time. You are too naively ambitious to get much out of them -- or too naive, perhaps, not that it makes much difference either way. And by the time you realize what you should have read, it’s too late. You would understand things differently, and probably better, had you made different choices. You would be a different person. Instead, you wasted a lot of time. (I know I did. There are nights when I recall all the time spent on the literary criticism of J. Hillis Miller and weep softly to myself.)
The booklet consists of transcripts of two meetings of N+1 contributors (a mixture of writers and academics, most in their 20's and 30's) as they discuss what they regret about their educations. Each contributor also submits a list of eight “Books That Changed My Life.”
The structure here seem to involve a rather intricate bit of irony. There is an explicit address to smart people in their teens, or barely out of them, offering suggestions on what to read, and how. It can be taken as a guide to how to avoid regret. The reflections and checklists are all well-considered. You could do a lot worse for an advice manual.
But the task is impossible. Avoiding regret is not an option, whether in your formal education or your love life; and it’s the price of the ticket that you must learn this the hard way. There are no shortcuts between naivete and sophistication. Or rather, there are a lot of shortcuts – but all of them will lead you astray.
Among the approaches tried and found wanting by participants in the discussion are:
The Dartmouth Review’s list of timeless classics by dead white European males.
The cultural studies templates for subverting DWEM hegemony.
Extremely intense close reading of the finest works of literature ever written.
Extremely intense close reading of the densest works of theory ever written.
Becoming so immersed in the works of a particular master-thinker (for example, Foucault) or author (Emerson, maybe) so that you end up quoting them all the time.
Just trying to keep up with whatever is on the syllabus as you move from semester to semester in a contemporary American university’s smorgasbord of electives.
Whichever path you follow, then, is bound to involve the risk of ending up someplace you might have qualms about, later. You just have to strike out and take your chances anyway. Regret will come, and you'll have to learn from it, too.
This candor is remarkable. And so is the hard edge of respect for the intellectual seriousness of young people. It reminded me, at several points, of a wonderful passage in an essay by Adorno:
“The naivete of the student who finds difficult and formidable things good enough for him has more wisdom in it than a grown-up pedantry that shakes its finger at thought, warning it that it should understand the simple things before it tackles the complex ones, which, however, are the only ones that tempt it. Postponing knowledge in this way only obstructs it.”
This booklet is a reflection on the difference between education and Bildung. That is, between the experience of moving through a given social institution, on the one hand, and the process of being inwardly “formed” by what you’ve learned, on the other.
It’s not an attempt to recast the curriculum, then. Or a polemic in the culture wars. Or a blueprint for reforming the vast multi-billion dollar research-and-entertainment complex known as “higher ed.” In some respects, it is much broader in focus than that; in others, it addresses the particularity of individual experience.
The emphasis falls on how books can influence a reader in ways having little to do with career, and everything to do with a sense of life. (Not that the participants are terribly solemn about this. One of them says, deadpan: “It’s like after I read Crime and Punishment in high school, I wanted to kill an old lady.”)
But there is also an undercurrent of disappointment with the university running throughout the discussion. “Our educations take place in institutions that are divided up in these ways that may not bear idealistic close inspection,” says Meghan Falvey, a graduate student in sociology at New York University. “You can really end up studying the wrong thing, sitting around a table with the wrong people, whose concerns are not your own. Almost inevitably it seems like you won’t know what your concerns are until you’re older or better read or something.”
Perhaps that is inevitable – a human problem, rather than the failing of any pedagogical arrangement that could be reformed. But other comments in What We Should Have Known suggest deep reservations about the university as an institution.
“I realized, the further I went on,” says Marco Roth, a doctoral candidate in literature at Yale, “that almost everyone in academia feels like an outsider, nobody knows what’s going on. Academia’s an empty vessel, but the ones who don’t realize it end up going all the way and end up in charge....They believe in the system. That there’s something they can conform to and master. And the proof is that they’ve stuck it out while so many others drop by the wayside into ‘obscurity.’”
An empty vessel is not worthless, of course. (It has its uses.) The complaint here, rather, is about the routinized and often rather vacuous cult of “professionalization” in the humanities. William James worried about this more than a century ago. But really, he could never have imagined how far things would go. In the more inane extremities of the process, any expression of doubt about the effects of professionalization will now immediately be denounced as “anti-intellectual” -- a tendency reflecting an incredibly impoverished conception of the life of the mind.
The participants in the discussions presented in What We Should Have Known are smart enough to know better; and none of them sounds timid enough to give a damn. The combination of seriousness and playfulness here is inspiring. My only regret is that I did not read this pamphlet a long time ago.
(Information about ordering What We Should Have Known is available here.)
"But publishers said their biggest hope was that Kindle would expand sales of books to a new generation of gadget lovers." --The New York Times
When I ask my English students to set learning goals for themselves at the beginning of each term, I’m interested in finding out how they want to improve as readers and writers. I am even more interested in getting at what they think it means to be good readers and good writers.
The most common "reading" goal my students set for themselves is to read faster. Because they have so much going on in their young lives, they are looking for the most efficient ways of getting their reading finished. They also mention that they want to improve their comprehension of what they read, but more importantly they want to learn how do it quickly.
They soon learn I don’t teach speed reading. I teach slow reading. I teach slow, concentrated, finger-on-the-page reading. I read to my students in my slow Texas drawl. I crawl with them through the passages and passageways. We mosey. We copy down sentences. We write paraphrases. We imitate sentences. We read a couple of lines. We ask questions. We pause. We read those lines again. I dedicate entire classes to silent, sustained, shared reading. We call it “reading lab.”
The relationship we have with text is called reading. The quality of that relationship depends upon what we bring to those relationships. Improving the relationships college students have with text is our primary responsibility. And it chiefly occurs when we read to and for them. I know many professors think that students already know how to read when they come to college (or should), but there is a real difference between knowing how to read words on the page and having a productive relationship with those words.
Having a productive relationship with text is also dependent upon hearing the text. Many of my students cannot hear what they read. Perhaps it is because they were not read to as children. Whatever the cause, they often cannot hear the voice of the text. Their eyes may be working, but their ears aren’t. Nothing on the lips and tongue either. What they can’t taste, they can’t consume. That’s why I read to my students. I’m their hearing aid. Their sommelier. Given my experience with the text, I help them learn the lay of the land. I help them find the right narrative path so they can follow it page after page. I’m an English instructor who also teaches voice.
I also know that many students sometimes go blind when they see text. It’s a shameful state of cultural affairs. Poetry-blindness is particularly tragic. Poetry unsettles the eye. It can make us dizzy, all this reading back and forth, up and down the page. But students easily go blind in the face of other texts, too. Lost and wandering aimlessly, they might as well give up, shut their eyes, and fall asleep for good.
So it shouldn’t be surprising that many students should go silent in the company of text. That they are unresponsive in class. That they should go dumb after going deaf and blind. That they have no sense and sensation of what they’ve read. That they look to their professors for short cuts, quick reads, and knowledge patches.
It also shouldn’t be surprising that the solution is to teach students to hear and see and speak the words we assign them. To accomplish this, we not only have to slow down our students, we have to slow down ourselves. Do more with less. At its best, reading should be a sort of textual genuflection, the sign of the cross we make between our eyes, ears, mouth, and mind to enliven the soul.
However, the current frantic pace of school work is not conducive to learning how to read the variety of texts students are assigned across the curriculum. Learning to read well is also dependent on reflection -- time to weigh, consider, accommodate, connect, synthesize, incorporate, sort the wheat from the chaff. If reflection is rarely available (or if there’s rarely time to help students learn how to reflect), then learning to read is rare.
Our learning culture is awash in technology so that information can be delivered in the blink of an eye at any time of the day or night. It's true that more information is flowing, but it doesn't always result in more knowing. In this hypersphere, it may be that students are reading and writing more than ever before. But practice doesn't make perfect. It could just as easily wear us down as lift us up.
This is all a prelude to my short take on Amazon’s new product, Kindle, a wireless and portable handheld device designed to make books instantly accessible. It’s actually a graven image. A false gadget god engineered in the service of efficient data transfer and consumer credit. Don’t be fooled, Kindle is no innocent tool. It's not a gift that keeps on giving. It holds a charge so it can keep on charging.
My dear colleague, you will soon be expected to try it. And then you will be expected to buy it. To embrace its efficiencies and remarkable cost savings. To order your textbooks through it. To order your students to use it.
Someone will put it in your hands. Don’t ask where it came from. Or who made it. Just raise it and praise it, dummy. Look how lightweight and lovely. See how quickly you can turn the page!
Laurence Musgrove is an associate professor of English at Saint Xavier University, in Chicago.
If you love a book, there is a special thrill that comes from seeing the phrase "soon to be a major motion picture." It is a thrill of dread. In the case of Brian Morton's novel Starting Out in the Evening, though, my initial reaction was disbelief. Starting Out, first published in 1998 and a finalist for the PEN/Faulkner Award that year, is one the best things I've ever read -- possibly the best -- about being a writer. But that makes it seem unfilmable almost by definition.
At its center is the relationship between an elderly novelist and the young woman who is writing a thesis on him. The most important and difficult truth Morton portrays about the life of an author is that so much of it must be spent alone. It can color a writer's dealings with other people in various ways, some of them quite complicated -- but that's not always the same thing as being dramatic. So how would this be portrayed in a movie? For that matter, could it be?
Well, in any case it has been, in a film that opens in a few days. "Starting Out in the Evening" already has some critics mentioning an Academy Award nominaton for Frank Langella's performance as a novelist in the final season of his career. Earlier this week, Langella was named Best Actor at the Boston Film Critics Awards and runner-up (to Daniel Day-Lewis) at the LA Film Critics Awards. Lili Taylor plays his daughter Ariel; as ever, the fact that Taylor is in the film is itself a recommendation.
Hurl all the accusations of phallocratic ocularocentrism you want but I do enjoy looking at Lauren Ambrose. She plays Heather, the graduate student who hopes to edit The Leonard Schiller Reader for the University of Chicago Press. Unfortunately the screenplay leaves her character rather thin -- as it does that of Casey, played by Adrian Lester. As Brian Morton originally portrayed him, Casey seemed very much to be one of the young African-American public intellectuals who were assuming the cultural role played by Jewish writers of an earlier generation. Despite a fine performance by Lester, that very mid-1990s dimension of the novel does not make it to the screen.
It is difficult to picture Brian Morton himself -- a wry and quiet man who teaches writing at Sarah Lawrence College -- walking down the red carpet during the Oscars. But who knows; the experience might make for another novel. At the very least, Starting Out in the Evening should now reach a new audience. The author answered some questions by e-mail, from which the following interview was assembled.
Q:I read "Starting Out" right after it appeared and identified most with Heather: the young person going to New York and trying to find her way, driven by admiration for literary heroes, but also by a good bit of raw ambition. About two years later, I reread the book and found that it was actually Schiller whose life felt most familiar, this time. His reputation, never huge, is fading, but he keeps on working, because persevering is the best a writer can manage, most of the time.
You somehow conveyed both of those kinds of experience without simply playing one of them off the other as superior -- the younger person treated as full of illusion, for example, or the older as being just bitter, or out of touch. There is a graceful acceptance of both phases of life as necessary and, in their way, right about things. At the same time, they have their limits. Heather really is a bit callow, and Schiller has armored himself against life in ways that he comes to regret. The balance is extraordinary. How did you do it? Where did the novel come from?
A: First of all, thank you. If I did portray both of these characters persuasively, I guess it was because I identified with them both.
In answer to where the novel came from—to a large extent it came from my attempt to work through my disappointment about the fate of my first novel, The Dylanist. The Dylanist was published in the summer of 1991, and went out of print less than six months later. Before it came out, I was aware that it might not live forever in the annals of literature, but I didn't really anticipate that it might have approximately the shelf life of milk. I was already 36 by the time the book was published; I'd spent thirteen years writing as seriously as I knew how; and it was devastating to see the book go instantly out of print.
Schiller, the main character in Starting Out in the Evening, is a 71 year old novelist who has written four books, all of which are out of print. So, although I'm not sure I was perfectly conscious of this when I was actually writing the book, I think that by writing about him, I was asking myself what it would be like if I spent the rest of my life writing novels that didn't do any better than The Dylanist had. I was trying to ask myself whether a writing life that came to nothing in terms of external recognition would be worth living.
Photo: Roadside Attractions
I used different parts of my own experience in writing about Heather. Her initial love for Schiller's work, her feeling that reading him was such a profound communion that it almost felt as if he was somehow interested in her as deeply as she was interested in him, seemed like an an experience that any reader has from time to time. (Schiller's nothing like Raymond Williams, but I kept having this experience as a reader of Raymond Williams's work all through my twenties and thirties.) After she meets him and grows disillusioned with him, starting to suspect that his monomaniacal focus on writing had drained his later work of vitality—well, the questions she was asking about him are questions I've asked about myself.
Q: Interesting to think of Raymond Williams as a source for Schiller. You edited the review section of Dissent when Irving Howe, one of its founders, was alive. I always figured he there in the novel, too, somewhere. Is that wrong?
A: No, it's right. I worked with Irving for 10 years, and learned from him, and loved him. My mental picture of Schiller's body -- his height and weight and the way he held himself and moved -- is drawn almost completely from Irving, or rather, from the way Irving appeared near the end of his life. And you could say that Schiller's attitude toward his own writing had something in common with Irving's attitude toward democratic socialism.
By the end of his life, I sometimes thought that Irving's fidelity to democratic socialism might be summed up in T.S. Eliot's line: "Sometimes we must fight for something not in the belief that it will triumph, but in order to keep the idea of it alive." (I can't remember exactly how it goes, but it's something like that.) Schiller was completely uncertain about the strength of his own gifts; he kept writing not because of any faith that his work would live on, but simply in order to pay tribute to his own conception of beauty, whether or not anything he wrote would ever fulfill it.
Also, Ilana Howe, Irving's widow, thinks that Schiller's nearly empty refrigerator was based on hers and Irving's, but I think I was just describing my own.
Q:Some scenes in your novel are not so much satirical as sharply observed. There's a bit about how all the up-and-coming literary editors in New York have exactly the same editions of the same authors on their shelves, for example, and how someone could sneak into their apartments late at night and exchange their libraries without anyone noticing. In another scene, you describe how a young writer who is on-the-make is just a little too amused at the jokes of a magazine bigwig. Did anybody reading the book protest? When I first read it, the part about the guy laughing too hard gave me a brief, paranoid flashback to my 20s.
A: No, nobody's ever complained. In some of my books I've had characters who were too obviously based on people I knew, and who were portrayed very unkindly -- caricatured -- and I've hurt a few people that way, which is something I'm not proud of. But that's a different story. I can't remember anyone feeling personally insulted by any of the scenes from literary life.
About the time we met in 1990 -- are you implying that the things I said that day weren't really that amusing?
Q:Let me plead the Fifth on that one..... Some novels -- even works of "literary fiction," as the expression goes -- feel destined to end up on screen. The possibility of adaption for film now often seems to condition the writing of a novel, or the experience of reading it, or both. But I've never thought that was the case with your work. How did it come to pass that Starting Out in the Evening turned into a movie?
A: I never imagined it as a movie either. There's so much interiority in the book -- so much "Was she thinking I was thinking what she was thinking I was thinking?" Kind of hard to film.
It became a movie because Fred Parnes thought he could see a movie in it. Fred is the kid brother of one of my best friends from high school, and he'd already made two movies -- a documentary about the a capella group The Persuasions called "Spread the Word," and an indie comedy called "A Man Is Mostly Water." Fred wrote a screenplay along with his writing partner, Andrew Wagner, who ended up directing the movie. They put it through many, many drafts, none of which I saw. They asked me if I wanted to look at it, but I didn't. I understood that in order to turn the book into a movie, they'd have to change a lot of things around.
I knew Fred well enough to respect his integrity -- I knew that whatever changes he made, he wouldn't turn Schiller into an elderly New York Intellectual who had a little business selling skag on the side. We wouldn't have a scene where Schiller, sick and tired of years of critical neglect, sticks a Beretta under his belt and goes out to gun down James Wood. So, since I trusted Fred's integrity, I didn't want to be standing there breathing down his neck, saying "Schiller would never do that! Heather would never say that!"
Q:What's it like to see your characters on screen, in the shape of famous actors?
A: When they were shooting the movie, Fred told me that it was remarkable to see Langella arrive on the set each day, a strapping Italian in a leather jacket, and then, after putting on a button-down shirt and a tie and a pair of glasses, transforming himself into an infirm Jewish intellectual. I only visited the set for one afternoon, but I instantly saw what he meant.
They were preparing a scene; people were bustling around and making a lot of noise; and Langella was sitting in a corner, buried in Leonard Schiller's overcoat, looking down, reading something from an index card he was holding in his hands. Looking at him, solitary in the midst of all that activity, it seemed as if he'd somehow managed to create a zone of quiet around himself. You could almost touch it. As I watched him, I thought, "He's got it."
John le Carré had a character named George Smiley in many of his books; after Alec Guinness played Smiley in two miniseries -- played him brilliantly -- le Carré said that he couldn't write about Smiley anymore. He said that Guinness took the character away from him. I'd never had any plans to write about Schiller again, but if I had -- well, I won't say that Langella's performance would have made it impossible. But it was so damn good that I would have had to work hard to wrestle him back.
Q:Langella's performance really makes the film. It's no surprise that the expression "Oscar-worthy" has come up in describing it. I found the final scene overwhelming -- lump in my throat, tears in my eyes, a sense that the whole course of Schiller's life was concentrated there in the expression on Langella's face.
But.... how to put this.... An awful lot of your novel isn't on screen. Most of the characters and incidents are there, but only a very small part of the spectrum of tone. As a movie, "Starting Out in the Evening" is pretty solemn, while one of the things I love about the novel is how it moves between serious and comic perspectives. How do you feel about that? Was it something you just accepted as inevitable?
A: Well, if you have a song and somebody does a cover version, you have to expect that they're going to interpret it in their own way. Mostly, I'm flattered that Fred and Andrew made a movie of it, and I'm glad that it's led a few people to discover the book.
Q:You teach writing at Sarah Lawrence. Have you had students who know you as the author of Starting Out in the Evening? Who imagine themselves as the Heather to your Schiller, perhaps? Do you expect a rush of people trying to audit your classes and show you their screenplays?
A: The student community at Sarah Lawrence has somehow intuited that I shun the limelight, and has tactfully conspired to help me feel as if I'm working in obscurity, a condition in which I thrive.
Writing from the other side of the world, Sothearwath surprised me by asking a favor: “Do you have any research or study on reading and writing? I am here ok, but I have lots to read.”
Until recently, Sothearwath (not his real name) taught English at the Royal University of Phnom Penh (RUPP), in Cambodia. Now, he had just begun a doctoral program at a university in another Asian country. For the past four years, I’ve spent each January at RUPP working with him and his colleagues. I endeavor not only to help the Cambodian instructors teach more effectively, but also to learn how to be a better teacher back home. I lead workshops on various topics -- learning theory, assessment, responding to student writing -- usually in classrooms as dingy as they are airless. But, most importantly, I hold follow-up individual coaching sessions with the faculty. We meet in their departmental office, a cramped but slightly air-conditioned double room. Most of the teachers share “desk space” at three round tables and take turns sliding in and out of the seat next to mine at their appointment times.
There have been no head starts for these teachers, and they have very little usable history. In Cambodia, the living either survived the Khmer Rouge (1975-79) or descended from that era’s millions of dead or disabled. Despite these hardships, perhaps because of them, most of the teachers I’ve met are engaged, professional, thoughtful, immensely social and, frankly, fun. We work hard, but we also laugh often.
Sothearwath rarely laughed. In his early 40s, reserved, proud but diffident, he was an experienced English teacher, and his linguistic virtuosity -- emblematic of his country’s tortured history of war, uneasy alliance, and occupation -- included Russian and some Vietnamese as well. But his comfort zone was circumscribed by lexicons and rote-learning lesson plans. So, although he attended all of my workshops, Sothearwath was initially reluctant to meet with me and explore his teaching plans and practices in depth. By my second January, though, he began requesting individual consultations. Still, he was just as likely to cancel an appointment as keep it. By my fourth stay, we had achieved a polite rapport but not the kind of connection that would have made it easy for him to contact me.
I replied to his e-mail at once, reminding him of his strengths as an educator and his potential as a scholar. I urged him to define a narrow question that he could answer through his research, offering examples of hypotheses derived from broad topics, and reminded him to consult with his librarians, noting that I used their expertise often. I asked him to stay in touch.
I have provided long-distance academic coaching to several of my Cambodian colleagues now pursuing graduate work abroad. At home, they have reached the top of their fields. However, in foreign universities where they must work exclusively in another language, another culture, and to standards and conventions for which some are unprepared, they can quickly find themselves unmoored.
Yet they are far from alone in this dilemma. At Smith College, I work extensively with M.S.W. and Ph.D. candidates in the college’s School for Social Work, in an academic and writing support program I co-designed: a program consisting almost entirely of individualized academic coaching. These advanced students benefit from assistance with writing techniques, and we almost always discuss methods of analysis and critical thinking. Social work strategies are effective tools with all learners: Meet the learners where they are and start with the strengths they possess. From this point, one can begin to coach people to confidence in a world that many -- even those accomplished in other or earlier pursuits -- find strange and new.
Sothearwath responded to my e-mail as quickly as the 12 hour time difference would allow. His usually careful English was in chaos, and I had trouble understanding his dilemma: “I am only one foreign student in my class and the class started 2 day later after I arrived…I did not know anything about my school. Therefore the study topics that we needed were selected by others.” He listed three broad topics on literacy and asked me to e-mail scholarship on one of them. Sothearwath seemed to believe that the only research he required was what I would send him. I had no sense that he had been to his library -- or wanted to go. He ended his e-mail, “Help me.”
Sothearwath’s anguish felt familiar. Recently, I worked with a middle-aged American who had returned to school for a Ph.D. in social work. A sophisticated, learned, and skilled professional, Nancy (another pseudonym) had managed to avoid typing anything for so many years that she had no idea about basic manuscript form, let alone how to use a computer or fathom the style requirements of the American Psychological Association used at Smith. Lacking these bedrock skills in a wired world so overwhelmed her ability to function that she was unable to address concepts and content that should have been readily accessible to her. She considered abandoning her program.
I find the Khmer phrase for one’s intense interests, chap arom, “to capture one’s consciousness,” useful at these moments. Before Nancy could move forward, she needed to reconnect to her original desire for her work. We talked about her satisfying experiences with clients, her successes with past writing projects and graduate programs, and the professional advantages a Ph.D. would bring. I also assured her that she could learn the skills she lacked. Our individual meetings were often bi-weekly, she was fragile but functioning, and she finished her first semester committed to the second. When I next saw Nancy, she was ebullient. In a card she wrote, “I passed. (Of course.)” She scheduled only a few appointments with me in her second term.
Clearly, and in very many ways, American and Cambodian educational settings are different. Yet one can use similar methods with advanced learners in disparate worlds. Personal coaching is well-established in professional environments and an increasing presence in undergraduate education. It can be labor-intensive and expensive -- but also highly effective.
Teachers know that the single greatest predictor of all learners’ success is their engagement in their academic endeavors. Thus, to enable learning, academic coaches can explore and enhance each learner’s connection and commitment to the work, particularly when the learner is destabilized by finding him or herself stumbling. What captured his consciousness? Why? What sustains her interest and commitment? The details of your writing process: what works, what doesn’t?
Sothearwath had achieved his great goal: He was studying abroad, in a place of more. Still, in Cambodia, resources are few, Internet access expensive, and even the best professionals remain the products of an intensely traditional, hierarchal and dangerously dictatorial culture. All serious Cambodian learners must overcome this. What would enable him to do so now?
I continued my correspondence with Sothearwath, sometimes twice a day. (He was clearly awake all night, many nights.) I reminded him of his abilities and resilience; I sent research data bases; and I continued asking targeted questions about his research: What writing assignments build critical thinking skills? Why? How? His return e-mails revealed some progress, his language becoming more clear as his thinking found form. As did my American social work student, he needed confidence building as much as skill sets. But Sothearwath also, and expectedly, needed more. He revealed that facing what he felt was a dooming deadline to critique existing research, he took some of my curriculum materials (giving me full credit for my work), mocked up a study based on them, slapped together a power-point presentation, and presented it in a class. “I had no choice because I had to give them something,” he explained.
At least he met his deadline and passed the assignment. And he was honest with me about what he did. Still, I expressed my surprise to him in an e-mail, withholding (I thought) my disappointment -- yet, he clearly seemed to sense it. He responded: “During the course work, it is hard for the scholars to consult with anyone. Actually, we can talk to the professor of each course, but you can imagine how much time the professor can have for students.”
“The worst thing that will ever happen to us,” another American consultant told me during my third January in Cambodia, “is that we’ll be escorted to the airport and deported. So never do anything to get a Cambodian in trouble -- because they have to live here.” To that end, discussions of serious controversy, politics and religion are avoided at the university. So, in my workshops and coaching sessions with the RUPP faculty, I found safer ways to discuss questioning assumptions, assessing evidence, and crafting sound arguments. But could I have done this differently? I was an “expert,” modeling academic propriety. I might have reinforced Sothearwath’s limited sense of scholarship.
Yet, I also realize that the Cambodians I worked with who are flourishing both there and abroad, demonstrated in many ways that they were engaged and active -- even when our discussion examples were more limited than I would have liked. They made extensive use of our coaching sessions, using the time to consider ideas and transfer knowledge to new situations; they argued and critiqued, sometimes even on risky topics. Sothearwath had been hesitant to accept coaching then; he was desperate to do so now.
Sothearwath -- and many learners like him -- knows he needs more. He needs time, some successes, and continuing coaching from educators. And with this assistance, he can achieve more: the original desire, context and self-assurance necessary for the work ahead of him. In a recent email, he attached a study he’d found and was beginning to evaluate. He was still struggling, but also beginning to make his own way in a world that he knew he had to make his own.
Debra Carney is a writing counselor and lecturer in English at Smith College.
Ever since coming back from MLA in Chicago, I’ve been thinking about Arthur Rimbaud. This isn’t a matter of having attended any sessions on his poetry. Though, come to think of it, he was mentioned at one point in an interesting session on the Beat Generation. This was held at 8:30 on a Saturday morning. You have to wonder sometimes if the people who schedule these things are making a little joke. No beatniks would ever have attended a session of the MLA held at 8:30 on a Saturday morning. Apart from being square, it would have meant staying up past their bedtime.
Rather, I’ve been thinking of Rimbaud in consequence of a wracking cough picked up from some blast of cold in Chicago. As you may recall, he proclaimed that a writer should cultivate his visionary genius through “a systematic derangement of the senses,” through wild experiences and consciousness-altering substances. Alas, the codeine in my prescription cough syrup is not having the desired effect. I sit down to write this in a state of unsystematic derangement.
So instead of hallucinatory conceptual riffs performed in spontaneous bop prosody, I’m going to claim the old columnist’s privilege of “going casual.” Here follow a few quick recommendations of things you might find interesting.
Once upon a time, the question at MLA each year seemed to be, “Who are the exciting new critical theorists, now?”
Then for a while it became, “So why don’t there seem to be any exciting new theoretical approaches?”
After a while, this mutated: “How much longer are we supposed to wait? Hey, wasn’t this panel called ‘Can We Queer the Subaltern Cyborg?’ also in the program for 1995?”
And then it seemed like all anyone wanted to talk about was the job crisis. In 2003, I recall hearing numerous references to an essay in Social Text arguing that the Ph.D. in some fields – for example, English – was a waste product of the academic economy. Certain departments required a steady influx of cheap labor, i.e. graduate students, to teach lower-division classes. Their own coursework would supposedly prepare Ph.D. candidates to be admitted into a profession. But most of them would later, with degree in hand, never find regular employment to teach.
This was not a failure of the system that could be corrected by reducing the number of graduate students admitted, went the argument. Rather, the system was working just fine. Cheap labor was consumed, and the Ph.D.-holder was excreted, and the bottom line was met.
The shift from vague discussions of Bataille's "general economy" to hard-edged considerations of questions about academic labor was certainly very striking. A few years earlier, people had theorized about abjection. Now they seemed to be living it.
The author of “The Waste Product of Graduate Education” was Marc Bousquet, now an associate professor of English at Santa Clara University, who has expanded the argument into a new book called How the University Works: Higher Education and the Low-Wage Nation, which does for academe what Upton Sinclair’s The Jungle did for breakfast sausage.
It should have traction outside the ranks of MLA. Some of the grumbling heard during the American Historical Association meeting in Washington, DC over the weekend suggests that people in other fields may read it with a shock of recognition. I had dinner recently with a historian who said, more or less, “People refer to the crisis as one of the ‘job market,’ but that’s misleading. Academic employment isn’t a market in the literal sense.” As it happens, that is one of Bousquet’s arguments -- although the historian saying it hadn’t heard of him or read his book.
How the University Works has spawned a blog of the same name that has very quickly emerged as a prime venue for muckraking, agitation, and YouTube interviews with known troublemakers. In other words, it’s really good to see, and I urge you to take a look.
Also recommended is Framing Theory’s Empire, edited by John Holbo and recently issued by Parlor Press. It assembles several phases of a symposium, held at The Valve in 2005, about the volume Theory’s Empire (Columbia University Press, 2005) – which was, in turn, a kind of rejoinder to The Norton Anthology of Theory and Criticism (Norton, 2001).
In other words, it is an anthology of responses to an anthology intended to negate another anthology. Maybe it should have an ouroborus on the cover?
In any case, the book stands as a critique not so much of “Theory” (nor, for that matter, of belletristic or neo-traditionalist “anti-Theory”) as of the familiar routines by which certain arguments have unfolded over the years. Instead of the usual “complaint and rejoinder” mode, the exchange moves in an altogether more shambolic and crabwise manner. That quality reflects its origins in an online colloquy. The effort to transfer the discussion from the blogosphere to book format is not always successful. So much of the flow of online discourse runs through the channels of direct linkage, while a printed book involves very different sorts of connectivity. Then again, it may be that the difference between such modes of reading and writing will become ever more salient for literary discussions as old-fashioned debates over “Theory” fade into the background.
So I tried to hint in an essay written to introduce the collection. A copy of the book itself just arrived a few days ago. Some degree of prejudice against print-on-demand publishing is bound to continue for a while – but let me note for the record that the finished product seems altogether indistinguishable from any paperback from a traditional academic press.
It is, by the way, cheaper to purchase Framing Theory’s Empire directly from Parlor Press than via an online bookseller. And you can download the whole thing in PDF for free.
The single richest and most thought-provoking discussion of reading (the kind of thing you do with books, as opposed to other modes of “media consumption” now available) is an essay by Caleb Crain that ran last month in The New Yorker. Anyone can complain about shrinking attention spans -- or, conversely, pick tiny holes in recent statistical claims about the decline of literacy. Impressionistic muttering is easy. In "Twilight of the Books," Crain does something completely different. He synthesizes a wide range of material on the history, economics, and even the physiology of reading, and does so with an elegance of understated effort.
No surprise, that. I've envied his knack for doing so ever since we were both writing for Lingua Franca (way back when). An important difference now, however, is that -- whatever his misgivings about "new media" -- Crain is able to supplement the polished final product with a set of blog entries on the sources he consulted. Items such as "Is Literacy Declining?" and "Does Television Impair Intellect?" amount to valuable bibliographical essays in their own right.
As it happens, the latest Cliopatria Awards name Caleb Crain as "Best Writer" of 2007 for his blog Steamboats Are Ruining Everything. So I learned last Friday, during the Cliopatria banquet held amidst the American Historical Association, when presiding eminence Ralph Luker circulated the final list around the table.
Not entirely sure if this recollection was for real, or if the cough medicine were just acting up, I checked the formal announcement and see that it reads: "The judges' aim was to reward writing that is well tailored to the history blogosphere, accessible, memorable and consistently history-oriented. Caleb Crain is always readable and thought-provoking; an engaging writer who pays attention to the constraints of the blog format but breaks them with style on occasion." Quite right, and congratulations to the recipient for an honor that certainly deserved.
Finally: "The Vietnam War is now as far in the past as the Second World War was at the beginning of the Vietnam War," wrote Daniel Davies recently in a post at Crooked Timber. "There has, basically, been at least one complete political and cultural generation turned over since the 1960s. I therefore declare 2008 to be officially The Year That We No Longer Have The 1960s To Blame. Making a small exception for the purely demographic effects of the Baby Boomers on economic and political issues of relevance, any and all remaining social problems are our own fault."
So what do you say, everybody? Is it a deal? Can we move boldly into the future by finding some other decade to complain about? I've always tended to blame everything on the 1980s, myself, but the last seven years almost make that look like a golden age.
Late last year, The New York Review of Books ran a full-page advertisement fairly glowing with the warmth of the enthusiasm it projected for work of Bob Avakian. In case that name does not ring a bell, Bob Avakian is Chairman of the Central Committee of the Revolutionary Communist Party, USA. Once upon a time, Avakian was a student of Stanley Fish at the University of California at Berkeley; but amidst all the excitement of the late 1960s, the poetry of Milton could not compete with the slogans coming out of the Great Proletarian Cultural Revolution in China, and so a leader of the American masses emerged, even if the masses themselves didn't notice.
The NYRB ad praised Avakian’s combination of “an unsparing critique of the history and current direction of American society with a sweeping view of world history and the potential for humanity.” It called upon readers to “engage” with his work. As it happens, I was once in a punk rock band with a former Avakianite. (This was back when one of the party’s slogans was “Revolution in the ‘80s – Go For It!”) Having thus already had the opportunity to (as they say) “engage” with Avakian’s work, I will testify that he is, at the very least, prolific and capable of extensive discourse. Nearly all of his writings are based on speeches to the party, and they do go on a bit.
In any case, the content of the full-page proclamation was much less interesting, all in all, than the list of people endorsing it. Among them were a few prominent academics. Cornel West was one of them. Members of the Harvard faculty were among the signatories. Ubiquitous cultural theorist Slavoj Zizek has recently added his name to an online version. The list also includes famous entertainers such as Public Enemy rapper Chuck D and Ricky Lee Jones, the folk-rock chanteuse. (The text and the most recently updated set of signatories can now be found here.)
Without quite endorsing the RCP slogan “Mao More Than Ever,” all of them had “come away from encounters with Avakian provoked and enriched in our own thinking.” Or so the text of the ad put it.
In the weeks since it appeared, a few friends who knew of my longstanding fascination with the Chairman Bob phenomenon asked about the New York Review ad. They were surprised to see it, and wondered whether all these people had actually taken up the cause of Avakianism.
My best guess, rather, was that very few of the signatories had read much Avakian. The abundance and verbosity of his pamphlets would exceed the stamina of any but the most disciplined of revolutionary intellectuals. What probably happened, I surmised, was that party cadres had pointed out various anti-Bush statements by Avakian in order to harvest a bunch of signatures from people who were angered by the course of recent history.
At the same time, it was easy to imagine how other people would probably understand the ad. They would look at it and conclude that the signatories were, in fact, hardcore militants looking to Avakian for leadership in establishing a revolutionary dictatorship of the proletariat and peasantry.
The belief that academia contains literally tens of thousands of such people has, of course, no basis in reality. But it is evidently quite profitable. There is an audience for such claims (the rate of propagation of suckers-per-minute having intensified since P.T. Barnum’s day) and it constitutes a more robust market than the one for Marxist-Leninist pamphlets. One pictures right-wing interns stuffing envelopes with reprinted copies of the NYRB advertisement and sending it to the hinterlands – and humming “We’re in the Money” all the while.
Well, not that it will slow down the fund-raising campaign one bit, but an article that ran on Sunday in the Ideas section of The Boston Globe helps clarify the motive of some of those who lent their signatures. Mark Oppenheimer, the editor of a new journal called The New Haven Review of Books, contacted some of the professors who endorsed the ad. He reports that they were much more interested in upholding Avakian’s right to free speech than they were in the content of his revolutionary doctrine.
There also may be a little nostalgia going on. Avakian is “a living link to the '60s,” writes Oppenheimer, “an era when American campus radicalism reached its apogee of influence. And he was an outspoken atheist back in the day, too, before Christopher Hitchens and others found bestsellerdom in unbelief; one professor told me he admired Avakian’s stand against religious fundamentalism. But above all the Avakian narrative allows civil libertarians to register a vote for free speech, even if they have to ignore the fact that Avakian's speech is in no danger of being suppressed. Rightly concerned about Guantanamo and the Patriot Act, they figure that Avakian is a good proxy fight, or good enough.”
This strikes me as a judicious estimate. But while for the most part concurring with the article (for which Oppenheimer interviewed me about my own sad misadventure of trying to arrange an interview with Chairman Bob), I think there is a little more going in with that manifesto than meets the eye.
Buying a full-page in America’s premier journal of public-intellectual commentary is an expensive proposition for a small group on the far left. And it is not necessarily the most obvious use of resources for revolutionaries who have otherwise spent much of their energy trying to build “base areas” (as Maoist theory puts it) in ghetto areas.
To understand what was really happening, we might take a quick glance at what might look like a very different sort of cultural artifact. I mean the recently leaked video clip of Tom Cruise speaking about Scientology, which recently showed up on YouTube. Here’s a link, for as long as it may be good.
A couple of weeks ago, a researcher for one of the television networks asked me if I might be willing to discuss the clip on one of the prime-time news programs. As with being interviewed for the Boston Globe article, this was a delayed side-effect of having once been in a punk-rock band – for another members of the group was a Scientologist. (A career as armchair subcultural anthropologist and the loss of hearing in my right ear seem to be closely related.)
It seemed as if a much better guest for the program might be Roy Wallis, whose excellent book The Road to Total Freedom: A Sociological Analysis of Scientology was published by Columbia University Press in 1976. But Wallis is now teaching in Belfast, while I live about two blocks from one of the network’s studios. Gore Vidal is said to have remarked that one should never turn down an opportunity either to have sex or go on TV. That seems like incredibly bad advice from the standpoint of hygiene, literal or spiritual. Still, I agreed to take a look at the clip to see if there were anything interesting to say about it.
And indeed there was. The video shows the famously enthusiastic actor discussing the miraculous powers he has gained from his years in the Church. The clip also demonstrates that Cruise can speak advanced Scientology jargon with a certain fluency.
Some commentary about his performance has been remarkably off-base -- treating it simply as a kind of recruitment film starring an extremely prominent celebrity. In fact, most of what Cruse says would be utterly incomprehensible to any potential recruit. You have to know the code, the inner lingo of the movement, to understand the implications of the points he was making.
Having studied Wallis’s monograph, I was able to follow the message almost like a native speaker. And that message was aimed strictly at anyone in the Church inclined to doubt its leadership. Cruse was pretty clearly warning members that their only hope lay in the authority of its established hierarchy.
So I explained in a short memorandum for the TV people – who thanked me, then decided another talking head wasn’t required for their program, after all. Gore Vidal might be unhappy, but I was slightly relieved. (Getting the Scientologists mad at you is no picnic. We’re talking about a church for which litigation is practically a sacrament.)
With hindsight, I think the general point of my analysis also applies to that full-page ad, as well. Whatever the intention of Cornel West or Slavoj Zizek in signing the appeal from the Committee to Project and Protect the Voice of Bob Avakian, the most important audience for its message was not the public-intellectual world served by The New York Review of Books.
The force of the discourse was, in important respects, centripetal. Its real audience is the party faithful. Or rather, those supporters who, at certain moments, feel doubt about whether Chairman Bob Avakian Thought actually can change the world. (The Chairman himself thinks that failure to appreciate his contributions is a major weakness among his followers, according to recent discussion among people formerly close to the party.)
There is nothing like a full-page ad in NYRB – endorsed by celebrities, no less – to make the road forward look that much brighter for the rank-and-file. It must also lift the Chairman’s own spirits. After all, the job of providing Maoist leadership in the world’s most highly developed country, with not a peasant in sight, has to get kind of depressing, at times.
Public intellectuals in America have good reason to be discouraged. And so do those who look to them for intellectual leadership. Currently, it almost seems that the more public the intellectual, the less seriously he or she is taken by other intellectuals. Nevertheless, public intellectuals today have more media outlets and markets available to them than ever before. Due primarily to the rise of new technologies, the circulation and recirculation of their ideas are reaching wider and wider audiences. Consequently, as the intellectual influence of public intellectuals over other intellectuals (viz., non-public intellectuals) wanes, the market for their ideas and their entertainment value skyrockets.
An additional cause for discouragement for public intellectuals and those who look to them for intellectual leadership is that society at large just doesn’t seem to afford its iconic or star public intellectuals much respect anymore. Public intellectuals in America are merely "one side of an argument," so to speak. From the general public’s point of view, they are either Republican or Democrat; liberal or conservative; left-wing or right-wing; pro-choice or pro-life; and so on. Public intellectuals signify or are reduced by the general public to nothing more than a position -- and usually an extreme one -- on a topic of contemporary social and political concern.
The reduction of the discourse of public intellectuals to mere polarized positions is the most observable sign of a lack of respect. It serves to short-circuit and obviate subtleties of argument and render superfluous the need for evidence. Respect is afforded public intellectuals not by the mere “declaration” or “assertion” of a position (anyone can merely declare or assert a position). Rather, respect is granted to them through the opportunity to articulate and defend their positions in some detail or depth to a wide audience. It is further confirmed when their defense is thoughtfully received by an attentive audience. Public intellectuals are respected for the depth of their knowledge, and efforts to suppress it, such as the reduction of their knowledge to a mere position, is ultimately a sign of disrespect for them as intellectuals.
The lack of respect afforded our public intellectuals today is a major cause for concern. The current situation can be put into better context when one recalls that the history of public intellectualism in America includes figures such as Ralph Waldo Emerson, William James, Max Weber, and John Dewey -- figures who still have a powerful presence in the world of ideas. At present, public intellectualism in America is preoccupied more with the idea-in-itself that is being promoted than with the person who is promoting it. For much of the last century, Dewey, for example, was regarded as not just another expert commenting on the public school system in America. Rather, he was treated as one of America’s finest philosophers who just happened to be sharing his ideas on education to a respectful and attentive national audience. At the opening of the 21st century, however, the situation is much different.
The final cause for discouragement regarding public intellectuals is the tug of war between academe and the public-private sector in which public intellectuals currently find themselves. Public intellectuals play a crucial role in the circulation, production and identity of knowledge though the two worlds they inhabit -- academe and the public-private sector -- both compete for their allegiance and affiliation. The interests of these two worlds are very different, with the most obvious difference being that academe privileges highly specialized modes of discourse, whereas the public-private world favors generalized ones.
I believe that the fundamental terms of the relationship of public intellectuals to the academic and public-private sectorss must be changed. I will even go so far as to offer that we might consider replacing the phrase "public intellectual" with the arguably more apt (albeit controversial) one, "corporate intellectual." The motivation for my case, however, will come from a most unlikely and unconventional source -- Emerson. Even though Emerson was writing well before the rise of academe and the university in America, his thoughts on academics and public intellectuals are extremely insightful and provide a unique point of entry regarding the issues at hand.
Critical reflection on the role of public intellectuals in America is important at this particular time in our history. Recent social and political events such as the war in Iraq, the mistreatment of prisoners at Guantanamo Bay and Abu Graib, and our responses to natural disasters, such as increasing global warming and Hurricane Katrina, reveal that our society seems to have lost its ability to question authority, to separate knowledge from opinion, and to discern what is valuable from what is worthless. Public intellectuals can potentially play a central role in directing -- or even redirecting -- the social and political agenda of the nation as well as provide the public with reliable insight. However, the academy’s move toward increasingly specialized knowledge and discourse and the public-private sector’s movement toward increasingly generalized (and polarizing) discourse and knowledge places public intellectuals in a difficult position to accomplish these ends. If public intellectuals are to become relevant and respected again, viz., be able to (re)direct social and political beliefs and aims, the terms of their relationship with the public-private and academic spheres must be changed.
Affiliations and Academic Values
Academe is frequently characterized as an oasis from the market-driven forces of the public-private sector. Within the academy, ideas are said to be pursued without regard to their market value by individuals dedicated to the life of the mind. Students and teachers enjoy in academe a reprieve from the pressure to conform their practices to the requirements of "cash value" or "public sentiment." Academe is a site where knowledge is disseminated, discovered, and debated, and academic values are directly linked to these knowledge-driven practices.
The public-private sector, however, is associated with a different set of activities and values. Moreover, arguably, this set of activities and values is defined as the opposite of those of academe. For example, if academe is dedicated to the life of the mind, then the public-private sector is not; if academe disseminates, discovers, and debates knowledge and ideas, then the public-private sector does not; if academe is not motivated by market values, then the public-private sector is. In sum, the public-private sector is a site where ends are pursued relative to their potential either to appease public and private sentiment or produce "cash value," whereas the academy is not.
Affiliation with the public-private sector is often akin in the academy to "selling out," namely, abandoning the pursuit of knowledge for the pursuit of market share. This perception is part of the reason that terms such as "public intellectual" and "academic" are at times used in a mutually exclusive manner: either one is a public intellectual or one is an academic. One cannot be both.
Public intellectuals promote or sell ideas whereas academics pursue or discover ideas; public intellectuals speak to and for the masses, whereas academics speak to and for academics. Moreover, public intellectuals are often distinguished by considerations of quantity, whereas academics are differentiated by considerations of quality. For public intellectuals, the more attention that their ideas or they themselves receive, the more valued they are as public intellectuals. In other words, one cannot be a valuable public intellectual without a public, and the greater the public, the greater the value that is ascribed to the public intellectual. Academics, however, are valued differently.
The key factor in judging the value of academics is quality: quality research in their discipline, quality teaching of their students, and quality service to their institution and community. While quantity can sometimes positively influence determinations of academic value, quantitative value is always tempered by considerations of quality. Standards of academic quality are determined within the academic community and may vary from discipline to discipline. In large part, quality in academia is a relative and subjective affair, as much depends on the standards established by the community. This notion of academic quality is particularly true within the humanities, but arguably holds as well in the sciences. Quality, the relative and subjective factor at the center of determinations of academic value, is much different than the key factor used to determine the value of public intellectuals. Issues of quantity are largely objective and empirical. As we shall see, for some, one only needs a tally-sheet and a calculator to determine the value of a public intellectual, whereas one needs very discipline-specific information to determine the value of an academic. This lack of reliance on discipline-specific information in quality judgments of public intellectuals is troubling.
The Decline of Public Intellectuals
We are living in a time when both the meaning and function of public intellectuals are being radically reshaped. The rise of new media and the growth of the entertainment industry have resulted in an unprecedented need for individuals to participate in it. Increasing numbers of academics are entering this growing marketplace for ideas, while at the same time the number of institutionally unaffiliated persons is decreasing. And while the "decline" of the public intellectual in America has been presented in numerous ways by numerous commentators, the most notorious and noteworthy example is the recent study from the legal commentator Richard Posner.
In his widely debated book Public Intellectuals (2002), Posner argues that American public intellectualism is in "decline" and presents a range of empirical evidence to support this conclusion. By a variety of methodologically questionable means, including statistics on media mentions, Internet traffic, and scholarly mentions, Posner presents a list of 546 major public intellectuals. He also offers a list of the top 100 public intellectuals most frequently mentioned in the media, with Henry Kissenger, Pat Moynihan, George Will, Larry Summers, William Bennett, Robert Reich, and Sidney Blumenthal at the top. Posner’s taxonomy of public intellectuals is as worthless in some respects as E.D. Hirsch’s list of “What Every Literate American Knows” in Cultural Literacy (1987) or Robert Maynard Hutchins’ selection of the Great Books of the Western World (1952). Nevertheless, it is as symptomatic of our times as People magazine’s annual personality taxonomies or David Letterman’s nightly Top Ten Lists.
While Posner’s study of public intellectuals is interesting and well intentioned, the fact that his quest for the biggest figures in the intellectual world literally is solely based on quantitative factors, and never on qualitative ones, is disappointing. Posner’s method furthers the notion that public intellectualism is merely a matter of "getting noticed" and never a matter of the quality of contribution one is making, let alone its epistemological, social and political value. Work like Posner’s continues to promote the unfortunate notion that public intellectuals are identifiable and worthy of merit based solely on the size of the market for their ideas, with no methodological allowances made for the quality of their contributions to public discourse. In addition, Posner treats public intellectualism in America as though it were merely part of the entertainment industry -- which it very well may be -- and, as such, judged by standards more akin to the Nielson ratings than the tribunal of reason.
Work on public intellectuals by cultural theorists like David Shumway, Jeffrey J. Williams, Sharon O’Dair and Cary Nelson is vastly superior to work like Posner’s. Their work seldom gets bogged down in the quantitative and who’s-who aspects of public intellectualism, but rather focuses on the cultural and disciplinary logic of what they call “the star system.” Effectively, their work on the star system is a commentary on the transition of some individuals from (private) academics to public intellectuals: a transition noteworthy for its shift between differing criterions of value, among other things.
One aspect of the star system is that a small coterie of academics make the transformation from being merely the most recognizable face of the life of the mind (academic stars) to being quite literally part of the entertainment industry (super-stars). As super-stars, their entertainment qualities and market value exceed those of mere academic stars. They operate in a value system more like that of movie stars than that of academic stars. If one can raise a stir, then one achieves a higher value in this system.
The nature of public intellectualism in America is in crisis partly because a wedge has been driven between the interests of academe and the interests of public-private sectors. One is either a mere academic or one is a mere public figure. As an academic, one’s audience is at best the members of one’s profession, and at worst, the members of one sub-area of one’s profession. In either case, the audience is strictly delimited. As a public intellectual, while one finds one’s audience expanded beyond the limits of one’s profession, one also finds it increasingly difficult in America to carry on a high and relevant level of discourse.
Given the unfortunate situation of academic and public intellectuals in America today, it might be instructive to look back to a time in America when the promise of a strong relationship between intellectuals and both academe and the public-private spheres existed and then ask how this relationship might be re-established. In looking back, I would like to comment on Emerson, in whose work there is the promise of a compromise between mere academics and mere public intellectuals; in looking forward I would like to suggest that we consider abandoning the academic-public intellectual dichotomy and establish a new category that might be called the "corporate intellectual" -- a term more consonant with the values of the new academy as well as with the public-private sector.
In his 1837 address to the Phi Beta Kappa society, “The American Scholar,” Emerson envisioned the American scholar as a person who would do whatever possible to communicate ideas to the world, not just to fellow intellectuals. Emerson regarded the American scholar to be a whole person while thinking. As a whole person, the American scholar would speak and think from the position of the “One Man,” which “is not a farmer, or a professor, or an engineer, but he is all. Man is priest, and scholar, and statesman, and producer, and soldier."
In the act of thinking, the intellectual becomes this whole person. Emerson writes: "In this distribution of functions the scholar is the delegated intellect. In the right state he is Man Thinking. In the degenerate state, when the victim of society, he tends to become a mere thinker, or still worse, the parrot of other men’s thinking. "
Isn’t this still true today? Doesn’t public intellectualism suffer from the exact form of degeneracy noted by Emerson? Are there not too many public intellectuals who are parrots in the public arena, speaking merely from the parameters laid out for them by others? Is regurgitating established discourses and strictly defined conceptual frameworks a sign of public intellectualism or public propaganda? Emerson is right in asserting that such things both discredit the ideas of individuals and render suspect the quality of their thoughts.
In all fairness though, perhaps “parroting” is more of a practical necessity today than it was in Emerson’s time. The need to affiliate one’s ideas with a group, school or individual is perhaps a function of the sound-bite age, where metonymic or telegraphic communication abounds. We demand labels for and from our public intellectuals, and when we don’t have them, we become nervous. And the labels we put on and demand from our public intellectuals are perhaps more important than what they actually think. "He’s a Republican" or "She’s a feminist" go a long way in the public arena in terms of persuading people of the value of our "thinking"; phrases like she sides with "moral values" and he is "against big government" serve as short-hand for more complete explanations and serve to cut off public debate and thought. This labeling process presents the conditions for an unending repetition and circulation of crystallized, unchanging doctrines within the public sphere.
As a public intellectual, Emerson’s whole person thinking wears a number of different hats. "The office of the scholar," writes Emerson, "is to cheer, to raise, and to guide men by showing them facts amidst appearances. He plies the slow, unhonored, and unpaid task of observation." "He is one who raises himself from private considerations and breathes and lives on public and illustrious thoughts. He is the world’s eye. He is the world’s heart." Emerson closes his address with a beautiful vision of public intellectuals as a group: "We will walk on our own feet; we will work with our own hands; we will speak our own minds. The study of letters shall be no longer a name for pity, for doubt, and for sensual indulgence."
Emerson provides us with a very clear response to the relationship of intellectuals to the public-private and academic spheres. For him, intellectuals live among these spheres, but do not affiliate with either one exclusively. For him intellectuals are always already involved in the public and private spheres as well as in the academic spheres and others. The concept of an "intellectual" for him implies a relationship with public, private and academic interests. Emerson himself, as perhaps the premier public intellectual of his day, if not in American history in general, both promoted or sold his ideas as well as worked hard to pursue or discover ideas; he both spoke to and for the masses as well as to and for the scholar.
From Public Intellectuals to Corporate Intellectuals
Public intellectualism today seems remote from the ideals of the Emersonian intellectual. In contrast to Emerson’s notion of the intellectual, our own appears overly narrow. The notion of the intellectual as "trapped" between affiliating with academe and the public-private sector is foreign to Emerson’s all-embracing intellectual. The rise of the corporate university allegedly pulls intellectuals away from the realm of academic values and into the realm of corporate and market (or neo-liberal) values. The general conclusion of most commentary of this type is that the intellectual’s values and identity are compromised in some way -- a conclusion that is reached by assuming that corporate and academic values are fundamentally incompatible.
But why do we need to continue to regard corporate and academic values as incompatible? Can there not be some common ground between them that allows not only for the continuing integrity of academic values in themselves but also of corporate values in themselves? Furthermore, what would happen if we postulate the intellectual from the position of the compatibility of academic and corporate values? Would the resultant intellectual be admirable or despicable? Progressive or reactionary? A monster or an angel?
One might reasonably call the type of intellectual that is the result of the rise of the new corporate university a "corporate intellectual." This designation would not only be appropriate, but also ultimately a fair one. While some might look upon the designation "corporate intellectual" with fear and disdain, I will offer that it is no less disdainful than the shopworn and outmoded designation “public intellectual.” More often than not, public intellectuals function in America today as part of the entertainment industry -- as part of a space set apart from academe. Most American academics are not public intellectuals, even if many of America’s public intellectuals are academics.
The recent rise of the corporate university leads one to the conclusion that academe is no longer nor will it ever be again an oasis divorced from private and public interests. Therefore, if intellectuals believe that the recent demand to straddle academe and the public-private sector is the continuing condition of the academy, they will be obligated to develop a sense of intellectual self-identity that does not view itself as "trapped" or "compromised." As the nature of academic identity changes, so too will, of necessity, the identity of intellectuals.
These changes in the configuration of the university call for academics to consider the markets for their ideas. In other words, instead of merely pursuing ideas in themselves or ideas as such, academics would weigh the market value of their ideas along with more purely knowledge-based considerations. This would simply be an extension of market-based practices already well established in academia. For example, most doctoral candidates balance the knowledge-based virtues of possible dissertation topics against the potential of these topics being appealing to prospective employers. Moreover, this market-based decision making is not limited to graduate students alone.
Professors of all levels working on manuscripts with an eye toward publication are remiss if they do not consider the market for their manuscript in the early stages of its development. Academic presses are increasingly behaving more like trade presses in that they are with more frequency refusing to publish otherwise academically sound manuscripts that do not have much potential for sales. On the down-side, this trend puts more pressure on academics to publish books with appeal beyond a small coterie of specialists; on the up-side, it compels academics to think in terms of a wider-audience for their ideas and to pursue projects that engage a broader set of interests and knowledge.
Furthermore, while it would be easy to be disdainful of the type of intellectual that results from this process, one should avoid this judgment and maintain an open mind as to the potential of these intellectuals for producing progressive change in both their particular professions and society at large. Corporate intellectuals would be persons who would always take into account at some level the market for their ideas and who would never merely pursue ideas as such. Market considerations of one’s ideas of necessity bring them into the public sphere -- and ultimately to a wider audience. Consequently, corporate intellectualism would in effect be a new type of public intellectualism. Moreover, given the current state of public intellectualism in America, this transition might not be a bad thing, particularly if it brings into the public sphere more of the progressive kinds of knowledge and questions pursued by academics.
The necessary condition for proper academic values and identity should not be gauged by one’s disassociation of interest with the market. As “corporate intellectuals,” members of academe would configure their identity as allied to both the “insular” world of the academy and to the public sphere. Not only is this a potentially more positive, socially responsible identity for intellectuals, it is more in tune with the current and continuing material conditions of the academy. So, for example, in considering writing a book or offering a course, intellectuals would weigh market considerations with academic concerns, asking both whether the project would have a market and whether it would further academic discourse. This reconfigured identity will resonate with academics seeking ways to have more public influence.
Rather than feeling trapped between academe and the public-private sector, academics should take advantage of the opportunity to align their identity with the public-private sphere. One of our goals as intellectuals might be to find ways to bring the two spheres to work together more organically, exercising public accountability without compromising our intellectual freedoms. In the process, increasing numbers of academic intellectuals might come to be regarded as public intellectuals. While the phrase “corporate intellectual” might grate against those ideologically opposed in toto to the corporatization of the university, it will be much more difficult for them to reject prima facia the notion that academics should weigh market considerations along with purely knowledge-based ones. If nothing else, the phrase “corporate intellectual” will spark much needed conversation about the positive role for academics in the emerging corporate university, particularly with regard to their relation to the public sphere. This will be one of the more encouraging consequences of the corporatization of the university, a material condition that does not appear to be passing away very soon. In the end, these newly minted corporate intellectuals have the potential not only to alter the meaning and nature of the American intellectual, but also to capture, as Emerson says, the world’s eye and the world’s heart. Hopefully, this is something that they will be able to do without seriously jeopardizing the pursuit of knowledge.
Jeffrey R. Di Leo
Jeffrey R. Di Leo is dean of the School of Arts and Sciences at the University of Houston at Victoria. He is editor and publisher of the American Book Review and editor of symploke, where a version of this essay first appeared. His most recent publications include A ffiliations: Identity in Academic Culture, On Anthologies: Politics and Pedagogy and Fiction's Present: Situating Contemporary Narrative Innovation (with R.M. Berry).
Valentine's Day. Time to pull out your Shakespeare's Sonnets, choose one to type up in a fancy font for that special someone, and deliver it with a box of chocolates. But as you thumb through the sonnets you begin wonder why they ever got connected with romance in the first place.
Here's one that begins "When forty winters shall beseige thy brow / And dig deep trenches in thy beauty's field." A warning about growing less beautiful with age isn't going to win anyone's heart -- especially someone who's seen 40 winters.
"Shall I compare thee to a summer's day? / Thou art more lovely and more temperate"? Summer's better than winter. And this sonnet worked for Joseph Fiennes in Shakespeare in Love. But Gwyneth Paltrow stopped reading after the first few lines. Copying out the entire poem, you begin to worry that "Nor shall death brag thou wanderest in his shade, / When in eternal lines to time thou growest" will not leave the right impression. It's an attractive idea, that poetry will preserve the loved one in death. But it's a little obscurely stated and besides, you're looking for romance, not more worry about aging. You crumple up your paper and keep looking.
Leaf through other favorites. Sonnet 29: "When, in disgrace with fortune and men's eyes, / I all alone bewail my outcast state." Too pathetic; doesn't go with chocolates. Not even the sonnet's happier but still needy conclusion -- "For thy sweet love remembered such wealth brings / That then I scorn to change my state with kings" -- entirely helps. You start to wonder that Shakespeare shares your fear of dying alone.
Determined not to use sonnet 116 -- "Let me not to the marriage of true minds / Admit impediments" -- which you're saving for your wedding, you peruse other well-known sonnets. 73: More aging and death. 129: "Th' expense of spirit in a waste of shame / Is lust in action." You wish. 130: "My mistress' eyes are nothing like the sun." Too arch. And what if special someone won't like being referred to as a "mistress" or becoming the subject of a macho bragging contest (''And yet, by heaven, I think my love as rare / As any she belied with false compare")?
Try a new tack; look over the more unfamiliar sonnets; realize why they're unfamiliar: "But why thy odour matcheth not thy show, / The soil is this: that thou dost common grow" (69); "So shall I live supposing thou art true / Like a deceived husband" (93); "eyes corrupt by over-partial looks / Be anchored in the bay where all men ride" (137).
It looks like it will have to be 116 after all. But doubts remain. The sonnet is famous for being read at weddings, but is alarmingly clingy for a courtship. Love never "bends with the remover to remove." What if there's a restraining order? Closing your book, you wonder why can't you find a single appropriately romantic sonnet by Shakespeare.
The answer, of course, is that good poetry isn't necessarily good for serving useful ends, even an end as apparently non-utilitarian as romance. We often connect Shakespeare's sonnets to romance -- heterosexual romance in particular -- but that's because of the way that they've been framed, and marketed, in the last half century. Having spent several years studying the reception of Shakespeare's sonnets, my favorite example of this marketing comes from a book called Shakespeare in Love: The Love Poetry of William Shakespeare. Published by Miramax in 1998 to coincide with its release of the movie of the same name, it features stills of Paltrow and Fiennes. In one still "Shakespeare" and "Viola" stare lovingly into one another's eyes. Juxtaposed to the photo is sonnet 138, which begins, "When my love swears that she is made of truth / I do believe her though I know she lies" and goes downhill from there.
I have marveled how this particular picture got connected to that particular sonnet. Had the compiler of the book not read the sonnet? Was he or she counting on the book's supposed readers not to read it? Or, my favorite idea, was it a bit of mischievousness on the part of a bored aspiring poet or former English major, striking a blow against the corporate marketing of Shakespeare as a figure of heterosexual romance. If so: I got your message, brother (or sister).
Though they are seen this way today, Shakespeare's sonnets have not always been linked to heterosexual romance -- or even been very highly regarded. For nearly 200 years after their first publication in 1609, readers often considered them among the worst things Shakespeare ever wrote. Nathan Drake, writing late in the last years of the 18th century, praised a 1793 edition of Shakespeare's Works for pointedly leaving the sonnets out. "For where is the utility," Drake asked, "of propagating compositions which no one can endure to read?"
Since the beginning of the 19th century the sonnets have become far more popular. But not without hesitations. Many readers are familiar with the fact that Shakespeare wrote the first 126 -- most scholars agree -- to a man, and just the last 28 to a woman. (Notably, all the really famous sonnets, except for "My mistress' eyes are nothing like the sun" come from the first 126.) For some readers such as the early 19th-century poet Samuel Taylor Coleridge, the idea of Shakespeare writing love poetry to another man was intolerable. Coleridge imagined that the male recipient must really have been female, an idea perpetuated in the modern habit of putting women, or men and women (rather than, say, two men), on the covers of editions of the sonnets.
It is less well-known, however, that 19th and early-20th century readers were often more disturbed by the sonnets to the woman. For many Victorian readers especially, the sonnets' expressions of male-male love were completely familiar within the homosocial world (not to mention public schools) of Victorian England, which considered one man's love for another a sign of proper manliness. Moreover, these Victorians, as many readers before and after them, were appalled by the bitter, lascivious, and adulterous sonnets to the woman who would become known as "the dark lady" -- itself a euphemism, since the sonnets make it clear she is no lady (she is the one sonnet 137 calls "the bay where all men ride").
While Coleridge fretted over Shakespeare’s sonnets to the young man, his friend and fellow poet William Wordsworth believed it was the sonnets to this dark lady that were “abominably harsh, obscure and worthless." The Victorian literary dynamo and Shakespeare editor F.J. Furnivall wrote that no one would doubt that the sonnets were autobiographical, if it were not for the fact that they told the story of Shakespeare's liaison with a married woman. For Furnivall, it was the story of love between men that redeemed the sonnets.
Modern marketing (and too often, teachers of English) has "solved" the "problems" of homoeroticism on the one hand, and misogynist, licentious, adulterous sex on the other, by this sleight of hand: Select the most appealing of the generally more appealing sonnets to the young man, and pretend that they're to a woman.
So what? Why should it matter to us today to whom the sonnets were written, or how earlier readerships received them? What's wrong with this romantic Shakespeare? Well, he's not really that romantic. And just as overly sentimental ideas of the sonnets reduce their range of emotion and psychological complexity -- one of the reasons that readers really do value them -- so do understandings of the sonnets that ignore their historical meanings too easily make Shakespeare's sonnets a mirror of our own limited experience of the world. Good poetry should stretch minds, not be molded to them.
So read Shakespeare's sonnets, and read about them. But for Valentine's day, give chocolates.
For countless dead bodies to become reanimated and swarm through the streets as cannibalistic ghouls would count as an apocalyptic development, by most people's standards. Then again, it is not one that we have to worry about all that much. Other possibilities of destruction tend to weigh more heavily on the mind. But if you combine extreme improbability with gruesome realism, the effect is a cinematic nightmare that won't go away -- one of the most durable and resonant forms of what Susan Sontag once described as "the imagination of disaster."
It all began with the release of George Romero's Night of the Living Dead in 1968: a low-budget independent film that more or less instituted the conventions of the cannibalistic zombie movie, as further developed in his Dawn of the Dead (1978) and Day of the Dead (1985). Other directors have played variations on his themes, but Romero remains the definitive zombie auteur -- not simply for founding the subgenre, but for making it apocalyptic in the richest sense. For the root meaning of "apocalypse," in Greek, is "an uncovering." Romero's zombies expose the dark underside of American culture: racism, consumerism, militarism, and so on.
His most recent addition to the zombie cycle, Diary of the Dead, which opened last Friday, returns viewers to the opening moments of the undead's onslaught. But while his first film, Night, was set in a world where radio and television were the only sources of information for panicking human refugees, Diary is a zombie film for the age of new media. Romero's band of survivors this time consists of a bunch of college students (and their alcoholic professor) who are busy making a film for class when the end of the world hits. One of them become obsessed with posting footage of the catastrophe online -- a chance for Romero to explore the ways that digital technology makes zombies of its users.
As an enthusiast for Romero's apocalyptic satire, I was somehow not terribly surprised to learn last year that Baylor University Press had published a book called Gospel of the Living Dead: George Romero's Visions of Hell on Earth. The author, Kim Paffenroth, is an associate professor of religious studies at Iona College in New Rochelle, New York.
Romero's zombie apocalypse brings "the complete breakdown of the natural world of food chains, social order, respect for life, and respect for death," writes Paffenroth, "because all those categories are meaningless and impossible to maintain in a world where one of the most fundamental limen, the threshold between alive and dead, has become a threshold that no one really crosses all the way over, but on which everyone lives suspended all the time." And in this moment of revelation, all the deadly sins stand fully revealed (and terribly rapacious).
The release of Diary of the Dead seemed a perfect excuse finally to interview Paffenroth. He answered questions by e-mail; the full transcript follows.
Q:You mention in your book that George Romero's work has literally given you nightmares. How did you go from watching his films to writing about them, and even publishing zombie fiction of your own?
A: Well, I was fascinated with the original Dawn when I was still a teen, but I'm afraid my level of commentary seldom got beyond -- "Zombies! Cool!" And then, to be honest, I didn't think of or watch any zombie films from the time Day came out until the Dawn remake was released. But during those years, I was just reading everything I could -- especially ancient and medieval literature, philosophy, and theology. So when I saw the Dawn remake, things clicked and I could give a more thorough and complicated response than I had when I was a youth, because I could then see how Romero was building on Dante and the Bible.
And to be frank, at that point I'd written a lot of books about the Bible and other theological topics, and no one read them. To an author, that's probably the worst disappointment imaginable. So I took a chance that if people didn't want to read about these theological subjects directly, maybe through the filter of their favorite monster genre, they'd be more open to the discussion and analysis. And it seems that they are.
As for making the transition to fiction writing, that's just crazy hubris that strikes all of us at some point -- the idea that anyone would want to read the tales we write -- and some of us are dogged and patient and lucky enough that it actually amounts to something. I never get over it, when I realize that there are some people who like my fiction and look forward to what I'll write next. That's a huge rush and I want to keep it going as long as I can.
Q:In the New Testament, Jesus dies, then comes back to life. His followers gather to eat his flesh and drink his blood. I am probably going to hell for this, but .... Is Christianity a zombie religion?
A: I think zombie movies want to portray the state of zombification as a monstrous perversion of the idea of Christian resurrection. Christians believe in a resurrection to a new, perfect state where there will be no pain or disease or violence. Zombies, on the other hand, are risen, but exist in a state where only the basest, most destructive human drive is left - the insatiable urge to consume, both as voracious gluttons of their fellow humans, and as mindless shoppers after petty, useless, meaningless objects. It's both a profoundly cynical look at human nature, and a sobering indictment of modern, American consumer culture.
Q:The human beings in Romero's world are living through an experience of "hell on earth." as your subtitle says. There are nods towards some possible naturalistic explanation for the dead within the films (that a virus or "space radiation" somehow brought corpses back to life) but the cause is never very useful or important to any of the characters. And some characters do think mankind is finally being punished. Is the apocalyptic dimension just more or less inevitable in this kind of disaster, or is it deliberate? To what degree is Romero's social satire consciously influenced by Christian themes? Or are those themes just inevitably built into the scenario and imagery?
A: I think "apocalyptic" has just come to mean "end of civilization," so of course, any movie or book with that as its premise is, by definition, "apocalyptic." And even if we throw in the interpretation "God's mad at us -- that big, mean God!" I still don't think that's very close to real, biblical apocalyptic.
Romero's view is a lot closer to biblical apocalyptic or prophetic literature, for he seems to make it clear, over and over, that humanity deserves this horror, and the humans in the films go to great lengths to make the situation even worse than it is already -- by their cruelty, greed, racism, and selfishness. Whether this is conscious or accidental, I really can't address with certainty: I only note that his prophetic vision is compatible with a Christian worldview, not that it stems from that.
Q:The fifth movie in George Romero's zombie cycle, Diary of the Dead , opened over the weekend. Does it seem like a progression or development in his vision, or does it simply revisit his earlier concerns in a new setting?
A: I think each film in the series has a special target that is the particular focus of Romero's disgust at the moment. The media has always been at the periphery in each of the previous films -- cooperating with government ineptitude and coverup in the first two until the plug's pulled and there is no more media -- but now it's the main subject of this installment.
Romero does a great job capturing the sick voyeurism of addiction to cell-phone cameras and the Internet - there are so many shots in this one where you just want to shout at the characters, "Put down the camera and HELP HER! SHE'S BEING EATEN ALIVE YOU IDIOT!" It is surely no accident that the two people who most help our protagonists are either cut off from the media (the Amish man) or they themselves have been the target of unfair representation in the media (black men who are called "looters" when white people in Katrina were said to be "salvaging" or "gathering" supplies). And the one time a crime is committed by one group of humans against another, the camera is forced off.
With that being said, I think in many ways it does return to the vision of Night of the Living Dead with its overwhelming cynicism and despair. Certainly the last shot is meant to evoke the same feeling of finality and doom as the first film, the gripping doubt that there's anything left in human society worth saving.
Q:It feels as if Romero is suggesting that Jason, the character holding the digital camera, is himself almost a zombie. There's something creepy about his detachment -- his appetite for just consuming what is going on around him, rather than acting to help anyone. But there are also indications that the cameraman does have a kind of moral commitment to what he is doing. He's trying to capture and transmit the truth of what is going on, because doing so might save lives. What did you make of that ambiguity? Is something redemptive going on here with behavior that otherwise seems quite inhuman?
A: I'd have to think about it in detail, once I have the DVD "text" to study. My initial reaction is that that interpretation mostly comes from the voice-over by Deb, his girlfriend and the narrator of Diary. The exact motives of Jason remain hazy to me. He says he doesn't want fame (what would it mean in their world?), yet he's obsessed with the 72,000 hits in 9 minutes. But he doesn't exactly explain why in that scene. I don't think he said that maybe some of the 72k people were saved or that he's doing a public service or helping save the world.
He just seems addicted and intoxicated by the 72k number itself -- like even if it's not fame, it's a junkie's fix, it's a validation of his value, as indeed is the chilling (and slightly comical) act of handing the camera to Deb at the end. As she keeps accusing him: if it doesn't happen on camera, it's like it doesn't happen.
So the camera is not reflecting reality, it's creating it. And Jason's version of reality is better than the government's falsified version of the first attack, because it's more accurate, but it's no less addictive or exploitive or inhumane by the end.
Q:Good points, but I still think there's some ambiguity about Jason's role, because this is a problem that comes up in debates over journalistic ethics -- whether the responsibility to report accurately and as a disengaged observer becomes, at some point, irresponsibility to any other standard of civilized behavior. Arguably Romero is having it both ways: criticizing Jason while simultaneously using the narrative format to ask whether or not his behavior might have some justification (however ex post facto or deluded).
A: Perhaps artists can have it both ways in a way journalists can't. Artists deal in ambiguities, journalists (supposedly) deal in facts. But with cell phones and the internet, suddenly everyone is a potential "journalist" and the facts are even more malleable and volatile than they ever were.
Q:You note that this subgenre has proven itself to be both popular with audiences and marginal to Hollywood. "Zombie movies," you write in your book, "just offend too many people on too many levels to be conventional and part of the status quo." And while not quite as gory as some of Romero's earlier work, Diary ends with an image calculated to shock and disgust. Is this a matter of keeping the element of humor under control? While a spoof like Shaun of the Dead was an affectionate homage to Romero, the element of social satire there didn't really have much, well, bite....
A: That's a great way to put it - that humorous homages use humor to offset the gore (look at the really over-the-top squashing scene in Hot Fuzz for an example of just how much gore you can offset, if the movie's funny enough!). But it also works the other way -- that biting social criticism needs some bite, needs to be a little out of control and not tamed or staid. I like that idea.
That being said, Romero makes my job a lot harder. The gore hounds sometimes put their hands over their ears and chant "LALALALA! I can't hear you!" if I say that some image they love on an aesthetic level might *mean* something -- while I think a lot of readers or viewers who might be receptive to critcism of our society just can't make it past the first disemboweling.
I would suppose it's an artistic judgment, and for me at least, Romero has been hitting the right balance for a long time, and is continuing to do so.