Poetry Can Be Dangerous

On April 19, after a day of teaching classes at Shippensburg University, I went out to my car and grabbed a box of old poetry manuscripts from the front seat of my little white beetle and carried it across the street and put it next to the trashcan outside Wright Hall. The poems were from poetry contests I had been judging and the box was heavy. I had previously left my recycling boxes there and they were always picked up and taken away by the trash department.

A young man from ROTC was watching me as I got into my car and drove away. I thought he was looking at my car, which has black flower decals and sometimes inspires strange looks. I later discovered that I, in my dark skin, am sometimes not even a person to the people who look at me. Instead, in spite of my peacefulness, my committed opposition to all aggression and war, I am a threat by my very existence, a threat just living in the world as a Muslim body.

Upon my departure, he called the local police department and told them a man of Middle Eastern descent driving a heavily decaled white Beetle with out of state plates and no campus parking sticker had just placed a box next to the trash can. My car has New York State plates, but he got the rest of it wrong. I have two stickers on my car. One is my highly visible faculty parking sticker and the other, which I just don't have the heart to take off these days, says "Kerry/Edwards: For a Stronger America."

Because of my recycling the bomb squad came, the state police came. Because of my recycling buildings were evacuated, classes were canceled, campus was closed. No. Not because of my recycling. Because of my dark body. No. Not because of my dark body. Because of his fear. Because of the way he saw me. Because of the culture of fear, mistrust, hatred, and suspicion that is carefully cultivated in the media, by the government, by people who claim to want to keep us "safe."

These are the days of orange alert, school lock-downs, and endless war. We are preparing for it, training for it, looking for it, and so of course, in the most innocuous of places -- a professor wanting to hurry home, hefting his box of discarded poetry -- we find it.

That man in the parking lot didn't even see me. He saw my darkness. He saw my Middle Eastern descent. Ironic because though my grandfathers came from Egypt, I am Indian, a South Asian, and could never be mistaken for a Middle Eastern man by anyone who'd ever met one.

One of those in the gathering crowd, trying to figure out what had happened, heard my description-a Middle Eastern man driving a white Beetle with out-of-state plates and knew immediately they were talking about me and realized that the box must have been manuscripts I was discarding. When the police were told I was a professor, immediately the question came back about where I was from.

At some length several of my faculty colleagues were able to get through to the police and get me on a cell phone where I explained to the university president and then to the state police that the box contained old poetry manuscripts that needed to be recycled. The police officer told me that in the current climate I needed to be more careful about how I behaved. "When I recycle?" I asked.

The university president appreciated my distress about the situation but denied that the call had anything to do with my race or ethnic background. The spokesman for the university called it an "honest mistake," not referring to the young man from ROTC giving in to his worst instincts and calling the police but referring to me, who made the mistake of being dark-skinned and putting my recycling next to the trashcan.

The university's bizarrely minimal statement lets everyone know that the "suspicious package" beside the trashcan ended up being, indeed, trash. It goes on to say, "We appreciate your cooperation during the incident and remind everyone that safety is a joint effort by all members of the campus community."

What does that community mean to me, a person who has to walk by the ROTC offices every day on my way to my own office just down the hall-who was watched, noted, and reported, all in a day's work? Today we gave in willingly and whole-heartedly to a culture of fear and blaming and profiling. It is deemed perfectly appropriate behavior to spy on one another and police one another and report on one another. Such behaviors exist most strongly in closed and undemocratic and fascist societies.

The university report does not mention the root cause of the alarm. That package became "suspicious" because of who was holding it, who put it down, who drove away. Me.

It was poetry, I kept insisting to the state policeman who was questioning me on the phone. It was poetry I was putting out to be recycled.

My body exists politically in a way I can not prevent. For a moment today, without even knowing it, driving away from campus in my little Beetle, exhausted after a day of teaching, listening to Justin Timberlake on the radio, I ceased to be a person when a man I had never met looked straight through me and saw the violence in his own heart.

Kazim Ali
Author's email: 

Kazim Ali is a poet and novelist. He teaches at Shippensburg University and at Stonecoast, the low-residency MFA program of the University of Southern Maine. Eyewitnesses confirmed his account of the scene after he left the university. A university spokesman declined to discuss specifics of the incident or who was involved, but told Inside Higher Ed that "the response was appropriate based on the circumstances," and that "just days after the [Virginia Tech] massacre, everybody is looking out for each other."

Digital Masonry

Jacques-Alain Miller has delivered unto us his thoughts on Google. In case the name does not signify, Jacques-Alain Miller is the son-in-law of the late Jacques Lacan and editor of his posthumously published works. He is not a Google enthusiast. The search engines follows “a totalitarian maxim,” he says. It is the new Big Brother. “It puts everything in its place,” Miller declares, “turning you into the sum of your clicks until the end of time.”

Powerful, then. And yet – hélas! – Google is also “stupid.” It can “scan all the books, plunder all the archives [of] cinema, television, the press, and beyond,” thereby subjecting the universe to “an omniscient gaze, traversing the world, lusting after every little last piece of information about everyone.” But it “is able to codify, but not to decode. It is the word in its brute materiality that it records.” (Read the whole thing here. And for another French complaint about Google, see this earlier column.)

When Miller pontificates, it is, verily, as a pontiff. Besides control of the enigmatic theorists’s literary estate, Miller has inherited Lacan’s mantle as leader of one international current in psychoanalysis. His influence spans several continents. Within the Lacanian movement, he is, so to speak, the analyst of the analysts’ analysts.

He was once also a student of Louis Althusser, whose seminar in Paris during the early 1960s taught apprentice Marxist philosophers not so much to analyze concepts as to “produce” them. Miller was the central figure in a moment of high drama during the era of high structuralism. During Althusser’s seminar, Miller complained that he had been busy producing something he called “metonymic causality” when another student stole it. He wanted his concept returned. (However this conflict was resolved, the real winner had to be any bemused bystander.)

Miller is, then, the past master of a certain mode of intellectual authority – one that has been deeply shaped by (and is ultimately inseparable from) tightly restricted fields of communication and exchange.

Someone once compared the Lacanian movement to a Masonic lodge. There were unpublished texts by the founder that remained more than usually esoteric: they were available in typescript editions of just a few copies, and then only to high-grade initiates.

It is hard to imagine a greater contrast to that digital flatland of relatively porous discursive borders about which Miller complains now. As well he might. (Resorting to Orwellian overkill is, in this context, probably a symptom of anxiety. There are plenty of reasons to worry and complain about Google, of course. But when you picture a cursor clicking a human face forever, it lacks something in the totalitarian-terror department.)

Yet closer examination of Miller’s pronouncement suggests another possibility. It isn’t just a document in which hierarchical intellectual authority comes to terms with the Web's numbskulled leveling. For the way Miller writes about the experience of using Google is quite revealing -- though not about the search engine itself.

“Our query is without syntax,” declares Miller, “minimal to the extreme; one click ... and bingo! It is a cascade -- the stark white of the query page is suddenly covered in words. The void flips into plenitude, concision to verbosity.... Finding the result that makes sense for you is therefore like looking for a needle in a haystack. Google would be intelligent if it could compute significations. But it can’t.”

In other words, Jacques-Alain Miller has no clue that algorithms determine the sequence of hits you get back from a search. (However intelligent Google might or might not be, the people behind it are quite clearly trying to “compute significations.”) He doesn’t grasp that you can shape a query – give it a syntax – to narrow its focus and heighten its precision. Miller’s complaints are a slightly more sophisticated version of someone typing “Whatever happened to Uncle Fred?” into Google and then feeling bewildered that the printout does not provide an answer.

For an informed contrast to Jacques-Alain Miller’s befuddled indignation, you might turn to Digital History Hacks, a very smart and rewarding blog maintained by William J. Turkel, an assistant professor of history at the University of Western Ontario. (As it happens, I first read about Miller in Psychoanalytic Politics: Jacques Lacan and Freud's French Revolution by one Sherry Turkle. The coincidence is marred by a slip of the signfier: they spell their names differently.)

The mandarin complaint about the new digital order is that it lacks history and substance, existing in a chaotic eternal present – one with no memory and precious little attention span. But a bibliographical guide that Turkel posted in January demonstrates that there is now extensive enough literature to speak of a field of digital history.

The term has a nice ambiguity to it – one that is worth thinking about. One the one hand, it can refer to the ways historians may use new media to do things they’ve always done – prepare archives, publish historiography, and so on. Daniel J. Cohen and Roy Rosenzweig’s Digital History: A Guide to Gathering, Preserving, and Presenting the Past on the Web (University of Pennsylvania Press, 2006) is the one handbook that ought to be known to scholars even outside the field of history itself. The full text of it is available for free online from the Center for History and New Media at George Mason University, which also hosts a useful selection of essays on digital history.

But as some of the material gathered there shows, digitalization itself creates opportunities for new kinds of history – and new problems, especially when documents exist in formats that have fallen out of use.

Furthermore, as various forms of information technology become more or more pervasive, it makes sense to begin thinking of another kind of digital history: the history of digitality.
Impressed by the bibliography that Turkel had prepared – and by the point that it now represented a body of work one would need to master in order to do graduate-level work in digital history – I contacted him by e-mail to get more of his thoughts on the field.

“Digital history begins,” he says, “with traditional historical sources represented in digital form on a computer, and with 'born-digital' sources like e-mail, text messages, computer code, video games and digital video. Once you have the proper equipment, these digital sources can be duplicated, stored, accessed, manipulated and transmitted at almost no cost. A box of archival documents can be stored in only one location, has to be consulted in person, can be used by only a few people at a time, and suffers wear as it is used. It is relatively vulnerable to various kinds of disaster. Digital copies of those documents, once created, aren't subject to any of those limitations. For some purposes you really need the originals (e.g., a chemical analysis of ink or paper). For many or most other purposes, you can use digital representations instead. And note that once the chemical analysis is completed, it too becomes a digital representation.”

But that’s just the initial phase, or foundation level, of digital history – the scanning substratum, in effect, in which documents become more readily available. A much more complex set of questions come up as historians face the deeper changes in their work made possible by a wholly different sort of archival space – what Roy Rosenzweig calls the "culture ofabundance" created by digitality.

“He asks us to consider what it would mean to try and write history with an essentially complete archival record,” Turkel told me. “I think that his question is quite deep because up until now we haven't really emphasized the degree to which our discipline has been shaped by information costs. It costs something (in terms of time, money, resources) to learn a language, read a book, visit an archive, take some notes, track down confirming evidence, etc. Not surprisingly, historians have tended to frame projects so that they could actually be completed in a reasonable amount of time, using the availability and accessibility of sources to set limits.”

Reducing information costs in turn changes the whole economy of research – especially during the first phase, when one is framing questions and trying to figure out if they are worth pursuing.

“If you're writing about a relatively famous person,” as Turkel put it, “other historians will expect you to be familiar with what that person wrote, and probably with their correspondence. Obviously, you should also know some of the secondary literature. But if you have access to a complete archival record, you can learn things that might have been almost impossible to discover before. How did your famous person figure in people's dreams, for example? People sometimes write about their dreams in diaries and letters, or even keep dream journals. But say you wanted to know how Darwin figured in the dreams of African people in the late 19th century. You couldn't read one diary at a time, hoping someone had had a dream about him and written it down. With a complete digital archive, you could easily do a keyword search for something like "Darwin NEAR dream" and then filter your results.”
As it happens, I conducted this interview a few weeks before coming across Jacques-Alain Miller’s comments on Google. It seems like synchronicity that Turkel would mention the possibility of digital historians getting involved in the interpretation of dreams (normally a psychoanalyst’s preserve). But for now, it sounds as if most historians are only slightly more savvy about digitality than the Lacanian Freemasons.

“All professional historians have a very clear idea about how to make use of archival and library sources,” Turkel says, “and many work with material culture, too. But I think far fewer have much sense of how search engines work or how to construct queries. Few are familiar with the range of online sources and tools. Very few are able to do things like write scrapers, parsers or spiders.”

(It pays to increase your word power. For a quick look at scraping and parsing, start here. For the role of spiders on the Web, have a look at this.)

“I believe that these kind of techniques will be increasingly important,” says Turkel, “and someday will be taken for granted. I guess I would consider digital history to have arrived as a field when most departments have at least one person who can (and does) offer a course in the subject. Right now, many departments are lucky to have someone who knows how to digitize paper sources or put up web pages.”

Scott McLemee
Author's email: 

Scott McLemee writes Intellectual Affairs each week. Suggestions and ideas for future columns are welcome.

Speak, Memory

Last week,Intellectual Affairs took up the topic of what might be called scandal-mania -- the never-ending search for shock, controversy, and gratifying indignation regarding our “master thinkers.” Unfortunately there haven’t been enough “shocking revelations” recently to keep up with the demand. So the old ones are brought out of mothballs, from time to time.

A slightly different kind of case has come up recently involving Zygmunt Bauman, who is emeritus professor of sociology at the University of Leeds and the University of Warsaw. Bauman is a prolific author with a broad range of interests in social theory, but is probably best known for a series of books and essays analyzing the emergence of the new, increasingly fluid and unstable forms of cultural and social order sometimes called “postmodernism.”

No doubt that fact alone will suffice to convince a certain part of the public that he must be guilty of something. Be that as it may, Bauman is not actually a pomo enthusiast. While rejecting various strands of communitarianism, he is quite ambivalent about the fragmentation and confusion in the postmodern condition. His book Liquid Times: Living in an Age of Uncertainty, just issued by Polity, is quite typical of his work over the past few years -- a mixture of social theory and cultural criticism, sweeping in its generalizations but also alert to the anxieties one sees reflected in the newspaper and on CNN.

In March, a paragraph concerning Bauman appeared at Sign and Sight, a Web site providing capsule summaries in English of the Feuilletons (topical cultural articles) appearing in German newspapers and magazines. It noted the recent publication in the Frankfurter Allgemeine Zeitung of an article by a Polish historian named Bogdan Musial. The piece “uncovers the Stalinist past of the world famous sociologist,” as Sign and Sight put it.

It also quoted a bit of the article. "The fact is that Bauman was deeply involved with the violent communist regime in Poland for more than 20 years,” in Musial’s words, “fighting real and supposed enemies of Stalinism with a weapon in his hand, shooting them in the back. His activities can hardly be passed off as the youthful transgressions of an intellectual seduced and led astray by communist ideology. And it is astonishing that Bauman, who so loves to point the finger, does not reflect on his own deeds."

A few weeks later, another discussion of the matter appeared in The Irish Times -- this one by Andreas Hess, a senior lecturer in sociology at the University of Dublin. The piece bore what seems, with hindsight, the almost inevitable title of “Postmodernism Made Me Do It: A World Without Blame.” (The article is not available except to subscribers, but I’ll quote from a copy passed along by a friend.)

Summing up the charges in the German article, Hess said that secret files recently declassified in Poland revealed that Bauman “participated in operations of political cleansing of alleged political opponents in Poland between 1944 and 1954. The Polish files also show Bauman was praised by his superiors for having been quite successful in completing the tasks assigned, although he seems, as at least one note suggests, not to have taken any major part in direct military operations because of his ‘Semitic background.’ However, to be promoted to the rank of major at the youthful age of 23 was quite an achievement. As the author of the article [in the German newspaper] pointed out, Bauman remained a faithful member of the party apparatus.”

Hess goes on to suggest that “Bauman’s hidden past” is the key to his work as “one of the prophets of postmodernism.” This is not really argued so much as asserted -- and in a somewhat contradictory way.

On the one hand, it is implied that Bauman has used postmodern relativism as a way to excuse his earlier Stalinist crimes. Unfortunately for this argument, Bauman is actually a critic of postmodernism. And so, on the other hand, the sociologist is also guilty of attacking Western society by denouncing postmodernity. Whether or not this is a coherent claim, it points to some of what is at issue in the drama over “Bauman’s secret Stalinism,” as it’s called.

Now, I do not read German or Polish -- a decided disadvantage in coming to any sense of how the controversy has unfolded in Europe. Throughout the former Soviet sphere of influence, a vast and agonizingly complex set of problems has emerged surrounding “lustration” -- the process of "purifying" public life by legally disqualifying those who collaborated with the old Communist regimes from serving in positions of authority.

Debates over the politicized use of lustration in Poland have gone on for years. “What may look like an effort to reconcile with the Communist past,” wrote one Polish legal scholar not long ago, “is something else entirely. It is an assault on reconciliation and a generational bid for power.” There are bound to be implications to Bauman’s lustration that will be lost on those of us looking at it from a distance.

But let’s just look at the matter on purely in terms of the academic scandal we’ve been offered. I have read some of Bauman’s work, but not a lot. Under the circumstances that may be an advantage. I am not a disciple – and by no means feel committed to defending him, come what may.

If he has hidden his past, then its revelation is a necessary thing. But then, that is the real issue at stake. Everything turns on that “if.”

What did we know about Zygmunt Bauman before the opening of his files? What could be surmise about his life based on interviews, his bibliographical record, and books about him readily available at a decent university library?

One soon discovers that “Bauman’s hidden past” was very badly hidden indeed. He has never published a memoir about being a Stalinist -- nor about anything else, so far as I know -- but he has never concealed that part of his life either. The facts can be pieced together readily.

He was born in Poland in 1925 and emigrated to the Soviet Union with his family at the start of World War II. This was an altogether understandable decision, questions of ideology aside. Stalin’s regime was not averse to the occasional half-disguised outburst of anti-Semitism, but that was not the central point of its entire agenda, at least; so it is hardly surprising that a Jewish family might respond to the partition of Poland in 1939 by heading East.

Bauman studied physics and dreamed, he says, of becoming a scientist. He served as a member of the Polish equivalent of the Red Army during the war. He returned to his native country as a fervent young Communist, eager, he says, to rebuild Poland as a modern, egalitarian society – a “people’s democracy” as the Stalinist lingo had it. His wife Janina Bauman, in her memoir A Dream of Belonging: My Years in Postwar Poland (Virago, 1988) portrays him as a true believer in the late 1940s and early 1950s.

But there is no sense overstressing his idealism. To have been a member of the Polish United Workers Party was not a matter of teaching Sunday school classes on Lenin to happy peasant children. Bauman would have participated in the usual rounds of denuciation, purge, “thought reform,” and rationalized brutality. He was also an officer in the Polish army. The recent revelations specify that he belonged to the military intelligence division -- making him, in effect, part of the secret police.

But the latter counts a “revelation” only to someone with no sense of party/military relations in the Eastern bloc. Not every member of the military was a Communist cadre -- and an officer who was also a member of the party had a role in intelligence-gathering, more or less by definition.

But a Jewish party member was in a precarious position – again, almost by definition. In 1953, he was forced out of the army during one of the regime’s campaigns against “Zionists” and “cosmopolitans.” He enrolled in the University of Warsaw and retrained as a social scientist. He began to research on the history of the British Labour Party and the development of contemporary Polish society.

One ought not to read too much dissidence into the simple fact of doing empirical sociology. Bauman himself says he wanted to reform the regime, to bring it into line with its professed egalitarian values. And yet, under the circumstances, becoming a sociologist was at least somewhat oppositional a move. He published articles on alienation, the problems of the younger generation, and the challenge of fostering innovation in a planned economy.

And so he remained loyal to the regime -- in his moderately oppositional fashion -- until another wave of official anti-Semitism in 1968 made this impossible. In her memoir, Janina Bauman recalls their final weeks in Poland as a time of threatening phone calls, hulking strangers loitering outside their apartment, and TV broadcasts that repeated her husband’s name in hateful tones. “A scholarly article appeared in a respectable magazine,” she writes. “It attacked [Zygmunt] and others for their dangerous influence on Polish youth. It was signed by a close friend.”

Bauman and his family emigrated that year, eventually settling in Leeds. (He never faced a language barrier, having for some years been editor of a Polish sociological journal published in English.) His writings continued to be critical of both the Soviet system and of capitalism, and to support the labor movement. When Solidarity emerged in 1980 to challenge the state, Bauman welcomed it as the force that would shape of the future of Poland.

These facts are all part of the record -- put there, most of them, by Bauman himself. By no means is it a heroic tale. From time to time, he must have named names, and written things he didn’t believe, and forced himself to believe things that he knew, deep down, were not true.

And yet Bauman did not hide his past, either. It has always been available for anyone trying to come to some judgment of his work. He has been accused of failing to reflect upon his experience. But even that is a dubious reading of the evidence. A central point of his work on the “liquid” social structure of postmodernism is its contrast with the modernity that went before, which he says was “marked by the disciplinary power of the pastoral state.” He describes the Nazi and Stalinist regimes as the ultimate, extreme cases of that “disciplinary power.”

Let’s go out on a limb and at least consider the possibility that someone who admittedly spent years serving a social system that he now understands as issuing from the same matrix as Hitler’s regime may perhaps be telling us (in his own roundabout, sociologistic way) that he is morally culpable, no matter what his good intentions may have been.

Alas, this is not quite so exciting as “Postmodernist Conceals Sinister Past.” It doesn’t even have the satisfying denouement found in “The God That Failed,” that standard of ex-Communist disillusionment. Sorry about that.... It’s just a tale of a man getting older and – just possibly – wiser. I tend to think of that as a happy story, even so.

Scott McLemee
Author's email: 

Scott McLemee writes Intellectual Affairs each week. Suggestions and ideas for future columns are welcome.

Dying to Teach

For several years I have been teaching personal writing courses in which students share with classmates self-disclosing essays on a wide variety of topics that are rarely discussed in the classroom, including eating disorders, sexual abuse, drug and alcohol addiction, depression, and suicide. Such "risky writing" -- the title of my 2001 book -- involves confronting painful or shameful emotions and requires a safe, empathic classroom atmosphere so that students who write about traumatic topics are not retraumatized. In March 2004, I decided to read aloud to my students the most personal writing of my life: a eulogy for my beloved wife.

Two years earlier it would have been unimaginable to believe that Barbara, who had been in excellent health, and who could have been a poster child for living an active, fulfilling life, would soon be diagnosed with one of the most dreaded diseases: pancreatic cancer. She had none of the risk factors except being over the age of 50. She appeared decades younger than her age; when our daughters were in college, she looked like their oldest sister rather than their mother. She never abused her body: never smoked, never drank excessively, exercised regularly, had annual physical exams, and always maintained a healthy weight. Perhaps equally important, there was no history of cancer on either side of her family: nearly all of her relatives lived to their 80s or 90s, including her parents and their many siblings. And so when Barbara was diagnosed with metastatic pancreatic cancer -- a redundancy since nearly all pancreatic cancer is metastatic by the time it is detected -- on August 12, 2002, one day after our 34th wedding anniversary, she was given less than a year to live.

Fear, shock, and horror followed the grim diagnosis, and for the next several months we were in and out of the hospital, undergoing tests, consultations, and treatments. There is no cure for pancreatic cancer -- it is one of the most virulent cancers, with a 99 percent mortality rate, and the standard treatment, chemotherapy, works only for a few months, if that long. From the moment of her diagnosis we were on a roller coaster -- there is no avoiding this overused metaphor. Unlike amusement roller coasters, in which thrill-seekers know in advance that they are paying for the illusion of danger, we knew that this ride would plunge Barbara lower and lower until its final crash. There were, to be sure, a few unexpected highs, when the disease seemed to be retreating, thanks to an experimental pancreatic cancer vaccine that Barbara took for 18 weeks. The vaccine supercharged the chemotherapy that followed, giving her several additional months of life; but when she was forced to end the chemotherapy after six months, because her white blood cell count was dangerously low, the cancer spread with a vengeance throughout her pancreas, liver, and abdomen, and all hope of remission vanished. Slowly and imperceptibly our attitude toward death changed from that of a dreaded adversary, to be avoided at all cost, to a welcome ally, signaling the end of the nearly 20-month ordeal.

When our doctors told us that Barbara was close to the end, in January 2004, I decided to write a eulogy. I could have waited until her actual death, but I didn't know whether I could write a eulogy in two or three days, the time interval, according to Jewish custom, between death and burial. Besides, I wanted as much time as possible to write what would surely be the most important speech of my life. And so I wrote a first draft that I continued to revise until her funeral three months later. I wanted to memorialize the woman who had been the center of my universe for four decades. She was not only my wife but also my best friend and soulmate, the person who had transformed my life from the moment we began dating.

I wrote the eulogy to celebrate Barbara's life rather than dwell on the wrenching details of her death. I wanted to bring smiles to the mourners, but I knew that my words would inevitably bring tears to their eyes: striking the right tonal balance would be a challenge. And so I decided to begin with light reminiscences, which would allow me to maintain my composure, and then I would move slowly toward the final months of Barbara's life, a subject that would make greater emotional demands on everyone in the funeral chapel. Here is part of the eulogy:

Barbara and I met in the fall of 1963 in our freshman English class at the University of Buffalo. She was not yet 17 years old. For me, though not for her, it was love at first sight: I couldn't take my eyes off her long flowing hair, green eyes, high cheek bones, olive complexion, and delicate nose. She had a natural, unselfconscious beauty that never faded, not even after her illness. Two of the black-and-white photos I took of her in 1967 now hang on my office wall at the university; students who walk into my office invariably comment on her exotic features. Barbara and I could not have been more different in class: I spoke incessantly, enraptured by my own words, while she remained silent like the sphinx, which only increased her mystery to me.

Our relationship began inauspiciously. Our first date was November 22, 1963, a day that no one of our generation will ever forget. After classes were cancelled because of President Kennedy's assassination, we decided to see a movie; we were among a handful of people in the theater as we watched Laurence Olivier play Heathcliff in Emily Brontë's Wuthering Heights. On our third date I walked her back to the dormitory and asked if I could kiss her goodnight. "No" was her immediate reply. I turned around and left, vowing never to ask her out again. A few months later I broke that promise, and we began seeing each other. When I later told her how hurt I was by her rejection, she replied, "It was a stupid question: you should have just kissed me."

Everything Barbara made was a work of art, and she was meticulous to a fault. Her eye invariably spotted misweaves and imperfections, and she demanded of others what she expected of herself, which was nothing short of perfection. It is not easy living with a person whose standards are so high; she was as mechanically inclined as I am mechanically declined, and I became dependent upon her ability to fix anything. She could repair faulty wiring, broken toilets, temperamental boilers, cracked floor tile, and leaky faucets. By contrast, I was hopeless. Her favorite story about me was the time I spent two hours replacing a head light in our car, only to discover that I had replaced the wrong light. Once in exasperation I said to her, "You're such a perfectionist that I don't understand why you married me." Without hesitation she replied, "I didn't think about it very much." Lucky for me that she didn't.

Barbara and I did not spend much time talking about the unfairness of her illness. We had no regrets about anything except that we did not have more time together. She felt little anger and no bitterness. She died during what would have been the best time in her life, when her children were grown up, happily married to wonderful men, successful in their careers, and beginning families of their own. She delighted in our new grandson Nate the Great, who filled her heart with joy.

Premature death always raises the most fundamental religious and existential questions, and each person will answer these questions differently. Amidst tragedy, those with strong religious faith may have emotional resources lacking in those without religious faith. I wish I could believe that Barbara is now in a better world, that there is a reason for her death, and that one day I will be reunited with her. What I do believe is that she will always be alive to those of us who were privileged to know her. I want to end by quoting a passage from a recent film based on Charles Dickens's novel Nicholas Nickleby: "In every life, no matter how full or empty one's purse, there is tragedy. It is the one promise life always fulfills. Happiness is a gift and the trick is not to expect it, but to delight in it when it comes and to add to other people's store of it." Barbara was one of those rare people who increased the store of happiness in the world.

With Barbara's approval, I decided to read the eulogy to my writing class, not only as a trial run, but, more importantly, as a way to offer my students my own example of risky writing. My students had been sharing their personal writings with me throughout the semester, and I wanted to reciprocate. I have discovered in every personal writing class that self-disclosure begets self-disclosure. Moreover, I believe in the adage that authors write best from their own experience. Here was a real-life experience in which all of my students would find themselves one day: confronting the specter of death from the point of view of the dying person or the caregiver. Here was an opportunity to describe, to others and myself, how and why Barbara has meant the world to me, and how my world would be forever changed by her death. Here was an opportunity to put into practice the adage I give to my writing students every class: show instead of tell, use concrete details, avoid clichés, compress your language, revise until every sentence is grammatically correct and stylistically graceful, and make the reader see your story. Above all, I wanted to write truthfully, which means not only engaging the minds and hearts of readers but also describing my wife without distorting or idealizing her life. It is always problematic when writers are too close to their subjects, when they are so much "in love with" their characters that they cannot see their human failings. I was indeed in love with my subject, but my challenge was to allow my students to see Barbara as I have seen her, and to convey to them, as I would later convey to the mourners attending her funeral, the special qualities about her life.

Most of my colleagues knew about Barbara's illness, but I hadn't informed anyone in the Expository Writing course I taught in the spring of 2004. I told them in early March, when our doctors told us that Barbara's death was imminent. I announced at the beginning of the class that I wanted to reserve the last 20 minutes of the hour for reading an essay that I had just written. The students seemed mildly puzzled, but no one said anything. The class proceeded as usual, and then my turn came. In a quiet, measured voice I revealed that my wife was terminally ill with cancer and close to death. I told them that I wanted to share with them the eulogy that I hoped I would be able to deliver at her funeral. Anyone who wished to leave class before hearing the eulogy could do so, I added. Finally I said that the class would be over when I finished reading the eulogy. And with that, I began.

During the reading I didn't lift my eyes from the copy of the eulogy on my desk. I didn't dare look up, fearful that I would be unable to continue reading if I saw anyone teary-eyed. I could hear several students from different sides of the classroom fighting back tears, but apart from that there was eerie silence, quite different from the ubiquitous white noise of the classroom. On three occasions I could hear my voice falter and break, but each time I paused, regained my composure, and resumed the reading. I thanked my students when I finished, and everyone quietly walked out of the classroom.

It was difficult to read aloud the eulogy, but afterwards I felt better, the way nearly all of my students feel after reading aloud their own emotionally charged essays. I felt exhausted and drained from the reading but also relieved, for no longer did I need to conceal Barbara's illness from my students. Finding the words to express my feelings, and then reading those words aloud, helped me to remain in control. My students' silence struck me as profoundly respectful, but I couldn't be sure how they felt unless I found a way to ask them. I knew from years of teaching experience that most students are reluctant to speak in class, even in self-disclosing courses. I wanted them to have the time to reflect on their feelings, which is why I ruled out an in-class essay, and I also wanted them to have the opportunity to remain anonymous so that they could be as truthful as possible. I did not want to require them to write about their feelings, and so I asked them to write an anonymous, optional essay describing how they felt about hearing the eulogy.

Fifteen of the 22 students who heard me read the eulogy turned in essays. They were generally well written, containing fewer grammatical and stylistic errors than in earlier writings. The disclosure of my wife's illness stunned all 15. They felt that the eulogy gave them insight into my personal life and that I was no longer simply a "teacher" to them. Twelve reported that they cried during the reading. Nearly all indicated that they could hear the emotionality of my voice as I read the eulogy and that it would not have been as powerful an experience if they had read the eulogy to themselves.

The eulogy was painful for all of the students. Many felt implicated in my story. I don't know whether it would be accurate to say that my students were traumatized, but some reported feeling physically as well as psychologically distressed during the reading. Some students reported that they found themselves thinking about class for the rest of the day: "My eyes precipitated and tears split my cheeks in halves as the sadness irrigated its way to the bottom of my face. I left class screaming at God in my head. Wondering why God does the things he does."

How did I feel about my students' tears? My purpose was not to move them to tears -- tearful responses are not necessarily "deeper" or more meaningful than dry-eyed responses. If we judged a story by the quantity of tears it produces in readers or viewers, any television soap opera would be a more profound aesthetic experience than King Lear. Nevertheless, I did want to "move" my students, and as the word implies, I sought to transport them to a different emotional realm, one that involved nothing less than the contemplation of life and death.

The adage, "mourners cry for themselves, not for the deceased," is a half-truth. We cry both for ourselves and for those whom we have lost. The most painful aspect of caring for Barbara was watching her suffer; after her death, the most painful part of mourning was imagining all that she will miss in life. My tears were as much for Barbara's sake as for my own; and my students' tears were as much for Barbara and me as for themselves.

Thirteen of the 15 students implied that the eulogy was appropriate for the writing class. They appreciated the trust I placed in them and implied that they would now place greater trust in me. They felt for the first time an equality in the teacher-student relationship: I had opened up to them just as they had opened up to me throughout the semester. "It has always bothered me when a teacher would read hundreds of essays, comment on them, and not read any of their own," wrote one student. "When a professor reads a work of their own they are putting themselves into the lake of vulnerability." Despite the fact that I had disclosed aspects of myself in Risky Writing, my new self-disclosure was different, and they now saw me differently. I was still their teacher, but I had now become another member of the class, one who was struggling, like everyone else, with a personal issue. I had never used the word "intersubjective" in class, but the classroom suddenly became a space where every person, including the teacher, was sharing aspects of his or her own subjectivity with each other.

The remaining two students were unsure whether they thought the eulogy was appropriate to read to the class. One wrote that the eulogy "put a damper on my day because it was so sad;" the other felt it was "a little too personal" to be read to a group of "mere students." I don't wish to invalidate the last response -- the cornerstone of an empathic classroom rests upon the principle that "feelings are feelings" and not to be disregarded -- but I never view members of my class as "mere students." They are not exactly "friends," to whom one may make an intimate self-disclosure, but after several weeks of the semester they are more than acquaintances. I don't believe that teachers should unburden themselves to students or seek psychological counseling from them, but I do believe that a teacher's careful self-disclosure of a real-life experience can become a profound educational experience for everyone in the classroom.

Family and friends became part of my support system as I cared for Barbara during her protracted illness. Without the help of my children, I could not have cared for her at home, especially toward the end, when she required around-the-clock care. Home-care hospice was also invaluable. Teaching was another support system. It afforded me not only a welcome distraction from the endless and exhausting problems confronted by a caregiver but also an opportunity to forget momentarily the crushing sadness I often felt at home. Throughout Barbara's illness I had my normal teaching load, but I had a compressed schedule so that I could be at home as much as possible. Perhaps what most surprised me about my response to Barbara's illness was the ability to compartmentalize my life. At home I fulfilled all my caregiving responsibilities. There were many times during the final weeks of her life when I felt emotionally and physically overwhelmed. Our hospice case supervisor suggested toward the end that it might be time to hospitalize Barbara, not for her sake but for our own. I remember feeling so exhausted from my caregiving responsibilities that I was indifferent to my own health and well-being. I felt that I was dying with Barbara. The early 20th-century Lithuanian-born Jewish philosopher Emmanuel Levinas would call this phenomenon, which I'm sure many caregivers feel, "dying for the other," when "worry over the death of the other comes before care for self." We all needed a break from the constant caregiving, yet we also felt that we had come this far and could manage a few more days. In the classroom, however, I felt cheerful, relaxed, and in control. Even during the last weeks of Barbara's life, I laughed and joked as usual in the classroom, which led my students to believe that nothing was wrong in my personal life.

The "teaching cure" enabled me to remain connected with the outer world of health; teaching served as a lifeline for me at a time when I was struggling to be a lifeline for my wife. Many colleagues offered generously to teach my classes during Barbara's illness, and they may have thought that my determination to continue teaching and meeting with students was a sign of strength. The truth was that teaching gave me the strength that otherwise I might not have had, for as much as I gave to my students, they gave to me and helped me through the crisis. As one person noted, "One thing that touched me as you read your eulogy was that teaching kept you sane. Hearing a professor say something like that made me think, 'That is one more reason why teaching is worth it after all.' It also made me wonder when in my life I will be at the point at which I will be able to say the same thing."

My eulogy was a bridge between the world of the healthy and the sick, the living and the dying; I wrote it when Barbara was gravely ill, but rather than distracting me from taking care of her, the eulogy enabled me to avoid succumbing to despair when she could no longer take care of herself. Writing the eulogy helped me to express the lifelong devotion that I have felt for her and she for me. My devotion to Barbara heightened my students' commitment to their teacher. "I found your reading to be very painful," one person wrote, "but I remained as strong as possible for you. I needed to give you my attention while you read it to the class. I knew this was important to you." They were not only a sympathetic audience but also a supportive group, reaching out to me in ways that seldom occur in the classroom. This supportive group takes on some of the characteristics of a "support" group but without offering the clinical advice that occurs in the latter.

My self-disclosure narrowed the distance between students and teacher, leading to a more equal classroom relationship based on reciprocity. There was, in Jessica Benjamin's words, mutual recognition: "the necessity of recognizing as well as being recognized by the other." There was nothing transgressive about this narrowed distance, nothing that would be considered unprofessional. Many past and present students attended the funeral and later came back to my home, along with perhaps a hundred other mourners. The main effect of the eulogy was internal. "I feel a greater sense of trust and respect now because you have shared your experience with us. By doing so, you have broken down the wall that is usually present in the classroom, separating the teacher from the students."

Jeffrey Berman
Author's email: 

Jeffrey Berman is professor of English at the State University of New York at Albany. This essay is adapted and reprinted with permission from his latest book, Dying to Teach: A Memoir of Love, Loss, and Learning (State University of New York Press) ©2007. Berman's previous books include Empathic Teaching: Education for Life and Risky Writing: Self-Disclosure and Self-Transformation in the Classroom.

C.L.R. James Meets Tony Soprano

Half a century before "The Sopranos" hit its stride, the Caribbean historian and theorist C.L.R. James recorded some penetrating thoughts on the gangster -- or, more precisely, the gangster film -- as symbol and proxy for the deepest tensions in American society. His insights are worth revising now, while saying farewell to one of the richest works of popular culture ever created.

First, a little context. In 1938, shortly before James arrived in the United States, he had published The Black Jacobins, still one of the great accounts of the Haitian slave revolt. He would later write Beyond a Boundary (1963), a sensitive cultural and social history of cricket – an appreciation of it as both a sport and a value system. But in 1950, when he produced a long manuscript titled “Notes on American Civilization,” James was an illegal alien from Trinidad. I have in hand documents from his interrogation by FBI agents in the late 1940s, during which he was questioned in detail about his left-wing political ideas and associations. (He had been an associate of Leon Trotsky and a leader in his international movement for many years.)

In personal manner, James was, like W.E.B. DuBois, one of the eminent black Victorians -- a gentleman and a scholar, but also someone listening to what his friend Ralph Ellison called “the lower frequencies” of American life. The document James wrote in 1950 was a rough draft for a book he never finished. Four years after his death, it was published as American Civilization (Blackwell, 1993). A sui generis work of cultural and political analysis, it is the product of years of immersion in American literature and history, as well as James’s ambivalent first-hand observation of the society around him. His studies were interrupted in 1953 when he was expelled by the government. James was later readmitted during the late 1960s and taught for many years at what is now the University of the District of Columbia.

American Civilization's discussion of gangster films is part of James's larger argument about media and the arts. James focuses on the role they play in a mass society that promises democracy and equality while systematically frustrating those who take those promises too seriously. Traveling in the American South in 1939 on his way back from a meeting with Trotsky in Mexico, James had made the mistake of sitting in the wrong part of the bus. Fortunately an African-American rider explained the rules to him before things got out of hand. But that experience -- and others like it, no doubt -- left him with a keen sense of the country's contradictions.

While James's analysis of American society is deeply shaped by readings of Hegel and Marx, it also owes a great deal to Frederick Jackson Turner’s theory of “the closing of the frontier.” The world onscreen, as James interpreted it, gave the moviegoer an alternative to the everyday experience of a life “ordered and restricted at every turn, where there is no certainty of employment, far less of being able to rise by energy and ability by going West as in the old days.”

Such frustrations intensified after 1929, according to James’s analysis. The first era of gangster films coincided with the beginning of the Great Depression. “The gangster did not fall from the sky,” wrote James. “He is the persistent symbol of the national past which now has no meaning – the past in which energy, determination, bravery were sure to get a man somewhere in the line of opportunity. Now the man on the assembly line, the farmer, know that they are there for life; and the gangster who displays all the old heroic qualities, in the only way he can display them, is the derisive symbol of the contrast between ideals and reality.”

The language and the assumptions here are obviously quite male-centered. But other passages in James’s work make clear that he understood the frustrations to cross gender lines -- especially given the increasing role of women in mass society as workers, consumers, and audience members.

“In such a society,” writes James, “the individual demands an aesthetic compensation in the contemplation of free individuals who go out into the world and settle their problems by free activity and individualistic methods. In these perpetual isolated wars free individuals are pitted against free individuals, live grandly and boldly. What they want, they go for. Gangsters get what they want, trying it for a while, then are killed.”

The narratives onscreen are a compromise between frustrated desire and social control.“In the end ‘crime does not pay,’” continues James, “but for an hour and a half highly skilled actors and a huge organization of production and distribution have given to many millions a sense of active living....”

Being a good Victorian at heart, James might have preferred that the audience seek “aesthetic compensation” in the more orderly pleasures of cricket, instead. But as a historian and a revolutionary, he accepted what he found. In offering “the freedom from restraint to allow pent-up feelings free play,” gangster movies “have released the bitterness, hate, fear, and sadism which simmer just below the surface.” His theoretical framework for this analysis was strictly classical, by the way. James was deeply influenced by Aristotle’s idea that tragedy allowed an audience to “purge” itself of violent emotions. One day, he thought, they would emerge in a new form -- a wave of upheavals that would shake the country to its foundations.

In 6 seasons over 10 years, “The Sopranos” has confirmed again and again C.L.R. James’s point about the gangster is an archetypal figure of American society. But the creators have gone far beyond his early insights. I say that with all due respect to James’s memory – and with the firm certainty that he would have been a devoted fan and capable interpreter.

For James, analyzing gangster films in 1950, there is an intimate connection between the individual viewer and the figure on the screen. At the same time, there is a vast distance between them. Movies offered the audience something it could not find outside the theater. The gangster is individualism personified. He has escaped all the rules and roles of normal life. His very existence – doomed as it is – embodies a triumph of personal will over social obligation.

By contrast, when we first meet Tony Soprano, a boss in the New Jersey mob, he is in some ways all too well integrated into the world around him. So much so, in fact, that it is giving him panic attacks from trying to meet all the demands from juggling the different roles he plays. In addition to being pater of his own brood, residing in a suburban McMansion, he is the dutiful (if put-upon) son in a dysfunctional and sociopathic family.

And then there are the pressures that attend being the competent manager of a successful business with diversified holdings. Even the form taken by his psychic misery seems perfectly ordinary: anxiety and depression, the tag-team heart-breakers of everyday neurosis.

James treats the cinematic gangsters of yesteryear as radical individualists – their crimes, however violent, being a kind of Romantic refusal of social authority. But the extraordinary power of “The Sopranos” has often come from its portrayal of an almost seamless continuum between normality and monstrosity. Perhaps the most emblematic moment in this regard came in the episode entitled “College,” early in show’s first year. We watch Tony, the proud and loving father, take his firstborn, Meadow, off to spend a day at the campus of one of her prospective colleges. Along the way, he notices a mobster who had informed to the government and gone into the witness protection program. Tony tracks the man down and strangles him to death.

At the college he sees an inscription from Hawthorne that reads, “No man ... can wear one face to himself and another to the multitude, without finally getting bewildered as to which one may be true." Earlier, we have seen Tony answer Meadow’s question about whether he is a member of the Mafia by admitting that, well, he does make a little money from illegal gambling, but no, he isn't a gangster. So the quotation from Hawthorne points to one source of Tony’s constant anxiety. But it also underscores part of the audience’s experience – an ambivalence that only grows more intense as “The Sopranos” unfolds.

For we are no more clear than Tony is which of his faces is “true.” To put it another way, all of them are. He really is a loving father and a good breadwinner (and no worse a husband, for all the compulsive philandering, than many) as well as a violent sociopath. The different sides of his life, while seemingly distinct, keep bleeding into one another.

Analyzing the gangster as American archetype in 1950, C.L.R. James found a figure whose rise and fall onscreen provided the audience with catharsis. With “The Sopranos,” we’ve seen a far more complex pattern of development than anything found in Little Caesar or High Sierra (among other films James had in mind).

With the finale, there will doubtless be a reminder – as in the days of the Hays Code – that “crime does not pay.” But an ironized reminder. After all, we’ve seen that it can pay pretty well. (As Balzac put it, “Behind every great fortune, a great crime.”) Closure won’t mean catharsis. Whatever happens to Tony or his family, the audience will be left with his ambivalence and anxiety, which, over time, we have come to make our own.

Scott McLemee
Author's email: 

Scott McLemee writes Intellectual Affairs each week. Suggestions and ideas for future columns are welcome.

Requiem for a Heavyweight

Word that Richard Rorty was on his deathbed – that he had pancreatic cancer, the same disease that killed Jacques Derrida almost three years ago – reached me last month via someone who more or less made me swear not to say anything about it in public. The promise was easy enough to keep. But the news made reading various recent books by and about Rorty an awfully complicated enterprise. The interviews in Take Care of Freedom and Truth Will Take Care of Itself (Stanford University Press, 2006) and the fourth volume of Rorty’s collected papers, Philosophy as Cultural Politics (Cambridge University Press, 2007) are so bracingly quick-witted that it was very hard to think of them as his final books.

But the experience was not as lugubrious as it may sound. I found myself laughing aloud, and more than once, at Rorty’s consistent indifference to certain pieties and protocols. He was prone to outrageous statements delivered with a deadpan matter-of-factness that could be quite breathtaking. The man had chutzpah.

It’s a “desirable situation,” he told an interviewer, “not to have to worry about whether you are writing philosophy or literature. But, in American academic culture, that’s not possible, because you have to worry about what department you are in.”

The last volume of his collected papers contains a piece called “Grandeur, Profundity, and Finitude.” It opens with a statement sweeping enough to merit that title: “Philosophy occupies an important place in culture only when things seem to be falling apart – when long-held and widely cherished beliefs are threatened. At such periods, intellectuals reinterpret the past in terms of an imagined future. They offer suggestions about what can be preserved and what must be discarded.”

Then, a few lines later, a paradoxical note of rude modesty interrupts all the grandeur and profundity. “In the course of the 20th century," writes Rorty, "there were no crises that called forth new philosophical ideas.”

It's not that the century was peaceful or crisis-free, by any means. But philosophers had less to do with responding to troubles than they once did. And that, for Rorty, is a good thing, or at least not a bad one – a sign that we are becoming less intoxicated by philosophy itself, more able to face the need to face crises at the level (social, economic, political, etc.) they actually present themselves. We may yet be able to accept, he writes, “that each generation will solve old problems only by creating new ones, that our descendants will look back on much that we have done with incredulous contempt, and that progress towards greater justice and freedom is neither inevitable nor impossible.”

Nothing in such statements is new, of course. They are the old familiar Rorty themes. The final books aren’t groundbreaking. But neither was there anything routine or merely contrarian about the way Rorty continued to challenge the boundaries within the humanities, or the frontier between theoretical discussion and public conversation. It is hard to imagine anyone taking his place.

An unexpected and unintentional sign of his influence recently came my way in the form of an old essay from The Journal of American History. It was there that David A. Hollinger, now chair of the department of history at the University of California at Berkeley, published a long essay called “The Problem of Pragmatism in American History.”

It appeared in 1980. And as of that year, Hollinger declared, it was obvious that “‘pragmatism’ is a concept most American historians have proved that they can get along without. Some non-historians may continue to believe that pragmatism is a distinctive contribution of America to modern civilization and somehow emblematic of America, but few scholarly energies are devoted to the exploration or even the assertion of this belief.”

Almost as an afterthought, Hollinger did mention that Richard Rorty had recently addressed the work of John Dewey from a “vividly contemporary” angle. But this seemed to be the a marginal exception to the rule. “If pragmatism has a future,” concluded Hollinger in 1980, “it will probably look very different from the past, and the two may not even share a name.”

Seldom has a comment about the contemporary state of the humanities ever been overtaken by events so quickly and so thoroughly. Rorty’s Philosophy and the Mirror of Nature (Princeton University Press, 1979) had just been published, and he was finishing the last of the essays to appear in Consequences of Pragmatism (University of Minnesota Press, 1982).

It is not that the revival was purely Rorty's doing, and some version of it might have unfolded even without his efforts. In such matters, the pendulum does tend to swing.

But Rorty's suggestion that John Dewey, Martin Heidegger, and Ludwig Wittgenstein were the three major philosophers of the century, and should be discussed together -- this was counterintuitive, to put it mildly. It created excitement that blazed across disciplinary boundaries, and even carried pragmatism out of the provinces and into international conversation. I'm not sure how long Hollinger's point that pragmatism was disappearing from textbooks on American intellectual history held true. But scholarship on the original pragmatists was growing within a few years, and anyone trying to catch up with the historiography now will soon find his or her eyeballs sorely tested.

In 1998, Morris Dickstein, a senior fellow at the City University of New York Graduate Center, edited a collection of papers called The Revival of Pragmatism: New Essays on Social Thought, Law, and Culture (Duke University Press) -- one of the contributors to it being, no surprise, Richard Rorty. “I’m really grieved,” he told me on Monday. "Rorty evolved from a philosopher into a mensch.... His respect for his critics, without yielding much ground to them, went well with his complete lack of pretension as a person.”

In an e-mail note, he offered an overview of Rorty that was sympathetic though not uncritical.

“To my mind," Dickstein wrote, "he was the only intellectual who gave postmodern relativism a plausible cast, and he was certainly the only one who combined it with Dissent-style social democratic politics. He admired Derrida and Davidson, Irving Howe and Harold Bloom, and told philosophers to start reading literary criticism. His turn from analytic philosophy to his own brand of pragmatism was a seminal moment in modern cultural discourse, especially because his neopragmatism was rooted in the 'linguistic turn' of analytic philosophy. His role in the Dewey revival was tremendously influential even though Dewey scholars universally felt that it was his own construction. His influence on younger intellectuals like Louis Menand and David Bromwich was very great and, to his credit, he earned the undying enmity of hard leftists who made him a bugaboo."

The philosopher "had a blind side when it came to religion," continued Dickstein, "and he tended to think of science as yet another religion, with its faith in empirical objectivity. But it's impossible to write about issues of truth or objectivity today without somehow bouncing off his work, as Simon Blackburn and Bernard Williams both did in their very good books on the subject. I liked him personally: he was generous with his time and always civil with opponents.”

A recent essay discussing Rorty challenges the idea that Rorty “had a blind side when it came to religion.” Writing in Dissent, Casey Nelson Blake, a professor of history and American studies at Columbia University, notes that Rorty “in recent years stepped back from his early atheist pronouncements, describing his current position as ‘anti-clerical,’ and he has begun to explore, with increasing sympathy and insight, the social Christianity that his grandfather Walter Rauschenbusch championed a century ago.”

Blake quotes a comment by Rorty from The Future of Religion, an exchange with the Catholic philosopher Gianni Vattimo that Columbia University Press published in 2005. (It comes out in paperback this summer.)

“My sense of the holy,” wrote Rorty, “insofar as I have one, is bound up with the hope that someday, any millennium now, my remote descendants will live in a global civilization in which love is pretty much the only law. In such a society, communication would be domination-free, class and caste would be unknown, hierarchy would be a matter of temporary pragmatic convenience, and power would be entirely at the disposal of the free agreement of a literate and well-educated electorate.”

I'm not sure whether that counts as a religious vision, by most standards. But it certainly qualifies as something that requires a lot of faith.

Two items of great interest came to my attention too late to include in this column. One is the final interview with Rorty, conducted by Danny Postel just before the philosopher's death. The other is a tribute to Rorty by Jürgen Habermas.

Scott McLemee
Author's email: 

An Anti-Progressive Syllabus

The first anthology of criticism I read in college was a low-budget volume edited by David Lodge entitled 20th-Century Literary Criticism. It was for an undergraduate class, the first one that spotlighted interpretation and opened a window onto graduate topics. A year later, this time an M.A. student at the University of California at Los Angeles, I took a required course on literary theory, with the anthology Critical Theory Since Plato (1971) edited by Hazard Adams. In a seminar not long after we toiled through Critical Theory Since 1965 (1986), edited by Adams and Leroy Searle, and another class selected Contemporary Literary Criticism: Literary and Cultural Studies (1989), edited by Ron Schleifer and Robert Con Davis. After I left graduate school, more literary/cultural criticism anthologies appeared along with various dictionaries and encyclopedias. The process seems to have culminated in The Norton Anthology of Theory and Criticism (ed. Vincent Leitch et al), whose publication in 2001 was momentous enough to merit a long story by Scott McLemee in The Chronicle of Higher Education that included the remark, “An anthology stamped with the Norton brand name is a sure sign of the field’s triumph in English departments.”

For McLemee to speak of “stamping” and “branding” was apt, more so than he intended, for every anthology assigned in class carries institutional weight. From the higher floors in the Ivory Tower, anthologies may look like mere teaching tools, and editing them amounts to service work, not scholarship. But while professors may overlook them except at course adoption time, for graduate students just entering the professional study of literature and culture, anthologies serve a crucial guiding function. Students apply to graduate school in the humanities because of their reading, the inspiration coming usually from primary texts, not critical works -- Swift not Barthes, Austen not Bhabha. They go into English because they like to read novels, or history because the past intrigues them, or philosophy because they want to figure out the great questions of life. Soon enough, they realize that joy, appreciation, moral musing, and basic erudition don’t cut it, and the first year or two entails an adjustment in aim and focus. The discourse is more advanced and specialized, critical and ironic. New and exotic terms emerge -- “hyperreal,” “hegemony,” “postcolonial” -- and differences between contemporary academic schools of thought matter more than differences between, say, one epoch and another.

Fresh students need help. What the anthologies do is supply them with a next-level reading list. The tables of contents provide the names to know, texts to scan, topics to absorb. In spite of the radical and provocative nature of many entries, the volumes mark a canon formation, a curriculum-building activity necessary for doctoral training. Plowing through them is not only a course of study but also a mode of professionalization, a way to join the conversation of advanced colleagues. As tutelage in up-to-date thinking, they strive for coverage, and to help young readers take it all in, they arrange the entries by chronology and by different categories. The Norton, for instance, contains an “Alternative Table of Contents” that divides contributors up by 42 classifications including “The Vernacular and Nationhood,” “Gay and Lesbian Criticism and Queer Theory,” and “The Body.”

As a poor and insecure 25-year-old in the mid-80s, I slogged through the selections one by one, and I thought that completing them would acquaint me with every respectable and serious current thread in literary and cultural thinking. But when I look back at them today, the anthologies look a lot less comprehensive. In fact, in one important aspect, they appear rather narrow and depleted. The problem lies in the sizable portion of the contributions that bear a polemical or political thrust. These pieces don’t pose a new model of interpretation, redefine terms, outline a theory, or sharpen disciplinary methods. Instead, they incorporate political themes into humanistic study, emphasize race/class/gender/sexuality topics, and challenge customary institutions of scholarly practice. When they do broach analytical methods, they do so with larger social and political goals in mind.

The problem isn’t the inclusion of sociopolitical forensic per se. Rather, it is that the selections fall squarely on the left side of the ideological spectrum. They are all more or less radically progressivist. They trade in group identities and dismantle bourgeois norms. They advocate feminist perspectives and race consciousness. They highlight the marginalized, the repressed, the counter-hegemonic. And they eagerly undo disciplinary structures that formed in the first half of the 20th century.

Reading through these titles (in the Norton: “On the Abolition of the English Department,” “Enforcing Normalcy,” “Talking Black,” “Compulsory Heterosexuality and Lesbian Existence,” etc.), one would think that all decent contemporary criticism stems from adversarial leftist impulses. There is nothing here to represent the conservative take on high/low distinctions, or its belief that without stable and limited cultural traditions a society turns vulgar and incoherent. Nothing from the libertarian side about how group identities threaten the moral health of individuals, or how revolutionary dreams lead to dystopic policies. The neoconservative analysis of the social and economic consequences of 1960s countercultural attitudes doesn’t even exist.

And yet, outside the anthologies and beyond the campus, these outlooks have influenced public policy at the highest levels. Their endurance in public life is a rebuke to the humanities reading list, and it recasts the putative sophistication of the curriculum into its opposite: campus parochialism. The damage it does to humanities students can last a lifetime, and I’ve run into far too many intelligent and active colleagues who can rattle off phrases from “What Is an Author?” and Gender Trouble, but who stare blankly at the mention of The Public Interest and A Nation at Risk.

This is a one-sided education, and the reading list needs to expand. To that end, here are a few texts to add to this fall’s syllabus. They reflect a mixture of liberal, libertarian, conservative, and neoconservative positions, and they serve an essential purpose: to broaden humanistic training and introduce students to the full range of commentary on cultural values and experience.

  • T.E. Hulme, “Romanticism and Classicism” (first published 1924). This essay remains a standard in Anglo-American modernist fields, but it seems to have disappeared from general surveys of criticism. Still, the distinctions Hulme draws illuminate fundamental fissures between conservative and progressive standpoints, even though he labels them romantic and classical. “Here is the root of romanticism: that man, the individual, is an infinite reservoir of possibilities; and if you can so rearrange society by the destruction of oppressive order then these possibilities will have a chance and you will get progress,” he says. The classicist believes the opposite: “Man is an extraordinarily fixed and limited animal whose nature is absolutely constant. It is only by tradition and organization that anything decent can be got out of him.” That distinction is a good start for any lecture on political criticism.
  • T.S. Eliot, “Tradition and the Individual Talent” (1919). Eliot’s little essay remains in all the anthologies, but its central point about the meaning of tradition often goes overlooked. Teachers need to expound why tradition matters so much to conservative thinkers before they explain why progressives regard it as suspect. Furthermore, their students need to understand it, for tradition is one of the few ideas that might help young people get a handle on the youth culture that bombards them daily and nightly. They need examples, too, and the most relevant traditionalist for them I’ve found so far is the Philip Seymour Hoffman character (“Lester Bangs”) in the popular film Almost Famous.
  • F.A. Hayek, The Counter-Revolution of Science (U.S. edition, 1952). Most people interested in Hayek go to The Road to Serfdom, but the chapters in Counter-Revolution lay out in more deliberate sequence the cardinal principles behind his philosophy. They include 1) the knowledge and information that producers and consumers bring to markets can never be collected and implemented by a single individual or “planning body”; and 2) local customs and creeds contain values and truths that are not entirely available to “conscious reason,” but should be respected nonetheless. Such conceptions explain why in 1979 Michel Foucault advised students to read Hayek and other “neoliberals” if they want to understand why people resist the will of the State. We should follow Foucault’s advice.
  • Leo Strauss, “What Is Liberal Education?” (1959). For introductory theory/criticism classes, forget Strauss and his relation to the neoconservatives. Assign this essay as both a reflection on mass culture and a tone-setter for academic labor. On mass culture and democracy, let the egalitarians respond to this: “Liberal education is the necessary endeavor to found an aristocracy within democratic mass society. Liberal education reminds those members of a mass democracy who have ears to hear, of human greatness.” And on tone, let the screen-obsessed minds of the students consider this: “life is too short to live with any but the greatest books.”
  • Raymond Aron, The Opium of the Intellectuals (English trans. 1957). Aron’s long diagnosis of the intellectual mindset remains almost as applicable today as it was during the Cold War. Why are Western intellectuals “merciless toward the failings of the democracies but ready to tolerate the worst crimes as long as they are committed in the name of the proper doctrines”? he asks, and the answers that emerge unveil some of the sources of resentment and elitism that haunt some quarters of the humanities today.
  • Francis Fukuyama, The End of History and the Last Man (1992). First formulated just as the Berlin Wall came tumbling down, Fukuyama’s thesis sparked enormous admiration and contention as the interpretation of the end of the Cold War. When I’ve urged colleagues to read it, though, they’ve scoffed in disdain. Perhaps they’ll listen to one of their heroes, Jean-Francois Lyotard, who informed people at Emory one afternoon that The End of History was the most significant work of political theory to come out of the United States in years.
  • Irving Kristol, Neoconservatism: The Autobiography of an Idea (1995). With the coming of the Bush administration, the term neoconservative has been tossed and served so promiscuously that reading Kristol’s essay is justified solely as an exercise in clarification. But his analyses of the counterculture, social justice, the “stupid party” (conservatives), and life as a Trotskyist undergraduate in the 1930s are so clear and antithetical to reigning campus ideals that they could be paired with any of a dozen entries in the anthologies to the students’ benefit. Not least of all, they might blunt the aggressive certitude of political culture critics and keep the students from adopting the same attitude.
  • David Horowitz, Radical Son: A Generational Odyssey (1997). Many people will recoil at this choice, which is unfortunate. They should not let their reaction to Horowitz’s campus activism prevent them from appreciating the many virtues of this memoir. It is a sober and moving account of America’s cultural revolution from the moral high points to the sociopathic low points. At the core lies the emotional and ethical toll it took on one of its participants, who displays in all nakedness the pain of abandoning causes that gave his life meaning from childhood to middle age. Students need an alternative to the triumphalist narrative of the Sixties, and this is one of the best.

Professors needn’t espouse a single idea in these books, but as a matter of preparing young people for intelligent discourse inside and outside the academy, they are worthy additions to the syllabus. Consider them, too, a way to spice up the classroom, to make the progressivist orthodoxies look a little less routine, self-assured, and unquestionable. Theory classes have become boring enough these days, and the succession of one progressivist voice after another deadens the brain. A Kristol here and a Hayek there might not only broaden the curriculum, but do something for Said, Sedgwick & Co. that they can’t do for themselves: make them sound interesting once again.

Mark Bauerlein
Author's email: 

Mark Bauerlein is professor of English at Emory University.

Pottering Around

“I’m getting ready to work on Harry Potter for a month,” said Laurie Muchnick in late June. She edits the book section of Newsday, a newspaper based in Long Island. We’ve been friends for a decade now (as long as the Potter novels have been published, as coincidence has it) and the conversation was a completely casual one. So I half expected her to emit a sigh or a grumble, or to pause for a beat before adding, “Well, it’s going to feel like a month anyway.”

But no -- she meant it literally, and it didn’t sound like she minded. She’s been rereading the entire series. Since the start of July, Newsday has run one item on Pottermania per day, which is the sort of thing editors do only when firmly convinced that a significant share of the audience will want it. Not all of the paper’s cultural coverage has focused on Harry Potter, of course. But with the latest movie about the young wizard now in the theaters, and the seventh novel due out on July 21 -- and bookies no doubt offering odds on whether Harry lives or dies -- we are talking about a phenomenon now well beyond run-of-the-mill levels of public interest. According to the plan that J.K. Rowling drew up when she began the series, Harry Potter and the Deathly Hallows is supposed to be the very last volume, though skeptics wonder if the lure of a few millions dollars more won't inspire some new adventure down the line.

In the years since the author introduced her characters to the public, they have become beloved and meaningful; and not to children only. At present, the catalog of the Library of Congress records 21 volumes of criticism and interpretation on the novels, in six languages. A collection called Harry Potter and International Relations, for example, published by Rowman and Littlefield in 2006, analyzes the significance of Hogwarts, the academy of magical arts at which Harry trains, with respect to the nation-state and geopolitical realism. It also contains an essay (and I swear this is true) called “Quidditch, Imperialism and the Sport-War Intertext.” At least 17 doctoral dissertations and seven master’s theses had been devoted to the Harry Potter books, at least in part, as of last year. Chances are good that all these figures are on the low side.

A confession: I have never read any of the Harry Potter novels nor seen even one of the movies. Aficionados should not take this personally, for it has not been a matter of cultural snobbery or high principle, or even of deliberate policy. It is simply an effect of the scarcity of time -- of hesitation before a body of work that will, in due course, run to some 4,000 pages and (by my estimate) more than 17 hours of film.

On the other hand, I’ve long been intrigued by how certain works of fiction create such powerful force-fields that readers go beyond enthusiasm, developing relationships with characters and their world that prove exceptionally intense, even life-changing. Examples would include C.S. Lewis, Thomas Pynchon, Ayn Rand, and J.R.R. Tolkien. (They are listed in alphabetical order, so no angry letters on slights implied by the sequence, please.)

And one regular product of such fascination is the desire not only to study the fiction ever more closely, but to create works of analysis that, so to speak, map and chronicle the imaginary world. In effect, the fiction creates its own nonfiction supplement.

So it was interesting, though no means a surprise, to learn that there is an intensive course on Harry Potter at North Georgia College and State University this summer, taught by Brian Jay Corrigan, a professor of English whose more routine area of specialization is Renaissance literature. Students in the course are contributing to an encyclopedia that will cover -- as Corrigan puts it during an email interview -- “the geographic, historic, folkloric, mythic, and all other backgrounds informing the Harry Potter world.” He says an agent is shopping the project around to publishers in New York now.

One encyclopedia of Potteriana is already available. But with the appearance of the final novel, it will soon be out of date, and Corrigan’s effort will presumably have the advantages of closure and retrospective insight. It will also be enormous -- perhaps 250,000 words long, with hundreds of illustrations being prepared to go with the entries.

“After a year and a half in planning and five weeks of class,” Corrigan told me, “the ‘rough’ part of the project, collecting together all the grist, is about three quarters finished. We have already generated nearly 1,500 typed pages (650,000 words). There will be a polishing period that will whittle all of this into a usable format.” He expects that phase to last until the end of the fall semester.

After we discussed the work in progress a bit, I broached some reservations that kept crossing my mind about the whole idea of a course on Harry Potter. It’s not that it seems like a terrible idea. But mild ambivalence about it seemed hard to shake.

On the one hand, it’s hard to gainsay, let alone begrudge, the success of any work of fiction that made reading popular for a whole generation of kids. And it is not hard to appreciate the advantage of giving students a taste of literary scholarship through closer examination of work they already know. As someone admittedly ignorant of the primary materials in question, I picked up some sense of the case to be made for Harry Potter from an essay by Michael Bérubé in the latest issue of The Common Review, which conveys some appreciation for the structural intricacy of the books.

The tightly constructed plots and complex shadings of characterization in Rowling’s work has had a profoundly educational effect for Bérubé’s son Jaime, who has Down syndrome.

“Indeed,” writes Bérubé, “one of the complaints about Rowling’s creations [is] that they are too baroquely plotted, too ‘cloak and dagger and triple reversal with double axel’ ....But it’s astonishing to me that tens of millions of young readers are following Rowling through her five-, seven-, and even nine-hundred page elaborations on the themes of betrayal, bravery, and insupportable loss; it’s all the more astonishing that one of those tens of millions is my own ‘retarded’ child, who wasn’t expected to be capable of following a plot more complicated than that of Chicken Little. And here’s what’s really stunning: Jamie remembers plot details over thousands of pages even though I read the books to him at night, just before he goes to bed, six or seven pages at a time. Well, narrative has been a memory-enhancing device for some time now, ever since bards got paid to chant family genealogies and catalog the ships that laid siege to Troy. But this is just ridiculous.”

So yes, there is something to respect in what J.K. Rowling has achieved. At the same time, isn’t undergraduate education a potentially decisive moment when students ought to be introduced to a wider conception of culture -- something outside the familiar, the readily available, the comfortingly familiar?

Last week, reporters from CNN were on the North Georgia campus to film Corrigan’s students as they played a Quidditch match (that being a magical competition well-known to Potter afficianados). The segment will presumably air some time around the time the final volume of the series is released. It all sounds enjoyable for everyone involved. Yet as I think about it, the ghosts of Matthew Arnold and Theodore Adorno hover nearby. They look pained.

Now, Arnold was a Victorian sage; and Adorno, a relentless Marxian critic of mass culture; and I am guessing that neither of them is familiar with the particular educational challenges involved in teaching undergraduates at North Georgia College and State University during the era of high-speed wireless connectivity. They are out of touch. Still, it seems as Arnold and Adorno would prefer that kids learn to appreciate forms of cultural creation that will not in any way ever come to the attention of a cable television network.

Corrigan heard me out as the spirits channeled their complaints.

“I am a Shakespearean,” he said, “and one of my greatest regrets in my field is the damage done to our historical understanding of his works occasioned by his having been viewed as "base, common, and popular" in his own day. If only some farsighted intellectual had taken that theatre in Southwark seriously and done in Shakespeare's day what we are attempting today, we would all be richer for the experience....Who is to say what is ‘best’ until we first explore, evaluate, and ascertain? Why not allow the culture that is generating the thought also engage in that exploration and evaluation? Surely that is the aim of pedagogy, instilling curiosity while guiding intellect towards informed opinion.”

Corrigan also notes something that is often palpable when people discuss the impending publication of the final Potter novel. The phenomenon began with the appearance of the first volume during the summer of 1997. Millions of kids and their parents have grown up with the series. It has in some sense been a generation-defining experience, the meaning of which is, as yet, impossible fully to unpack. The intense involvement of readers has in part been a matter of the narrative’s open-endedness; but soon that will change.

“It might be said,” as Corrigan puts it, “ that we, as a class, are contributing to the scholarship of a future world. Undoubtedly ‘Potter-mania’ will cool, but the cultural phenomenon has been recorded and will be remembered. I am leading a group of people who are currently Harry's age (between 18 and 28 years old). Moreover, they are in Harry's age -- they grew up with Harry. We are creating a fly in amber. Never again will any scholar be able to approach Harry Potter from this perspective, not knowing how it will end.”

Next week, he says, “the story of Harry will have been published for the world to know, and no one will ever again be able to look at these books as we can today.... My students are doing far more than reading seven novels and writing essays on what they think. They are exploring backgrounds that inform this series and along the way are delving into many fields of study. As such, they are learning the interrelatedness of literature with the worlds of thought that inform the idea of a university.”

There is also the more prosaic sort of instruction that goes with preparing an encyclopedia. Corrigan says his class is acquiring “the practical skills of working to a real deadline, editing, and dealing with ‘real world’ situations such as slow contract negotiations and the minutiae of New York publishing houses. For a group of English students, many of whom interested in publishing, this is invaluable internship experience.”

The specters listen quietly, but they look skeptical. Matthew Arnold wants to point out that, after all, we do not continue to read Shakespeare because he was once popular in his day. Theodore Adorno is annoyed by the expression “invaluable internship experience” (evidently it sounds really bad in German) and starts to mutter about preserving the difference between the liberal arts and vocational training.

On Corrigan’s behalf, I argue in defense of his points. Aware that this can only mean I am talking to myself again, it seems like a good idea to check in with Laurie Muchnick at Newsday to find out how the “month of Potter” is going.

She mentions that she’s had a reporter looking into how bookstores have handled security, since nobody is supposed to have access to the books until midnight on July 21. “Nobody” includes reviewers. The publisher, Scholastic, doesn’t bother sending out the books, since kids don’t care what the critics think. So Muchnick expects to be at the store that night to pick up her reserved copy.

The newspaper has been publishing short pieces by readers on what they expect to happen in the final volume -- a matter, not of pure imagination, but of deduction from the previous six volumes. By next week, though, all mysteries will be resolved.

When I tell her about Corrigan’s course, Muchnick says she can see the possible pedagogical value, but wonders if the moment might not be passing soon.

“I’ve been rereading all of the books,” she says, “and it’s been really impressive to see how carefully Rowling has structured them. There are clues to things happening later that are embedded in the earlier volumes. I can see how they would merit a sort of close reading, the old New Critical approach of looking really closely at the text to see what is going on in it.”

That is, in effect, what fans have done with their time between books -- trying to figure out what comes next by reading between the available lines. Interpretation has been a way of continuing one’s involvement in the text while waiting for the next installment.

But the relationship between analysis and anticipation will soon change. “Will people still be as interested in hunting for clues once they know that the answers are actually available?” asks Muchnick. “I just don’t know. People will still enjoy the books, but probably not in the same way.”

Scott McLemee
Author's email: 

Jane Austen, Yadda, Yadda, Yadda

Recently I was cornered by a university employee who knows I’m a scholar of British literature, specializing in Jane Austen.

“I started Pride and Prejudice last week,” he told me. “It’s one of those books I know I should have read, but I couldn’t get past the first few chapters.”

“Really,” I replied, eyebrows raised.

“Yeah, I just lost interest,” he went on. “I kept thinking to myself, ‘Oh, brother. I think I know where this is going.’”

Was this disarming honesty or throwing down the gauntlet? Was I being called out? Whatever it was, I shifted nervously as I listened to the rest of his monologue: “My theory is that the novel can be pretty much summed up as Elizabeth and Darcy meet, Elizabeth and Darcy hate each other, Elizabeth and Darcy fall in love, yadda, yadda, yadda.”

Reader, I stared at him blankly. Of course, I spent hours afterward constructing witty, cynical comebacks, such as “Yeah, I know what you mean. I have that response to episodes of VH1’s 'Behind the Music' and to reading the Bible.” But in the moment, all I managed to spit out was something clichéd and professorial resembling, “Hmm. That’s interesting. I think maybe it takes a few readings of Austen to really appreciate her fiction’s depth, humor, and irony.”

That’s also my stock answer to traditional-aged undergraduates on the first day of class -- 20-year-olds who confess that they’ve signed up for a literature class on Austen and her contemporaries because they absolutely love (or absolutely hate) her fiction -- or maybe just the film adaptations. Or Colin Firth or Keira Knightley or Clueless. The Austen-haters often claim to be taking the course because they want to understand what in the world is the big deal. A few of them end up seeing it by the end of the semester, a few more don’t, and that’s fine. But the yadda-yadda-yadda employee was a well-read, middle-aged guy with no sophomore excuse for being sophomoric. My gut reaction to his confession registered somewhere between crestfallen and incensed.

I'm having a similarly mixed reaction to the latest wave of Austen mania in the U.S. and U.K., shifting nervously, while approaching it with a combination of anxiety and dread. I know that all English professors worth their salt should be constructing some theories and responses now, in advance of being cornered by colleagues and co-workers and co-eds, so as not to have to resort to the professorial and clichéd. What will we say when asked about Anne Hathaway’s Becoming Jane (2007); about upcoming The Jane Austen Book Club film, with its star-studded cast; or about PBS’s planned 10-week winter 2008 airing of the Complete Jane Austen on "Masterpiece Theatre"?

What’s the witty, cynical comeback to this cultural flowering of Austen-related stuff, I find myself wondering: “Can’t wait to see it!” “Wish I’d thought of it first!” “The Decline and Fall of Austen’s Empire.” “A tippet in the hand is worth two in the bush.” “A stitch in the huswife saves nine.” “Don’t look a gift pianoforte in the mouth”?

But along with such repartee, we’ll also need to ready weightier observations. First, I believe it’s imperative that we call a moratorium on starting sentences with “It is a truth universally acknowledged,” as in, “It is a truth universally acknowledged that this is the first time in television history Austen’s complete works have been aired in succession.” In the coming months we will no doubt suffer through dozens of newspaper and magazine articles beginning, “It is a truth universally acknowledged.” Best not to add to the collective torture.

In addition, when constructing our soundbites, we ought not to forget the sheer breadth of today’s Austen craze; it’s more than just films and television adaptations we’re in for. New books have appeared, too, like Confessions of a Jane Austen Addict (2007) and Jane Austen for Dummies (2006). Though I worry that these books make reading her fiction sound like something done at an Alcoholics Anonymous meeting for slow learners, surely it’s not too late for some well-placed damage control?

After all, the Austen-inspired publicity stunts are already in full swing. Perhaps you’ve heard about the kerfluffle that unfolded over the pond, “Jane Austen Rejected!” Thinly veiled versions of Austen’s fiction were sent out to British publishers as new work, under the name of Allison Laydee (a.k.a. David Lassman), and all were rejected. Even Harlequin Mills & Boon passed on publishing adulterated Jane Austen plots. The horror! The horror!

But isn’t this is déjà vu all over again? Please raise your tussy mussy if you remember 10 or so years ago, when we were last inundated with Austen film and TV adaptations; with Bridget Jones novels and films; and with Austen board games, stationery, and editorial cartoons. Everyone then seemed to be asking, “Why Austen? Why now?”

The late 1990s were strange days for us longtime members of the Jane Austen Society of North America. It was as if we no longer had to apologize for indulging in our versions of wearing plastic Spock ears, whether quadrille, or quilling, or merely quizzing. Many of us became instant pundits among our friends, family, and the media, providing copy for everything from the Arkansas Democrat to The Wall Street Journal. Only a few periodicals continued to misspell Jane’s name as Austin, while many more managed to render correctly Bennet, Morland, and Love and Freindship. Oh, those were heady times.

If you were there, then you’ll no doubt recall that we came up with some pretty wild theories to explain the Jane train, too. Remember when Camilla Paglia said Austen’s popularity could be explained as a cultural symptom in reaction to the O.J. Trial, as people longed for stories in which no one was being butchered? That was a good one. Or how some claimed that the return to Austen was a result of the fin de siècle’s prompting us to take stock and return to works of past centuries? Seems pretty thin now. Others claimed that Austen’s resurgence happened because we needed to measure the worth of our male heroes, from Bill Clinton and Brad Pitt to Kurt Cobain and Ross Perot. (Jane Austen and Ross Perot?)

So here we are, circa 2007, finding ourselves in danger of being asked yet again, “Why Austen? Why now?” How delightful. How frightening. I’m determined not to be caught off guard, so I’ve constructed some all-purpose answers to explain the latest Austen craze, suitable for everything from The Nation to "Larry King Live" to Marie Claire. Anyone struggling for words is, of course, welcome to use these as conversational building blocks:

Option A: “Today’s Austen mania is a form of cultural compensation for the disaster of the Iraq War and for the genocide in Darfur. Her novels offer us a way to forget the world’s evils by allowing us to travel back to those halcyon post-French Revolutionary days of Napoleon.”

Option B: “Austen’s timeless narratives of women’s romantic searching provide a welcome distraction from the Supreme Court’s rolling back of abortion rights, as we yearn for an era when many women had the power to refuse a proposal of marriage.”

Option C: “Austen’s newfound popularity signals that empire-waist frocks are due for a fashion revival; that irony, having been shunned after 9/11, is back and better than ever; and that Wal-Mart will roll back prices on its imported teas.”

This list is just a draft of talking points. I still have a few more ideas to work out. For instance, can it be an accident that Austen’s popularity is surging, just as Jane magazine has gone defunct? There is certainly a quotable quip in the making there. Even if we don’t perfect our theories in the coming months, I don’t think there should be much cause for worry. Check back with me in 2013, the 200th anniversary of Pride and Prejudice’s publication. Oh, brother. I think I know where this is going.

Devoney Looser
Author's email: 

Devoney Looser is associate professor of English at the University of Missouri at Columbia, and the author of British Women Writers and the Writing of History (Johns Hopkins University Press). She has just completed a book on British women writers and old age, to be published next year.

On Drivel

I recently received a draft of one of my dissertation chapters back from my advisor. As always, he provided copious comments -- advice on improving the coherence of my argument, smoothing out some ungainly syntax, and choosing more appropriate words. My advisor is scrupulous, perhaps excessively so. I have learned a great deal about how to think and write from his comments.

But my advisor is also a tough reader, and I find that after all these years of being a student I am still learning how to take criticism. To wit: in my recent draft, written in bold, red ink is one word that succinctly represents what he thinks of the passage -- “drivel.” I quickly forgot all of the good things he had said about my argument as I focused on this one word, brutally penned in the margin. My incisive points, my elegantly constructed sentences, all reduced to a one-word judgment.

I knew that drivel meant nonsense, but shame prompted me to consult a dictionary. I learned that its meaning was a metaphorical extension of its more literal definition: to let saliva dribble from the mouth. Nothing more vividly represents brazen stupidity than the image of someone drooling. There is something intrinsically repulsive about the act of drooling and as I thought about how that metaphor might apply to my writing, I literally gave a small shudder. Ouch! Was my prose the equivalent of drivel? Analogous to an unconscious trickle of spit?

Yes. My advisor was right. What I had written was drivel. The passage didn’t meaningfully contribute to the argument. In fact, it didn’t seem to be saying much of anything. When I looked at the passage more closely, I saw that it was largely comprised of a loosely stitched together sequence of conventional phrases: “it is the fact that,” “of course,” “indeed, he goes on to argue,” and “on the one hand,” “on the other hand.” It was the utter conventionality of the writing that made it drivel. The passage represented writing on auto-pilot, requiring little to no consciousness on my part. I might as well have been slobbering onto the page. Somewhere behind all the nonsense, I had an idea, but what it was I could not say. Responding to the simple, severe remark felt something like going through the stages of grief. I moved from denial (“surely it’s not that bad”) through anger (“what nerve!”) and toward acceptance (“yup, it’s bad”).

I thought about this experience in the context of my own work. I teach writing and literature at Salt Lake Community College, and every semester I comment on student papers. I identify flaws in their reasoning, give advice on style and punctuation, and even point out when they’ve made an original point or turned a neat phrase. I have never written the word drivel in the margin of one of my student’s papers, but I have been tempted to do so on more than one occasion. I believe almost every writing teacher has felt the impulse to heap ferocious criticism on students. Those who haven’t are far more saintly than I.

I suppose after I achieved acceptance came a feeling of admiration. By God, I wish I had the guts to write the word drivel in the margin of a student paper! Of course, I don’t include these temptations in the class of my finer instincts. The temptation is more on par, I think, with the cheap thrill I get when an action-hero utters a powerful one-liner. Sometimes I just want to be the Arnold Schwarzenegger of writing teachers. But I am not Arnold, nor was meant to be.

The experience prompted me to entertain some more serious lessons about how my experience as a graduate student may translate into my work as a teacher. As a teacher of writing, it’s good to be put in the position of student writer, to experience all of the fear, anxiety, and hopefulness that goes into producing a piece of writing that will be judged by an authority figure. It is both humbling and instructive to be told that you are wanting, that what you’ve produced isn’t up to par. Being both a student and a teacher has made me more sympathetic to my students. I know what it feels like to be criticized and I am more likely to consider the consequences of harsh feedback. In other words, it’s a way of inoculating myself against my adolescent, writing-teacher-as-action-hero fantasies. My experience speaks to the benefit of occasionally subjecting ourselves to the rituals of performance and assessment that we ask our students to perform. We do this, of course, with conferences and papers. Becoming an active participant in disciplinary conversations not only helps me build on my knowledge in the field, it makes me feel like a student all over again.

Yet I am reminded that criticism is a form of praise. My advisor cares enough to call my writing drivel when he sees it, not because he thinks I’m stupid but rather because he believes I am capable of producing something better than drivel. I did not ultimately wilt at the word. I do not believe that I possess a special inner strength that makes me uniquely capable of withstanding severe criticism. Perhaps, then, we are not harsh enough with our students, that in our well-meaning effort to encourage them we end by being less than honest with them.

But maybe there are no life lessons to be drawn from drivel. Drivel is irredeemable. One can’t turn around and reclaim drivel. Never, we can hope, will there be an endowed chair of Drivel Studies. And I don’t believe that drivel is one of those terms that one can, with a bit of vernacular judo, turn on its head. Can I imagine my son saying in his teenage years, “That’s so drivel! It’s wicked drivel!”

There is finally no way around drivel. I find that I am refreshed by the honesty of the term. It reminds me of the uncomfortable fact that my interaction with students will always be structured around criticism, though we sometimes attempt to disguise this basic fact. I think students sometimes understand this better than we do.

Jason Pickavance
Author's email: 

Jason Pickavance is an assistant professor of English at Salt Lake Community College and a graduate student in American studies at the University of Utah. Despite his occasional lapses into drivel, he plans on defending his dissertation this year.


Subscribe to RSS - English
Back to Top