This morning I received an e-mail from a new colleague of mine about some workshop topics on writing. I met her last week at the massive Conference on College Composition and Communication (CCCC), in New York City. We’re both new members of the executive board of the Assembly for Expanded Perspectives on Learning (AEPL), a National Council of Teachers of English (NCTE) affiliate organization that is interested in promoting teaching and learning beyond traditional disciplines and methodologies.
She’s the recently elected associate chair trying to brainstorm some ideas for upcoming workshops and conferences. I’m the new treasurer trying to get my Excel columns to add up right.
At our meeting in the escalatored bowels of the Manhattan Hilton, the board agreed that the 2008 workshop would be titled “The Rhetorical Art of Reflection,” but in her e-mail today to me and the other board members, she suggested that the 2009 workshop might be on a topic related to the connections between music and writing.
This e-mail popped up as I was sitting here at my laptop in my university office listening to Van Morrison's album “Tupelo Honey” and writing copy for a Web site for our recently approved general education program and curriculum.
I wrote back to her wondering if anyone else like me had this kind of continual digital soundtrack running through their media players while tapping along on their keyboards and wristing red laser mouse pods. I thought it would be interesting to find out what other folks listen to when they write, headphoned or not. I also recommended a new income-generating idea for our little AEPL assemblage: a CD collection of greatest hits for writing, recommended by the usual galaxy of comp/rhet stars. Hey, Peter Elbow! What are you listening to? Cheryl Glenn? Raul Sanchez?
My preferences for writing of course are situational, just like they should be for any good rhetorician. As I’m writing this essay, I’m listening to “Ethiopiques, Vol. 4: Ethio Jazz & Musique Instrumentale, 1969-1974” by musician-arranger Mulatu Astatqe. My daughter sent it to me last year, and I ripped it immediately into my playlists. Other writing favorites in jazz include “Consummation” by the Thad Jones & Mel Lewis Orchestra, passed on to me by my neighbor Bill, Lionel Hampton’s “Mostly Ballads” and “Mostly Blues,” and some other favorites from the early 70’s: Keith Jarrett’s “The Köln Concert,” and “The Colours of Chlöe” by Eberhard Weber.
Here at my desk with the tangle of wires running from the scanner, printer, PDA cradle, and leftover Gateway 2000 speakers, I start off the day usually with something to get the blood moving, like Los Pregoneros Del Puerto and their traditional music of Veracruz, Paco de Lucia’s “Anthologia Vol. 1,” or that dobro-infused live double play by Alison Krauss and Union Station.
Or if I’m particularly stressed out and need to write and relax, I click on “Union” or “Devotion” by Rasa, R. Carlos Nakai’s “Cycles. Vol. 2,” or Clannad’s “Landmarks.”
But if I’m just chugging along during the day, I go to the old faithfuls: the soundtrack from Ken Burns’ “Lewis and Clark: The Journey of the Corps of Discovery,” Mary Chapin Carpenter’s “Stones in the Road,” Dylan’s “Blood on the Tracks,” some Puccini or Neil Young’s “Comes a Time.”
Given the slice and dice randomized nature of iTunes and Napster, I realize that speaking of music in terms of albums is very old school, but the extended play of the 50 to 60 minute tune after tune fits my writing rhythm pretty well. Once a playlist is over, I know it’s time to take a break, push away from my desk, stand up and lean back to stretch out my stiff back, wander out into the hallway of that other world, or walk downstairs and check my campus mailbox to see what junk I can toss into the recycling bins nearby.
When I was a longhaired college kid, I had Crosby, Stills, and Nash, Marvin Gaye, Cat Stevens, and Joni Mitchell in pretty much constant rotation on my scratchy stereo, one skewered vinyl dropping down on the next until it was time to flip the stack over again. In those days, I was listening for lyrics and rhyme as much as anything, thinking I was a writer in the company of writers who also happen to play music. These days I’m listening for melody and rhythm as much as anything, thinking I’m a writer in the company of musicians who also happen to keep me writing.
I guess I don’t know if a workshop on music and writing is such a good idea after all. Right now I’m thinking it would be just about as useful as any other workshop on the preferences folks have about writing: pencil vs. pen, medium vs. fine tip, black vs. blue, laptop vs. desktop, blank pad vs. college-ruled vs. yellow legal pad, at the desk vs. in bed, PC vs. Mac, Bach vs. Mozart. Seems all too personal, finicky, and idiosyncratic to me. Kind of like writing, if you know what I mean.
Laurence Musgrove is an associate professor of English and foreign languages at Saint Xavier University, in Chicago.
Longtime readers of Intellectual Affairs may recall that this column occasionally indulges in reference-book nerdery. So it was a pleasant but appropriate surprise when the Bodleian Library of the University of Oxford provided a copy of its new edition of the very first dictionary of the English language. It has been out of print for almost 400 years, and the Bodleian is now home to the one known copy of it to have survived.
Available now as The First English Dictionary, 1604 (distributed by the University of Chicago Press), the work was originally published under the title A Table Alphabeticall. It was compiled in the late 16th century by one Robert Cawdrey. The book did not bring him fame or fortune, but it went through at least two revised editions within a decade. That suggests there must have been a market for Cawdrey’s guide to what the title page called the “hard usuall English wordes” that readers sometimes encountered “in Scripture, Sermons, or elswhere.”
Cawdrey had the misfortune, unlike fellow lexicographer Samuel Johnson, of never meeting his Boswell. Yet he had an eventful career – enough to allow for a small field of Cawdrey studies. An interesting introduction by John Simpson, the chief editor of the Oxford English Dictionary, sums up what is known about Cawdrey and suggests ways in which his dictionary may contain echoes of his life and times.
At the risk of being overly present-minded, there’s a sense in which Cawdrey was a pioneer in dealing with the effects of his era’s information explosion. Thanks to the printing press, the English language was undergoing a kind of mutation in the 16th century.
New words began to circulate in the uncharted zone between common usage and the cosmopolitan lingo of sophisticated urbanites who traveled widely. Learned gentlemen were traveling to France and Italy and coming back “to powder their talk with over-sea language,” as Cawdrey noted. Some kinds of “academicke” language (glossed by Cawdrey as “of the sect of wise and learned men”) were gaining wider usage. And readers were encountering words like “crocodile” and “akekorn” which were unfamiliar. Cawdrey’s terse definitions of them as “beast” and “fruit,” respectively, suggest he probably had seen neither.
Booksellers had offered lexicons of ancient and foreign languages. And there were handbooks explaining the meaning of specialized jargon, such as that used by lawyers. But it was Cawdrey’s bright idea that you might need to be able to translate new-fangled English into a more familiar set of “plaine English words.”
Cawdrey also found himself in the position of needing to explain his operating system. “To profit by this Table,” as he informed the “gentle Reader” in a note, “thou must learn the Alphabet, to wit, the order of the Letters as they stand....and where every Letter standeth.” Furthermore, you really needed to have it down cold. A word beginning with the letters “ca,” he noted, would appear earlier than one starting with “cu.” After using the “Table” for a while, you probably got the hang of it.
Who was this orderly innovator? Cawdrey, born in the middle of England sometime in the final years of Henry VIII, seems not to have attended Oxford or Cambridge. But he was learned enough to teach and to preach, and came to enjoy the patronage of a minister to Queen Elizabeth. He married, and raised a brood of eight children. In a preface to the dictionary, Cawdrey acknowledges the assistance of “my sonne Thomas, who now is Schoolmaister in London.”
Cawdrey published volumes on religious instruction and on the proper way to run a household so that each person knew his or her proper place. He also compiled “A Treasurie or store-house of similies both pleasant, delightfull, and profitable, for all estates of men in generall.” (Such verbosity was quite typical of book titles at the time. The full title page for his dictionary runs to about two paragraphs.)
Whatever his chances for mobility and modest renown within the Elizabethan intelligentsia were severely limited, however, given his strong religious convictions. For Cawdrey was a Puritan – that is, someone convinced that too many of the old Roman Catholic ways still clung to the Church of England.
Curious whether "Puritan" (a neologism with controversial overtones) appeared in dictionary, I looked it up. It isn’t there. But Cawrey does have “purifie,” meaning “purge, scoure, or make cleane” -- which is soon followed by “putrifie, to waxe rotten, or corrupted as a sore.” By the 1580s, Cawdrey had both words very much in mind when he spoke from the pulpit. When he was called before church authorities, one of the complaints was that he had given a sermon in which he had “depraved the Book of Common Prayer, saying, That the same was a Vile Book and Fy upon it.” He was stripped of his position as minister.
But Cawdrey did not give up without a fight. He appealed the sentence, making almost two dozen trips to London to argue that it was invalid under church law. All to no avail. He ignored hints from well-placed friends that he might get his job back by at least seeming to go along with the authorities on some points. For that matter, he continued to sign his letters as if he were the legitimate pastor of his town.
No doubt Cawdrey retained a following within the Puritan underground, but he presumably had to go back to teaching to earn a living. Details about his final years are few. It isn’t even clear when Cawdrey died. He would have been approaching 70 when his dictionary appeared, and references in reprints of his books a few years later imply that they were revised posthumously.
In his introductory essay, John Simpson points out that the OED now lists 60,000 words that are known to have been in use in English around the year 1600. Cawdrey defines about 2,500 of them. “We should probably assume that he was unable to include as many words as he would have liked,” writes Simpson, “in order to keep his book within bounds. It was, after all, an exploratory venture.”
But that makes the selection all the more interesting. It gives you a notion of what counted as a “hard word” at the time. Most of them are familiar now from ordinary usage, though not always in quite the sense that Cawdrey indicates. He gives the meaning of “decision” as “cutting away,” for example. Tones of the preacher can be heard in his slightly puzzling definition of “curiositie” as “picked diligence, greater carefulnes, then is seemly or necessarie.”
Given his Puritan leanings, it is interesting to see that the word “libertine” has no specifically erotic overtones for Cawdrey. He defines the word applying to those “loose in religion, one that thinks he may doe as he listeth.” One of the longest entries is for “incest,” explained as “unlawfull copulation of man and woman within the degrees of kinred, or alliance, forbidden by Gods law, whether it be in marriage or otherwise.”
It is a commonplace of much recent scholarship that, prior to the mania for categorizing varieties of sexual desire that emerged in the 19th century, the word “sodomy” covered a wide range of non-procreative acts, heterosexual as well as homosexual. Cawdrey, it seems, didn’t get the memo. He defines “sodomitrie” as “when one man lyeth filthylie with another man.” Conversely, and rather more puzzling, is his definition of “buggerie” (which one might assume to be a slang term for a rather specific act) as “conjunction with one of the same kinde, or of men with beasts.”
In a few entries, one detects references to Cawdrey’s drawn-out legal struggle of the 1580s and '90s. He explains that a "rejoinder" is “a thing added afterwards, or is when the defendant maketh answere to the replication of the plaintife.” So a rejoinder is a response, perhaps, to “sophistikation” which Cawdrey defines as “a cavilling, deceitful speech.”
Especially pointed and poignant is the entry for “temporise,” meaning “to serve the time, or to follow the fashions and behaviour of the time.” Say what you will about Puritan crankiness, but Robert Cawdrey did not “temporise.”
Particularly interesting to note are entries hinting at how the “new information infrastructure” (circa 1600) was affecting language. The expense of producing and distributing literature was going down. “Literature,” by the way, is defined by Cawdrey here as “learning.” Cawdrey includes a bit of scholarly jargon, “abstract,” which he explains means “drawne away from another: a litbooke or volume prepared out of a greater.”
Some of the words starting to drift into the ken of ordinary readers were derived from Greek, such as “democracie, a common-wealth gouerned by the people” and “monopolie, a license that none shall buy and sell a thing, but one alone.” Likewise with terms from the learned art of rhetoric such as “metaphor,” defined as "similitude, or the putting over of a word from his proper and naturall signification, to a forraine or unproper signification.”
Cawdrey’s opening address “To the Reader” is a manifesto for the Puritan plain style. Anyone seeking “to speak publiquely before the ignorant people,” he insists, should “bee admonished that they never affect any strange inkhorne termes, but labour to speake so as is commonly received, and so as the most ignorant may well understand them.”
At the same time, some of the fancier words were catching on. The purpose of the dictionary was to fill in the gap between language that “Ladies, Gentlewomen, or any other unskilfull persons” might encounter in their reading and what they could readily understand. (At this point, one would certainly like to know whether Cawdrey taught his own three daughters how to read.) Apart from its importance to the history of lexicography, this pioneering reference work remains interesting as an early effort to strike a balance between innovation and accessibility in language use.
“Some men seek so far for outlandish English,” the old Puritan divine complains, “that they forget altogether their mothers language, so that if some of their mothers were alive, they were not able to tell, or understand what they say.” Oh Robert Cawdrey, that thou shouldst be alive at this hour!
This last fall I attended the 2006 TYCA-West conference. It was held in beautiful Park City, Utah in October. About 60 two-year college English faculty, graduate students, and even some university professors gathered to discuss the study and practice of teaching English. It’s hard to imagine a more beautiful setting than Park City in the fall. The crisp mountain air and the burnt orange and red scrub oak painting the surrounding mountains lend a….
What? You mean to tell me that you’ve never heard of TYCA-West? TYCA is the Two-Year College English Association, which is a group of the National Council of Teachers of English. It is comprised of seven regions, and each region holds an annual conference. TYCA-West is the regional organization that includes Utah, Idaho, Nevada, Arizona, Wyoming, and (who would have guessed), Hawaii.
Let’s be honest, as far as conferences go, it’s difficult to imagine a less prestigious conference than a regional two-year college English conference. You aren’t likely to rub shoulders with star scholars in the field. Nor will you encounter presentations that will help you sort through the talked about new book or intellectual movement of the year. For that, go to MLA. I’m not against the big conference. But I’ve come to appreciate the strengths of the small conference, and for professors dedicated to teaching, regional conferences may in fact be more valuable and more rewarding than higher profile conferences.
At the TYCA-West conference, we tend to focus on practical issues associated with teaching English. This last year, our keynote address was by Sharon Mitchler, the past TYCA-National chair. She addressed the larger economic and demographic trends associated with teaching English in the two-year college. I learned, for instance, that two-year colleges “teach an estimated 50 percent of all college-level composition and 70 percent of all developmental composition courses,” and I learned that “college participation rates among low-income students peaked in 1998 and have been falling since then.” Mitchler’s presentation had a refreshingly empirical cast, something I’m not accustomed to at humanities conferences. But she effectively embedded those facts within a larger argument about how these trends will ultimately determine what we do in the classroom, whether we realize it or not. At the 2005 TYCA-West conference, we were treated to an excellent presentation by Kathleen Blake Yancey on the changing nature of literacy. It was followed by an engaging and pleasingly cant-free discussion about what we’re currently experiencing in the classroom. Many of the challenges associated with teaching writing persist. Instructors shared stories about how difficult it is to get students to become critical readers and writers. Many instructors, however, pointed to newer trends in writing instruction, like service learning, which offer students more authentic scenarios of composition.
I’ve also formed lasting friendships at TYCA-West. Since becoming involved in TYCA-West, I now know and correspond with faculty members from each of the states within my region, from places like Yavapai College of Arizona, Community College of Southern Nevada, Western Wyoming Community College, and Dixie State College. We share an identity as two-year college English faculty, joined in a common enterprise. As faculty members who share similar economic and demographic challenges, we have also formed a regional identity, something not typically encouraged by the larger conferences. I feel like I have developed an authentic network through my experience at TYCA-West. From Jeff Andelora who teaches at Mesa Community College, I’ve learned about the history of community college English. From Bradley Waltman at the Community College of Southern Nevada, I’ve learned about the challenges associated with placing students in writing courses.
Here’s what you won’t find at TYCA-West or most other smaller, regional conferences. You won’t be subjected to the name-badge-glance-and-turn, a move I’ve always for some reason viewed as akin to a basketball player’s expert pivot. (If only the Utah Jazz center could pivot like that.) Instead, you will encounter colleagues at peer institutions genuinely interested to meet you and hear what you have to say.
Neither will you attend presentations obviously constructed for the sole purpose of CV fodder. No counterintuitive readings of canonical texts that strain credulity. No impotent counter-hegemonic posturing. Presentations tilt toward the practical rather than the theoretical. Though, believe it or not, two-year college English professors are interested in theory, but we typically put theory in the service of practical considerations. In my experience, you are more likely to hear what Joseph Williams called the “So what?” question at smaller conferences. Taken together, the presentations at our TYCA conferences soberly address the perennial challenge of how we get our students to become more effective writers and readers.
Finally, regional conferences are cheap. I briefly considered attending this year’s Conference on College Composition and Communication in New York City. But rooms at the conference hotel are $300 a night and the flight would have cost me about $500 round trip. The total cost of the conference would have easily exceeded $1,500 and, though I am lucky enough to get support from my college to attend conferences, I decided that it just wasn’t worth it. For those faculty members who receive little or no support from their institutions, this year’s 4Cs conference is probably out of reach.
In contrast, let me present, Thoreau-like, the costs of my 2005 TYCA-West in Prescott, Arizona:
Travel $230 (round trip to Phoenix plus a shuttle to and from Prescott).
Hotel (shared room with a colleague) $75.
Conference Registration $140 (included breakfast and lunch on both days of the conference).
Food $75 (including a beer and scrumptious burger at The Saloon, which has a wall-sized painting of Steve McQueen worth seeing).
For around $500 I enjoyed a conference where I connected with professors from the region, went to Prescott for the first time -- a beautiful little college town in the mountains northwest of Phoenix -- and learned a little more about how to become a more effective English teacher. The 2006 TYCA-West conference in Park City was a 30-minute drive from my house.
Regional organizations can languish, though. Anyone who has been involved in the organization and promotion of a regional conference can tell you that it’s sometimes difficult to generate interest and attendance. Because the large, national conferences exert such a big influence over the discipline, it is often a challenge to persuade professors that small conferences are worth their time. After all, what will a presentation at TYCA-West do for your CV? But I am excited about next year’s TYCA-West conference in Las Vegas. (I suggested we adopt the line, “What happens at TYCA-West stays at TYCA-West,” in order to generate greater participation.)
Large conferences will always be important, and I still plan on attending them. But the academic work done by many college professors happens primarily in the classroom. The small conference provides an ideal forum for them to share this important work.
Jason Pickavance is an instructor in the English department at Salt Lake Community College, where he teaches courses in writing and American literature.
“It has been my experience with literary critics and academics in this country,” wrote Kurt Vonnegut in an essay published in 1981, “that clarity looks a lot like laziness and childishness and cheapness to them. Any idea which can be grasped immediately is for them, by definition, something they knew all the time. So it is with literary experimentation, too. If a literary experiment works like a dream, is easy to read and enjoy, the experimenter is a hack. The only way to get full credit as a fearless experimenter is to fail and fail.”
The anger in that statement had been building up for at least a couple of decades. Much of Vonnegut’s early work was classified as science fiction – a filing-cabinet drawer that, as he once put it, academics tended to confuse with a urinal. He was later discovered by people who didn't read science fiction, and most of his books stayed in print. But that just meant he had failed to fail, so the charge of being a hack was still in the air.
In some respects, though, his complaints were already out of date when he made them; for by the early 1980s, there was already a scholarly industry in Vonnegut criticism. It now runs to some three dozen books, not to mention more journal articles than anyone would want to count.
During the original wave of speculation on postmodernism during the 1960s and early ‘70s – when that notion was relatively untheoretical, a label applied to emergent literary tendencies more than the name for some vast cultural problematic – it was very often the work of Kurt Vonnegut that people had in mind as an exemplary instance. Parataxis,metafiction, blurring of the distinction between mass-culture genres and modernistic formal experimentation -- all of this, you found in Vonnegut. His novels were chemically pure samples of the postmodern condition.
And then came the definitive moment documenting Vonnegut’s place in the literary curriculum: the film "Back to School" (1986), in which the author had a cameo role.
In that landmark work, as you may recall, Rodney Dangerfield played Thornton Mellon, a millionaire who returns to college for the educational opportunities involved in partying with coeds in bikinis. When an English professor assigns a paper on Vonnegut’s fiction, Dangerfield hires the novelist himself to write the analysis. The paper receives a failing grade. (Someone in Hollywood must be a fan of Northrop Frye, who once said that whatever else one might say about Wordsworth’s preface to the Lyrical Ballads, as a piece of Wordsworth criticism it only merited a B plus.)
Given such clear evidence of canonization, it was a surprise to notice that a couple of friends responded to the news of Vonnegut’s death last week with slightly embarrassed sadness. Both are graduate students in the humanities. One called his novels a “guilty pleasure.” Another mentioned how much Vonnegut’s work had meant to him “even if he’s not considered that great or serious a writer.”
I suspect that such feelings about Vonnegut are pretty widespread -- that the shelves of secondary literature don’t really quell a certain ambivalence among readers who feel both deep affection for his work combined with a certain keen nervousness about his cultural status. Unfortunately Vonnegut did not make things any easier by publishing so many novels that devolved into self-parody. If he had quit after Cat’s Cradle and Slaughterhouse Five, the ratio of wheat to chaff in his fiction would be much more favorable.
But the ambivalence itself is not, I think, a response to the uneven quality of his work -- nor even the product of some misguided notion that a funny author can’t be taken seriously. Rather, the problem may be that Vonnegut is an author one tends to discover in adolescence. Defensiveness about the attachment one feel to his work is, in part, a matter of wanting to protect the part of oneself that seemed to come into being upon first reading him. “I deal with sophomoric questions that full adults regard as settled,” he told an interviewer once.
He had, for example, a large capacity for facing brute contingency as part of human existence. A great deal of life is chance. (The fact that you were born, for example. Think how arbitrary that is.) And much of the rest of life consists of learning to evade that truth – walling it off, away from consciousness, because otherwise the reality of it would be too hard to fathom. Instead, we throw ourselves into fictions of power and belonging: nationalism, militarism, religion, the acquisition of cool stuff. These are ways to contain both the vulnerability before chance and the terrors of loneliness. In Vonnegut’s understanding of the world, loneliness is a fundamental part of human experience that became much, much worse in the United States, somehow, during the second half of the twentieth century – with no particular reason to think it will get better anytime soon.
As contributions to the cultural history of mankind, such thoughts are pretty small beer. On the other hand, just try to escape their implications. To call a point simple is the cheapest and least effective means of gainsaying it.
On Monday, at about the time I sat writing that paragraph about chance and terror and helplessness, someone was walking around a university campus shooting people at random. This was a coincidence. It was chance. That thought is no comfort. As one of the Tralfamidorians says in Slaughterhouse Five: “Well, here we are, Mr. Pilgrim, trapped in the amber of this moment. There is no why.” So it goes.
Vonnegut (who once called himself “a Christ-worshiping agnostic”) drew from the ground truth of existential terror a moral conclusion that it made sense to try to love your neighbor as yourself – or at least to treat other people with radical decency. This sounds simplistic until you actually try doing it.
He was a socialist in the old Midwestern tradition best expressed in a famous statement by Eugene Debs that went: "Years ago I recognized my kinship with all living beings, and I made up my mind that I was not one bit better than the meanest on earth. I said then, and I say now, that while there is a lower class, I am in it, and while there is a criminal element I am of it, and while there is a soul in prison, I am not free." Quoting that was about as close to a theoretical statement as Vonnegut ever got. The rest of his outlook he regarded as common sense.
“Everything I believe,” he said, “I was taught in junior civics during the Great Depression – at School 43 in Indianapolis, with full approval of the school board. School 43 wasn’t a radical school. American was an idealistic, pacifistic nation at that time. I was taught in the sixth grade to be proud that we had a standing Army of just over a hundred thousand men and that the generals has nothing to say about what was done in Washington. I was taught to be proud of that and to pity Europe for having more than a million men under arms and spending all their money on airplanes and tanks. I simply never unlearned junior civics. I still believe in it. I got a very good grade.”
Someone with such attitudes must necessarily be an anachronism, of course, and anachronisms tend to be either funny or sad. His books, at their best, were both. A few of them will survive because they hold those qualities in such beautiful proportion. “Laughter,” as Vonnegut once put it, “is a response to frustration, just as tears are, and it solves nothing, just as tears solve nothing. Laughter or crying is what a human being does when there’s nothing else he can do.”
On April 19, after a day of teaching classes at Shippensburg University, I went out to my car and grabbed a box of old poetry manuscripts from the front seat of my little white beetle and carried it across the street and put it next to the trashcan outside Wright Hall. The poems were from poetry contests I had been judging and the box was heavy. I had previously left my recycling boxes there and they were always picked up and taken away by the trash department.
A young man from ROTC was watching me as I got into my car and drove away. I thought he was looking at my car, which has black flower decals and sometimes inspires strange looks. I later discovered that I, in my dark skin, am sometimes not even a person to the people who look at me. Instead, in spite of my peacefulness, my committed opposition to all aggression and war, I am a threat by my very existence, a threat just living in the world as a Muslim body.
Upon my departure, he called the local police department and told them a man of Middle Eastern descent driving a heavily decaled white Beetle with out of state plates and no campus parking sticker had just placed a box next to the trash can. My car has New York State plates, but he got the rest of it wrong. I have two stickers on my car. One is my highly visible faculty parking sticker and the other, which I just don't have the heart to take off these days, says "Kerry/Edwards: For a Stronger America."
Because of my recycling the bomb squad came, the state police came. Because of my recycling buildings were evacuated, classes were canceled, campus was closed. No. Not because of my recycling. Because of my dark body. No. Not because of my dark body. Because of his fear. Because of the way he saw me. Because of the culture of fear, mistrust, hatred, and suspicion that is carefully cultivated in the media, by the government, by people who claim to want to keep us "safe."
These are the days of orange alert, school lock-downs, and endless war. We are preparing for it, training for it, looking for it, and so of course, in the most innocuous of places -- a professor wanting to hurry home, hefting his box of discarded poetry -- we find it.
That man in the parking lot didn't even see me. He saw my darkness. He saw my Middle Eastern descent. Ironic because though my grandfathers came from Egypt, I am Indian, a South Asian, and could never be mistaken for a Middle Eastern man by anyone who'd ever met one.
One of those in the gathering crowd, trying to figure out what had happened, heard my description-a Middle Eastern man driving a white Beetle with out-of-state plates and knew immediately they were talking about me and realized that the box must have been manuscripts I was discarding. When the police were told I was a professor, immediately the question came back about where I was from.
At some length several of my faculty colleagues were able to get through to the police and get me on a cell phone where I explained to the university president and then to the state police that the box contained old poetry manuscripts that needed to be recycled. The police officer told me that in the current climate I needed to be more careful about how I behaved. "When I recycle?" I asked.
The university president appreciated my distress about the situation but denied that the call had anything to do with my race or ethnic background. The spokesman for the university called it an "honest mistake," not referring to the young man from ROTC giving in to his worst instincts and calling the police but referring to me, who made the mistake of being dark-skinned and putting my recycling next to the trashcan.
The university's bizarrely minimal statement lets everyone know that the "suspicious package" beside the trashcan ended up being, indeed, trash. It goes on to say, "We appreciate your cooperation during the incident and remind everyone that safety is a joint effort by all members of the campus community."
What does that community mean to me, a person who has to walk by the ROTC offices every day on my way to my own office just down the hall-who was watched, noted, and reported, all in a day's work? Today we gave in willingly and whole-heartedly to a culture of fear and blaming and profiling. It is deemed perfectly appropriate behavior to spy on one another and police one another and report on one another. Such behaviors exist most strongly in closed and undemocratic and fascist societies.
The university report does not mention the root cause of the alarm. That package became "suspicious" because of who was holding it, who put it down, who drove away. Me.
It was poetry, I kept insisting to the state policeman who was questioning me on the phone. It was poetry I was putting out to be recycled.
My body exists politically in a way I can not prevent. For a moment today, without even knowing it, driving away from campus in my little Beetle, exhausted after a day of teaching, listening to Justin Timberlake on the radio, I ceased to be a person when a man I had never met looked straight through me and saw the violence in his own heart.
Kazim Ali is a poet and novelist. He teaches at Shippensburg University and at Stonecoast, the low-residency MFA program of the University of Southern Maine. Eyewitnesses confirmed his account of the scene after he left the university. A university spokesman declined to discuss specifics of the incident or who was involved, but told Inside Higher Ed that "the response was appropriate based on the circumstances," and that "just days after the [Virginia Tech] massacre, everybody is looking out for each other."
Jacques-Alain Miller has delivered unto us his thoughts on Google. In case the name does not signify, Jacques-Alain Miller is the son-in-law of the late Jacques Lacan and editor of his posthumously published works. He is not a Google enthusiast. The search engines follows “a totalitarian maxim,” he says. It is the new Big Brother. “It puts everything in its place,” Miller declares, “turning you into the sum of your clicks until the end of time.”
Powerful, then. And yet – hélas! – Google is also “stupid.” It can “scan all the books, plunder all the archives [of] cinema, television, the press, and beyond,” thereby subjecting the universe to “an omniscient gaze, traversing the world, lusting after every little last piece of information about everyone.” But it “is able to codify, but not to decode. It is the word in its brute materiality that it records.” (Read the whole thing here. And for another French complaint about Google, see this earlier column.)
When Miller pontificates, it is, verily, as a pontiff. Besides control of the enigmatic theorists’s literary estate, Miller has inherited Lacan’s mantle as leader of one international current in psychoanalysis. His influence spans several continents. Within the Lacanian movement, he is, so to speak, the analyst of the analysts’ analysts.
He was once also a student of Louis Althusser, whose seminar in Paris during the early 1960s taught apprentice Marxist philosophers not so much to analyze concepts as to “produce” them. Miller was the central figure in a moment of high drama during the era of high structuralism. During Althusser’s seminar, Miller complained that he had been busy producing something he called “metonymic causality” when another student stole it. He wanted his concept returned. (However this conflict was resolved, the real winner had to be any bemused bystander.)
Miller is, then, the past master of a certain mode of intellectual authority – one that has been deeply shaped by (and is ultimately inseparable from) tightly restricted fields of communication and exchange.
Someone once compared the Lacanian movement to a Masonic lodge. There were unpublished texts by the founder that remained more than usually esoteric: they were available in typescript editions of just a few copies, and then only to high-grade initiates.
It is hard to imagine a greater contrast to that digital flatland of relatively porous discursive borders about which Miller complains now. As well he might. (Resorting to Orwellian overkill is, in this context, probably a symptom of anxiety. There are plenty of reasons to worry and complain about Google, of course. But when you picture a cursor clicking a human face forever, it lacks something in the totalitarian-terror department.)
Yet closer examination of Miller’s pronouncement suggests another possibility. It isn’t just a document in which hierarchical intellectual authority comes to terms with the Web's numbskulled leveling. For the way Miller writes about the experience of using Google is quite revealing -- though not about the search engine itself.
“Our query is without syntax,” declares Miller, “minimal to the extreme; one click ... and bingo! It is a cascade -- the stark white of the query page is suddenly covered in words. The void flips into plenitude, concision to verbosity.... Finding the result that makes sense for you is therefore like looking for a needle in a haystack. Google would be intelligent if it could compute significations. But it can’t.”
In other words, Jacques-Alain Miller has no clue that algorithms determine the sequence of hits you get back from a search. (However intelligent Google might or might not be, the people behind it are quite clearly trying to “compute significations.”) He doesn’t grasp that you can shape a query – give it a syntax – to narrow its focus and heighten its precision. Miller’s complaints are a slightly more sophisticated version of someone typing “Whatever happened to Uncle Fred?” into Google and then feeling bewildered that the printout does not provide an answer.
For an informed contrast to Jacques-Alain Miller’s befuddled indignation, you might turn to Digital History Hacks, a very smart and rewarding blog maintained by William J. Turkel, an assistant professor of history at the University of Western Ontario. (As it happens, I first read about Miller in Psychoanalytic Politics: Jacques Lacan and Freud's French Revolution by one Sherry Turkle. The coincidence is marred by a slip of the signfier: they spell their names differently.)
The mandarin complaint about the new digital order is that it lacks history and substance, existing in a chaotic eternal present – one with no memory and precious little attention span. But a bibliographical guide that Turkel posted in January demonstrates that there is now extensive enough literature to speak of a field of digital history.
The term has a nice ambiguity to it – one that is worth thinking about. One the one hand, it can refer to the ways historians may use new media to do things they’ve always done – prepare archives, publish historiography, and so on. Daniel J. Cohen and Roy Rosenzweig’s Digital History: A Guide to Gathering, Preserving, and Presenting the Past on the Web (University of Pennsylvania Press, 2006) is the one handbook that ought to be known to scholars even outside the field of history itself. The full text of it is available for free online from the Center for History and New Media at George Mason University, which also hosts a useful selection of essays on digital history.
But as some of the material gathered there shows, digitalization itself creates opportunities for new kinds of history – and new problems, especially when documents exist in formats that have fallen out of use.
Furthermore, as various forms of information technology become more or more pervasive, it makes sense to begin thinking of another kind of digital history: the history of digitality. Impressed by the bibliography that Turkel had prepared – and by the point that it now represented a body of work one would need to master in order to do graduate-level work in digital history – I contacted him by e-mail to get more of his thoughts on the field.
“Digital history begins,” he says, “with traditional historical sources represented in digital form on a computer, and with 'born-digital' sources like e-mail, text messages, computer code, video games and digital video. Once you have the proper equipment, these digital sources can be duplicated, stored, accessed, manipulated and transmitted at almost no cost. A box of archival documents can be stored in only one location, has to be consulted in person, can be used by only a few people at a time, and suffers wear as it is used. It is relatively vulnerable to various kinds of disaster. Digital copies of those documents, once created, aren't subject to any of those limitations. For some purposes you really need the originals (e.g., a chemical analysis of ink or paper). For many or most other purposes, you can use digital representations instead. And note that once the chemical analysis is completed, it too becomes a digital representation.”
But that’s just the initial phase, or foundation level, of digital history – the scanning substratum, in effect, in which documents become more readily available. A much more complex set of questions come up as historians face the deeper changes in their work made possible by a wholly different sort of archival space – what Roy Rosenzweig calls the "culture ofabundance" created by digitality.
“He asks us to consider what it would mean to try and write history with an essentially complete archival record,” Turkel told me. “I think that his question is quite deep because up until now we haven't really emphasized the degree to which our discipline has been shaped by information costs. It costs something (in terms of time, money, resources) to learn a language, read a book, visit an archive, take some notes, track down confirming evidence, etc. Not surprisingly, historians have tended to frame projects so that they could actually be completed in a reasonable amount of time, using the availability and accessibility of sources to set limits.”
Reducing information costs in turn changes the whole economy of research – especially during the first phase, when one is framing questions and trying to figure out if they are worth pursuing.
“If you're writing about a relatively famous person,” as Turkel put it, “other historians will expect you to be familiar with what that person wrote, and probably with their correspondence. Obviously, you should also know some of the secondary literature. But if you have access to a complete archival record, you can learn things that might have been almost impossible to discover before. How did your famous person figure in people's dreams, for example? People sometimes write about their dreams in diaries and letters, or even keep dream journals. But say you wanted to know how Darwin figured in the dreams of African people in the late 19th century. You couldn't read one diary at a time, hoping someone had had a dream about him and written it down. With a complete digital archive, you could easily do a keyword search for something like "Darwin NEAR dream" and then filter your results.” As it happens, I conducted this interview a few weeks before coming across Jacques-Alain Miller’s comments on Google. It seems like synchronicity that Turkel would mention the possibility of digital historians getting involved in the interpretation of dreams (normally a psychoanalyst’s preserve). But for now, it sounds as if most historians are only slightly more savvy about digitality than the Lacanian Freemasons.
“All professional historians have a very clear idea about how to make use of archival and library sources,” Turkel says, “and many work with material culture, too. But I think far fewer have much sense of how search engines work or how to construct queries. Few are familiar with the range of online sources and tools. Very few are able to do things like write scrapers, parsers or spiders.”
(It pays to increase your word power. For a quick look at scraping and parsing, start here. For the role of spiders on the Web, have a look at this.)
“I believe that these kind of techniques will be increasingly important,” says Turkel, “and someday will be taken for granted. I guess I would consider digital history to have arrived as a field when most departments have at least one person who can (and does) offer a course in the subject. Right now, many departments are lucky to have someone who knows how to digitize paper sources or put up web pages.”
Last week,Intellectual Affairs took up the topic of what might be called scandal-mania -- the never-ending search for shock, controversy, and gratifying indignation regarding our “master thinkers.” Unfortunately there haven’t been enough “shocking revelations” recently to keep up with the demand. So the old ones are brought out of mothballs, from time to time.
A slightly different kind of case has come up recently involving Zygmunt Bauman, who is emeritus professor of sociology at the University of Leeds and the University of Warsaw. Bauman is a prolific author with a broad range of interests in social theory, but is probably best known for a series of books and essays analyzing the emergence of the new, increasingly fluid and unstable forms of cultural and social order sometimes called “postmodernism.”
No doubt that fact alone will suffice to convince a certain part of the public that he must be guilty of something. Be that as it may, Bauman is not actually a pomo enthusiast. While rejecting various strands of communitarianism, he is quite ambivalent about the fragmentation and confusion in the postmodern condition. His book Liquid Times: Living in an Age of Uncertainty, just issued by Polity, is quite typical of his work over the past few years -- a mixture of social theory and cultural criticism, sweeping in its generalizations but also alert to the anxieties one sees reflected in the newspaper and on CNN.
In March, a paragraph concerning Bauman appeared at Sign and Sight, a Web site providing capsule summaries in English of the Feuilletons (topical cultural articles) appearing in German newspapers and magazines. It noted the recent publication in the Frankfurter Allgemeine Zeitung of an article by a Polish historian named Bogdan Musial. The piece “uncovers the Stalinist past of the world famous sociologist,” as Sign and Sight put it.
It also quoted a bit of the article. "The fact is that Bauman was deeply involved with the violent communist regime in Poland for more than 20 years,” in Musial’s words, “fighting real and supposed enemies of Stalinism with a weapon in his hand, shooting them in the back. His activities can hardly be passed off as the youthful transgressions of an intellectual seduced and led astray by communist ideology. And it is astonishing that Bauman, who so loves to point the finger, does not reflect on his own deeds."
A few weeks later, another discussion of the matter appeared in The Irish Times -- this one by Andreas Hess, a senior lecturer in sociology at the University of Dublin. The piece bore what seems, with hindsight, the almost inevitable title of “Postmodernism Made Me Do It: A World Without Blame.” (The article is not available except to subscribers, but I’ll quote from a copy passed along by a friend.)
Summing up the charges in the German article, Hess said that secret files recently declassified in Poland revealed that Bauman “participated in operations of political cleansing of alleged political opponents in Poland between 1944 and 1954. The Polish files also show Bauman was praised by his superiors for having been quite successful in completing the tasks assigned, although he seems, as at least one note suggests, not to have taken any major part in direct military operations because of his ‘Semitic background.’ However, to be promoted to the rank of major at the youthful age of 23 was quite an achievement. As the author of the article [in the German newspaper] pointed out, Bauman remained a faithful member of the party apparatus.”
Hess goes on to suggest that “Bauman’s hidden past” is the key to his work as “one of the prophets of postmodernism.” This is not really argued so much as asserted -- and in a somewhat contradictory way.
On the one hand, it is implied that Bauman has used postmodern relativism as a way to excuse his earlier Stalinist crimes. Unfortunately for this argument, Bauman is actually a critic of postmodernism. And so, on the other hand, the sociologist is also guilty of attacking Western society by denouncing postmodernity. Whether or not this is a coherent claim, it points to some of what is at issue in the drama over “Bauman’s secret Stalinism,” as it’s called.
Now, I do not read German or Polish -- a decided disadvantage in coming to any sense of how the controversy has unfolded in Europe. Throughout the former Soviet sphere of influence, a vast and agonizingly complex set of problems has emerged surrounding “lustration” -- the process of "purifying" public life by legally disqualifying those who collaborated with the old Communist regimes from serving in positions of authority.
But let’s just look at the matter on purely in terms of the academic scandal we’ve been offered. I have read some of Bauman’s work, but not a lot. Under the circumstances that may be an advantage. I am not a disciple – and by no means feel committed to defending him, come what may.
If he has hidden his past, then its revelation is a necessary thing. But then, that is the real issue at stake. Everything turns on that “if.”
What did we know about Zygmunt Bauman before the opening of his files? What could be surmise about his life based on interviews, his bibliographical record, and books about him readily available at a decent university library?
One soon discovers that “Bauman’s hidden past” was very badly hidden indeed. He has never published a memoir about being a Stalinist -- nor about anything else, so far as I know -- but he has never concealed that part of his life either. The facts can be pieced together readily.
He was born in Poland in 1925 and emigrated to the Soviet Union with his family at the start of World War II. This was an altogether understandable decision, questions of ideology aside. Stalin’s regime was not averse to the occasional half-disguised outburst of anti-Semitism, but that was not the central point of its entire agenda, at least; so it is hardly surprising that a Jewish family might respond to the partition of Poland in 1939 by heading East.
Bauman studied physics and dreamed, he says, of becoming a scientist. He served as a member of the Polish equivalent of the Red Army during the war. He returned to his native country as a fervent young Communist, eager, he says, to rebuild Poland as a modern, egalitarian society – a “people’s democracy” as the Stalinist lingo had it. His wife Janina Bauman, in her memoir A Dream of Belonging: My Years in Postwar Poland (Virago, 1988) portrays him as a true believer in the late 1940s and early 1950s.
But there is no sense overstressing his idealism. To have been a member of the Polish United Workers Party was not a matter of teaching Sunday school classes on Lenin to happy peasant children. Bauman would have participated in the usual rounds of denuciation, purge, “thought reform,” and rationalized brutality. He was also an officer in the Polish army. The recent revelations specify that he belonged to the military intelligence division -- making him, in effect, part of the secret police.
But the latter counts a “revelation” only to someone with no sense of party/military relations in the Eastern bloc. Not every member of the military was a Communist cadre -- and an officer who was also a member of the party had a role in intelligence-gathering, more or less by definition.
But a Jewish party member was in a precarious position – again, almost by definition. In 1953, he was forced out of the army during one of the regime’s campaigns against “Zionists” and “cosmopolitans.” He enrolled in the University of Warsaw and retrained as a social scientist. He began to research on the history of the British Labour Party and the development of contemporary Polish society.
One ought not to read too much dissidence into the simple fact of doing empirical sociology. Bauman himself says he wanted to reform the regime, to bring it into line with its professed egalitarian values. And yet, under the circumstances, becoming a sociologist was at least somewhat oppositional a move. He published articles on alienation, the problems of the younger generation, and the challenge of fostering innovation in a planned economy.
And so he remained loyal to the regime -- in his moderately oppositional fashion -- until another wave of official anti-Semitism in 1968 made this impossible. In her memoir, Janina Bauman recalls their final weeks in Poland as a time of threatening phone calls, hulking strangers loitering outside their apartment, and TV broadcasts that repeated her husband’s name in hateful tones. “A scholarly article appeared in a respectable magazine,” she writes. “It attacked [Zygmunt] and others for their dangerous influence on Polish youth. It was signed by a close friend.”
Bauman and his family emigrated that year, eventually settling in Leeds. (He never faced a language barrier, having for some years been editor of a Polish sociological journal published in English.) His writings continued to be critical of both the Soviet system and of capitalism, and to support the labor movement. When Solidarity emerged in 1980 to challenge the state, Bauman welcomed it as the force that would shape of the future of Poland.
These facts are all part of the record -- put there, most of them, by Bauman himself. By no means is it a heroic tale. From time to time, he must have named names, and written things he didn’t believe, and forced himself to believe things that he knew, deep down, were not true.
And yet Bauman did not hide his past, either. It has always been available for anyone trying to come to some judgment of his work. He has been accused of failing to reflect upon his experience. But even that is a dubious reading of the evidence. A central point of his work on the “liquid” social structure of postmodernism is its contrast with the modernity that went before, which he says was “marked by the disciplinary power of the pastoral state.” He describes the Nazi and Stalinist regimes as the ultimate, extreme cases of that “disciplinary power.”
Let’s go out on a limb and at least consider the possibility that someone who admittedly spent years serving a social system that he now understands as issuing from the same matrix as Hitler’s regime may perhaps be telling us (in his own roundabout, sociologistic way) that he is morally culpable, no matter what his good intentions may have been.
Alas, this is not quite so exciting as “Postmodernist Conceals Sinister Past.” It doesn’t even have the satisfying denouement found in “The God That Failed,” that standard of ex-Communist disillusionment. Sorry about that.... It’s just a tale of a man getting older and – just possibly – wiser. I tend to think of that as a happy story, even so.
For several years I have been teaching personal writing courses in which students share with classmates self-disclosing essays on a wide variety of topics that are rarely discussed in the classroom, including eating disorders, sexual abuse, drug and alcohol addiction, depression, and suicide. Such "risky writing" -- the title of my 2001 book -- involves confronting painful or shameful emotions and requires a safe, empathic classroom atmosphere so that students who write about traumatic topics are not retraumatized. In March 2004, I decided to read aloud to my students the most personal writing of my life: a eulogy for my beloved wife.
Two years earlier it would have been unimaginable to believe that Barbara, who had been in excellent health, and who could have been a poster child for living an active, fulfilling life, would soon be diagnosed with one of the most dreaded diseases: pancreatic cancer. She had none of the risk factors except being over the age of 50. She appeared decades younger than her age; when our daughters were in college, she looked like their oldest sister rather than their mother. She never abused her body: never smoked, never drank excessively, exercised regularly, had annual physical exams, and always maintained a healthy weight. Perhaps equally important, there was no history of cancer on either side of her family: nearly all of her relatives lived to their 80s or 90s, including her parents and their many siblings. And so when Barbara was diagnosed with metastatic pancreatic cancer -- a redundancy since nearly all pancreatic cancer is metastatic by the time it is detected -- on August 12, 2002, one day after our 34th wedding anniversary, she was given less than a year to live.
Fear, shock, and horror followed the grim diagnosis, and for the next several months we were in and out of the hospital, undergoing tests, consultations, and treatments. There is no cure for pancreatic cancer -- it is one of the most virulent cancers, with a 99 percent mortality rate, and the standard treatment, chemotherapy, works only for a few months, if that long. From the moment of her diagnosis we were on a roller coaster -- there is no avoiding this overused metaphor. Unlike amusement roller coasters, in which thrill-seekers know in advance that they are paying for the illusion of danger, we knew that this ride would plunge Barbara lower and lower until its final crash. There were, to be sure, a few unexpected highs, when the disease seemed to be retreating, thanks to an experimental pancreatic cancer vaccine that Barbara took for 18 weeks. The vaccine supercharged the chemotherapy that followed, giving her several additional months of life; but when she was forced to end the chemotherapy after six months, because her white blood cell count was dangerously low, the cancer spread with a vengeance throughout her pancreas, liver, and abdomen, and all hope of remission vanished. Slowly and imperceptibly our attitude toward death changed from that of a dreaded adversary, to be avoided at all cost, to a welcome ally, signaling the end of the nearly 20-month ordeal.
When our doctors told us that Barbara was close to the end, in January 2004, I decided to write a eulogy. I could have waited until her actual death, but I didn't know whether I could write a eulogy in two or three days, the time interval, according to Jewish custom, between death and burial. Besides, I wanted as much time as possible to write what would surely be the most important speech of my life. And so I wrote a first draft that I continued to revise until her funeral three months later. I wanted to memorialize the woman who had been the center of my universe for four decades. She was not only my wife but also my best friend and soulmate, the person who had transformed my life from the moment we began dating.
I wrote the eulogy to celebrate Barbara's life rather than dwell on the wrenching details of her death. I wanted to bring smiles to the mourners, but I knew that my words would inevitably bring tears to their eyes: striking the right tonal balance would be a challenge. And so I decided to begin with light reminiscences, which would allow me to maintain my composure, and then I would move slowly toward the final months of Barbara's life, a subject that would make greater emotional demands on everyone in the funeral chapel. Here is part of the eulogy:
Barbara and I met in the fall of 1963 in our freshman English class at the University of Buffalo. She was not yet 17 years old. For me, though not for her, it was love at first sight: I couldn't take my eyes off her long flowing hair, green eyes, high cheek bones, olive complexion, and delicate nose. She had a natural, unselfconscious beauty that never faded, not even after her illness. Two of the black-and-white photos I took of her in 1967 now hang on my office wall at the university; students who walk into my office invariably comment on her exotic features. Barbara and I could not have been more different in class: I spoke incessantly, enraptured by my own words, while she remained silent like the sphinx, which only increased her mystery to me.
Our relationship began inauspiciously. Our first date was November 22, 1963, a day that no one of our generation will ever forget. After classes were cancelled because of President Kennedy's assassination, we decided to see a movie; we were among a handful of people in the theater as we watched Laurence Olivier play Heathcliff in Emily Brontë's Wuthering Heights. On our third date I walked her back to the dormitory and asked if I could kiss her goodnight. "No" was her immediate reply. I turned around and left, vowing never to ask her out again. A few months later I broke that promise, and we began seeing each other. When I later told her how hurt I was by her rejection, she replied, "It was a stupid question: you should have just kissed me."
Everything Barbara made was a work of art, and she was meticulous to a fault. Her eye invariably spotted misweaves and imperfections, and she demanded of others what she expected of herself, which was nothing short of perfection. It is not easy living with a person whose standards are so high; she was as mechanically inclined as I am mechanically declined, and I became dependent upon her ability to fix anything. She could repair faulty wiring, broken toilets, temperamental boilers, cracked floor tile, and leaky faucets. By contrast, I was hopeless. Her favorite story about me was the time I spent two hours replacing a head light in our car, only to discover that I had replaced the wrong light. Once in exasperation I said to her, "You're such a perfectionist that I don't understand why you married me." Without hesitation she replied, "I didn't think about it very much." Lucky for me that she didn't.
Barbara and I did not spend much time talking about the unfairness of her illness. We had no regrets about anything except that we did not have more time together. She felt little anger and no bitterness. She died during what would have been the best time in her life, when her children were grown up, happily married to wonderful men, successful in their careers, and beginning families of their own. She delighted in our new grandson Nate the Great, who filled her heart with joy.
Premature death always raises the most fundamental religious and existential questions, and each person will answer these questions differently. Amidst tragedy, those with strong religious faith may have emotional resources lacking in those without religious faith. I wish I could believe that Barbara is now in a better world, that there is a reason for her death, and that one day I will be reunited with her. What I do believe is that she will always be alive to those of us who were privileged to know her. I want to end by quoting a passage from a recent film based on Charles Dickens's novel Nicholas Nickleby: "In every life, no matter how full or empty one's purse, there is tragedy. It is the one promise life always fulfills. Happiness is a gift and the trick is not to expect it, but to delight in it when it comes and to add to other people's store of it." Barbara was one of those rare people who increased the store of happiness in the world.
With Barbara's approval, I decided to read the eulogy to my writing class, not only as a trial run, but, more importantly, as a way to offer my students my own example of risky writing. My students had been sharing their personal writings with me throughout the semester, and I wanted to reciprocate. I have discovered in every personal writing class that self-disclosure begets self-disclosure. Moreover, I believe in the adage that authors write best from their own experience. Here was a real-life experience in which all of my students would find themselves one day: confronting the specter of death from the point of view of the dying person or the caregiver. Here was an opportunity to describe, to others and myself, how and why Barbara has meant the world to me, and how my world would be forever changed by her death. Here was an opportunity to put into practice the adage I give to my writing students every class: show instead of tell, use concrete details, avoid clichés, compress your language, revise until every sentence is grammatically correct and stylistically graceful, and make the reader see your story. Above all, I wanted to write truthfully, which means not only engaging the minds and hearts of readers but also describing my wife without distorting or idealizing her life. It is always problematic when writers are too close to their subjects, when they are so much "in love with" their characters that they cannot see their human failings. I was indeed in love with my subject, but my challenge was to allow my students to see Barbara as I have seen her, and to convey to them, as I would later convey to the mourners attending her funeral, the special qualities about her life.
Most of my colleagues knew about Barbara's illness, but I hadn't informed anyone in the Expository Writing course I taught in the spring of 2004. I told them in early March, when our doctors told us that Barbara's death was imminent. I announced at the beginning of the class that I wanted to reserve the last 20 minutes of the hour for reading an essay that I had just written. The students seemed mildly puzzled, but no one said anything. The class proceeded as usual, and then my turn came. In a quiet, measured voice I revealed that my wife was terminally ill with cancer and close to death. I told them that I wanted to share with them the eulogy that I hoped I would be able to deliver at her funeral. Anyone who wished to leave class before hearing the eulogy could do so, I added. Finally I said that the class would be over when I finished reading the eulogy. And with that, I began.
During the reading I didn't lift my eyes from the copy of the eulogy on my desk. I didn't dare look up, fearful that I would be unable to continue reading if I saw anyone teary-eyed. I could hear several students from different sides of the classroom fighting back tears, but apart from that there was eerie silence, quite different from the ubiquitous white noise of the classroom. On three occasions I could hear my voice falter and break, but each time I paused, regained my composure, and resumed the reading. I thanked my students when I finished, and everyone quietly walked out of the classroom.
It was difficult to read aloud the eulogy, but afterwards I felt better, the way nearly all of my students feel after reading aloud their own emotionally charged essays. I felt exhausted and drained from the reading but also relieved, for no longer did I need to conceal Barbara's illness from my students. Finding the words to express my feelings, and then reading those words aloud, helped me to remain in control. My students' silence struck me as profoundly respectful, but I couldn't be sure how they felt unless I found a way to ask them. I knew from years of teaching experience that most students are reluctant to speak in class, even in self-disclosing courses. I wanted them to have the time to reflect on their feelings, which is why I ruled out an in-class essay, and I also wanted them to have the opportunity to remain anonymous so that they could be as truthful as possible. I did not want to require them to write about their feelings, and so I asked them to write an anonymous, optional essay describing how they felt about hearing the eulogy.
Fifteen of the 22 students who heard me read the eulogy turned in essays. They were generally well written, containing fewer grammatical and stylistic errors than in earlier writings. The disclosure of my wife's illness stunned all 15. They felt that the eulogy gave them insight into my personal life and that I was no longer simply a "teacher" to them. Twelve reported that they cried during the reading. Nearly all indicated that they could hear the emotionality of my voice as I read the eulogy and that it would not have been as powerful an experience if they had read the eulogy to themselves.
The eulogy was painful for all of the students. Many felt implicated in my story. I don't know whether it would be accurate to say that my students were traumatized, but some reported feeling physically as well as psychologically distressed during the reading. Some students reported that they found themselves thinking about class for the rest of the day: "My eyes precipitated and tears split my cheeks in halves as the sadness irrigated its way to the bottom of my face. I left class screaming at God in my head. Wondering why God does the things he does."
How did I feel about my students' tears? My purpose was not to move them to tears -- tearful responses are not necessarily "deeper" or more meaningful than dry-eyed responses. If we judged a story by the quantity of tears it produces in readers or viewers, any television soap opera would be a more profound aesthetic experience than King Lear. Nevertheless, I did want to "move" my students, and as the word implies, I sought to transport them to a different emotional realm, one that involved nothing less than the contemplation of life and death.
The adage, "mourners cry for themselves, not for the deceased," is a half-truth. We cry both for ourselves and for those whom we have lost. The most painful aspect of caring for Barbara was watching her suffer; after her death, the most painful part of mourning was imagining all that she will miss in life. My tears were as much for Barbara's sake as for my own; and my students' tears were as much for Barbara and me as for themselves.
Thirteen of the 15 students implied that the eulogy was appropriate for the writing class. They appreciated the trust I placed in them and implied that they would now place greater trust in me. They felt for the first time an equality in the teacher-student relationship: I had opened up to them just as they had opened up to me throughout the semester. "It has always bothered me when a teacher would read hundreds of essays, comment on them, and not read any of their own," wrote one student. "When a professor reads a work of their own they are putting themselves into the lake of vulnerability." Despite the fact that I had disclosed aspects of myself in Risky Writing, my new self-disclosure was different, and they now saw me differently. I was still their teacher, but I had now become another member of the class, one who was struggling, like everyone else, with a personal issue. I had never used the word "intersubjective" in class, but the classroom suddenly became a space where every person, including the teacher, was sharing aspects of his or her own subjectivity with each other.
The remaining two students were unsure whether they thought the eulogy was appropriate to read to the class. One wrote that the eulogy "put a damper on my day because it was so sad;" the other felt it was "a little too personal" to be read to a group of "mere students." I don't wish to invalidate the last response -- the cornerstone of an empathic classroom rests upon the principle that "feelings are feelings" and not to be disregarded -- but I never view members of my class as "mere students." They are not exactly "friends," to whom one may make an intimate self-disclosure, but after several weeks of the semester they are more than acquaintances. I don't believe that teachers should unburden themselves to students or seek psychological counseling from them, but I do believe that a teacher's careful self-disclosure of a real-life experience can become a profound educational experience for everyone in the classroom.
Family and friends became part of my support system as I cared for Barbara during her protracted illness. Without the help of my children, I could not have cared for her at home, especially toward the end, when she required around-the-clock care. Home-care hospice was also invaluable. Teaching was another support system. It afforded me not only a welcome distraction from the endless and exhausting problems confronted by a caregiver but also an opportunity to forget momentarily the crushing sadness I often felt at home. Throughout Barbara's illness I had my normal teaching load, but I had a compressed schedule so that I could be at home as much as possible. Perhaps what most surprised me about my response to Barbara's illness was the ability to compartmentalize my life. At home I fulfilled all my caregiving responsibilities. There were many times during the final weeks of her life when I felt emotionally and physically overwhelmed. Our hospice case supervisor suggested toward the end that it might be time to hospitalize Barbara, not for her sake but for our own. I remember feeling so exhausted from my caregiving responsibilities that I was indifferent to my own health and well-being. I felt that I was dying with Barbara. The early 20th-century Lithuanian-born Jewish philosopher Emmanuel Levinas would call this phenomenon, which I'm sure many caregivers feel, "dying for the other," when "worry over the death of the other comes before care for self." We all needed a break from the constant caregiving, yet we also felt that we had come this far and could manage a few more days. In the classroom, however, I felt cheerful, relaxed, and in control. Even during the last weeks of Barbara's life, I laughed and joked as usual in the classroom, which led my students to believe that nothing was wrong in my personal life.
The "teaching cure" enabled me to remain connected with the outer world of health; teaching served as a lifeline for me at a time when I was struggling to be a lifeline for my wife. Many colleagues offered generously to teach my classes during Barbara's illness, and they may have thought that my determination to continue teaching and meeting with students was a sign of strength. The truth was that teaching gave me the strength that otherwise I might not have had, for as much as I gave to my students, they gave to me and helped me through the crisis. As one person noted, "One thing that touched me as you read your eulogy was that teaching kept you sane. Hearing a professor say something like that made me think, 'That is one more reason why teaching is worth it after all.' It also made me wonder when in my life I will be at the point at which I will be able to say the same thing."
My eulogy was a bridge between the world of the healthy and the sick, the living and the dying; I wrote it when Barbara was gravely ill, but rather than distracting me from taking care of her, the eulogy enabled me to avoid succumbing to despair when she could no longer take care of herself. Writing the eulogy helped me to express the lifelong devotion that I have felt for her and she for me. My devotion to Barbara heightened my students' commitment to their teacher. "I found your reading to be very painful," one person wrote, "but I remained as strong as possible for you. I needed to give you my attention while you read it to the class. I knew this was important to you." They were not only a sympathetic audience but also a supportive group, reaching out to me in ways that seldom occur in the classroom. This supportive group takes on some of the characteristics of a "support" group but without offering the clinical advice that occurs in the latter.
My self-disclosure narrowed the distance between students and teacher, leading to a more equal classroom relationship based on reciprocity. There was, in Jessica Benjamin's words, mutual recognition: "the necessity of recognizing as well as being recognized by the other." There was nothing transgressive about this narrowed distance, nothing that would be considered unprofessional. Many past and present students attended the funeral and later came back to my home, along with perhaps a hundred other mourners. The main effect of the eulogy was internal. "I feel a greater sense of trust and respect now because you have shared your experience with us. By doing so, you have broken down the wall that is usually present in the classroom, separating the teacher from the students."
Half a century before "The Sopranos" hit its stride, the Caribbean historian and theorist C.L.R. James recorded some penetrating thoughts on the gangster -- or, more precisely, the gangster film -- as symbol and proxy for the deepest tensions in American society. His insights are worth revising now, while saying farewell to one of the richest works of popular culture ever created.
First, a little context. In 1938, shortly before James arrived in the United States, he had published The Black Jacobins, still one of the great accounts of the Haitian slave revolt. He would later write Beyond a Boundary (1963), a sensitive cultural and social history of cricket – an appreciation of it as both a sport and a value system. But in 1950, when he produced a long manuscript titled “Notes on American Civilization,” James was an illegal alien from Trinidad. I have in hand documents from his interrogation by FBI agents in the late 1940s, during which he was questioned in detail about his left-wing political ideas and associations. (He had been an associate of Leon Trotsky and a leader in his international movement for many years.)
In personal manner, James was, like W.E.B. DuBois, one of the eminent black Victorians -- a gentleman and a scholar, but also someone listening to what his friend Ralph Ellison called “the lower frequencies” of American life. The document James wrote in 1950 was a rough draft for a book he never finished. Four years after his death, it was published as American Civilization (Blackwell, 1993). A sui generis work of cultural and political analysis, it is the product of years of immersion in American literature and history, as well as James’s ambivalent first-hand observation of the society around him. His studies were interrupted in 1953 when he was expelled by the government. James was later readmitted during the late 1960s and taught for many years at what is now the University of the District of Columbia.
American Civilization's discussion of gangster films is part of James's larger argument about media and the arts. James focuses on the role they play in a mass society that promises democracy and equality while systematically frustrating those who take those promises too seriously. Traveling in the American South in 1939 on his way back from a meeting with Trotsky in Mexico, James had made the mistake of sitting in the wrong part of the bus. Fortunately an African-American rider explained the rules to him before things got out of hand. But that experience -- and others like it, no doubt -- left him with a keen sense of the country's contradictions.
While James's analysis of American society is deeply shaped by readings of Hegel and Marx, it also owes a great deal to Frederick Jackson Turner’s theory of “the closing of the frontier.” The world onscreen, as James interpreted it, gave the moviegoer an alternative to the everyday experience of a life “ordered and restricted at every turn, where there is no certainty of employment, far less of being able to rise by energy and ability by going West as in the old days.”
Such frustrations intensified after 1929, according to James’s analysis. The first era of gangster films coincided with the beginning of the Great Depression. “The gangster did not fall from the sky,” wrote James. “He is the persistent symbol of the national past which now has no meaning – the past in which energy, determination, bravery were sure to get a man somewhere in the line of opportunity. Now the man on the assembly line, the farmer, know that they are there for life; and the gangster who displays all the old heroic qualities, in the only way he can display them, is the derisive symbol of the contrast between ideals and reality.”
The language and the assumptions here are obviously quite male-centered. But other passages in James’s work make clear that he understood the frustrations to cross gender lines -- especially given the increasing role of women in mass society as workers, consumers, and audience members.
“In such a society,” writes James, “the individual demands an aesthetic compensation in the contemplation of free individuals who go out into the world and settle their problems by free activity and individualistic methods. In these perpetual isolated wars free individuals are pitted against free individuals, live grandly and boldly. What they want, they go for. Gangsters get what they want, trying it for a while, then are killed.”
The narratives onscreen are a compromise between frustrated desire and social control.“In the end ‘crime does not pay,’” continues James, “but for an hour and a half highly skilled actors and a huge organization of production and distribution have given to many millions a sense of active living....”
Being a good Victorian at heart, James might have preferred that the audience seek “aesthetic compensation” in the more orderly pleasures of cricket, instead. But as a historian and a revolutionary, he accepted what he found. In offering “the freedom from restraint to allow pent-up feelings free play,” gangster movies “have released the bitterness, hate, fear, and sadism which simmer just below the surface.” His theoretical framework for this analysis was strictly classical, by the way. James was deeply influenced by Aristotle’s idea that tragedy allowed an audience to “purge” itself of violent emotions. One day, he thought, they would emerge in a new form -- a wave of upheavals that would shake the country to its foundations.
In 6 seasons over 10 years, “The Sopranos” has confirmed again and again C.L.R. James’s point about the gangster is an archetypal figure of American society. But the creators have gone far beyond his early insights. I say that with all due respect to James’s memory – and with the firm certainty that he would have been a devoted fan and capable interpreter.
For James, analyzing gangster films in 1950, there is an intimate connection between the individual viewer and the figure on the screen. At the same time, there is a vast distance between them. Movies offered the audience something it could not find outside the theater. The gangster is individualism personified. He has escaped all the rules and roles of normal life. His very existence – doomed as it is – embodies a triumph of personal will over social obligation.
By contrast, when we first meet Tony Soprano, a boss in the New Jersey mob, he is in some ways all too well integrated into the world around him. So much so, in fact, that it is giving him panic attacks from trying to meet all the demands from juggling the different roles he plays. In addition to being pater of his own brood, residing in a suburban McMansion, he is the dutiful (if put-upon) son in a dysfunctional and sociopathic family.
And then there are the pressures that attend being the competent manager of a successful business with diversified holdings. Even the form taken by his psychic misery seems perfectly ordinary: anxiety and depression, the tag-team heart-breakers of everyday neurosis.
James treats the cinematic gangsters of yesteryear as radical individualists – their crimes, however violent, being a kind of Romantic refusal of social authority. But the extraordinary power of “The Sopranos” has often come from its portrayal of an almost seamless continuum between normality and monstrosity. Perhaps the most emblematic moment in this regard came in the episode entitled “College,” early in show’s first year. We watch Tony, the proud and loving father, take his firstborn, Meadow, off to spend a day at the campus of one of her prospective colleges. Along the way, he notices a mobster who had informed to the government and gone into the witness protection program. Tony tracks the man down and strangles him to death.
At the college he sees an inscription from Hawthorne that reads, “No man ... can wear one face to himself and another to the multitude, without finally getting bewildered as to which one may be true." Earlier, we have seen Tony answer Meadow’s question about whether he is a member of the Mafia by admitting that, well, he does make a little money from illegal gambling, but no, he isn't a gangster. So the quotation from Hawthorne points to one source of Tony’s constant anxiety. But it also underscores part of the audience’s experience – an ambivalence that only grows more intense as “The Sopranos” unfolds.
For we are no more clear than Tony is which of his faces is “true.” To put it another way, all of them are. He really is a loving father and a good breadwinner (and no worse a husband, for all the compulsive philandering, than many) as well as a violent sociopath. The different sides of his life, while seemingly distinct, keep bleeding into one another.
Analyzing the gangster as American archetype in 1950, C.L.R. James found a figure whose rise and fall onscreen provided the audience with catharsis. With “The Sopranos,” we’ve seen a far more complex pattern of development than anything found in Little Caesar or High Sierra (among other films James had in mind).
With the finale, there will doubtless be a reminder – as in the days of the Hays Code – that “crime does not pay.” But an ironized reminder. After all, we’ve seen that it can pay pretty well. (As Balzac put it, “Behind every great fortune, a great crime.”) Closure won’t mean catharsis. Whatever happens to Tony or his family, the audience will be left with his ambivalence and anxiety, which, over time, we have come to make our own.
Word that Richard Rorty was on his deathbed – that he had pancreatic cancer, the same disease that killed Jacques Derrida almost three years ago – reached me last month via someone who more or less made me swear not to say anything about it in public. The promise was easy enough to keep. But the news made reading various recent books by and about Rorty an awfully complicated enterprise. The interviews in Take Care of Freedom and Truth Will Take Care of Itself (Stanford University Press, 2006) and the fourth volume of Rorty’s collected papers, Philosophy as Cultural Politics (Cambridge University Press, 2007) are so bracingly quick-witted that it was very hard to think of them as his final books.
But the experience was not as lugubrious as it may sound. I found myself laughing aloud, and more than once, at Rorty’s consistent indifference to certain pieties and protocols. He was prone to outrageous statements delivered with a deadpan matter-of-factness that could be quite breathtaking. The man had chutzpah.
It’s a “desirable situation,” he told an interviewer, “not to have to worry about whether you are writing philosophy or literature. But, in American academic culture, that’s not possible, because you have to worry about what department you are in.”
The last volume of his collected papers contains a piece called “Grandeur, Profundity, and Finitude.” It opens with a statement sweeping enough to merit that title: “Philosophy occupies an important place in culture only when things seem to be falling apart – when long-held and widely cherished beliefs are threatened. At such periods, intellectuals reinterpret the past in terms of an imagined future. They offer suggestions about what can be preserved and what must be discarded.”
Then, a few lines later, a paradoxical note of rude modesty interrupts all the grandeur and profundity. “In the course of the 20th century," writes Rorty, "there were no crises that called forth new philosophical ideas.”
It's not that the century was peaceful or crisis-free, by any means. But philosophers had less to do with responding to troubles than they once did. And that, for Rorty, is a good thing, or at least not a bad one – a sign that we are becoming less intoxicated by philosophy itself, more able to face the need to face crises at the level (social, economic, political, etc.) they actually present themselves. We may yet be able to accept, he writes, “that each generation will solve old problems only by creating new ones, that our descendants will look back on much that we have done with incredulous contempt, and that progress towards greater justice and freedom is neither inevitable nor impossible.”
Nothing in such statements is new, of course. They are the old familiar Rorty themes. The final books aren’t groundbreaking. But neither was there anything routine or merely contrarian about the way Rorty continued to challenge the boundaries within the humanities, or the frontier between theoretical discussion and public conversation. It is hard to imagine anyone taking his place.
An unexpected and unintentional sign of his influence recently came my way in the form of an old essay from The Journal of American History. It was there that David A. Hollinger, now chair of the department of history at the University of California at Berkeley, published a long essay called “The Problem of Pragmatism in American History.”
It appeared in 1980. And as of that year, Hollinger declared, it was obvious that “‘pragmatism’ is a concept most American historians have proved that they can get along without. Some non-historians may continue to believe that pragmatism is a distinctive contribution of America to modern civilization and somehow emblematic of America, but few scholarly energies are devoted to the exploration or even the assertion of this belief.”
Almost as an afterthought, Hollinger did mention that Richard Rorty had recently addressed the work of John Dewey from a “vividly contemporary” angle. But this seemed to be the a marginal exception to the rule. “If pragmatism has a future,” concluded Hollinger in 1980, “it will probably look very different from the past, and the two may not even share a name.”
Seldom has a comment about the contemporary state of the humanities ever been overtaken by events so quickly and so thoroughly. Rorty’s Philosophy and the Mirror of Nature (Princeton University Press, 1979) had just been published, and he was finishing the last of the essays to appear in Consequences of Pragmatism (University of Minnesota Press, 1982).
It is not that the revival was purely Rorty's doing, and some version of it might have unfolded even without his efforts. In such matters, the pendulum does tend to swing.
But Rorty's suggestion that John Dewey, Martin Heidegger, and Ludwig Wittgenstein were the three major philosophers of the century, and should be discussed together -- this was counterintuitive, to put it mildly. It created excitement that blazed across disciplinary boundaries, and even carried pragmatism out of the provinces and into international conversation. I'm not sure how long Hollinger's point that pragmatism was disappearing from textbooks on American intellectual history held true. But scholarship on the original pragmatists was growing within a few years, and anyone trying to catch up with the historiography now will soon find his or her eyeballs sorely tested.
In 1998, Morris Dickstein, a senior fellow at the City University of New York Graduate Center, edited a collection of papers called The Revival of Pragmatism: New Essays on Social Thought, Law, and Culture (Duke University Press) -- one of the contributors to it being, no surprise, Richard Rorty. “I’m really grieved,” he told me on Monday. "Rorty evolved from a philosopher into a mensch.... His respect for his critics, without yielding much ground to them, went well with his complete lack of pretension as a person.”
In an e-mail note, he offered an overview of Rorty that was sympathetic though not uncritical.
“To my mind," Dickstein wrote, "he was the only intellectual who gave postmodern relativism a plausible cast, and he was certainly the only one who combined it with Dissent-style social democratic politics. He admired Derrida and Davidson, Irving Howe and Harold Bloom, and told philosophers to start reading literary criticism. His turn from analytic philosophy to his own brand of pragmatism was a seminal moment in modern cultural discourse, especially because his neopragmatism was rooted in the 'linguistic turn' of analytic philosophy. His role in the Dewey revival was tremendously influential even though Dewey scholars universally felt that it was his own construction. His influence on younger intellectuals like Louis Menand and David Bromwich was very great and, to his credit, he earned the undying enmity of hard leftists who made him a bugaboo."
The philosopher "had a blind side when it came to religion," continued Dickstein, "and he tended to think of science as yet another religion, with its faith in empirical objectivity. But it's impossible to write about issues of truth or objectivity today without somehow bouncing off his work, as Simon Blackburn and Bernard Williams both did in their very good books on the subject. I liked him personally: he was generous with his time and always civil with opponents.”
A recent essay discussing Rorty challenges the idea that Rorty “had a blind side when it came to religion.” Writing in Dissent, Casey Nelson Blake, a professor of history and American studies at Columbia University, notes that Rorty “in recent years stepped back from his early atheist pronouncements, describing his current position as ‘anti-clerical,’ and he has begun to explore, with increasing sympathy and insight, the social Christianity that his grandfather Walter Rauschenbusch championed a century ago.”
Blake quotes a comment by Rorty from The Future of Religion, an exchange with the Catholic philosopher Gianni Vattimo that Columbia University Press published in 2005. (It comes out in paperback this summer.)
“My sense of the holy,” wrote Rorty, “insofar as I have one, is bound up with the hope that someday, any millennium now, my remote descendants will live in a global civilization in which love is pretty much the only law. In such a society, communication would be domination-free, class and caste would be unknown, hierarchy would be a matter of temporary pragmatic convenience, and power would be entirely at the disposal of the free agreement of a literate and well-educated electorate.”
I'm not sure whether that counts as a religious vision, by most standards. But it certainly qualifies as something that requires a lot of faith.