Apart from his preoccupation with race, class, and gender -- not to mention his interest in both cross-dressing and cannibalism -- the worst thing about William Shakespeare is, of course, his language. He coined the expression "the beast with two backs." Hamlet refers to the "country matters" that "lie between maids' legs." Characters in another play make penis jokes about how a certain word should be understood as a noun in "the focative case" -- thereby sneaking in a pun on a word that the Federal Communication Commission fines you for using.
I can't believe they teach this trash in schools. It's time for Fox News to do an expose.
And while they're at it, perhaps it is time to investigate another scandal: Oxford University Press (no less!) has just issued the new edition of The F Word, by Jesse Sheidlower, an editor at large for the Oxford English Dictionary. Random House published the first version of his study in 1995. But the word itself has only grown in its range of nuances in the meantime. It is often heard in punk rock and gangster rap, and has in recent years enlivened the discourse of the executive branch of the United States government.
The latest edition adds more than 100 variations on the word to its lexicon, draws on a variety of digital databases, and incorporates examples of usage from New Zealand, South Africa, and elsewhere. Terms once identified as belonging to one part of speech are analyzed in their full range of usages; "fugly," for example, is now treated as both noun and adjective. The nuances of words are now more finely parsed. While previous editions defined "fuckfaced" as meaning "ugly," it can also mean "tired" or "drunk." The military and civilian usages of "clusterfuck," whether as noun or verb, are cataloged.
Sheidlower's introduction undertakes a swift and no-nonsense debunking of some common myths about the word. It is not the acronym of "for unlawful carnal knowledge" (let alone the preposterously stilted "fornication under consent of the King"). More surprising to learn is that it isn't really an Anglo-Saxon word either, as it's usually called. The first known appearance in English is around 1475; its ancestry appears to be Germanic.
This is vulgarity at its most erudite, and vice versa. Although Sheidlower indicates he chose some illustrative quotations because he found them humorous, The F Word itself is a sober piece of scholarship. I asked the lexicographer a few questions about his project by e-mail; a transcript of the interview follows.
Q: This is the third edition of your book, and by far the most extensive. How did you come to make studying the word your life's work?
A: I think that all words are interesting, but especially slang terms, because slang is an area that had been ignored or treated with active hostility by academics for quite some time. Thus, there's still a lot of work to do on slang.
My specific interest in this word came about mostly by chance. I had been working on the Historical Dictionary of American Slang at Random House, and suggested in an editorial meeting that we publish the fuck material separately, for ease of access to what would be one of the most-looked-up words in the book, and this suggestion was taken up with an enthusiasm that surprised me. And that's how it all started.
Q:The word appears in an Italian-to-English in dictionary in 1598 and returns in a guide to English etymology (written in Latin) in 1671. It pops up in other reference works over the following century -- then, after 1775, disappears from general dictionaries entirely for 170 years. How do you understand this deliberate lexicographic blind spot? Was it something that applied to most "swear words" or "vulgarities"? Or was it singled out for repression?
A: No, it was words of this kind in general. The same thing that made the Victorian era so (publicly, if not actually) repressive affected the view of the language as well.
In the Introduction I quote from a legal decision in the 1840s where the judge specifically notes that despite being absent from dictionaries, the word fuck was in common use, so we shouldn't use lexicographers' modesty as a guideline. In the 1890s, a printer refused to publish a volume of a (privately printed) slang dictionary because of its obscene content, and when the dictionary's author took the printer to court for breach of contract, the printer won the moment the jury saw what it was he didn't want to print. A few examples like that are all we need to see to learn about the kind of pressures that existed at the time.
Q:You document an wide range of uses of the word -- including scores of idioms, numerous cognates, a lot of abbreviations (the most famous being SNAFU), and several ways to write it down without quite violating the prohibition, such as "fug" or "f****" or even "XXXX." The variety is astounding. At the same time, the connotation of any given use tends to be hostile or aggressive, as often as it is sexual. How deep is that association? Did the word start out with that overtone, or did it acquire its hostile edge at some point along the way?
A: As far as we can tell, it's relatively recent. For the first several centuries, sexual uses were the only thing we had. Uses such as 'to harm; victimize' and 'to cheat or trick' aren't found until the late eighteenth and late nineteenth centuries respectively, and these are very rare until the twentieth. With that said, the association of hostility or aggression with sex is not a new development.
Q:T his is the first reference book I've ever seen to cite Usenet as documentation. Would you say a bit about the value -- and the pitfalls -- of using digital resources for this project.
A: People often think that having access to big databases makes it easier to do research. Quite the contrary -- it makes the result better, but if very often makes it much harder. Instead of getting a moderate amount of evidence, that you are able to handle, you get a vast amount of evidence that you have to struggle to process. And if you ignore it, someone else won't.
So you do end up with a much more thorough and comprehensive project, but at the cost of enormous time. There were several simple, one-sense entries that I started to work on thinking that I'd be done in ten minutes, and ended up hours later with a greatly expanded multi-sense entry.
It is, of course, great that all of this is available. And it's a great democratizer -- everyone at a university will have access to the same range of electronic resources, and that's wonderful. But it makes your job as a scholar more difficult when you know that anyone can find something that you missed. That's true for the Internet as a whole, not just in relation to language research.
Q: Last month, a guest on "Saturday Night Live" used the word by accident; she meant to say "freaking," it seems, but the uneuphemized version came out. Around the same time, the anchorman for a New York television station used the curious expression "keep fucking that chicken" while on the air. These incidents would have been a big deal, once upon a time. Now they barely register on public awareness. Do you think the word will ever be just ... a word?
A: I think it's unlikely that fuck will lose all of its power at any point in the foreseeable future. After all, even relatively mild expressions ("darn!" or "bastard," say) still maintain a certain amount of colloquial force. And because fuck is still viewed as the most extreme general word there is, its use on TV will continue to be surprising. So while I do think that the progression we've seen in the last 40 or so years in particular will keep going -- i.e. that it will become ever more acceptable -- it will be a very long time, if ever, before it's just a word.
Scattered through the Modern Language Association’s 2009 convention were telling sessions devoted to the state of higher education. Compelling testimony was offered in small and sometimes crowded rooms about the loss of long-term central features of the discipline, from foreign language study to graduate student support to tenure track jobs for new Ph.D.'s. In many respects, the MLA’s annual meeting is more responsive to higher education’s grave crisis than the other humanities and social science disciplines that should also be part of the conversation, from anthropology to classics to history and sociology. There are simply more MLA sessions dealing with such issues than there are at other disciplinary meetings. Yet there was also throughout the MLA convention a strong sense of irrelevant business as usual, in the form of innumerable sessions devoted to traditional scholarship. There is a certain poignancy to the orchestra playing Mozart while the Titanic slips beneath the waves: We who are about to die salute our traditional high cultural commitments.
Of course we should sustain the values and the ongoing research that make humanities disciplines what they are. But the point is that the ship does not have to go down. There is action to be taken, work to be done, organizing and educating to do when faculty members and graduate students come together from around the country. Disciplinary organizations thus need to revise their priorities to confront what is proving to be a multi-year recession in higher education. As I argue in No University Is an Island, the recession is prompting destructive changes in governance, faculty status, and educational mission that will long outlast the current crisis. Because MLA’s members are already talking about these matters in scattered ways, it is time for the organization to take the lead in revising the format of its annual meeting to address the state of higher education -- and prepare its members to be effective agents -- in a much more focused, visible, and productive way. Then perhaps other disciplines will follow.
A generation ago, when the MLA’s Graduate Student Caucus sought to reform the organization, it circulated several posters at annual meetings. Most telling, I thought, was a photograph of the Titanic, captioned “Are you enjoying your assistant-ship?” It was no easy task back then convincing the average tenured MLA member that the large waves towering over our lifeboats would not be good for surfing. Now the average college teacher is no longer eligible for tenure, and the good ship humanities is already partly under water.
The MLA’s response to a changing profession was to increase the number and variety of sessions, to give convention space to both fantasy and reality. The MLA would cease to be exclusively a platform for privilege. The organization would become a big tent. Unfortunately, the big tent is looking more like a shroud. The humanities are drowning. It is time to rethink the annual meeting to make it serve a threatened profession’s needs.
Until we can secure the future of higher education, we need to be substantially focused on money and power. That, I would argue, should be the theme of the 2010 annual meeting, and the structure of the meeting should be revised to reflect that focus. Instead of simply offering incoherent variety, the MLA should emphasize large meetings on the current crisis and its implications. And I do not mean simply paper presentations, telling as local testimony can be.
Disciplinary organizations need to offer substantial training sessions -- typically running several hours each and perhaps returning for additional sessions over two or three days -- that teach their members the fundamentals of financial analysis and strategies for organizing resistance. The AAUP, for example, teaches summer workshops each year that show faculty members the difference between budgets, which are fundamentally planning documents riddled with assumptions, and financial statements, which report actual expenditures for the previous year. We work not with hypothetical budgets but with examples from a dozen universities. Attendees learn that there are virtually always pots of money not listed on a university budget at all. A budget, MLA members will benefit from learning, is essentially a narrative. It can and should be deconstructed. I expect the AAUP would be willing and able to conduct such training sessions at disciplinary meetings. Indeed we already have the PowerPoint presentations and detailed handouts we would need. We have faculty members who specialize in analyzing university finances ready to serve the MLA and other disciplinary organizations.
The AAUP could also join with the AFT and the NEA to offer workshops in the fundamentals of collective bargaining, explaining how faculty and graduate employees at a given school can create a union that meets their distinctive institutional needs and embodies their core values. We can stage scenarios that give faculty members and graduate student activists experience in negotiating contracts. And the MLA should schedule large sessions that help faculty in places where collective bargaining is impossible, to recognize that organizing to have influence over budget decisions and institutional priorities is also possible without a union. The organization should also invite the California Faculty Association to conduct a large workshop on ways to reach out to students, parents, alumni, and other citizens and rebuild public support for higher education. CFA has been running a terrific campaign toward that end. The point is to empower faculty members to be the equals, not the victims, of campus administrators.
I am urging an annual MLA meeting that promotes not only literary studies but also material empowerment, that equips the members of the profession with the skills they need to preserve an appropriate environment for teaching and research. If the MLA takes the lead in reshaping its annual meeting this way, other disciplines will follow.
The First of the Month is a cultural and intellectual publication that is singularly lively, and no less strange. It started out in 1998, in tabloid format, as a “newspaper of the radical imagination” published in Harlem. First has been compared to Partisan Review, the legendary magazine of the New York Intellectuals that began during the Depression. But honestly, that's just lazy. Any time a bunch of smart people start a magazine, somebody ends up comparing to it to Partisan Review, especially if it is published in New York; but First took its name from a song by Bone-Thugs-n-Harmony, and while I’d like to picture Delmore Schwartz doing a little freestyle rapping over scotch at the White Horse Tavern, it’s a stretch.
Following what has become the contemporary routine, the paper gave birth to a Web site; this then replaced the print edition. An anthology culled from its first decade appeared last year as The First of the Year: 2008, published by Transaction. On first approach, the book looks like a memorial service for the whole project. And an impressive one: the roster of contributors included (to give a very abbreviated and almost random roll-call) Amiri Baraka, Greil Marcus, Lawrence Goodwyn, Grace Lee Boggs, Adolph Reed, Russell Jacoby, Armond White, Kurt Vonnegut, Kate Millet, Richard Hoggart, and Ellen Willis.
I meant to give the volume a plug when it appeared; so much for the good intention. But happily, my initial impression was totally wrong. While continuing to function online (and to have its world headquarters in Harlem, where editorial collective member and impressario Benj DeMott lives) First has reinvented itself as an annual anthology. First of the Year: 2009, has just been published, which seems worth noting here, in this first column of the year.
The viability of any small-scale, relatively unprofitable cultural initiative is a function of two forces. One is the good will of the people directly involved. The other is getting support from the public – or rather, creating one.
In this case, the process is made more difficult by the fact that First is sui generis. Which is putting it politely. My own response upon first encountering it about 10 years ago involved a little cartoon balloon forming over my forehead containing the letters “WTF?” It is not simply that it is hard to know what to expect next; sometimes it is hard to say what it was you just read. In First, political commentary, cultural analysis, and personal essays sit side-by-side. But at times, all three are going on at once, within the same piece. Kenneth Burke used to refer to such jostlings of the coordinate system as creating "perspective by incongruity." It signals a breakdown of familiar formats -- a scrambling of routine associations.This is stimulating, if perplexing. The confusion is not a bug but a feature.
One familiar description of the journal that I have come to distrust treats First as a bridge between popular culture and the ivory tower. An often-repeated blurb from some years ago calls it "the only leftist publication [one] could imagine being read at both Columbia University and Rikers.”
Good advertising, to be sure. But the better the ad, the more its presumptions need checking. The whole “building a bridge” trope implies that there is a distance to be spanned – a connection between enclaves to be made. (The ideas are over here, the masses over there.) But reading First involves getting oriented to a different geography. Some academics do write for it, but they do not have pride of place among the other contributors, who include poets and musicians and journalists, and people who might best just be called citizens. The implication is not that there is distance to be crossed, but that we're all on common ground, whether we know it, or like it, or not.
In the wake of 9/11, some writers for First (not all of them) rallied to the call for a war, and at least one endorsed George W. Bush during the 2004 campaign. Does that mean that First is actually “the only ‘neoconservative’ publication read in both academe and prisons”? Well, no, but funny you should ask, because it underscores the convenience made possible by pre-gummed ideological labels.
At times they are useful (I tend to think "social-imperialist" is a pretty good label for the idea that "shock and awe" was necessary for historical progress in Iraq) but not always.
The discussion of Obama in the new volume is a case in point. Both Paul Berman (a Clintonian liberal who supported the Iraq War) and Amiri Baraka (who takes his political bearings from Mao Tsetung Thought) concurring that the 2008 election was a transformative moment. This is, let's say, an unanticipated convergence. Meanwhile, Charles O’Brien (an editorial collective member who endorsed Bush in ‘04, on more or less populist grounds) treats Obama as a short-circuit in the creation of a vigorous radicalism-from-below needed for social change. “Of the Obama campaign, what endures?” he asks. “The new Pepsi ad.”
It would be wrong to see First as yet another wonk magazine with some cultural stuff in it. Nor is one of those journals (edited on the bridge, so to speak) in which the latest reality-TV show provides the excuse for yet another tour of Foucault’s panopticon. Politics and culture come together at odd angles in the pages of First, -- or rather, each spins out from some vital center that proves hard to pin down. Margin and mainstream are configured differently here.
I tried to get a handle on First's particularity by talking to Benj DeMott, who edited the two anthologies and is now working on the third. We spoke by phone. Taking notes did not seem like a plausible endeavor on my part, because DeMott's mind moves like greased lightning – the ideas and references coming out in arpeggios, rapid-fire and sometimes multitrack.
But one point he made did stick. It was a consideration on holding together a project in which the contributorsdo not share a party line, and indeed sometimes only just barely agree to disagree. It sounds complicated and precarious. Often, he said, it comes down to sharing a passion for music -- for sensing that both democracy and dancing ought to be in the streets. Politics isn't about policy, it's about movement.
That does not mean celebration is always the order of the day. The indulgence of academic hiphop fans is legendary, but if you want to see what tough-minded cultural analysis looks like, check out the African-American film critic Armond White's reflections on white rapper Eminem in The First of the Year: 2009. The essay can be recommended even if its subject is now shrinking in pop culture’s rearview mirror.
“Rather than a symbol of cultural resistance,” writes White, “he’s the most egregious symbol of our era’s selfish trends. With his bootstrap crap and references to rugged individualism reminiscent of the 80s, he’s a heartless Reagan-baby – but without the old man’s politesse.... His three albums of obstinate rants culminate in the egocentric track ‘Without Me,’ making him the Ayn Rand of rap – a pop hack who refuses to look beyond himself.... Minus righteousness, angry rap is dismissible. Rap is exciting when it voices desire for social redress; the urge toward public and personal justice is what made it progressive. Eminem’s resurrected Great White Hope disempowers hip hop’s cultural movement by debasing it.”
Now, if you can imagine such thoughts ever appearing in an essay by Irving Howe -- let alone Irving Kristol -- then we can go ahead and describe First as inheriting the legacy of the New York Intellectuals.
Otherwise, it may be time to recognize and respect First for what it is in its own right: a journal of demotic intelligence, alive to its own times, with insights and errors appropriate to those times, making it worth the price of perplexity.
Only after talking to Benj DeMott did I read what seems, with hindsight, like the essay that best explains what is going on with the whole project. This is long tribute -- far more analytical than sentimental -- to his father, the late Benjamin DeMott, who was a professor of English at Amherst College. He was a remarkable essayist and social critic.
It is time that someone publish a volume of DeMott senior's selected writings. Meanwhile, his influence on First seems pervasive. The younger DeMott quotes a letter written in his father’s final years -- a piece of advice given to a friend. It offers a challenge to what we might call "the will to sophistication," and its hard clarity is bracing:
"Study humiliation. You have nothing ahead of you but that. You survive not by trusting old friends. Or by hoping for love from a child. You survive by realizing you have nothing whatever the world wants, and that therefore the one course open to you is to start over. Recognize your knowledge and experience are valueless. Realize the only possible role for you on earth is that of a student and a learner. Never think that your opinions – unless founded on hard work in a field that is totally new to you – are of interest to anyone. Treat nobody, friend, co-worker, child, whomever, as someone who knows less than you about any subject whatever. You are an Inferior for life. Whatever is left of it....This is the best that life can offer. And it’s better than it sounds.”
This amounts, finally, to a formulation of a democratic ethos for intellectual life. It bends the stick, hard, against the familiar warp. So, in its own way, does First, and I hope the website and the series of anthologies will continue and prosper as new readers and writers join its public.
I have been reading with sadness and horror about the murder of Don Belton, an assistant professor of English at Indiana University, whose body was found in his apartment in Bloomington on December 28. He had been stabbed repeatedly in the back and sides. A novelist and essayist, Belton had taught creative writing at a number of institutions and was the editor of Speak My Name: Black Men on Masculinity and the American Dream, a landmark anthology published by Beacon in the mid-1990s. He was also gay, which is not an incidental detail.
Around the time police were getting their bearings on the case, the girlfriend of a young ex-Marine named Michael Griffin contacted police to tell them she thought he was involved in Belton’s death. Griffin was soon taken into custody. According to a detective's affidavit available online, he said that Belton had sexually assaulted him on Christmas. Two days later, he went to Belton’s apartment to have a “conversation” which turned into a “scuffle,” resulting in the professor’s death.
These words, which sound so mild, sit oddly in the narrative. The affidavit then goes on to say that Griffin stated “that he took a knife, called a ‘Peace Keeper’ that he had purchased prior to going to Iraq while in the Marine Corps, with him....” He also thought to bring a change of clothes. The bloody ones went into a white trash bag. Griffin “then went about and ran several errands,” the report continues, “before he eventually discarded the bloody clothing into a dumpster.... Mr. Griffin then returned home where he stated that he yold his girlfriend what he had done.”
I heard about the case from my friend Josh Lukin, a lecturer in the First Year Writing Program at Temple University -- where, as he used to say in the contributor's note for his publications, "he and novelist Don Belton occasionally bemuse the staff with their renditions of classic show tunes," back when they both taught there. Josh recalls his friend as a sweet-natured and brilliant colleague, but one whose many gifts did not include the ability to lift heavy objects.
Belton was 53 years old while the man charged in his death is 25. The idea that he could violate an ex-Marine (and not once but twice, according to his statement to the police during interrogation) would be funny if it were not so grotesque.
In his affidavit, the Bloomington detective who investigated the case reports finding “a journal kept by the decedent ... in which he writes in the week prior to Christmas 2009 that he is very happy that an individual by the name of Michael has come into his life.” Benton had joined Griffin and his girlfriend for Christmas. Indeed -- and this is in some ways the most troubling thing about the story -- the relationship seems to have been very friendly until it turned vicious.
It is easy to speculate about what may have happened. In fact we do not know. But the circumstances track with a familiar pattern -- one common enough to have a name: “the ‘gay panic’ defense.” This rests on the idea that the wave of disgust created in a heterosexual person at exposure to gay sexuality can create a state of temporary psychosis. The panic-stricken victim loses responsibility for his (for some reason, it always turns out to be “his”) actions.
This is an idea that should be retired to the Museum of Deranged Rationalization as soon as possible. But it seems far-fetched to imagine that Griffin and his counsel will get through trial without invoking it. (Despite his confession, Griffin has pleaded not guilty to murder.)
On the other hand, the “panic” defense touches on an issue that was of vital interest to Belton himself. He wrote the introduction to a book edited by the late Eve Kosofsky Sedgwick. Her work on queer theory includes a sustained inquiry into the complicated and damaging way certain institutions have forged intense bonds among men while also obliging them to police one another for the slightest trace of homosexuality. This contradictory demand makes for paranoia and volatility.
In Epistemology of the Closet (University of California Press, 1990), Sedgwick writes, “The historical emphasis on enforcement of homophobic rules in the armed services in, for instance, England and the United States supports this analysis. In these institutions, where both men’s manipulability and their potential for violence are at the highest possible premium, the prescription of the most intimate male bonding and the proscription of (the remarkably cognate) ‘homosexuality’ are both stronger than in civilian society – are, in fact, close to absolute.”
As it happens, Belton had reflected on this ambivalent, anxious, crazy-making dimension of social reality in an essay that appeared in the journal Transition in 1998. Reflecting on a book about gay Marines, Belton reflected on his own very complicated effort to sort out mixed messages about race, sexuality, and violence when he was growing up in the 1960s. The machismo of Black Panther leader Eldridge Cleaver had been both appealing and problematic – given that it rested on a belief that, as Franz Fanon had put it, “homosexuality is an attribute of the white race, Western civilization.” This was another version of the cultural logic that Sedgwick had identified: Solidarity among African-American men being forged by excluding gays as race traitors.
Belton’s vision was broader. He had been friends with James Baldwin and lectured on him at the Sorbonne; the influence of the novelist and essayist on his own work was not small. One of his friends has quoted a passage from Baldwin that seems to epitomize Belton’s work: “Love takes off masks that we fear we cannot live without and know we cannot live within." Although I did not know the man himself, this touches the heart of his writing, which suggests a desire to go beyond, or beneath, the prescribed roles and rules governing “identity.”
This is easier said than done, of course. It is also dangerous; love can be dangerous. Belton wrote in his journal (to quote from the detective’s statement again) “that he is very happy that an individual by the name of Michael has come into his life.” It is not necessary to use pseudopsychological terms like “gay panic” to describe the response this created. Keep in mind that the killer brought his own special knife and a change of clothes. Arguably another vocabulary applies, in which it is necessary to speak of evil
One of the remarkable things about the response to Belton's death is just how much of it there has been. Hundreds of people turned out for a vigil on New Year's Day (see video). There is a website called Justice for Don Belton. An open letter from the chair of his department has appeared on the departmental Web site. A memorial service will be held in Bloomington
And Josh Lukin tells me that he is proposing a session called “Remembering Don Belton” for the next MLA -- a panel "engaging his scholarship, art, journalism, and pedagogy." Possible topics might include "his writing and teaching on black masculinity, Baldwin, Brecht, Mapplethorpe, Morrison, Motown, jazz, cinema, abjection," to make the list no longer than that.
"The guy's range of interests was huge," Josh says, "and he kept surprising me with his knowledge of critical texts, both recent ('Bowlby, Just Looking? Great chapters on Dreiser.') and more traditional ('Why not talk about Morrison using R.W.B. Lewis, American Adam?')."
I have no idea how decisions about such proposals are decided. But this would be a good session to have on the schedule for next year. To move from sorrow to celebration is not easy; the effort should be encouraged.
All of this fuss over J.D. Salinger is yesterday’s dinner warmed over. Make no mistake: Salinger was a terrific writer, and at one time he was very famous as an artist, not a recluse. But he ravaged his own reputation. He threw a cloak over himself, and ensured that he’ll be unknown to tomorrow’s readers, and little more than a footnote in the next generation’s literary histories.
Salinger’s life may be divided evenly into two parts. For the first 45 years or so, he sought to become a well-known writer, and succeeded handsomely. For the second 45 years, he sought to erase the evidence of the first 45 by building a wall of silence around himself and his work. Sadly, he succeeded very well at that too. Salinger’s most decisive act, of course, was to stop publishing. After publishing one extraordinary novel, The Catcher in the Rye, in 1952, followed by several volumes of interlinked stories about a family named Glass, Salinger quit. At the height of his influence in the mid-1960s, with his creative powers flowing abundantly, he simply withdrew from the world of publishing, readers, and especially critics. The prurient interest in Salinger’s isolation may endure longer than Salinger’s writing. Word has it that a Salinger documentary, prepared in secret, is already in the works.
J.D. Salinger was once the voice of a generation. Millions of readers of a certain age saw in Holden Caulfield, the hero of The Catcher in the Rye, an eloquent expression of their own longings and frustrations. But that generation is now middle-aged. They're the ones writing Salinger’s admiring obituaries now, so they exaggerate his importance based on how they remember him. Salinger was once very important indeed, but he did his best to muffle that importance by refusing any and all entreaties from anthologies, critics, and filmmakers.
Salinger was of course entitled to his personal privacy, and he was likewise entitled to write for himself and not for publication. But it’s more than a pity that he expended so much effort to keep people from reading the work that he so eagerly turned into the world at a time when he was feeling more generous toward it. Salinger refused requests to republish his work in different formats, and when people tried to write about it — and about him — he made it as difficult as possible. His successful court effort to block a biographer from quoting from his unpublished letters not only ruined one book, but also chilled the ambitions of writers who might have followed in its wake.
Salinger’s best-known short story, “A Perfect Day for Bananafish,” is about a prodigiously bright young man — someone with seemingly everything to live for — who shoots himself in the head one day for no apparent reason. Upon reflection, we can read “Bananafish” as a kind of allegory of Salinger’s own career. No one will ever know exactly why he shut himself down, but many have wondered — as they continue to wonder about the suicide in the story.
Perhaps Salinger might have kept going if he cared more about the connection that he made with his readers. Bruce Springsteen told an interviewer in 1984 that, “If the price of fame is that you have to be isolated from the people you write for, then that's too fuckin’ high a price to pay." Springsteen is in this respect the very antithesis of Salinger. Witnesses testify that Salinger continued to write in his New Hampshire hermitage, but he evidently had little desire to communicate with any reader but himself.
I imagine that Salinger’s unpublished work will be packaged and sold at some point. There’s too much money to be made for that not to happen. But the anticipation will surely exceed the actual event. In fact, I predict that Salinger’s significance will drop like a stone once that material comes out and gets digested, and that’s because of the anti-public life Salinger himself led.
Rebuffing the literary anthologies may prove to be Salinger’s most consequential decision in that regard, because it has kept his writing from the eyes of succeeding generations of readers. Most young readers encounter classic authors in the pages of such collections, and these encounters lay the foundation for their later reading. Salinger’s work is increasingly invisible to younger people now, so his reputation won't stay aloft once the brief, titillating pleasure of revealing what's in his writer's cupboard is satisfied.
That pleasure will also evaporate because the posthumous work is unlikely to be very good. Writers who refuse to communicate with their readers or with the larger world tend not to produce very good fiction because they’re no longer of the world that they’re writing about. Salinger effectively expatriated himself from the social world, but that world was changing around him through the decades of his isolation. We may expect stories encased in amber.
Salinger betrayed a great talent. Metabolically speaking, he died last week. But his passing really began decades ago.
Leonard Cassuto is professor of English at Fordham University and the general editor of the forthcoming Cambridge History of the American Novel.
In a recent New York Review article on Byron, Harold Bloom makes the following passing remark: “In the two centuries since Byron died in Greece [...] only Shakespeare has been translated and read more, first on the Continent and then worldwide.” Bloom does not cite any statistics, and one cannot help but wonder: Really? More than Homer and Dante, or, among the moderns, more than Sartre and Thomas Mann? Of course, what Bloom really means is that Byron was translated and read more than any other English writer, and he may well be correct on that count. Yet this omission is telling, as it highlights an unfortunate tendency (recently diagnosed by David Damrosch) among certain English professors to equate literature in general with literature written in English. This disciplinary bias, less prejudice than habit, can distort their scholarship – the authors that they admire tend to be far more catholic in their reading. But this pattern also raises a larger academic question: Why do we still partition the literary canon according to nationalist traditions? Is this really the most intellectually satisfying and authentic approach to literary studies?
For an example of how disciplinary blinders can affect scholars as well-read as Bloom, we need only turn back to his article, where we find Byron described as “the eternal archetype of the celebrity, the Napoleon of the realms of rhyme... the still unique celebrity of the modern world.” What such hyperbole masks is the fact that the model for such literary celebrity is in reality to be located in another author, who unfortunately did not have the good sense to be born in England. Indeed, anyone familiar with the inordinate fame of Jean-Jacques Rousseau knows that he was the first genuine literary celebrity, lionized and sought out across Europe, much to his growing despair and paranoia (as this brilliant study by the historian Antoine Lilti details). Byron himself was smitten by Rousseau, touring the Lac Léman with his friend Shelley to visit the sites from Julie, ou la nouvelle Héloïse. Rousseau may not have provided his public with the same devilish scandals as the naughty Lord, but his Confessions, with their admission of a fondness for spankings and exhibitionism, were sultry enough.
Bloom is certainly no provincial, and his own, published version of The Western Canon includes German, Spanish, French, and Italian works – although this canon, too, is heavily tilted toward English authors. But can this be avoided? No doubt French scholars would produce a version of the canon equally tilted toward the French, just as scholars from other nations would privilege their own authors. To an extent, this literary patriotism is normal and understandable: every culture values its heritage, and will expend more energy and resources promoting it.
From the viewpoint of literary history, however, such patriotism is also intellectually wrongheaded. To be sure, writers are often marked most strongly by their compatriots: one must read Dante to understand Boccacio, Corneille to understand Racine, or, as Bloom would have us believe, Whitman to understand T. S. Eliot. But such a vertical reading of literature (which Bloom himself mapped out in The Anxiety of Influence) overlooks the equally – sometimes far more – important horizontal ties that connect authors across national borders. T. S. Eliot may have been “hopelessly evasive about Whitman while endlessly revising him in [his] own major poems,” yet by Eliot’s own admission, the French school of symbolist poetry had a far greater impact on his work. Some of Eliot’s first published poems, in fact, were written in French. Conversely, the French novelist Claude Simon may have endlessly revised Proust, but his own major novels – such as La route des Flandres and L’herbe – owe far more to William Faulkner. Such examples could be multiplied ad infinitum: they are, in fact, the stuff that literary history is made of.
To this criticism, English professors have a ready-made answer: Go study comparative literature! But they have only half a point. Comp lit programs are designed to give students a great deal of flexibility: their degrees may impose quotas for number of courses taken in foreign language departments, but rarely, if ever, do comp lit programs build curricular requirements around literary history. Yet that is precisely the point: Students wishing to study English Romanticism ought to have more than Wikipedia-level knowledge about German Idealist philosophy and Romantic poetry; students interested in the 18th-century English novel should be familiar with the Spanish picaresque tradition; and so on and so forth. Comp lit alone cannot break down the walls of literary protectionism.
The fact that we even have comp lit departments reveals our ingrained belief that “comparing” literary works or traditions is merely optional. Despite Bloom’s own defense of a “Western canon,” such a thing no longer exists for most academics. This is not because the feminists, post-colonialists, or post-modernists managed to deconstruct it, but rather because our institutions for literary studies have gerrymandered the canon, department by department. Is it not shocking that students can major in English at many colleges without ever having read a single book written in a foreign language? Even in translation? (Consider, by contrast, that history majors, even those desirous to only study the American Revolution, are routinely required to take courses on Asian, African, and/or European history, in many different time periods, to boot.) Given that English is the natural home for literary-minded students who are not proficient in another language, it is depressing that they can graduate from college with the implicit assumption that literature is the prerogative of the English-speaking peoples, an habeas corpus of the arts.
But wait a minute: how dare I criticize English curriculums for not including foreign works, when the major granted by my own department, French, is not exactly brimming with German, Russian, or Arabic texts, either? To the extent that French (or any other foreign language) is a literature major, this point is well taken. But there are differences, too. First, it is far more likely that our students will have read and studied English literature at some point in high school and college. They will thus already have had some exposure, at least, to another national canon. Second, and more importantly, a French, Spanish, or Chinese major is more than a literature major: it is to no small degree a foreign language major, meaning that the students must master an entire other set of linguistic skills. Finally, language departments are increasingly headed toward area studies. German departments routinely offer classes on Marx, Nietzsche, and Freud, none of whom are technically literary authors. Foreign language departments are sometimes the only places in a university where once-important scholarly traditions can still be studied: Lévi-Strauss’s Tristes tropiques probably features on reading exam lists more often in French than in anthropology departments. A model for such an interdisciplinary department already exists in Classics.
I do not wish to suggest that English professors are to blame for the Anglicization of literature in American universities: they reside, after all, in English departments, and can hardly be expected to teach courses on Russian writers. The larger problem is institutional, as well as methodological. But it bears emphasizing that this problem does not only affect undergraduates, and can lead to serious provincialism in the realm of research, as well. An English doctoral student who works on the Enlightenment once openly confessed to me that she had not read a single French text from that period. No Montesquieu, no Voltaire, no Rousseau, no Diderot, rien. Sadly, this tendency does not seem restricted to graduate students, either.
Literary scholars are not blind to this problem: a decade ago, Franco Moretti challenged his colleagues to study “world literature” rather than local, national, or comparative literatures. He also outlined the obvious difficulty: “I work on West European narrative between 1790 and 1930, and already feel like a charlatan outside of Britain or France. World literature?” While the study of world literature presents an opportunity for innovative methodologies (some of which were surveyed in a recent issue of New Literary History), students already struggling to master a single national literary history will no doubt find such global ambitions overwhelming.
What, then, is to be done? Rearranging the academic order of knowledge can be a revolutionary undertaking, in which ideals get trampled in administrative terror. And prescribing a dose of world literature may ultimately be too strong a medicine for the malady that ails literary studies, particularly at the undergraduate level. In fact, a number of smaller measures might improve matters considerably. To begin with, literature professors could make a greater effort to incorporate works from other national literatures in their courses. Where the funds are available, professors from neighboring literature departments could team-teach such hybrid reading lists. Second, language and literature majors could also require that a number of courses be taken in two or three other literature departments. A model for this arrangement already exists at Stanford, where the English department recently launched an “English Literature and Foreign Language Literature” major, which includes “a coherent program of four courses in the foreign literature, read in the original.” To fulfill this last condition, of course, colleges would have to become more serious about their foreign-language requirements. Finally, literature students would be better served if colleges and universities offered a literature major, as is notably the case at Yale, UC San Diego, and UC Santa Cruz. Within this field of study, students could specialize in a particular period, genre, author, or even language, all the while taking into account the larger international or even global context.
Will such measures suffice to pull down the iron curtain dividing the literary past? Unless they manage to infiltrate the scholarly mindset of national-literature professors, probably not. Then again, as many of us know firsthand, teaching often does transform (or at least inform) our research interests. A case could of course be made for more radical measures, such as the fusion of English and foreign language departments into a single “Literature Department,” as exists at UC San Diego. But enacting this sort of bureaucratic coup carries a steep intellectual (not to mention political) price. It would be unfortunate, for instance, to inhibit foreign literature departments from developing their area-studies breadth, and from building bridges with philosophy, history, anthropology, sociology, religious studies, political science, and international relations. English departments, moreover, are developing in similar, centrifugal directions: in addition to teaching their own majors, English departments contribute more widely to the instruction of writing (including creative writing), and have their own ties with Linguistics and Communications departments. This existing segmentation of the university may appear messy, but has the benefit of preventing new walls from being erected, this time between neighboring disciplines.
Dan Edelstein is assistant professor of French at Stanford University.
Yogi Berra is supposed to have said that people shouldn't write their autobiographies while they're still alive. Anyone who reads very many academic autobiographies will appreciate the sentiment. We have enough accounts, thanks, of how the path to tenure in the English department at Duke University was lit up by certain profound early life experiences. (The route now seems exceptionally well-mapped for one that not many people get to travel.)
But an exception might be made a recent volume called A Taste for Language: Literacy, Class, and English Studies by James Ray Watkins, Jr., published by Southern Illinois University Press. It is not the work of an academic celebrity. I doubt anyone will turn to it for career advice; it doesn't offer any. But as a study of the examined life, it has its lessons.
The author is an online educator for the Art Institute of Pittsburgh and the Center for Talented Youth at Johns Hopkins University. He runs a blog called Writing in the Wild. So one learns from the back cover. But the lesson really starts with a photograph across from the title page. It shows the author's father and was taken circa 1944. He is wearing a tie and his hair is well-combed. The pose suggests that the portrait might be of a young soldier, taken as a memento for his parents -- except that he looks as if he may not yet be old enough to shave.
And it turns out all of this is true. The son of a tenant farmer in Mississippi, Watkins Sr. enlisted in the army at the age of 16. He claimed to be older, of course, and to have graduated high school, although his formal education actually ended in the fourth grade.
Thanks to the GI Bill, the adolescent tank-commander in that photograph later went to night school to get his equivalency degree, then attended Louisiana State University. This prepared him for a successful career as a utilities analyst for the city of Houston. He died in the early 1980s, not long after the author began his own higher education.
"No one in his immediate family had attended school much beyond the middle- or high-school level," writes Watkins Jr.. "His family, my mother tells me, saw college as a kind of indulgence and thought that any young man could better spend his time earning a living. Before he entered LSU, then, it is likely that my father had only the roughest approximation of what a university education might entail.... My father left college with more than professional skills; he graduated with a larger sense of the purposes of education that made it imperative for his children as well."
This is a story of upward mobility, then, with economic security as its goal. But that is not all that was transmitted from father to son. To go from seeing education as a needless luxury to regarding it as an urgent necessity for one's children involves a deep change of ethos. Watkins tries to reconstruct this process through a close reading of any material he can find from his father's education -- in particular the textbooks for his courses on composition and literature at LSU during the late 1940, which left him with the skills needed to produce the sort of expository prose required in the professional workplace.
As it happened, the English department at LSU was also then an epicenter of the New Criticism, whose practitioners tried to teach students to read literary works with an eye to how their language worked. "It seems reasonable to assume that my father's lack of previous education made the inculcation of this sensibility difficult at best....The academic triumph of New Critical literary education in the English department had strict limits, clearly marked in my father's transcript."
But the effort had its effect, even so. It meant that Watkins Sr. could recognize that there might be something worthwhile about the ability to read for pleasure. And so it is that -- two generations after functional literacy was the family norm, but anything beyond it regarded with misgivings -- the author could end up writing a master's thesis on Paul de Man, getting a Ph.D., and teaching at various institutions.
This, then, is not an academic autobiography so much as an educational genealogy. The author is tracing back to their sources the conditions of possibility for his own existence. But it is not particularly introspective. There are no prose-poetical arias. The writing is unsentimental.
Instruction in expository composition left Watkins Sr. in command of an efficient, objective, no-frills style: the equivalent of a professional demeanor that could zero in on facts, while keeping subjective expression to a bare minimum. The son honors that ability with a narrative voice that is so precise in conveying the man's likes and habits and expectations of life that you are left with a sense of having met him -- yet with only a hint at the depths of feeling that it must have stirred in him to tell the story.
In that sense, A Taste for Language is not memoiristic, either. Digging through his father's textbooks and situating them in the history of language study as a discipline, Watkins is doing scholarship. And his research has implications that are not strictly personal.
People who come from a long line of securely middle-class professionals can take a certain amount of inherited cultural capital for granted. In Watkins's case, that is not an option. His recognizes that he has been shaped, however indirectly, by educational influences that were being exercised on him before he was even born. His father's upward mobility was in part the product of the pedagogical labor of writing instructors. Talk about "the life of the mind" can get highfalutin and self-aggrandizing at times. There is something to say for grasping how much of it is the result of institutional processes that go largely unnoticed.
That, in turn, raises questions about how well the present arrangement works. "On the one hand," Watkins writes, "we must accept our students' vocational goals as legitimate expressions of their desire to maintain or strengthen their economic position; on the other, we must seek out ways to persuade them that the contemplative, reflective traditions of the academy are important to their professional and social futures. Indeed, our goals ought to be even larger: to convince students that in spite of their apparent impracticality, the critical methodologies of the school have immediate professional application. Alertness to injustice isn't simply helpful in 'society in general'; it is necessary in the immediate, specific context of the work site."
Whether this can be realized in practice is, of course, another matter. Back when Watkins's father went to college, composition and literature were part of the same discipline. But that has not been the case for some time. Most training in composition is done by part-time or adjunct instructors. That arrangement, in turn, reflects a set of priorities in which such training is treated as a necessary but (at best) secondary function of the university. Which, in turn, reinforces the tendency for the rewards of higher education to go to students who arrive with adequate stocks of inherited cultural capital. It is an arrangement that seems almost as if it were designed to sustain inequality, rather than narrowing it.
"A two-tiered system of a few well-paid and independent literary teachers and researchers working side-by-side with poorly compensated part-time composition teachers would hardly support the interests of our profession, our students, or our society," writes Watkins. He calls for unionization of instructors in composition and literature as a first step towards mitigating this situation.
This reader, at least, wanted to applaud. Without the ability to bargain collectively, it's hard to see how the casualisation of academic labor will ever end. But it does rather raise the question of whether the expectation of upward mobility is not so ingrained in the professionalized middle-class as to make solidarity an almost unimaginable ideal.
Since 2000, I've been the host of the Wimba Distinguished Lecture Series, shouting from the rooftops (well, desktops) about how to use modern educational technologies to teach effectively online. But now, after evangelizing for the last decade, I'm switching sides. I am teaching creative writing online as an adjunct professor for Holmes Community College, in Goodman, Mississippi. How the tables have turned.
I've probably led more webcasts than anyone on the planet. Seriously. I've hosted webcasts at least once a week for 10 years and I've also given thousands of other online presentations. From presentations about educational technologies and policies, to effective instructional techniques, I've done it. But now I'm tasked with teaching – online – creative writing, a topic that traditionally uses a workshop format, a format that is quite difficult to replicate in a virtual environment. Yet it's not the format that worries me.
You see, this is my first time teaching a college course. Though I've led writing workshops, collaborated with writers and journalists here in New York, contributed to numerous publications, and even penned my own book, I now fretfully ready myself to formally – and virtually – mold young (and a few moldier) minds at a college more than 1,000 miles away from my life here in New York. But I can’t wait. I can't wait to familiarize my students with exemplary works of poetry, fiction and nonfiction. I can't wait to answer my students' questions and hear their insights. I can't wait for my students to learn from me and for me to learn from them. I'm nervous. But I'm ready. I think. So in the immortal inquiry asked by David Byrne: Well, how did I get here?
Let’s start by looking at the Ed Tech industry first.
When wearing my Wimba hat, I often remind my audience that it’s only been about a decade since the modern format of online courses was put into place. The current configuration of combining course management systems, web conferencing, instant messaging, message boards, etc. to teach a class to students in a classroom and/or their pajamas barely existed in the 20th century, so when one stops to consider the idea that collegiate courses had been taught (more or less) in the exact same manner since ancient Egypt, Greece, and Mesopotamia, it’s quite startling to see how quickly this transformation has transpired.
Obviously this format of modern courses is still being tweaked, but it certainly appears that much of the technological and pedagogical foundation is firmly in place. As of today, the dawn of the ‘10s, tens of thousands of postsecondary faculty, either because of or in spite of their ability and/or willingness, have already taken the plunge and incorporated technologies into their courses – often with a great deal of success.
I’ve written numerous research documents boasting both the tangible and intangible benefits of technology-enabled courses. Countless examples of institutions around the globe that have seen benefits such as increased retention rates of students, increased enrollments, improved graduation rates, and dollars saved on time and travel, all thanks to technology in the classroom, fill the pages of these documents. In fact, I’ve seen so many positive examples of technology-enabled education over the years that I now have an extremely difficult time understanding why any institution wouldn’t beef up its current online offerings. The downside is just so negligible while the upside is so great.
But I digress. After all, I’ve now got my own class to worry about.
A couple of months ago I left my comfy big-city confines and headed south to tiny Goodman for an on-site orientation for new faculty. I didn’t really know what to expect. I knew I’d have a big leg-up in terms of my knowledge of online course technologies, but I also knew I’d have a big leg-down in terms of my knowledge of classroom instruction. Turns out I was dead-on.
My two Holmes Community College trainers that day explained the ins and outs of being an online instructor to me and the approximately 10 others in the room, all of whom had collegiate teaching experience. At least my tech savviness made up for the in-front-of-a-class-savviness I lacked. But even though I was already familiar with the Blackboards and SunGuards of the world, I didn’t realize how much about them I didn’t know. As my girlfriend always says, it’s hard to know what you don’t know.
My HCC trainers spent hours teaching me about Bb’s enrollment tools, grading and assessment functions, and how to withdraw students who need to drop out. Despite being around instructors for so much of my life, I guess I never truly grasped how much of teaching is actually administering. After a full day of technology training I left the campus very excited, but also very nervous. I kept picturing myself pushing the wrong button and accidentally unenrolling an eager student and then having to sheepishly write an email to the Holmes IT staffers informing them of my blunder.
But on the flipside, my nervousness also translated to eagerness. As I learned more about my prospective students – fervent 18-21-year-olds as well as working adults from around the country – I plotted the numerous ways in which I could engage them online. While driving from my orientation back to the Jackson airport I thought of at least 20 assignments that would combine best practices of teaching creative writing face-to-face with best practices of teaching online. In fact, by the time I reached the rental car return desk I could envision the thank-you letters I hoped to receive from my happy students who affably learned a few tips and tricks about writing with some flair.
Which brings me to today.
My lesson plans are done. My syllabus is up. My books are in the bookstore. But my mind still bursts with uncertainties (after all, I am a writer).
How well can they write? What do they already know? What don’t they know? From what kinds of experiences will they draw when they put pen to paper? Have they been to William Faulkner’s house up the road in Oxford? Will they mind if I occasionally swear? Will I understand them if they speak with thick drawls? Will their writing be better than mine?
The waiting is the hardest part. I wish I could invent time travel and get the first class over with.
The funny part is that I’m never this nervous when preparing and/or waiting to give presentations for Wimba, but I guess that’s because of my experience at the company. Hopefully I’ll read this op-ed a few years from now and laugh at how nervous I was. Man, I can’t wait to be a veteran writing teacher brimming with the confidence only gained from years of experience! Oh, how worn will the elbow pads of my tweed jackets be. Some day.
I discussed my trepidations with my family over the holidays, and my dad, drawing upon his 30 years of teaching experience, asked, “Do you have your opening speech ready?” I told him I did, but I lied, I guess because I don’t really need one. And this already demonstrates the difference between online and face-to-face.
When my class is ready to begin, someone from HCC’s technology department will simply hit a button in Blackboard, and then, in an instant, the class will be active. It won’t be the same as the first class of a face-to-face course. I won’t write my name in big letters on the chalkboard and won’t give a big dramatic speech about the wonders of writing creatively. Instead, my students will receive a message in their inboxes notifying them to watch the archive of a lecture I’ll record later this week. Sure, they’ll still see my talking head and hear the inflection of my nervous-yet-excited voice, but the impact might not be as a great as watching me forcefully pace back-and-forth in front of full lecture hall. Then again, perhaps the impact will be even greater because they’ll be equally nervous as they embark on a new class in a new medium.
Stay tuned for more as I tell my tales from the other side….
Matt Wasowski is senior director of customer programs at Wimba.
What 10 books have most influenced you? That question has launched many a discussion online in recent days.I’ve been scribbling down my own list while reading the replies – but also wondering just how we assess the presence of influence, let alone its relative intensity.
With some of the lists, it's hard to tell what the word means. If a person names J.R.R. Tolkien’s Lord of the Rings as an influence, what does that imply? Did he or she become a scholar of Anglo-Saxon literature? Go on an epic quest that saved the world? Write hobbit stories? Record a heavy-metal album with runes on the cover?
To cite something as an influence can be a way to emphasize that it yielded much satisfaction. But the term properly implies something more consequential than that. You didn't just consume and digest; you were consumed and digested in turn.
I greatly enjoy the TV series "Breaking Bad" yet do not feel that it is transforming my existence. It has not inspired me to cook and sell methamphetamines, or even to imagine this as a possible solution to midlife anomie. Hence I would not claim it as an influence, just yet.
What counts, then? In mulling this over, it became clear that some authors were just too influential to claim as influences, if you will forgive the paradox. I read quite a lot of Marx, Freud, and Nietzsche at an impressionable age, and this certainly left its mark. But putting them on the list seemed unnecessary, for their power is pretty nearly inescapable.It would be like pointing out that I have breathed a lot of oxygen in my day.
Anyway, enough prolegomenous throat clearing. On to the list...
(1) Bertrand Russell, Why I am Not a Christian
Until shortly before my 14th birthday, in the opening months of the Carter administration, I was a Christian fundamentalist who fully expected the apocalyptic scenario of the Book of Revelation to be worked out in world events during my lifetime. Please understand that I do not say that with even the slightest sense of irony.
At the time, I was also very keen on Blaise Pascal, who was definitely not a Southern Baptist but who had undergone a mystical experience giving him a deep conviction of the existence of a divine order. Bertrand Russell’s book must have been on a library shelf near Pascal. I started reading it to arm myself against the enemy.
Things did not work out that way. In his urbane and relentlessly logical manner, Russell broke down everything that I had taken to be axiomatic about the existence of God -- and about the terrible consequences of not believing. He seemed to anticipate every counterargument. I spent days -- and a few late nights – running through it in my head.
The experience was painful and terrifying. It shook me to my core; even that seems like an understatement. Nothing remained the same afterward. To repeat: Influence and pleasure are entirely different things.
(2) Allen Ginsberg, Howl and Other Poems
At 14, I thought this was the greatest book of poetry ever. The length and the rhythm of Ginsberg’s lines, his juxtapositions of imagery (“the crack of doom on the hydrogen jukebox”), the way his diction shifted into the biblical or the street-level obscene ... all of this made the hair stand up on the back of my neck. Which, as someone once, said, is how you can tell when poetry is working on you.
It also inspired many a page of my own literary efforts, now lost to posterity. Paper will burn, if you let it.
Today the Beat idea that suffering and madness and extremity bring wisdom does not strike me quite so appealing and romantic. I have been exposed to quite enough miserable, crazy, extreme people for one lifetime. (They come to Washington a lot, especially these days.) But I still love this book. The shorter poems in the back – written when Ginsberg himself was under the influence of William Carlos Williams – still seem very moving.
(3) Jorge Luis Borges, Other Inquisitions
The best way to discover Borges is probably through the short stories in Ficciones, or the selection of prose and poetry in Labyrinths. As chance had it, I first came across him by way of this volume of essays, in which criticism becomes a form of imaginative writing. For Borges, all of literature forms one big interconnected structure in which the books are, in effect, reading you. Many an academic article on intertextuality consists of an unwitting and usually witless gloss on Other Inquisitions.
My favorite passage comes at the end of “Kafka and His Precursors,” an essay of three pages that subtly transforms the very idea of “influence” itself: “The fact is that every writercreates his own precursors. His work modifies our conception of the past, as it will modify the future.”
(4) Susan Sontag, Against Interpretation and Other Essays
Borges combined erudition with playfulness. Sontag, by contrast, was an erudite person who sometimes tried very hard to be playful, more or less out of a sense of duty. I don’t think this worked out very well, and certainly not over the long run.
But in the early 1960s, she wrote a series of essays on literature, film, art, and ideas that remain exceptional and definitive. In them you feel a mind trying to open itself to as many possibilities as it can, sort of like Matthew Arnold dealing with being trapped in Andy Warhol’s Factory for a while. This book was the syllabus for my own reading and moviegoing for a few years after I first discovered it, and I go back to visit it from time to time, like a favorite neighborhood.
(5-6) Jean-Paul Sartre, pretty much anything in English translation as of the early 1980s
OK, admittedly this is cheating, since it would include dozens of volumes of philosophy, fiction, plays, and journalism. I would need to wedge the pertinent volumes of Simone de Beauvoir’s memoirs in there, as well. You do what you have to do. I feel sufficiently uneasy about this to let it claim two spots on the list, rather than just one.
Sartre embodied the writer as intellectual and activist. There is nobody even remotely comparable these days; don't accept cheesy knock-offs. The question of Sartre's legacy is too complicated to go into here, and I am ambivalent about much of it, now, in any case. But his work still provokes me – through inspiration or irritation or both – in a way that no living author’s work does.
Narrowing things down a bit: Two volumes of interviews and articles from his final decade or so, Life/Situations and Between Marxism and Existentialism, seem like quintessential books. The latter has recently been reissued by Verso.
(7) Norman Podhoretz, Making It
Published in late 1967, while Podhoretz still considered himself a liberal (his transformation into neoconservative ideologue would take a few more years), Making It is the story of one man’s relentless climb to eminence in the world of the New York literary-intellectual establishment.“One of the longest journeys in the world,” its opening sentence begins, “is the journey from Brooklyn to Manhattan....”
Reading this in Texas at the age of 19, I was not yet in a position to appreciate its full, rich ridiculousness, and instead studied the book as carefully as I once had any account of the act of love – preparing for the day when detailed information might prove useful, rather than just frustrating.
In a hurry to brush off the hayseeds, I managed to confuse cynicism with sophistication. Over time, this did a certain amount of damage -- some of it, fortunately, remediable. It is embarrassing to include this book on my list. That is why I am doing so.
(8) Richard Hofstadter, The Paranoid Style in American Politics
There are serious problems with Hofstadter’s analysis of the People’s Party of the 1890s. We can talk about the failings of the consensus school of U.S. historiography until the cows come home. I acknowledge these things without reservation. And yet this book is indispensable.
I first read it in the early 1980s and have revisited it at least once each decade since then. I know of no better description of the typical qualities and standard features of our public discourse in its barking-at-the-moon episodes. It reminds us that such upsurges do not come out of nowhere. This is not exactly a comfort, but it does help make the batshit insane seem at least somewhat intelligible.
One American television network has evidently adopted the book as the basis for its business model. But you can’t blame Hofstadter for that.
(9) Richard Wright, American Hunger
In the early 1940s, Richard Wright produced an autobiographical manuscript covering his life up to 1937. Most of it was appeared in 1945 as Black Boy, but the final section, covering his years as a member of the Communist Party, was published as a separate book in 1977.
I was very taken with it not simply for its account of the radical movement during the Depression but for Wright's account of his own struggle to become a writer. And all the more so given something the author's estate included in the original printing of the book. It was facsimile reproduction of one page of the typescript, covered with his handwritten revisions of the text -- lines crossed out, words changed, sentences rewritten, etc.
This came as a revelation. My assumption had been that once you learned how to write, well, you just wrote. (The struggle was just to get to that point.) I stared at the page for a long time, trying to figure out how Wright had known that a given phrase or sentence might be improved, especially since what he had down often looked fine.
Another form of influence: When a book teaches you how much you don't know about how much you don't know, and how much you need to know it.
(10) Richard Lanham, Revising Prose
Finding this volume in a secondhand bookstore was not, perhaps, the answer to a prayer. But the deep perplexity left by American Hunger certainly left me ready for it.
Half of learning to write is knowing how to recognize when a sentence or paragraph is bad, and why, and what can be done about this. Lanham teaches a handful of very basic skills necessary to begin reworking a draft. His manual is now in its fifth edition. I have no idea what changes may have been introduced in the past 25 years or so. But if a textbook ever changed my life, this one did.
For a long while now I have planned to write an essay about the habit of keeping a notebook, and have even, from time to time, started to take notes on the topic. By now there have accumulated more passages hectoring myself to settle down to work on it than pages containing actual insights. It seems the project has a short circuit.
But it may be that this reflects a basic tension within the notebook itself, considered as a genre of writing. On the one hand, it is turned towards the outside world; it is absorptive and assimilative, a tool for recording information, ideas, impressions. On the other, it is the ideal venue for self-consciousness to run amok. Even when a notebook is integral to a specific project, the writing always seems to be lacking something. Thoughts remain unfinished or provisional. You are moving but you aren't there yet. This can be frustrating. But then a notebook can also be where you can dig in your heels -- summoning up the confidence, or the vital reserves of energy, needed to continue.
Sometimes the notebook provides escape from the work in progress, rather than contributing to it. This is not necessarily a matter of procrastination.
The best essay on the notebook as workshop is probably “On Intellectual Craftsmanship” by C. Wright Mills.(See this column on it.) But an important supplement comes from Elias Canetti, who won the Nobel Prize for Literature in 1981. After spending decades on his idiosyncratic and sui generis work of scholarship Crowds and Power (1960), Canetti published a mordant essay on notebook-keeping called “Dialogue with the Cruel Partner.”
“One cannot avoid the fact,” he writes, “that a work being continued daily through the years may occasionally strike one as clumsy, hopeless, or belated. One loathes it, one feels besieged by it, it cuts off one’s breath. Suddenly, everything in the world seems more important, and one feels like a bungler... Every outside sound seems to come from a forbidden paradise; whereas every word one joins to the labor one has been continuing for so long, every such word, in its pliant adjustment, its servility, has the color of a banal and permitted hell.”
From such dark moods, the notebook offers a reprieve. When the writer “views himself as the slave of his goal, only one thing can help: he has to yield to the diversity of his faculties and promiscuously record whatever comes to his mind.... The same writer, normally keeping a strict discipline, briefly becomes the voluntary plaything of his chance ideas. He writes down things that he would never have expected in himself, that go against his background, his convictions, his modesty, his pride, and even his otherwise stubbornly defended truth.”
There is a third modality of the notebook habit – a matter of treating it, neither as the warehouse and workshop for a project nor as an escape from its demands, but as something like its own form of writing, imposing its own peculiar demands.
Joan Didion’s essay “On Keeping a Notebook” is astute on how this third mode is a function of temperament: “The impulse to write things down is a peculiarly compulsive one, inexplicable to those who do not share it, useful only accidentally, only secondarily, in the way that any compulsion tries to justify itself.” The fragments jotted down are “bits of the mind's string too short to use, an indiscriminate and erratic assemblage with meaning only for its own maker.”
The resulting collages of stray data and random insights are a way to keep track of one’s earlier incarnations, the personalities adopted and left behind in the course of a lifetime. “I think we are well advised to keep on nodding terms with people we used to be,” Didion writes, “whether we find them attractive company or not.”
As it happens, Canetti made much the same point. “The mechanisms one uses to make life easy are far too well-developed,” he writes. “First a man says, somewhat timidly: ‘I really couldn’t help it.’ And then, in the twinkling of an eye, the matter is forgotten. To escape this unworthiness, one ought to write the thing down, and then, much later, perhaps years later, when self-complacence is dripping out of all of one’s pores, when one least expects it, one is suddenly, and to one’s horror, confronted with it. ‘I was once capable of that, I did that.’ ”
On this account, then, notebooks are, in effect, an annex of the superego. My own notebooks play that role at times.They document opinions or enthusiasms that sometimes prove embarrassing, after a few years have passed. But they are also full of injunctions – usually to work harder, or to finish some project now gathering dust in one of the more workshop-like volumes, or to start studying X in a systematic fashion (and here’s the syllabus...).
Recently the text of Didion’s essay was posted at an online venue called The New Inquiry, which is something of a cross between a group blog and a salon (it sponsors face-to-face meetings in New York between readers and contributors) and seems to be in transition towards becoming a magazine. Its three founders are recent graduates of Columbia University and Barnard College.
The site itself is a kind of collective notebook. It made me wonder how the proprietors understood notebook-keeping – and whether digital technology influenced how they practiced it. My own habits are irremediably old-fashioned. A netbook is not a notebook, to my mind anyway, and I still do a lot of writing with pen in hand, even while exhorting myself to be more productive and efficient (a performative contradiction, if ever there were one). But being stuck in one’s own habits does not preempt curiosity about those of other people, so I asked the New Inquirists how they saw “notebooking.”
While she prefers to read from paper, Jennifer Bernstein, a New York-based writer, finds that reflecting on what she reads is another matter: “I often create a Word document in which to jot down the best ideas and quotations from a book. Then I end up reading commentary on the book and articles related to its theme, excerpts from which I also paste into the document, usually with my own thoughts. The document becomes a kind of mini-scrapbook, the record of my exploration of a concept (for example, one I did recently was conservatism in the 20th century). This isn’t a perfect method. It’s led to a proliferation of strangely titled documents on my hard drive that at some point I should probably sort through and systematize. On the other hand, the chaos reflects how my mind really works.”
Rachel Rosenfelt, a cultural critic living in Brooklyn, told me: “I've never been a paper-notebook keeper in the sense Didion means it. Or in most senses, really. When I moved out of my last apartment I unearthed a pocket notebook that I had bought years earlier to track my expenses. On the first page was written: ‘notebook- $3.14.’ That was the only entry.”
Instead, she uses whatever book she is reading as a recording surface. They end up “profaned,” as she puts it, “filled with unrelated scribblings in the front and back pages, marked up with underlines, stars and notes....The notes I take within the texts and margins of books work like a diary for me in that sense, and often have a second life online in the form of the ideas I formulate and write about on TNI and elsewhere.” One consequence is that Rosenfelt can never part with a book when she is done with it. After all, you don’t sell a diary.
The attitude of Mary Borkowski, an arts programmer for the Columbia University radio station WKCR, sounds closest to my own. “I'm a bit eccentric in that I rarely write anything initially on the computer,” she told me. “I compose most essays, letters, short stories, poems, even emails, in long hand and then transcribe them onto the screen. I do realize that writing in longhand is, well, time-consuming, but there is something about writing in longhand that is always more surreptitious, more crafty, almost silent -- the least painful way to wrench a thought from my mind.”
The exact format of “notebooking” matters less, Borkowski says, than the impulse to find “a canvas for the mind” – a place for “the spurts of thoughts and memes, blurps from the brain stems that have no order yet.” The notebook is “the outline before the outline.”
I sensed that The New Inquiry serves as a place to record (the preferred term now is “curate”) things its participants had read, and to gloss them if the spirit so moves. Jennifer Bernstein confirmed this: “I usually just post several cultural artifacts that I see as closely related, without comment (see this, for example). This format allows me to maintain the loose, associative connection between them (and to suggest that connection to others). Websites can accommodate all kinds of media, including audio and video, which allows juxtapositions that weren’t instinctive or even possible before.”
Besides “collective notebook-keeping in the form of group blogs,” Bernstein noted the potential of formats such as Google Documents, “where people can edit the very same text, or Wave, which supports all forms of media. Basic software innovations like Word’s Track Changes and Google Wave have multiplied the forms that commentary can take.”
But part of what I value about The New Inquiry is that its participants always seem at least somewhat ambivalent about the technologies they have grown up with – and this comes through in Mary Borkowski’s comments.
“We create tools for living,” she told me, “and they became objects that totally dominate us or we dominate them. Notebooking is then one of the last personal stands against the individual mind being dominated by outside forces, or having to 'think inside the box,' if you will. It's a 'secret,' 'private' outlet that used to exist in ledger or diary form but now, especially when we're so inundated by the busy-ness of technology, notebooking is a state of mind expressed in the time we are separated from our palm devices, or laptops, or phones. The notebooking state of mind comes up when we can think minus the chatter, when ideas clarify. Notebooking facilitates the spontaneity of creativity, thoughts that could occur at any moment, or random time -- the unaccounted for in our over-accounted for, micromanaged, lifehacking world. Notebooking is the place to process your thinking in a world that seems to only value the end product.”