The First of the Month is a cultural and intellectual publication that is singularly lively, and no less strange. It started out in 1998, in tabloid format, as a “newspaper of the radical imagination” published in Harlem. First has been compared to Partisan Review, the legendary magazine of the New York Intellectuals that began during the Depression. But honestly, that's just lazy. Any time a bunch of smart people start a magazine, somebody ends up comparing to it to Partisan Review, especially if it is published in New York; but First took its name from a song by Bone-Thugs-n-Harmony, and while I’d like to picture Delmore Schwartz doing a little freestyle rapping over scotch at the White Horse Tavern, it’s a stretch.
Following what has become the contemporary routine, the paper gave birth to a Web site; this then replaced the print edition. An anthology culled from its first decade appeared last year as The First of the Year: 2008, published by Transaction. On first approach, the book looks like a memorial service for the whole project. And an impressive one: the roster of contributors included (to give a very abbreviated and almost random roll-call) Amiri Baraka, Greil Marcus, Lawrence Goodwyn, Grace Lee Boggs, Adolph Reed, Russell Jacoby, Armond White, Kurt Vonnegut, Kate Millet, Richard Hoggart, and Ellen Willis.
I meant to give the volume a plug when it appeared; so much for the good intention. But happily, my initial impression was totally wrong. While continuing to function online (and to have its world headquarters in Harlem, where editorial collective member and impressario Benj DeMott lives) First has reinvented itself as an annual anthology. First of the Year: 2009, has just been published, which seems worth noting here, in this first column of the year.
The viability of any small-scale, relatively unprofitable cultural initiative is a function of two forces. One is the good will of the people directly involved. The other is getting support from the public – or rather, creating one.
In this case, the process is made more difficult by the fact that First is sui generis. Which is putting it politely. My own response upon first encountering it about 10 years ago involved a little cartoon balloon forming over my forehead containing the letters “WTF?” It is not simply that it is hard to know what to expect next; sometimes it is hard to say what it was you just read. In First, political commentary, cultural analysis, and personal essays sit side-by-side. But at times, all three are going on at once, within the same piece. Kenneth Burke used to refer to such jostlings of the coordinate system as creating "perspective by incongruity." It signals a breakdown of familiar formats -- a scrambling of routine associations.This is stimulating, if perplexing. The confusion is not a bug but a feature.
One familiar description of the journal that I have come to distrust treats First as a bridge between popular culture and the ivory tower. An often-repeated blurb from some years ago calls it "the only leftist publication [one] could imagine being read at both Columbia University and Rikers.”
Good advertising, to be sure. But the better the ad, the more its presumptions need checking. The whole “building a bridge” trope implies that there is a distance to be spanned – a connection between enclaves to be made. (The ideas are over here, the masses over there.) But reading First involves getting oriented to a different geography. Some academics do write for it, but they do not have pride of place among the other contributors, who include poets and musicians and journalists, and people who might best just be called citizens. The implication is not that there is distance to be crossed, but that we're all on common ground, whether we know it, or like it, or not.
In the wake of 9/11, some writers for First (not all of them) rallied to the call for a war, and at least one endorsed George W. Bush during the 2004 campaign. Does that mean that First is actually “the only ‘neoconservative’ publication read in both academe and prisons”? Well, no, but funny you should ask, because it underscores the convenience made possible by pre-gummed ideological labels.
At times they are useful (I tend to think "social-imperialist" is a pretty good label for the idea that "shock and awe" was necessary for historical progress in Iraq) but not always.
The discussion of Obama in the new volume is a case in point. Both Paul Berman (a Clintonian liberal who supported the Iraq War) and Amiri Baraka (who takes his political bearings from Mao Tsetung Thought) concurring that the 2008 election was a transformative moment. This is, let's say, an unanticipated convergence. Meanwhile, Charles O’Brien (an editorial collective member who endorsed Bush in ‘04, on more or less populist grounds) treats Obama as a short-circuit in the creation of a vigorous radicalism-from-below needed for social change. “Of the Obama campaign, what endures?” he asks. “The new Pepsi ad.”
It would be wrong to see First as yet another wonk magazine with some cultural stuff in it. Nor is one of those journals (edited on the bridge, so to speak) in which the latest reality-TV show provides the excuse for yet another tour of Foucault’s panopticon. Politics and culture come together at odd angles in the pages of First, -- or rather, each spins out from some vital center that proves hard to pin down. Margin and mainstream are configured differently here.
I tried to get a handle on First's particularity by talking to Benj DeMott, who edited the two anthologies and is now working on the third. We spoke by phone. Taking notes did not seem like a plausible endeavor on my part, because DeMott's mind moves like greased lightning – the ideas and references coming out in arpeggios, rapid-fire and sometimes multitrack.
But one point he made did stick. It was a consideration on holding together a project in which the contributorsdo not share a party line, and indeed sometimes only just barely agree to disagree. It sounds complicated and precarious. Often, he said, it comes down to sharing a passion for music -- for sensing that both democracy and dancing ought to be in the streets. Politics isn't about policy, it's about movement.
That does not mean celebration is always the order of the day. The indulgence of academic hiphop fans is legendary, but if you want to see what tough-minded cultural analysis looks like, check out the African-American film critic Armond White's reflections on white rapper Eminem in The First of the Year: 2009. The essay can be recommended even if its subject is now shrinking in pop culture’s rearview mirror.
“Rather than a symbol of cultural resistance,” writes White, “he’s the most egregious symbol of our era’s selfish trends. With his bootstrap crap and references to rugged individualism reminiscent of the 80s, he’s a heartless Reagan-baby – but without the old man’s politesse.... His three albums of obstinate rants culminate in the egocentric track ‘Without Me,’ making him the Ayn Rand of rap – a pop hack who refuses to look beyond himself.... Minus righteousness, angry rap is dismissible. Rap is exciting when it voices desire for social redress; the urge toward public and personal justice is what made it progressive. Eminem’s resurrected Great White Hope disempowers hip hop’s cultural movement by debasing it.”
Now, if you can imagine such thoughts ever appearing in an essay by Irving Howe -- let alone Irving Kristol -- then we can go ahead and describe First as inheriting the legacy of the New York Intellectuals.
Otherwise, it may be time to recognize and respect First for what it is in its own right: a journal of demotic intelligence, alive to its own times, with insights and errors appropriate to those times, making it worth the price of perplexity.
Only after talking to Benj DeMott did I read what seems, with hindsight, like the essay that best explains what is going on with the whole project. This is long tribute -- far more analytical than sentimental -- to his father, the late Benjamin DeMott, who was a professor of English at Amherst College. He was a remarkable essayist and social critic.
It is time that someone publish a volume of DeMott senior's selected writings. Meanwhile, his influence on First seems pervasive. The younger DeMott quotes a letter written in his father’s final years -- a piece of advice given to a friend. It offers a challenge to what we might call "the will to sophistication," and its hard clarity is bracing:
"Study humiliation. You have nothing ahead of you but that. You survive not by trusting old friends. Or by hoping for love from a child. You survive by realizing you have nothing whatever the world wants, and that therefore the one course open to you is to start over. Recognize your knowledge and experience are valueless. Realize the only possible role for you on earth is that of a student and a learner. Never think that your opinions – unless founded on hard work in a field that is totally new to you – are of interest to anyone. Treat nobody, friend, co-worker, child, whomever, as someone who knows less than you about any subject whatever. You are an Inferior for life. Whatever is left of it....This is the best that life can offer. And it’s better than it sounds.”
This amounts, finally, to a formulation of a democratic ethos for intellectual life. It bends the stick, hard, against the familiar warp. So, in its own way, does First, and I hope the website and the series of anthologies will continue and prosper as new readers and writers join its public.
I have been reading with sadness and horror about the murder of Don Belton, an assistant professor of English at Indiana University, whose body was found in his apartment in Bloomington on December 28. He had been stabbed repeatedly in the back and sides. A novelist and essayist, Belton had taught creative writing at a number of institutions and was the editor of Speak My Name: Black Men on Masculinity and the American Dream, a landmark anthology published by Beacon in the mid-1990s. He was also gay, which is not an incidental detail.
Around the time police were getting their bearings on the case, the girlfriend of a young ex-Marine named Michael Griffin contacted police to tell them she thought he was involved in Belton’s death. Griffin was soon taken into custody. According to a detective's affidavit available online, he said that Belton had sexually assaulted him on Christmas. Two days later, he went to Belton’s apartment to have a “conversation” which turned into a “scuffle,” resulting in the professor’s death.
These words, which sound so mild, sit oddly in the narrative. The affidavit then goes on to say that Griffin stated “that he took a knife, called a ‘Peace Keeper’ that he had purchased prior to going to Iraq while in the Marine Corps, with him....” He also thought to bring a change of clothes. The bloody ones went into a white trash bag. Griffin “then went about and ran several errands,” the report continues, “before he eventually discarded the bloody clothing into a dumpster.... Mr. Griffin then returned home where he stated that he yold his girlfriend what he had done.”
I heard about the case from my friend Josh Lukin, a lecturer in the First Year Writing Program at Temple University -- where, as he used to say in the contributor's note for his publications, "he and novelist Don Belton occasionally bemuse the staff with their renditions of classic show tunes," back when they both taught there. Josh recalls his friend as a sweet-natured and brilliant colleague, but one whose many gifts did not include the ability to lift heavy objects.
Belton was 53 years old while the man charged in his death is 25. The idea that he could violate an ex-Marine (and not once but twice, according to his statement to the police during interrogation) would be funny if it were not so grotesque.
In his affidavit, the Bloomington detective who investigated the case reports finding “a journal kept by the decedent ... in which he writes in the week prior to Christmas 2009 that he is very happy that an individual by the name of Michael has come into his life.” Benton had joined Griffin and his girlfriend for Christmas. Indeed -- and this is in some ways the most troubling thing about the story -- the relationship seems to have been very friendly until it turned vicious.
It is easy to speculate about what may have happened. In fact we do not know. But the circumstances track with a familiar pattern -- one common enough to have a name: “the ‘gay panic’ defense.” This rests on the idea that the wave of disgust created in a heterosexual person at exposure to gay sexuality can create a state of temporary psychosis. The panic-stricken victim loses responsibility for his (for some reason, it always turns out to be “his”) actions.
This is an idea that should be retired to the Museum of Deranged Rationalization as soon as possible. But it seems far-fetched to imagine that Griffin and his counsel will get through trial without invoking it. (Despite his confession, Griffin has pleaded not guilty to murder.)
On the other hand, the “panic” defense touches on an issue that was of vital interest to Belton himself. He wrote the introduction to a book edited by the late Eve Kosofsky Sedgwick. Her work on queer theory includes a sustained inquiry into the complicated and damaging way certain institutions have forged intense bonds among men while also obliging them to police one another for the slightest trace of homosexuality. This contradictory demand makes for paranoia and volatility.
In Epistemology of the Closet (University of California Press, 1990), Sedgwick writes, “The historical emphasis on enforcement of homophobic rules in the armed services in, for instance, England and the United States supports this analysis. In these institutions, where both men’s manipulability and their potential for violence are at the highest possible premium, the prescription of the most intimate male bonding and the proscription of (the remarkably cognate) ‘homosexuality’ are both stronger than in civilian society – are, in fact, close to absolute.”
As it happens, Belton had reflected on this ambivalent, anxious, crazy-making dimension of social reality in an essay that appeared in the journal Transition in 1998. Reflecting on a book about gay Marines, Belton reflected on his own very complicated effort to sort out mixed messages about race, sexuality, and violence when he was growing up in the 1960s. The machismo of Black Panther leader Eldridge Cleaver had been both appealing and problematic – given that it rested on a belief that, as Franz Fanon had put it, “homosexuality is an attribute of the white race, Western civilization.” This was another version of the cultural logic that Sedgwick had identified: Solidarity among African-American men being forged by excluding gays as race traitors.
Belton’s vision was broader. He had been friends with James Baldwin and lectured on him at the Sorbonne; the influence of the novelist and essayist on his own work was not small. One of his friends has quoted a passage from Baldwin that seems to epitomize Belton’s work: “Love takes off masks that we fear we cannot live without and know we cannot live within." Although I did not know the man himself, this touches the heart of his writing, which suggests a desire to go beyond, or beneath, the prescribed roles and rules governing “identity.”
This is easier said than done, of course. It is also dangerous; love can be dangerous. Belton wrote in his journal (to quote from the detective’s statement again) “that he is very happy that an individual by the name of Michael has come into his life.” It is not necessary to use pseudopsychological terms like “gay panic” to describe the response this created. Keep in mind that the killer brought his own special knife and a change of clothes. Arguably another vocabulary applies, in which it is necessary to speak of evil
One of the remarkable things about the response to Belton's death is just how much of it there has been. Hundreds of people turned out for a vigil on New Year's Day (see video). There is a website called Justice for Don Belton. An open letter from the chair of his department has appeared on the departmental Web site. A memorial service will be held in Bloomington
And Josh Lukin tells me that he is proposing a session called “Remembering Don Belton” for the next MLA -- a panel "engaging his scholarship, art, journalism, and pedagogy." Possible topics might include "his writing and teaching on black masculinity, Baldwin, Brecht, Mapplethorpe, Morrison, Motown, jazz, cinema, abjection," to make the list no longer than that.
"The guy's range of interests was huge," Josh says, "and he kept surprising me with his knowledge of critical texts, both recent ('Bowlby, Just Looking? Great chapters on Dreiser.') and more traditional ('Why not talk about Morrison using R.W.B. Lewis, American Adam?')."
I have no idea how decisions about such proposals are decided. But this would be a good session to have on the schedule for next year. To move from sorrow to celebration is not easy; the effort should be encouraged.
All of this fuss over J.D. Salinger is yesterday’s dinner warmed over. Make no mistake: Salinger was a terrific writer, and at one time he was very famous as an artist, not a recluse. But he ravaged his own reputation. He threw a cloak over himself, and ensured that he’ll be unknown to tomorrow’s readers, and little more than a footnote in the next generation’s literary histories.
Salinger’s life may be divided evenly into two parts. For the first 45 years or so, he sought to become a well-known writer, and succeeded handsomely. For the second 45 years, he sought to erase the evidence of the first 45 by building a wall of silence around himself and his work. Sadly, he succeeded very well at that too. Salinger’s most decisive act, of course, was to stop publishing. After publishing one extraordinary novel, The Catcher in the Rye, in 1952, followed by several volumes of interlinked stories about a family named Glass, Salinger quit. At the height of his influence in the mid-1960s, with his creative powers flowing abundantly, he simply withdrew from the world of publishing, readers, and especially critics. The prurient interest in Salinger’s isolation may endure longer than Salinger’s writing. Word has it that a Salinger documentary, prepared in secret, is already in the works.
J.D. Salinger was once the voice of a generation. Millions of readers of a certain age saw in Holden Caulfield, the hero of The Catcher in the Rye, an eloquent expression of their own longings and frustrations. But that generation is now middle-aged. They're the ones writing Salinger’s admiring obituaries now, so they exaggerate his importance based on how they remember him. Salinger was once very important indeed, but he did his best to muffle that importance by refusing any and all entreaties from anthologies, critics, and filmmakers.
Salinger was of course entitled to his personal privacy, and he was likewise entitled to write for himself and not for publication. But it’s more than a pity that he expended so much effort to keep people from reading the work that he so eagerly turned into the world at a time when he was feeling more generous toward it. Salinger refused requests to republish his work in different formats, and when people tried to write about it — and about him — he made it as difficult as possible. His successful court effort to block a biographer from quoting from his unpublished letters not only ruined one book, but also chilled the ambitions of writers who might have followed in its wake.
Salinger’s best-known short story, “A Perfect Day for Bananafish,” is about a prodigiously bright young man — someone with seemingly everything to live for — who shoots himself in the head one day for no apparent reason. Upon reflection, we can read “Bananafish” as a kind of allegory of Salinger’s own career. No one will ever know exactly why he shut himself down, but many have wondered — as they continue to wonder about the suicide in the story.
Perhaps Salinger might have kept going if he cared more about the connection that he made with his readers. Bruce Springsteen told an interviewer in 1984 that, “If the price of fame is that you have to be isolated from the people you write for, then that's too fuckin’ high a price to pay." Springsteen is in this respect the very antithesis of Salinger. Witnesses testify that Salinger continued to write in his New Hampshire hermitage, but he evidently had little desire to communicate with any reader but himself.
I imagine that Salinger’s unpublished work will be packaged and sold at some point. There’s too much money to be made for that not to happen. But the anticipation will surely exceed the actual event. In fact, I predict that Salinger’s significance will drop like a stone once that material comes out and gets digested, and that’s because of the anti-public life Salinger himself led.
Rebuffing the literary anthologies may prove to be Salinger’s most consequential decision in that regard, because it has kept his writing from the eyes of succeeding generations of readers. Most young readers encounter classic authors in the pages of such collections, and these encounters lay the foundation for their later reading. Salinger’s work is increasingly invisible to younger people now, so his reputation won't stay aloft once the brief, titillating pleasure of revealing what's in his writer's cupboard is satisfied.
That pleasure will also evaporate because the posthumous work is unlikely to be very good. Writers who refuse to communicate with their readers or with the larger world tend not to produce very good fiction because they’re no longer of the world that they’re writing about. Salinger effectively expatriated himself from the social world, but that world was changing around him through the decades of his isolation. We may expect stories encased in amber.
Salinger betrayed a great talent. Metabolically speaking, he died last week. But his passing really began decades ago.
Leonard Cassuto is professor of English at Fordham University and the general editor of the forthcoming Cambridge History of the American Novel.
In a recent New York Review article on Byron, Harold Bloom makes the following passing remark: “In the two centuries since Byron died in Greece [...] only Shakespeare has been translated and read more, first on the Continent and then worldwide.” Bloom does not cite any statistics, and one cannot help but wonder: Really? More than Homer and Dante, or, among the moderns, more than Sartre and Thomas Mann? Of course, what Bloom really means is that Byron was translated and read more than any other English writer, and he may well be correct on that count. Yet this omission is telling, as it highlights an unfortunate tendency (recently diagnosed by David Damrosch) among certain English professors to equate literature in general with literature written in English. This disciplinary bias, less prejudice than habit, can distort their scholarship – the authors that they admire tend to be far more catholic in their reading. But this pattern also raises a larger academic question: Why do we still partition the literary canon according to nationalist traditions? Is this really the most intellectually satisfying and authentic approach to literary studies?
For an example of how disciplinary blinders can affect scholars as well-read as Bloom, we need only turn back to his article, where we find Byron described as “the eternal archetype of the celebrity, the Napoleon of the realms of rhyme... the still unique celebrity of the modern world.” What such hyperbole masks is the fact that the model for such literary celebrity is in reality to be located in another author, who unfortunately did not have the good sense to be born in England. Indeed, anyone familiar with the inordinate fame of Jean-Jacques Rousseau knows that he was the first genuine literary celebrity, lionized and sought out across Europe, much to his growing despair and paranoia (as this brilliant study by the historian Antoine Lilti details). Byron himself was smitten by Rousseau, touring the Lac Léman with his friend Shelley to visit the sites from Julie, ou la nouvelle Héloïse. Rousseau may not have provided his public with the same devilish scandals as the naughty Lord, but his Confessions, with their admission of a fondness for spankings and exhibitionism, were sultry enough.
Bloom is certainly no provincial, and his own, published version of The Western Canon includes German, Spanish, French, and Italian works – although this canon, too, is heavily tilted toward English authors. But can this be avoided? No doubt French scholars would produce a version of the canon equally tilted toward the French, just as scholars from other nations would privilege their own authors. To an extent, this literary patriotism is normal and understandable: every culture values its heritage, and will expend more energy and resources promoting it.
From the viewpoint of literary history, however, such patriotism is also intellectually wrongheaded. To be sure, writers are often marked most strongly by their compatriots: one must read Dante to understand Boccacio, Corneille to understand Racine, or, as Bloom would have us believe, Whitman to understand T. S. Eliot. But such a vertical reading of literature (which Bloom himself mapped out in The Anxiety of Influence) overlooks the equally – sometimes far more – important horizontal ties that connect authors across national borders. T. S. Eliot may have been “hopelessly evasive about Whitman while endlessly revising him in [his] own major poems,” yet by Eliot’s own admission, the French school of symbolist poetry had a far greater impact on his work. Some of Eliot’s first published poems, in fact, were written in French. Conversely, the French novelist Claude Simon may have endlessly revised Proust, but his own major novels – such as La route des Flandres and L’herbe – owe far more to William Faulkner. Such examples could be multiplied ad infinitum: they are, in fact, the stuff that literary history is made of.
To this criticism, English professors have a ready-made answer: Go study comparative literature! But they have only half a point. Comp lit programs are designed to give students a great deal of flexibility: their degrees may impose quotas for number of courses taken in foreign language departments, but rarely, if ever, do comp lit programs build curricular requirements around literary history. Yet that is precisely the point: Students wishing to study English Romanticism ought to have more than Wikipedia-level knowledge about German Idealist philosophy and Romantic poetry; students interested in the 18th-century English novel should be familiar with the Spanish picaresque tradition; and so on and so forth. Comp lit alone cannot break down the walls of literary protectionism.
The fact that we even have comp lit departments reveals our ingrained belief that “comparing” literary works or traditions is merely optional. Despite Bloom’s own defense of a “Western canon,” such a thing no longer exists for most academics. This is not because the feminists, post-colonialists, or post-modernists managed to deconstruct it, but rather because our institutions for literary studies have gerrymandered the canon, department by department. Is it not shocking that students can major in English at many colleges without ever having read a single book written in a foreign language? Even in translation? (Consider, by contrast, that history majors, even those desirous to only study the American Revolution, are routinely required to take courses on Asian, African, and/or European history, in many different time periods, to boot.) Given that English is the natural home for literary-minded students who are not proficient in another language, it is depressing that they can graduate from college with the implicit assumption that literature is the prerogative of the English-speaking peoples, an habeas corpus of the arts.
But wait a minute: how dare I criticize English curriculums for not including foreign works, when the major granted by my own department, French, is not exactly brimming with German, Russian, or Arabic texts, either? To the extent that French (or any other foreign language) is a literature major, this point is well taken. But there are differences, too. First, it is far more likely that our students will have read and studied English literature at some point in high school and college. They will thus already have had some exposure, at least, to another national canon. Second, and more importantly, a French, Spanish, or Chinese major is more than a literature major: it is to no small degree a foreign language major, meaning that the students must master an entire other set of linguistic skills. Finally, language departments are increasingly headed toward area studies. German departments routinely offer classes on Marx, Nietzsche, and Freud, none of whom are technically literary authors. Foreign language departments are sometimes the only places in a university where once-important scholarly traditions can still be studied: Lévi-Strauss’s Tristes tropiques probably features on reading exam lists more often in French than in anthropology departments. A model for such an interdisciplinary department already exists in Classics.
I do not wish to suggest that English professors are to blame for the Anglicization of literature in American universities: they reside, after all, in English departments, and can hardly be expected to teach courses on Russian writers. The larger problem is institutional, as well as methodological. But it bears emphasizing that this problem does not only affect undergraduates, and can lead to serious provincialism in the realm of research, as well. An English doctoral student who works on the Enlightenment once openly confessed to me that she had not read a single French text from that period. No Montesquieu, no Voltaire, no Rousseau, no Diderot, rien. Sadly, this tendency does not seem restricted to graduate students, either.
Literary scholars are not blind to this problem: a decade ago, Franco Moretti challenged his colleagues to study “world literature” rather than local, national, or comparative literatures. He also outlined the obvious difficulty: “I work on West European narrative between 1790 and 1930, and already feel like a charlatan outside of Britain or France. World literature?” While the study of world literature presents an opportunity for innovative methodologies (some of which were surveyed in a recent issue of New Literary History), students already struggling to master a single national literary history will no doubt find such global ambitions overwhelming.
What, then, is to be done? Rearranging the academic order of knowledge can be a revolutionary undertaking, in which ideals get trampled in administrative terror. And prescribing a dose of world literature may ultimately be too strong a medicine for the malady that ails literary studies, particularly at the undergraduate level. In fact, a number of smaller measures might improve matters considerably. To begin with, literature professors could make a greater effort to incorporate works from other national literatures in their courses. Where the funds are available, professors from neighboring literature departments could team-teach such hybrid reading lists. Second, language and literature majors could also require that a number of courses be taken in two or three other literature departments. A model for this arrangement already exists at Stanford, where the English department recently launched an “English Literature and Foreign Language Literature” major, which includes “a coherent program of four courses in the foreign literature, read in the original.” To fulfill this last condition, of course, colleges would have to become more serious about their foreign-language requirements. Finally, literature students would be better served if colleges and universities offered a literature major, as is notably the case at Yale, UC San Diego, and UC Santa Cruz. Within this field of study, students could specialize in a particular period, genre, author, or even language, all the while taking into account the larger international or even global context.
Will such measures suffice to pull down the iron curtain dividing the literary past? Unless they manage to infiltrate the scholarly mindset of national-literature professors, probably not. Then again, as many of us know firsthand, teaching often does transform (or at least inform) our research interests. A case could of course be made for more radical measures, such as the fusion of English and foreign language departments into a single “Literature Department,” as exists at UC San Diego. But enacting this sort of bureaucratic coup carries a steep intellectual (not to mention political) price. It would be unfortunate, for instance, to inhibit foreign literature departments from developing their area-studies breadth, and from building bridges with philosophy, history, anthropology, sociology, religious studies, political science, and international relations. English departments, moreover, are developing in similar, centrifugal directions: in addition to teaching their own majors, English departments contribute more widely to the instruction of writing (including creative writing), and have their own ties with Linguistics and Communications departments. This existing segmentation of the university may appear messy, but has the benefit of preventing new walls from being erected, this time between neighboring disciplines.
Dan Edelstein is assistant professor of French at Stanford University.
Yogi Berra is supposed to have said that people shouldn't write their autobiographies while they're still alive. Anyone who reads very many academic autobiographies will appreciate the sentiment. We have enough accounts, thanks, of how the path to tenure in the English department at Duke University was lit up by certain profound early life experiences. (The route now seems exceptionally well-mapped for one that not many people get to travel.)
But an exception might be made a recent volume called A Taste for Language: Literacy, Class, and English Studies by James Ray Watkins, Jr., published by Southern Illinois University Press. It is not the work of an academic celebrity. I doubt anyone will turn to it for career advice; it doesn't offer any. But as a study of the examined life, it has its lessons.
The author is an online educator for the Art Institute of Pittsburgh and the Center for Talented Youth at Johns Hopkins University. He runs a blog called Writing in the Wild. So one learns from the back cover. But the lesson really starts with a photograph across from the title page. It shows the author's father and was taken circa 1944. He is wearing a tie and his hair is well-combed. The pose suggests that the portrait might be of a young soldier, taken as a memento for his parents -- except that he looks as if he may not yet be old enough to shave.
And it turns out all of this is true. The son of a tenant farmer in Mississippi, Watkins Sr. enlisted in the army at the age of 16. He claimed to be older, of course, and to have graduated high school, although his formal education actually ended in the fourth grade.
Thanks to the GI Bill, the adolescent tank-commander in that photograph later went to night school to get his equivalency degree, then attended Louisiana State University. This prepared him for a successful career as a utilities analyst for the city of Houston. He died in the early 1980s, not long after the author began his own higher education.
"No one in his immediate family had attended school much beyond the middle- or high-school level," writes Watkins Jr.. "His family, my mother tells me, saw college as a kind of indulgence and thought that any young man could better spend his time earning a living. Before he entered LSU, then, it is likely that my father had only the roughest approximation of what a university education might entail.... My father left college with more than professional skills; he graduated with a larger sense of the purposes of education that made it imperative for his children as well."
This is a story of upward mobility, then, with economic security as its goal. But that is not all that was transmitted from father to son. To go from seeing education as a needless luxury to regarding it as an urgent necessity for one's children involves a deep change of ethos. Watkins tries to reconstruct this process through a close reading of any material he can find from his father's education -- in particular the textbooks for his courses on composition and literature at LSU during the late 1940, which left him with the skills needed to produce the sort of expository prose required in the professional workplace.
As it happened, the English department at LSU was also then an epicenter of the New Criticism, whose practitioners tried to teach students to read literary works with an eye to how their language worked. "It seems reasonable to assume that my father's lack of previous education made the inculcation of this sensibility difficult at best....The academic triumph of New Critical literary education in the English department had strict limits, clearly marked in my father's transcript."
But the effort had its effect, even so. It meant that Watkins Sr. could recognize that there might be something worthwhile about the ability to read for pleasure. And so it is that -- two generations after functional literacy was the family norm, but anything beyond it regarded with misgivings -- the author could end up writing a master's thesis on Paul de Man, getting a Ph.D., and teaching at various institutions.
This, then, is not an academic autobiography so much as an educational genealogy. The author is tracing back to their sources the conditions of possibility for his own existence. But it is not particularly introspective. There are no prose-poetical arias. The writing is unsentimental.
Instruction in expository composition left Watkins Sr. in command of an efficient, objective, no-frills style: the equivalent of a professional demeanor that could zero in on facts, while keeping subjective expression to a bare minimum. The son honors that ability with a narrative voice that is so precise in conveying the man's likes and habits and expectations of life that you are left with a sense of having met him -- yet with only a hint at the depths of feeling that it must have stirred in him to tell the story.
In that sense, A Taste for Language is not memoiristic, either. Digging through his father's textbooks and situating them in the history of language study as a discipline, Watkins is doing scholarship. And his research has implications that are not strictly personal.
People who come from a long line of securely middle-class professionals can take a certain amount of inherited cultural capital for granted. In Watkins's case, that is not an option. His recognizes that he has been shaped, however indirectly, by educational influences that were being exercised on him before he was even born. His father's upward mobility was in part the product of the pedagogical labor of writing instructors. Talk about "the life of the mind" can get highfalutin and self-aggrandizing at times. There is something to say for grasping how much of it is the result of institutional processes that go largely unnoticed.
That, in turn, raises questions about how well the present arrangement works. "On the one hand," Watkins writes, "we must accept our students' vocational goals as legitimate expressions of their desire to maintain or strengthen their economic position; on the other, we must seek out ways to persuade them that the contemplative, reflective traditions of the academy are important to their professional and social futures. Indeed, our goals ought to be even larger: to convince students that in spite of their apparent impracticality, the critical methodologies of the school have immediate professional application. Alertness to injustice isn't simply helpful in 'society in general'; it is necessary in the immediate, specific context of the work site."
Whether this can be realized in practice is, of course, another matter. Back when Watkins's father went to college, composition and literature were part of the same discipline. But that has not been the case for some time. Most training in composition is done by part-time or adjunct instructors. That arrangement, in turn, reflects a set of priorities in which such training is treated as a necessary but (at best) secondary function of the university. Which, in turn, reinforces the tendency for the rewards of higher education to go to students who arrive with adequate stocks of inherited cultural capital. It is an arrangement that seems almost as if it were designed to sustain inequality, rather than narrowing it.
"A two-tiered system of a few well-paid and independent literary teachers and researchers working side-by-side with poorly compensated part-time composition teachers would hardly support the interests of our profession, our students, or our society," writes Watkins. He calls for unionization of instructors in composition and literature as a first step towards mitigating this situation.
This reader, at least, wanted to applaud. Without the ability to bargain collectively, it's hard to see how the casualisation of academic labor will ever end. But it does rather raise the question of whether the expectation of upward mobility is not so ingrained in the professionalized middle-class as to make solidarity an almost unimaginable ideal.
Since 2000, I've been the host of the Wimba Distinguished Lecture Series, shouting from the rooftops (well, desktops) about how to use modern educational technologies to teach effectively online. But now, after evangelizing for the last decade, I'm switching sides. I am teaching creative writing online as an adjunct professor for Holmes Community College, in Goodman, Mississippi. How the tables have turned.
I've probably led more webcasts than anyone on the planet. Seriously. I've hosted webcasts at least once a week for 10 years and I've also given thousands of other online presentations. From presentations about educational technologies and policies, to effective instructional techniques, I've done it. But now I'm tasked with teaching – online – creative writing, a topic that traditionally uses a workshop format, a format that is quite difficult to replicate in a virtual environment. Yet it's not the format that worries me.
You see, this is my first time teaching a college course. Though I've led writing workshops, collaborated with writers and journalists here in New York, contributed to numerous publications, and even penned my own book, I now fretfully ready myself to formally – and virtually – mold young (and a few moldier) minds at a college more than 1,000 miles away from my life here in New York. But I can’t wait. I can't wait to familiarize my students with exemplary works of poetry, fiction and nonfiction. I can't wait to answer my students' questions and hear their insights. I can't wait for my students to learn from me and for me to learn from them. I'm nervous. But I'm ready. I think. So in the immortal inquiry asked by David Byrne: Well, how did I get here?
Let’s start by looking at the Ed Tech industry first.
When wearing my Wimba hat, I often remind my audience that it’s only been about a decade since the modern format of online courses was put into place. The current configuration of combining course management systems, web conferencing, instant messaging, message boards, etc. to teach a class to students in a classroom and/or their pajamas barely existed in the 20th century, so when one stops to consider the idea that collegiate courses had been taught (more or less) in the exact same manner since ancient Egypt, Greece, and Mesopotamia, it’s quite startling to see how quickly this transformation has transpired.
Obviously this format of modern courses is still being tweaked, but it certainly appears that much of the technological and pedagogical foundation is firmly in place. As of today, the dawn of the ‘10s, tens of thousands of postsecondary faculty, either because of or in spite of their ability and/or willingness, have already taken the plunge and incorporated technologies into their courses – often with a great deal of success.
I’ve written numerous research documents boasting both the tangible and intangible benefits of technology-enabled courses. Countless examples of institutions around the globe that have seen benefits such as increased retention rates of students, increased enrollments, improved graduation rates, and dollars saved on time and travel, all thanks to technology in the classroom, fill the pages of these documents. In fact, I’ve seen so many positive examples of technology-enabled education over the years that I now have an extremely difficult time understanding why any institution wouldn’t beef up its current online offerings. The downside is just so negligible while the upside is so great.
But I digress. After all, I’ve now got my own class to worry about.
A couple of months ago I left my comfy big-city confines and headed south to tiny Goodman for an on-site orientation for new faculty. I didn’t really know what to expect. I knew I’d have a big leg-up in terms of my knowledge of online course technologies, but I also knew I’d have a big leg-down in terms of my knowledge of classroom instruction. Turns out I was dead-on.
My two Holmes Community College trainers that day explained the ins and outs of being an online instructor to me and the approximately 10 others in the room, all of whom had collegiate teaching experience. At least my tech savviness made up for the in-front-of-a-class-savviness I lacked. But even though I was already familiar with the Blackboards and SunGuards of the world, I didn’t realize how much about them I didn’t know. As my girlfriend always says, it’s hard to know what you don’t know.
My HCC trainers spent hours teaching me about Bb’s enrollment tools, grading and assessment functions, and how to withdraw students who need to drop out. Despite being around instructors for so much of my life, I guess I never truly grasped how much of teaching is actually administering. After a full day of technology training I left the campus very excited, but also very nervous. I kept picturing myself pushing the wrong button and accidentally unenrolling an eager student and then having to sheepishly write an email to the Holmes IT staffers informing them of my blunder.
But on the flipside, my nervousness also translated to eagerness. As I learned more about my prospective students – fervent 18-21-year-olds as well as working adults from around the country – I plotted the numerous ways in which I could engage them online. While driving from my orientation back to the Jackson airport I thought of at least 20 assignments that would combine best practices of teaching creative writing face-to-face with best practices of teaching online. In fact, by the time I reached the rental car return desk I could envision the thank-you letters I hoped to receive from my happy students who affably learned a few tips and tricks about writing with some flair.
Which brings me to today.
My lesson plans are done. My syllabus is up. My books are in the bookstore. But my mind still bursts with uncertainties (after all, I am a writer).
How well can they write? What do they already know? What don’t they know? From what kinds of experiences will they draw when they put pen to paper? Have they been to William Faulkner’s house up the road in Oxford? Will they mind if I occasionally swear? Will I understand them if they speak with thick drawls? Will their writing be better than mine?
The waiting is the hardest part. I wish I could invent time travel and get the first class over with.
The funny part is that I’m never this nervous when preparing and/or waiting to give presentations for Wimba, but I guess that’s because of my experience at the company. Hopefully I’ll read this op-ed a few years from now and laugh at how nervous I was. Man, I can’t wait to be a veteran writing teacher brimming with the confidence only gained from years of experience! Oh, how worn will the elbow pads of my tweed jackets be. Some day.
I discussed my trepidations with my family over the holidays, and my dad, drawing upon his 30 years of teaching experience, asked, “Do you have your opening speech ready?” I told him I did, but I lied, I guess because I don’t really need one. And this already demonstrates the difference between online and face-to-face.
When my class is ready to begin, someone from HCC’s technology department will simply hit a button in Blackboard, and then, in an instant, the class will be active. It won’t be the same as the first class of a face-to-face course. I won’t write my name in big letters on the chalkboard and won’t give a big dramatic speech about the wonders of writing creatively. Instead, my students will receive a message in their inboxes notifying them to watch the archive of a lecture I’ll record later this week. Sure, they’ll still see my talking head and hear the inflection of my nervous-yet-excited voice, but the impact might not be as a great as watching me forcefully pace back-and-forth in front of full lecture hall. Then again, perhaps the impact will be even greater because they’ll be equally nervous as they embark on a new class in a new medium.
Stay tuned for more as I tell my tales from the other side….
Matt Wasowski is senior director of customer programs at Wimba.
What 10 books have most influenced you? That question has launched many a discussion online in recent days.I’ve been scribbling down my own list while reading the replies – but also wondering just how we assess the presence of influence, let alone its relative intensity.
With some of the lists, it's hard to tell what the word means. If a person names J.R.R. Tolkien’s Lord of the Rings as an influence, what does that imply? Did he or she become a scholar of Anglo-Saxon literature? Go on an epic quest that saved the world? Write hobbit stories? Record a heavy-metal album with runes on the cover?
To cite something as an influence can be a way to emphasize that it yielded much satisfaction. But the term properly implies something more consequential than that. You didn't just consume and digest; you were consumed and digested in turn.
I greatly enjoy the TV series "Breaking Bad" yet do not feel that it is transforming my existence. It has not inspired me to cook and sell methamphetamines, or even to imagine this as a possible solution to midlife anomie. Hence I would not claim it as an influence, just yet.
What counts, then? In mulling this over, it became clear that some authors were just too influential to claim as influences, if you will forgive the paradox. I read quite a lot of Marx, Freud, and Nietzsche at an impressionable age, and this certainly left its mark. But putting them on the list seemed unnecessary, for their power is pretty nearly inescapable.It would be like pointing out that I have breathed a lot of oxygen in my day.
Anyway, enough prolegomenous throat clearing. On to the list...
(1) Bertrand Russell, Why I am Not a Christian
Until shortly before my 14th birthday, in the opening months of the Carter administration, I was a Christian fundamentalist who fully expected the apocalyptic scenario of the Book of Revelation to be worked out in world events during my lifetime. Please understand that I do not say that with even the slightest sense of irony.
At the time, I was also very keen on Blaise Pascal, who was definitely not a Southern Baptist but who had undergone a mystical experience giving him a deep conviction of the existence of a divine order. Bertrand Russell’s book must have been on a library shelf near Pascal. I started reading it to arm myself against the enemy.
Things did not work out that way. In his urbane and relentlessly logical manner, Russell broke down everything that I had taken to be axiomatic about the existence of God -- and about the terrible consequences of not believing. He seemed to anticipate every counterargument. I spent days -- and a few late nights – running through it in my head.
The experience was painful and terrifying. It shook me to my core; even that seems like an understatement. Nothing remained the same afterward. To repeat: Influence and pleasure are entirely different things.
(2) Allen Ginsberg, Howl and Other Poems
At 14, I thought this was the greatest book of poetry ever. The length and the rhythm of Ginsberg’s lines, his juxtapositions of imagery (“the crack of doom on the hydrogen jukebox”), the way his diction shifted into the biblical or the street-level obscene ... all of this made the hair stand up on the back of my neck. Which, as someone once, said, is how you can tell when poetry is working on you.
It also inspired many a page of my own literary efforts, now lost to posterity. Paper will burn, if you let it.
Today the Beat idea that suffering and madness and extremity bring wisdom does not strike me quite so appealing and romantic. I have been exposed to quite enough miserable, crazy, extreme people for one lifetime. (They come to Washington a lot, especially these days.) But I still love this book. The shorter poems in the back – written when Ginsberg himself was under the influence of William Carlos Williams – still seem very moving.
(3) Jorge Luis Borges, Other Inquisitions
The best way to discover Borges is probably through the short stories in Ficciones, or the selection of prose and poetry in Labyrinths. As chance had it, I first came across him by way of this volume of essays, in which criticism becomes a form of imaginative writing. For Borges, all of literature forms one big interconnected structure in which the books are, in effect, reading you. Many an academic article on intertextuality consists of an unwitting and usually witless gloss on Other Inquisitions.
My favorite passage comes at the end of “Kafka and His Precursors,” an essay of three pages that subtly transforms the very idea of “influence” itself: “The fact is that every writercreates his own precursors. His work modifies our conception of the past, as it will modify the future.”
(4) Susan Sontag, Against Interpretation and Other Essays
Borges combined erudition with playfulness. Sontag, by contrast, was an erudite person who sometimes tried very hard to be playful, more or less out of a sense of duty. I don’t think this worked out very well, and certainly not over the long run.
But in the early 1960s, she wrote a series of essays on literature, film, art, and ideas that remain exceptional and definitive. In them you feel a mind trying to open itself to as many possibilities as it can, sort of like Matthew Arnold dealing with being trapped in Andy Warhol’s Factory for a while. This book was the syllabus for my own reading and moviegoing for a few years after I first discovered it, and I go back to visit it from time to time, like a favorite neighborhood.
(5-6) Jean-Paul Sartre, pretty much anything in English translation as of the early 1980s
OK, admittedly this is cheating, since it would include dozens of volumes of philosophy, fiction, plays, and journalism. I would need to wedge the pertinent volumes of Simone de Beauvoir’s memoirs in there, as well. You do what you have to do. I feel sufficiently uneasy about this to let it claim two spots on the list, rather than just one.
Sartre embodied the writer as intellectual and activist. There is nobody even remotely comparable these days; don't accept cheesy knock-offs. The question of Sartre's legacy is too complicated to go into here, and I am ambivalent about much of it, now, in any case. But his work still provokes me – through inspiration or irritation or both – in a way that no living author’s work does.
Narrowing things down a bit: Two volumes of interviews and articles from his final decade or so, Life/Situations and Between Marxism and Existentialism, seem like quintessential books. The latter has recently been reissued by Verso.
(7) Norman Podhoretz, Making It
Published in late 1967, while Podhoretz still considered himself a liberal (his transformation into neoconservative ideologue would take a few more years), Making It is the story of one man’s relentless climb to eminence in the world of the New York literary-intellectual establishment.“One of the longest journeys in the world,” its opening sentence begins, “is the journey from Brooklyn to Manhattan....”
Reading this in Texas at the age of 19, I was not yet in a position to appreciate its full, rich ridiculousness, and instead studied the book as carefully as I once had any account of the act of love – preparing for the day when detailed information might prove useful, rather than just frustrating.
In a hurry to brush off the hayseeds, I managed to confuse cynicism with sophistication. Over time, this did a certain amount of damage -- some of it, fortunately, remediable. It is embarrassing to include this book on my list. That is why I am doing so.
(8) Richard Hofstadter, The Paranoid Style in American Politics
There are serious problems with Hofstadter’s analysis of the People’s Party of the 1890s. We can talk about the failings of the consensus school of U.S. historiography until the cows come home. I acknowledge these things without reservation. And yet this book is indispensable.
I first read it in the early 1980s and have revisited it at least once each decade since then. I know of no better description of the typical qualities and standard features of our public discourse in its barking-at-the-moon episodes. It reminds us that such upsurges do not come out of nowhere. This is not exactly a comfort, but it does help make the batshit insane seem at least somewhat intelligible.
One American television network has evidently adopted the book as the basis for its business model. But you can’t blame Hofstadter for that.
(9) Richard Wright, American Hunger
In the early 1940s, Richard Wright produced an autobiographical manuscript covering his life up to 1937. Most of it was appeared in 1945 as Black Boy, but the final section, covering his years as a member of the Communist Party, was published as a separate book in 1977.
I was very taken with it not simply for its account of the radical movement during the Depression but for Wright's account of his own struggle to become a writer. And all the more so given something the author's estate included in the original printing of the book. It was facsimile reproduction of one page of the typescript, covered with his handwritten revisions of the text -- lines crossed out, words changed, sentences rewritten, etc.
This came as a revelation. My assumption had been that once you learned how to write, well, you just wrote. (The struggle was just to get to that point.) I stared at the page for a long time, trying to figure out how Wright had known that a given phrase or sentence might be improved, especially since what he had down often looked fine.
Another form of influence: When a book teaches you how much you don't know about how much you don't know, and how much you need to know it.
(10) Richard Lanham, Revising Prose
Finding this volume in a secondhand bookstore was not, perhaps, the answer to a prayer. But the deep perplexity left by American Hunger certainly left me ready for it.
Half of learning to write is knowing how to recognize when a sentence or paragraph is bad, and why, and what can be done about this. Lanham teaches a handful of very basic skills necessary to begin reworking a draft. His manual is now in its fifth edition. I have no idea what changes may have been introduced in the past 25 years or so. But if a textbook ever changed my life, this one did.
For a long while now I have planned to write an essay about the habit of keeping a notebook, and have even, from time to time, started to take notes on the topic. By now there have accumulated more passages hectoring myself to settle down to work on it than pages containing actual insights. It seems the project has a short circuit.
But it may be that this reflects a basic tension within the notebook itself, considered as a genre of writing. On the one hand, it is turned towards the outside world; it is absorptive and assimilative, a tool for recording information, ideas, impressions. On the other, it is the ideal venue for self-consciousness to run amok. Even when a notebook is integral to a specific project, the writing always seems to be lacking something. Thoughts remain unfinished or provisional. You are moving but you aren't there yet. This can be frustrating. But then a notebook can also be where you can dig in your heels -- summoning up the confidence, or the vital reserves of energy, needed to continue.
Sometimes the notebook provides escape from the work in progress, rather than contributing to it. This is not necessarily a matter of procrastination.
The best essay on the notebook as workshop is probably “On Intellectual Craftsmanship” by C. Wright Mills.(See this column on it.) But an important supplement comes from Elias Canetti, who won the Nobel Prize for Literature in 1981. After spending decades on his idiosyncratic and sui generis work of scholarship Crowds and Power (1960), Canetti published a mordant essay on notebook-keeping called “Dialogue with the Cruel Partner.”
“One cannot avoid the fact,” he writes, “that a work being continued daily through the years may occasionally strike one as clumsy, hopeless, or belated. One loathes it, one feels besieged by it, it cuts off one’s breath. Suddenly, everything in the world seems more important, and one feels like a bungler... Every outside sound seems to come from a forbidden paradise; whereas every word one joins to the labor one has been continuing for so long, every such word, in its pliant adjustment, its servility, has the color of a banal and permitted hell.”
From such dark moods, the notebook offers a reprieve. When the writer “views himself as the slave of his goal, only one thing can help: he has to yield to the diversity of his faculties and promiscuously record whatever comes to his mind.... The same writer, normally keeping a strict discipline, briefly becomes the voluntary plaything of his chance ideas. He writes down things that he would never have expected in himself, that go against his background, his convictions, his modesty, his pride, and even his otherwise stubbornly defended truth.”
There is a third modality of the notebook habit – a matter of treating it, neither as the warehouse and workshop for a project nor as an escape from its demands, but as something like its own form of writing, imposing its own peculiar demands.
Joan Didion’s essay “On Keeping a Notebook” is astute on how this third mode is a function of temperament: “The impulse to write things down is a peculiarly compulsive one, inexplicable to those who do not share it, useful only accidentally, only secondarily, in the way that any compulsion tries to justify itself.” The fragments jotted down are “bits of the mind's string too short to use, an indiscriminate and erratic assemblage with meaning only for its own maker.”
The resulting collages of stray data and random insights are a way to keep track of one’s earlier incarnations, the personalities adopted and left behind in the course of a lifetime. “I think we are well advised to keep on nodding terms with people we used to be,” Didion writes, “whether we find them attractive company or not.”
As it happens, Canetti made much the same point. “The mechanisms one uses to make life easy are far too well-developed,” he writes. “First a man says, somewhat timidly: ‘I really couldn’t help it.’ And then, in the twinkling of an eye, the matter is forgotten. To escape this unworthiness, one ought to write the thing down, and then, much later, perhaps years later, when self-complacence is dripping out of all of one’s pores, when one least expects it, one is suddenly, and to one’s horror, confronted with it. ‘I was once capable of that, I did that.’ ”
On this account, then, notebooks are, in effect, an annex of the superego. My own notebooks play that role at times.They document opinions or enthusiasms that sometimes prove embarrassing, after a few years have passed. But they are also full of injunctions – usually to work harder, or to finish some project now gathering dust in one of the more workshop-like volumes, or to start studying X in a systematic fashion (and here’s the syllabus...).
Recently the text of Didion’s essay was posted at an online venue called The New Inquiry, which is something of a cross between a group blog and a salon (it sponsors face-to-face meetings in New York between readers and contributors) and seems to be in transition towards becoming a magazine. Its three founders are recent graduates of Columbia University and Barnard College.
The site itself is a kind of collective notebook. It made me wonder how the proprietors understood notebook-keeping – and whether digital technology influenced how they practiced it. My own habits are irremediably old-fashioned. A netbook is not a notebook, to my mind anyway, and I still do a lot of writing with pen in hand, even while exhorting myself to be more productive and efficient (a performative contradiction, if ever there were one). But being stuck in one’s own habits does not preempt curiosity about those of other people, so I asked the New Inquirists how they saw “notebooking.”
While she prefers to read from paper, Jennifer Bernstein, a New York-based writer, finds that reflecting on what she reads is another matter: “I often create a Word document in which to jot down the best ideas and quotations from a book. Then I end up reading commentary on the book and articles related to its theme, excerpts from which I also paste into the document, usually with my own thoughts. The document becomes a kind of mini-scrapbook, the record of my exploration of a concept (for example, one I did recently was conservatism in the 20th century). This isn’t a perfect method. It’s led to a proliferation of strangely titled documents on my hard drive that at some point I should probably sort through and systematize. On the other hand, the chaos reflects how my mind really works.”
Rachel Rosenfelt, a cultural critic living in Brooklyn, told me: “I've never been a paper-notebook keeper in the sense Didion means it. Or in most senses, really. When I moved out of my last apartment I unearthed a pocket notebook that I had bought years earlier to track my expenses. On the first page was written: ‘notebook- $3.14.’ That was the only entry.”
Instead, she uses whatever book she is reading as a recording surface. They end up “profaned,” as she puts it, “filled with unrelated scribblings in the front and back pages, marked up with underlines, stars and notes....The notes I take within the texts and margins of books work like a diary for me in that sense, and often have a second life online in the form of the ideas I formulate and write about on TNI and elsewhere.” One consequence is that Rosenfelt can never part with a book when she is done with it. After all, you don’t sell a diary.
The attitude of Mary Borkowski, an arts programmer for the Columbia University radio station WKCR, sounds closest to my own. “I'm a bit eccentric in that I rarely write anything initially on the computer,” she told me. “I compose most essays, letters, short stories, poems, even emails, in long hand and then transcribe them onto the screen. I do realize that writing in longhand is, well, time-consuming, but there is something about writing in longhand that is always more surreptitious, more crafty, almost silent -- the least painful way to wrench a thought from my mind.”
The exact format of “notebooking” matters less, Borkowski says, than the impulse to find “a canvas for the mind” – a place for “the spurts of thoughts and memes, blurps from the brain stems that have no order yet.” The notebook is “the outline before the outline.”
I sensed that The New Inquiry serves as a place to record (the preferred term now is “curate”) things its participants had read, and to gloss them if the spirit so moves. Jennifer Bernstein confirmed this: “I usually just post several cultural artifacts that I see as closely related, without comment (see this, for example). This format allows me to maintain the loose, associative connection between them (and to suggest that connection to others). Websites can accommodate all kinds of media, including audio and video, which allows juxtapositions that weren’t instinctive or even possible before.”
Besides “collective notebook-keeping in the form of group blogs,” Bernstein noted the potential of formats such as Google Documents, “where people can edit the very same text, or Wave, which supports all forms of media. Basic software innovations like Word’s Track Changes and Google Wave have multiplied the forms that commentary can take.”
But part of what I value about The New Inquiry is that its participants always seem at least somewhat ambivalent about the technologies they have grown up with – and this comes through in Mary Borkowski’s comments.
“We create tools for living,” she told me, “and they became objects that totally dominate us or we dominate them. Notebooking is then one of the last personal stands against the individual mind being dominated by outside forces, or having to 'think inside the box,' if you will. It's a 'secret,' 'private' outlet that used to exist in ledger or diary form but now, especially when we're so inundated by the busy-ness of technology, notebooking is a state of mind expressed in the time we are separated from our palm devices, or laptops, or phones. The notebooking state of mind comes up when we can think minus the chatter, when ideas clarify. Notebooking facilitates the spontaneity of creativity, thoughts that could occur at any moment, or random time -- the unaccounted for in our over-accounted for, micromanaged, lifehacking world. Notebooking is the place to process your thinking in a world that seems to only value the end product.”
About 20 years ago, while I was working in the Manuscript Division of the Library of Congress, one of my fellow archival technicians was a recently graduated Yalie who had been employed at one point by the Beinecke Rare Book and Manuscript Library. Yale University is home to, among other things, the Ezra Pound papers. "After a while,” my friend said, “you started to notice something about the Ezra Pound scholars. They looked like Ezra Pound. Not all of them, of course, but a lot of them did. You could tell when there was a conference because all these guys who looked like Ezra Pound were in the reading room.”
This raised questions, of course, about influence and causality: Did you start imitating Ezra Pound after studying him for a while, or was it that guys who already looked a little bit like the poet were more likely to specialize in him? Did people in other specialties or fields of study do this? We did not have people in powdered wigs showing up at the LC asking to see the papers of the Founding Fathers. Did they maybe have powdered wigs in their backpacks but thought better of it when they saw the security guards?
And so the conversation progressed after work, after beers. I forget what conclusions we reached, but that may be for the best.
Some of it came to mind a couple of weeks ago while I was back at my own alma mater, the University of Texas at Austin, standing in the lobby of the Harry Ransom Center, where there is a small display of a few items from the recently acquired papers of David Foster Wallace. I had made inquiries about having a look at the collection. It is still being processed, and I was told that doing so would only be possible during a return trip this fall. I imagined coming back in November to a reading room full of David Foster Wallace scholars -- unshaven guys in bandannas, presumably. So much for stealing a march on them....
The glass case in the Ransom Center lobby contains a few pages of the typescript of his novel Infinite Jest, and the page proofs of a biography of Borges that he wrote about forThe New York Times (full of the marginal and inside-the-cover notes a reviewer makes along the way), and also a poem about Vikings that he had written at the age of 7.
The display was a modest concession to public curiosity. While no amount of staring at it could spark in my brain any new insight into DFW's work, it had the virtue of being unsensationalistic. A writer who kills himself runs the risk -- and he must have known this -- of having his life and work turned into one long suicide note. That is both ghoulish and dumb, but perhaps understandable, given that the act of writing itself tends to be lacking in overt drama. It is easier to focus on the big exit than the steady application of backside to chair.
One small element of the display did have an emotional charge, at least for this viewer. Inside the cover of the Borges biography (which he ended up finding disappointing) Wallace recorded the word count and deadline his editor had given him when assigning the piece. There is absolutely nothing remarkable about either the note or its location; it is the kind of thing a reasonably efficient working writer jots down as a matter of course.
But there is a complex double-take involved in seeing Wallace in those terms: a genius, yes, but also, among other things, a reasonably efficient working writer, immersed in the everyday routines of that particular mode of being in the world.
Returning last week to the familiar clutter of my Inside Higher Ed cubicle -- a scene less of reasonable efficiency than entropic squalor -- I found that Broadway Books has sent a copy of David Lipsky’s new volume Although Of Course You End Up Becoming Yourself: A Road Trip With David Foster Wallace. It consists of the transcript of five days’ worth of conversations that Lipsky, a novelist and Rolling Stone contributing editor, had with Wallace in early 1996, when Infinite Jest had just appeared. There are a few pages of introductory material by Lipsky himself. They overlap a bit with the memorable article Lipsky published following Wallace’s death, but not that much, and anyone who has read the one should also check out the other.
Becoming Yourself is not that long a book (just over 300 pages, most of them well-ventilated with white space) but I found it a slow read, because something about the whole thing felt disquieting. Lipsky was accompanying Wallace on part of his book tour. Their discussions, recorded on tape, were meant to be raw material for a Rolling Stone profile that, for one reason or another, never quite came together. Although a sort of intimacy emerges, the whole thing is marked by the strained dynamic of self-consciousness squared -- for each of them is alert to the Goffmanian undercurrents of each step of the whole encounter, the way that each element of self-disclosure (whether by interviewer or by subject) is at least potentially a form of manipulation.
That tension will not come as a surprise to any reader of Wallace. The ratcheting-up of self-awareness, particularly as provoked and channeled by the mass media, is the vital pulse of his writing, whether fiction or non-. He never left you with the sense that he was exempt from it in its most inexorable and on-autopilot forms; on the contrary. But with pen in hand, he could, if not exactly regulate the pace and intensity of hyperlucidly self-conscious frames of mind, then at least do something with them, creatively.
Not so here. At times Wallace finds himself at sea, treading water, going in circles. His comments, made between stints of promoting Infinite Jest, are riddled with a sense of complicity in something he understands as both necessary and dubious. (Commercially necessary; existentially dubious.)
He has, he says, “written a book about how seductive image is, and how very many ways there are to get seduced off any kind of meaningful path, because of the way the culture is now. But what if I become this grotesque parody of just what the book is about? And of course, this stuff drives me nuts.... So the next level of complication is, do I congratulate myself on my worry and concern about all this stuff, because it is a sign that I’ve not been seduced about it? And then of course, if I get happy about that, then I’ve lost the edge – I mean, there’s just no end to the little French curls of craziness you can go through about it.”
True, that. But these are remarks, and remarks are not literature. Over time it becomes obvious that Wallace is in perfect earnest about the fear of distraction from work – from making literature, that is, rather than being part of the culture industry, with its ambient sound (as he puts it) of “this enormous hiss of egos at various stages of inflation and deflation.” Wallace is the most eloquent in the passages where, no mistake about it, you can hear his desire to stop talking.
Best to end, then, with one of them:
“What writers have is a license and also the freedom to sit – to sit, clench their fists, and make themselves excruciatingly aware of the stuff that we’re mostly aware of only on a certain level. And that if the writer does his job right, what he basically does is remind the reader of how smart the reader is. Is to wake the reader up to stuff that the reader’s been aware of all the time. And it’s not a question of the writer having more capacity than the average person. It’s that the writer is willing I think to cut off, cut himself off from certain stuff, and develop ... and just, and think really hard. Which not everybody has the luxury to do. But I gotta tell you, I just think to look across the room and automatically assume that somebody else is less aware than me, or that somehow their interior life is less rich, and complicated, and acutely perceived than mine, makes me not as good a writer. Because that means I’m going to be performing for a faceless audience, instead of trying to have a conversation with a person.”
Usually the reader imitates the author -- hoping to absorb that state of grace or genius, or at least to share in its aura. Here the roles have shifted, the polarities reversed. This is why his death seems such a loss. Reading him, there was the sense that he understood the way we live now. He would tell us what we knew about it. Almost knew, but not yet.
What's in a name? that which we call a rose By any other name would smell as sweet.
These lines from Romeo and Juliet are often quoted to indicate the triviality of naming. But anyone who has read or seen the play through to its end knows that the names Montague and Capulet indicate a complex web of family relationships and enmities that end up bringing about the tragic deaths of our protagonists.
Lore also has it that Shakespeare's lines were perhaps a coy slam against the Rose Theatre, a rival of his own Globe Theatre, and that with these lines he was poking fun at the stench caused by less-than-sanitary arrangements at the Rose.
I write now in response to the naming of a newly created department at my large state university called "the Department of Writing and Rhetoric." This new department is being split off from the English department and given the mandate to install a new Writing Across the Curriculum program, convert adjunct positions to "permanent" instructor positions, and establish a related B.A. degree.
While the acronym WAR may seem appropriate to some of my colleagues, many of them think we have more important things to worry about than a name right now. We have also been repeatedly told in the face of previous protests that referring to Composition as Writing is a trend nationwide. Nonetheless, I believe that this title is an indication of bad faith and a negative harbinger for the work of the new department and programs like it elsewhere.
Since the announcement of this change, I attended a tenure party for a colleague in another department. Every single person I spoke with at this party assumed from the title of the new department that "all" writing would be taught there, including my field of Creative Writing. People repeatedly asked me what I thought about being in a new department, and I repeatedly corrected them as confusion spread over their faces. They couldn't understand how the Department of Writing and Rhetoric would not include the writing of fiction, poetry, and so on. I repeatedly had to say that “Writing” in this usage means Composition. They repeatedly asked me why, then, the department will be using the title Writing.
That's a very good question, and one that indicates something disturbing, not just here, but in that nationwide naming trend mentioned above and so often cited. Referring to programs in Composition by the title "Writing" indicates that this field is the authority over all meaningful types of writing – in all other fields. By implication, it implies that no other type of writing but what Composition Studies teaches is valid or important – or even exists. Both of these claims are demonstrably false, although they are the silent assumptions that often underlie Composition's use of the term Writing to describe itself.
Perhaps even more disturbing is that using the name Department of Writing and Rhetoric indicates a willingness to write badly in order to empire-build. Good writing is always about clarity and insight, precision and accuracy. Therefore, this confusing name calls into question the very quality of the writing instruction that will be given in the new department. If the department cannot and will not name itself accurately, then what does that bode for the students to be educated there?
Don't get me wrong. I also differ from some of my colleagues in that I am happy about the creation of the new department. Composition is an upstart field that, like my own of Creative Writing, has often not gotten its due. Partly this is because it stems from a remedial function -- Composition became necessary when the sons and daughters of the working class began attending colleges and universities and were not adequately prepared in the finer points of belles lettres.
Naturally, due to the fact that the background -- and the goals -- of these individuals differed from those of the upper classes that had established belles lettres, Composition began to explore and defend less artistic, more practical forms of writing. This evolution differs from that of such programs in mathematics, for instance, where remedial algebra still focuses on the same formulas as those used in advanced courses. In Composition Studies and Writing Across the Curriculum programs, there has been a focus on supplanting the literary scholarly essay as the gold standard of writing. In the past few decades, Composition as a field has worked hard to establish the legitimacy and importance of other forms of writing and their teaching. Much of this effort I admire.
I am also happy that Composition will be given resources long absent. Having taught Composition courses myself for several years, I understand the need for acknowledgment and support, even if the specifics of the plan at my university have not been widely shared or discussed and seem to me based on suspect methods. I wish the new department nothing but the best in its attempts to improve basic writing instruction for our students.
However, many in the field of Composition have also brought resentment of old wounds and insults to bear by attempting to claim that it is foundational and that it is the expert in all types of writing. Advocates for the field have accomplished this by theorizing what they do and by selling it to those in other fields as the answer to literacy. Among other things, they have also tried to change its name to something less associated with its remedial roots and more grandiose in its scope. However, it remains the case that Composition Studies does not represent a universal approach to literacy, critical thinking, or writing.
In my own field of Creative Writing, for instance, we have far different assumptions about what constitutes effective writing instruction. Admittedly, we have somewhat different purposes. But let me also point out that the rise of Composition Studies over the past 30 or 40 years does not seem to have led to a populace that writes better.
In fact, it has coincided with a time when literacy rates have dropped and where complaints about the poor writing skills of college and university graduates (especially of large public universities) have continued to rise. Obviously many complex social factors contribute to this. It is also debatable whether universities have contributed to this state of affairs because the changing methods of teaching Composition are misguided or because there simply haven't been enough resources. I'm all for giving Composition the resources it needs, respecting its right to self-determination in its field, and letting us see what happens. I am all for the general population writing better, even if it is in an instrumental and limited form disconnected from the literary traditions that have fed most love of and respect for the written word in our culture.
Beyond the details of these various professional debates, my negative reaction to the new departmental name stems from the corruption of language that is so prevalent in our society today, where advertisers and politicians and many others lie through exaggeration, omission and indirection. The best analysis of this is perhaps Toni Morrison's 1993 Nobel Lecture in Literature. In it she talks about uses of language that are destructive, about language that obscures rather than clarifies, and how so often such language "tucks its fascist boots under crinolines of respectability and patriotism as it moves relentlessly toward the bottom line and the bottomed-out mind."
If we put the writerly education of our students into the hands of people who insist on rejecting the accurate term Composition for the grandiose and unclear one Writing, what will they learn? They will learn, I am afraid, that they can say whatever they want, even if it is sloppy, confusing, manipulative, or a knowing lie.
Misnaming this department also evokes the negative definition of the title's other half: Rhetoric. In academe we know that rhetoric can be "the study of effective use of language," but most of the world is more familiar with rhetoric defined as "the undue use of exaggeration and display; bombast." This latter definition seems apt when combined with Writing in this name.
I, for one, will never call it the Department of Writing and Rhetoric. I will call it what it actually is: the Department of Composition and Rhetoric. If its practitioners truly respected their own history, they would call it that, too. A "rose" sometimes can smell not so sweet, especially if it turns out not to be a flower at all.
Lisa Roney is associate professor of English and coordinator for the undergraduate Creative Writing program at the University of Central Florida.