English

C.L.R. James Meets Tony Soprano

Half a century before "The Sopranos" hit its stride, the Caribbean historian and theorist C.L.R. James recorded some penetrating thoughts on the gangster -- or, more precisely, the gangster film -- as symbol and proxy for the deepest tensions in American society. His insights are worth revising now, while saying farewell to one of the richest works of popular culture ever created.

First, a little context. In 1938, shortly before James arrived in the United States, he had published The Black Jacobins, still one of the great accounts of the Haitian slave revolt. He would later write Beyond a Boundary (1963), a sensitive cultural and social history of cricket – an appreciation of it as both a sport and a value system. But in 1950, when he produced a long manuscript titled “Notes on American Civilization,” James was an illegal alien from Trinidad. I have in hand documents from his interrogation by FBI agents in the late 1940s, during which he was questioned in detail about his left-wing political ideas and associations. (He had been an associate of Leon Trotsky and a leader in his international movement for many years.)

In personal manner, James was, like W.E.B. DuBois, one of the eminent black Victorians -- a gentleman and a scholar, but also someone listening to what his friend Ralph Ellison called “the lower frequencies” of American life. The document James wrote in 1950 was a rough draft for a book he never finished. Four years after his death, it was published as American Civilization (Blackwell, 1993). A sui generis work of cultural and political analysis, it is the product of years of immersion in American literature and history, as well as James’s ambivalent first-hand observation of the society around him. His studies were interrupted in 1953 when he was expelled by the government. James was later readmitted during the late 1960s and taught for many years at what is now the University of the District of Columbia.

American Civilization's discussion of gangster films is part of James's larger argument about media and the arts. James focuses on the role they play in a mass society that promises democracy and equality while systematically frustrating those who take those promises too seriously. Traveling in the American South in 1939 on his way back from a meeting with Trotsky in Mexico, James had made the mistake of sitting in the wrong part of the bus. Fortunately an African-American rider explained the rules to him before things got out of hand. But that experience -- and others like it, no doubt -- left him with a keen sense of the country's contradictions.

While James's analysis of American society is deeply shaped by readings of Hegel and Marx, it also owes a great deal to Frederick Jackson Turner’s theory of “the closing of the frontier.” The world onscreen, as James interpreted it, gave the moviegoer an alternative to the everyday experience of a life “ordered and restricted at every turn, where there is no certainty of employment, far less of being able to rise by energy and ability by going West as in the old days.”

Such frustrations intensified after 1929, according to James’s analysis. The first era of gangster films coincided with the beginning of the Great Depression. “The gangster did not fall from the sky,” wrote James. “He is the persistent symbol of the national past which now has no meaning – the past in which energy, determination, bravery were sure to get a man somewhere in the line of opportunity. Now the man on the assembly line, the farmer, know that they are there for life; and the gangster who displays all the old heroic qualities, in the only way he can display them, is the derisive symbol of the contrast between ideals and reality.”

The language and the assumptions here are obviously quite male-centered. But other passages in James’s work make clear that he understood the frustrations to cross gender lines -- especially given the increasing role of women in mass society as workers, consumers, and audience members.

“In such a society,” writes James, “the individual demands an aesthetic compensation in the contemplation of free individuals who go out into the world and settle their problems by free activity and individualistic methods. In these perpetual isolated wars free individuals are pitted against free individuals, live grandly and boldly. What they want, they go for. Gangsters get what they want, trying it for a while, then are killed.”

The narratives onscreen are a compromise between frustrated desire and social control.“In the end ‘crime does not pay,’” continues James, “but for an hour and a half highly skilled actors and a huge organization of production and distribution have given to many millions a sense of active living....”

Being a good Victorian at heart, James might have preferred that the audience seek “aesthetic compensation” in the more orderly pleasures of cricket, instead. But as a historian and a revolutionary, he accepted what he found. In offering “the freedom from restraint to allow pent-up feelings free play,” gangster movies “have released the bitterness, hate, fear, and sadism which simmer just below the surface.” His theoretical framework for this analysis was strictly classical, by the way. James was deeply influenced by Aristotle’s idea that tragedy allowed an audience to “purge” itself of violent emotions. One day, he thought, they would emerge in a new form -- a wave of upheavals that would shake the country to its foundations.

In 6 seasons over 10 years, “The Sopranos” has confirmed again and again C.L.R. James’s point about the gangster is an archetypal figure of American society. But the creators have gone far beyond his early insights. I say that with all due respect to James’s memory – and with the firm certainty that he would have been a devoted fan and capable interpreter.

For James, analyzing gangster films in 1950, there is an intimate connection between the individual viewer and the figure on the screen. At the same time, there is a vast distance between them. Movies offered the audience something it could not find outside the theater. The gangster is individualism personified. He has escaped all the rules and roles of normal life. His very existence – doomed as it is – embodies a triumph of personal will over social obligation.

By contrast, when we first meet Tony Soprano, a boss in the New Jersey mob, he is in some ways all too well integrated into the world around him. So much so, in fact, that it is giving him panic attacks from trying to meet all the demands from juggling the different roles he plays. In addition to being pater of his own brood, residing in a suburban McMansion, he is the dutiful (if put-upon) son in a dysfunctional and sociopathic family.

And then there are the pressures that attend being the competent manager of a successful business with diversified holdings. Even the form taken by his psychic misery seems perfectly ordinary: anxiety and depression, the tag-team heart-breakers of everyday neurosis.

James treats the cinematic gangsters of yesteryear as radical individualists – their crimes, however violent, being a kind of Romantic refusal of social authority. But the extraordinary power of “The Sopranos” has often come from its portrayal of an almost seamless continuum between normality and monstrosity. Perhaps the most emblematic moment in this regard came in the episode entitled “College,” early in show’s first year. We watch Tony, the proud and loving father, take his firstborn, Meadow, off to spend a day at the campus of one of her prospective colleges. Along the way, he notices a mobster who had informed to the government and gone into the witness protection program. Tony tracks the man down and strangles him to death.

At the college he sees an inscription from Hawthorne that reads, “No man ... can wear one face to himself and another to the multitude, without finally getting bewildered as to which one may be true." Earlier, we have seen Tony answer Meadow’s question about whether he is a member of the Mafia by admitting that, well, he does make a little money from illegal gambling, but no, he isn't a gangster. So the quotation from Hawthorne points to one source of Tony’s constant anxiety. But it also underscores part of the audience’s experience – an ambivalence that only grows more intense as “The Sopranos” unfolds.

For we are no more clear than Tony is which of his faces is “true.” To put it another way, all of them are. He really is a loving father and a good breadwinner (and no worse a husband, for all the compulsive philandering, than many) as well as a violent sociopath. The different sides of his life, while seemingly distinct, keep bleeding into one another.

Analyzing the gangster as American archetype in 1950, C.L.R. James found a figure whose rise and fall onscreen provided the audience with catharsis. With “The Sopranos,” we’ve seen a far more complex pattern of development than anything found in Little Caesar or High Sierra (among other films James had in mind).

With the finale, there will doubtless be a reminder – as in the days of the Hays Code – that “crime does not pay.” But an ironized reminder. After all, we’ve seen that it can pay pretty well. (As Balzac put it, “Behind every great fortune, a great crime.”) Closure won’t mean catharsis. Whatever happens to Tony or his family, the audience will be left with his ambivalence and anxiety, which, over time, we have come to make our own.

Author/s: 
Scott McLemee
Author's email: 
scott.mclemee@insidehighered.com

Scott McLemee writes Intellectual Affairs each week. Suggestions and ideas for future columns are welcome.

Requiem for a Heavyweight

Word that Richard Rorty was on his deathbed – that he had pancreatic cancer, the same disease that killed Jacques Derrida almost three years ago – reached me last month via someone who more or less made me swear not to say anything about it in public. The promise was easy enough to keep. But the news made reading various recent books by and about Rorty an awfully complicated enterprise. The interviews in Take Care of Freedom and Truth Will Take Care of Itself (Stanford University Press, 2006) and the fourth volume of Rorty’s collected papers, Philosophy as Cultural Politics (Cambridge University Press, 2007) are so bracingly quick-witted that it was very hard to think of them as his final books.

But the experience was not as lugubrious as it may sound. I found myself laughing aloud, and more than once, at Rorty’s consistent indifference to certain pieties and protocols. He was prone to outrageous statements delivered with a deadpan matter-of-factness that could be quite breathtaking. The man had chutzpah.

It’s a “desirable situation,” he told an interviewer, “not to have to worry about whether you are writing philosophy or literature. But, in American academic culture, that’s not possible, because you have to worry about what department you are in.”

The last volume of his collected papers contains a piece called “Grandeur, Profundity, and Finitude.” It opens with a statement sweeping enough to merit that title: “Philosophy occupies an important place in culture only when things seem to be falling apart – when long-held and widely cherished beliefs are threatened. At such periods, intellectuals reinterpret the past in terms of an imagined future. They offer suggestions about what can be preserved and what must be discarded.”

Then, a few lines later, a paradoxical note of rude modesty interrupts all the grandeur and profundity. “In the course of the 20th century," writes Rorty, "there were no crises that called forth new philosophical ideas.”

It's not that the century was peaceful or crisis-free, by any means. But philosophers had less to do with responding to troubles than they once did. And that, for Rorty, is a good thing, or at least not a bad one – a sign that we are becoming less intoxicated by philosophy itself, more able to face the need to face crises at the level (social, economic, political, etc.) they actually present themselves. We may yet be able to accept, he writes, “that each generation will solve old problems only by creating new ones, that our descendants will look back on much that we have done with incredulous contempt, and that progress towards greater justice and freedom is neither inevitable nor impossible.”

Nothing in such statements is new, of course. They are the old familiar Rorty themes. The final books aren’t groundbreaking. But neither was there anything routine or merely contrarian about the way Rorty continued to challenge the boundaries within the humanities, or the frontier between theoretical discussion and public conversation. It is hard to imagine anyone taking his place.

An unexpected and unintentional sign of his influence recently came my way in the form of an old essay from The Journal of American History. It was there that David A. Hollinger, now chair of the department of history at the University of California at Berkeley, published a long essay called “The Problem of Pragmatism in American History.”

It appeared in 1980. And as of that year, Hollinger declared, it was obvious that “‘pragmatism’ is a concept most American historians have proved that they can get along without. Some non-historians may continue to believe that pragmatism is a distinctive contribution of America to modern civilization and somehow emblematic of America, but few scholarly energies are devoted to the exploration or even the assertion of this belief.”

Almost as an afterthought, Hollinger did mention that Richard Rorty had recently addressed the work of John Dewey from a “vividly contemporary” angle. But this seemed to be the a marginal exception to the rule. “If pragmatism has a future,” concluded Hollinger in 1980, “it will probably look very different from the past, and the two may not even share a name.”

Seldom has a comment about the contemporary state of the humanities ever been overtaken by events so quickly and so thoroughly. Rorty’s Philosophy and the Mirror of Nature (Princeton University Press, 1979) had just been published, and he was finishing the last of the essays to appear in Consequences of Pragmatism (University of Minnesota Press, 1982).

It is not that the revival was purely Rorty's doing, and some version of it might have unfolded even without his efforts. In such matters, the pendulum does tend to swing.

But Rorty's suggestion that John Dewey, Martin Heidegger, and Ludwig Wittgenstein were the three major philosophers of the century, and should be discussed together -- this was counterintuitive, to put it mildly. It created excitement that blazed across disciplinary boundaries, and even carried pragmatism out of the provinces and into international conversation. I'm not sure how long Hollinger's point that pragmatism was disappearing from textbooks on American intellectual history held true. But scholarship on the original pragmatists was growing within a few years, and anyone trying to catch up with the historiography now will soon find his or her eyeballs sorely tested.

In 1998, Morris Dickstein, a senior fellow at the City University of New York Graduate Center, edited a collection of papers called The Revival of Pragmatism: New Essays on Social Thought, Law, and Culture (Duke University Press) -- one of the contributors to it being, no surprise, Richard Rorty. “I’m really grieved,” he told me on Monday. "Rorty evolved from a philosopher into a mensch.... His respect for his critics, without yielding much ground to them, went well with his complete lack of pretension as a person.”

In an e-mail note, he offered an overview of Rorty that was sympathetic though not uncritical.

“To my mind," Dickstein wrote, "he was the only intellectual who gave postmodern relativism a plausible cast, and he was certainly the only one who combined it with Dissent-style social democratic politics. He admired Derrida and Davidson, Irving Howe and Harold Bloom, and told philosophers to start reading literary criticism. His turn from analytic philosophy to his own brand of pragmatism was a seminal moment in modern cultural discourse, especially because his neopragmatism was rooted in the 'linguistic turn' of analytic philosophy. His role in the Dewey revival was tremendously influential even though Dewey scholars universally felt that it was his own construction. His influence on younger intellectuals like Louis Menand and David Bromwich was very great and, to his credit, he earned the undying enmity of hard leftists who made him a bugaboo."

The philosopher "had a blind side when it came to religion," continued Dickstein, "and he tended to think of science as yet another religion, with its faith in empirical objectivity. But it's impossible to write about issues of truth or objectivity today without somehow bouncing off his work, as Simon Blackburn and Bernard Williams both did in their very good books on the subject. I liked him personally: he was generous with his time and always civil with opponents.”

A recent essay discussing Rorty challenges the idea that Rorty “had a blind side when it came to religion.” Writing in Dissent, Casey Nelson Blake, a professor of history and American studies at Columbia University, notes that Rorty “in recent years stepped back from his early atheist pronouncements, describing his current position as ‘anti-clerical,’ and he has begun to explore, with increasing sympathy and insight, the social Christianity that his grandfather Walter Rauschenbusch championed a century ago.”

Blake quotes a comment by Rorty from The Future of Religion, an exchange with the Catholic philosopher Gianni Vattimo that Columbia University Press published in 2005. (It comes out in paperback this summer.)

“My sense of the holy,” wrote Rorty, “insofar as I have one, is bound up with the hope that someday, any millennium now, my remote descendants will live in a global civilization in which love is pretty much the only law. In such a society, communication would be domination-free, class and caste would be unknown, hierarchy would be a matter of temporary pragmatic convenience, and power would be entirely at the disposal of the free agreement of a literate and well-educated electorate.”

I'm not sure whether that counts as a religious vision, by most standards. But it certainly qualifies as something that requires a lot of faith.

Two items of great interest came to my attention too late to include in this column. One is the final interview with Rorty, conducted by Danny Postel just before the philosopher's death. The other is a tribute to Rorty by Jürgen Habermas.

Author/s: 
Scott McLemee
Author's email: 
scott.mclemee@insidehighered.com

An Anti-Progressive Syllabus

The first anthology of criticism I read in college was a low-budget volume edited by David Lodge entitled 20th-Century Literary Criticism. It was for an undergraduate class, the first one that spotlighted interpretation and opened a window onto graduate topics. A year later, this time an M.A. student at the University of California at Los Angeles, I took a required course on literary theory, with the anthology Critical Theory Since Plato (1971) edited by Hazard Adams. In a seminar not long after we toiled through Critical Theory Since 1965 (1986), edited by Adams and Leroy Searle, and another class selected Contemporary Literary Criticism: Literary and Cultural Studies (1989), edited by Ron Schleifer and Robert Con Davis. After I left graduate school, more literary/cultural criticism anthologies appeared along with various dictionaries and encyclopedias. The process seems to have culminated in The Norton Anthology of Theory and Criticism (ed. Vincent Leitch et al), whose publication in 2001 was momentous enough to merit a long story by Scott McLemee in The Chronicle of Higher Education that included the remark, “An anthology stamped with the Norton brand name is a sure sign of the field’s triumph in English departments.”

For McLemee to speak of “stamping” and “branding” was apt, more so than he intended, for every anthology assigned in class carries institutional weight. From the higher floors in the Ivory Tower, anthologies may look like mere teaching tools, and editing them amounts to service work, not scholarship. But while professors may overlook them except at course adoption time, for graduate students just entering the professional study of literature and culture, anthologies serve a crucial guiding function. Students apply to graduate school in the humanities because of their reading, the inspiration coming usually from primary texts, not critical works -- Swift not Barthes, Austen not Bhabha. They go into English because they like to read novels, or history because the past intrigues them, or philosophy because they want to figure out the great questions of life. Soon enough, they realize that joy, appreciation, moral musing, and basic erudition don’t cut it, and the first year or two entails an adjustment in aim and focus. The discourse is more advanced and specialized, critical and ironic. New and exotic terms emerge -- “hyperreal,” “hegemony,” “postcolonial” -- and differences between contemporary academic schools of thought matter more than differences between, say, one epoch and another.

Fresh students need help. What the anthologies do is supply them with a next-level reading list. The tables of contents provide the names to know, texts to scan, topics to absorb. In spite of the radical and provocative nature of many entries, the volumes mark a canon formation, a curriculum-building activity necessary for doctoral training. Plowing through them is not only a course of study but also a mode of professionalization, a way to join the conversation of advanced colleagues. As tutelage in up-to-date thinking, they strive for coverage, and to help young readers take it all in, they arrange the entries by chronology and by different categories. The Norton, for instance, contains an “Alternative Table of Contents” that divides contributors up by 42 classifications including “The Vernacular and Nationhood,” “Gay and Lesbian Criticism and Queer Theory,” and “The Body.”

As a poor and insecure 25-year-old in the mid-80s, I slogged through the selections one by one, and I thought that completing them would acquaint me with every respectable and serious current thread in literary and cultural thinking. But when I look back at them today, the anthologies look a lot less comprehensive. In fact, in one important aspect, they appear rather narrow and depleted. The problem lies in the sizable portion of the contributions that bear a polemical or political thrust. These pieces don’t pose a new model of interpretation, redefine terms, outline a theory, or sharpen disciplinary methods. Instead, they incorporate political themes into humanistic study, emphasize race/class/gender/sexuality topics, and challenge customary institutions of scholarly practice. When they do broach analytical methods, they do so with larger social and political goals in mind.

The problem isn’t the inclusion of sociopolitical forensic per se. Rather, it is that the selections fall squarely on the left side of the ideological spectrum. They are all more or less radically progressivist. They trade in group identities and dismantle bourgeois norms. They advocate feminist perspectives and race consciousness. They highlight the marginalized, the repressed, the counter-hegemonic. And they eagerly undo disciplinary structures that formed in the first half of the 20th century.

Reading through these titles (in the Norton: “On the Abolition of the English Department,” “Enforcing Normalcy,” “Talking Black,” “Compulsory Heterosexuality and Lesbian Existence,” etc.), one would think that all decent contemporary criticism stems from adversarial leftist impulses. There is nothing here to represent the conservative take on high/low distinctions, or its belief that without stable and limited cultural traditions a society turns vulgar and incoherent. Nothing from the libertarian side about how group identities threaten the moral health of individuals, or how revolutionary dreams lead to dystopic policies. The neoconservative analysis of the social and economic consequences of 1960s countercultural attitudes doesn’t even exist.

And yet, outside the anthologies and beyond the campus, these outlooks have influenced public policy at the highest levels. Their endurance in public life is a rebuke to the humanities reading list, and it recasts the putative sophistication of the curriculum into its opposite: campus parochialism. The damage it does to humanities students can last a lifetime, and I’ve run into far too many intelligent and active colleagues who can rattle off phrases from “What Is an Author?” and Gender Trouble, but who stare blankly at the mention of The Public Interest and A Nation at Risk.

This is a one-sided education, and the reading list needs to expand. To that end, here are a few texts to add to this fall’s syllabus. They reflect a mixture of liberal, libertarian, conservative, and neoconservative positions, and they serve an essential purpose: to broaden humanistic training and introduce students to the full range of commentary on cultural values and experience.

  • T.E. Hulme, “Romanticism and Classicism” (first published 1924). This essay remains a standard in Anglo-American modernist fields, but it seems to have disappeared from general surveys of criticism. Still, the distinctions Hulme draws illuminate fundamental fissures between conservative and progressive standpoints, even though he labels them romantic and classical. “Here is the root of romanticism: that man, the individual, is an infinite reservoir of possibilities; and if you can so rearrange society by the destruction of oppressive order then these possibilities will have a chance and you will get progress,” he says. The classicist believes the opposite: “Man is an extraordinarily fixed and limited animal whose nature is absolutely constant. It is only by tradition and organization that anything decent can be got out of him.” That distinction is a good start for any lecture on political criticism.
  • T.S. Eliot, “Tradition and the Individual Talent” (1919). Eliot’s little essay remains in all the anthologies, but its central point about the meaning of tradition often goes overlooked. Teachers need to expound why tradition matters so much to conservative thinkers before they explain why progressives regard it as suspect. Furthermore, their students need to understand it, for tradition is one of the few ideas that might help young people get a handle on the youth culture that bombards them daily and nightly. They need examples, too, and the most relevant traditionalist for them I’ve found so far is the Philip Seymour Hoffman character (“Lester Bangs”) in the popular film Almost Famous.
  • F.A. Hayek, The Counter-Revolution of Science (U.S. edition, 1952). Most people interested in Hayek go to The Road to Serfdom, but the chapters in Counter-Revolution lay out in more deliberate sequence the cardinal principles behind his philosophy. They include 1) the knowledge and information that producers and consumers bring to markets can never be collected and implemented by a single individual or “planning body”; and 2) local customs and creeds contain values and truths that are not entirely available to “conscious reason,” but should be respected nonetheless. Such conceptions explain why in 1979 Michel Foucault advised students to read Hayek and other “neoliberals” if they want to understand why people resist the will of the State. We should follow Foucault’s advice.
  • Leo Strauss, “What Is Liberal Education?” (1959). For introductory theory/criticism classes, forget Strauss and his relation to the neoconservatives. Assign this essay as both a reflection on mass culture and a tone-setter for academic labor. On mass culture and democracy, let the egalitarians respond to this: “Liberal education is the necessary endeavor to found an aristocracy within democratic mass society. Liberal education reminds those members of a mass democracy who have ears to hear, of human greatness.” And on tone, let the screen-obsessed minds of the students consider this: “life is too short to live with any but the greatest books.”
  • Raymond Aron, The Opium of the Intellectuals (English trans. 1957). Aron’s long diagnosis of the intellectual mindset remains almost as applicable today as it was during the Cold War. Why are Western intellectuals “merciless toward the failings of the democracies but ready to tolerate the worst crimes as long as they are committed in the name of the proper doctrines”? he asks, and the answers that emerge unveil some of the sources of resentment and elitism that haunt some quarters of the humanities today.
  • Francis Fukuyama, The End of History and the Last Man (1992). First formulated just as the Berlin Wall came tumbling down, Fukuyama’s thesis sparked enormous admiration and contention as the interpretation of the end of the Cold War. When I’ve urged colleagues to read it, though, they’ve scoffed in disdain. Perhaps they’ll listen to one of their heroes, Jean-Francois Lyotard, who informed people at Emory one afternoon that The End of History was the most significant work of political theory to come out of the United States in years.
  • Irving Kristol, Neoconservatism: The Autobiography of an Idea (1995). With the coming of the Bush administration, the term neoconservative has been tossed and served so promiscuously that reading Kristol’s essay is justified solely as an exercise in clarification. But his analyses of the counterculture, social justice, the “stupid party” (conservatives), and life as a Trotskyist undergraduate in the 1930s are so clear and antithetical to reigning campus ideals that they could be paired with any of a dozen entries in the anthologies to the students’ benefit. Not least of all, they might blunt the aggressive certitude of political culture critics and keep the students from adopting the same attitude.
  • David Horowitz, Radical Son: A Generational Odyssey (1997). Many people will recoil at this choice, which is unfortunate. They should not let their reaction to Horowitz’s campus activism prevent them from appreciating the many virtues of this memoir. It is a sober and moving account of America’s cultural revolution from the moral high points to the sociopathic low points. At the core lies the emotional and ethical toll it took on one of its participants, who displays in all nakedness the pain of abandoning causes that gave his life meaning from childhood to middle age. Students need an alternative to the triumphalist narrative of the Sixties, and this is one of the best.

Professors needn’t espouse a single idea in these books, but as a matter of preparing young people for intelligent discourse inside and outside the academy, they are worthy additions to the syllabus. Consider them, too, a way to spice up the classroom, to make the progressivist orthodoxies look a little less routine, self-assured, and unquestionable. Theory classes have become boring enough these days, and the succession of one progressivist voice after another deadens the brain. A Kristol here and a Hayek there might not only broaden the curriculum, but do something for Said, Sedgwick & Co. that they can’t do for themselves: make them sound interesting once again.

Author/s: 
Mark Bauerlein
Author's email: 
info@insidehighered.com

Mark Bauerlein is professor of English at Emory University.

Pottering Around

“I’m getting ready to work on Harry Potter for a month,” said Laurie Muchnick in late June. She edits the book section of Newsday, a newspaper based in Long Island. We’ve been friends for a decade now (as long as the Potter novels have been published, as coincidence has it) and the conversation was a completely casual one. So I half expected her to emit a sigh or a grumble, or to pause for a beat before adding, “Well, it’s going to feel like a month anyway.”

But no -- she meant it literally, and it didn’t sound like she minded. She’s been rereading the entire series. Since the start of July, Newsday has run one item on Pottermania per day, which is the sort of thing editors do only when firmly convinced that a significant share of the audience will want it. Not all of the paper’s cultural coverage has focused on Harry Potter, of course. But with the latest movie about the young wizard now in the theaters, and the seventh novel due out on July 21 -- and bookies no doubt offering odds on whether Harry lives or dies -- we are talking about a phenomenon now well beyond run-of-the-mill levels of public interest. According to the plan that J.K. Rowling drew up when she began the series, Harry Potter and the Deathly Hallows is supposed to be the very last volume, though skeptics wonder if the lure of a few millions dollars more won't inspire some new adventure down the line.

In the years since the author introduced her characters to the public, they have become beloved and meaningful; and not to children only. At present, the catalog of the Library of Congress records 21 volumes of criticism and interpretation on the novels, in six languages. A collection called Harry Potter and International Relations, for example, published by Rowman and Littlefield in 2006, analyzes the significance of Hogwarts, the academy of magical arts at which Harry trains, with respect to the nation-state and geopolitical realism. It also contains an essay (and I swear this is true) called “Quidditch, Imperialism and the Sport-War Intertext.” At least 17 doctoral dissertations and seven master’s theses had been devoted to the Harry Potter books, at least in part, as of last year. Chances are good that all these figures are on the low side.

A confession: I have never read any of the Harry Potter novels nor seen even one of the movies. Aficionados should not take this personally, for it has not been a matter of cultural snobbery or high principle, or even of deliberate policy. It is simply an effect of the scarcity of time -- of hesitation before a body of work that will, in due course, run to some 4,000 pages and (by my estimate) more than 17 hours of film.

On the other hand, I’ve long been intrigued by how certain works of fiction create such powerful force-fields that readers go beyond enthusiasm, developing relationships with characters and their world that prove exceptionally intense, even life-changing. Examples would include C.S. Lewis, Thomas Pynchon, Ayn Rand, and J.R.R. Tolkien. (They are listed in alphabetical order, so no angry letters on slights implied by the sequence, please.)

And one regular product of such fascination is the desire not only to study the fiction ever more closely, but to create works of analysis that, so to speak, map and chronicle the imaginary world. In effect, the fiction creates its own nonfiction supplement.

So it was interesting, though no means a surprise, to learn that there is an intensive course on Harry Potter at North Georgia College and State University this summer, taught by Brian Jay Corrigan, a professor of English whose more routine area of specialization is Renaissance literature. Students in the course are contributing to an encyclopedia that will cover -- as Corrigan puts it during an email interview -- “the geographic, historic, folkloric, mythic, and all other backgrounds informing the Harry Potter world.” He says an agent is shopping the project around to publishers in New York now.

One encyclopedia of Potteriana is already available. But with the appearance of the final novel, it will soon be out of date, and Corrigan’s effort will presumably have the advantages of closure and retrospective insight. It will also be enormous -- perhaps 250,000 words long, with hundreds of illustrations being prepared to go with the entries.

“After a year and a half in planning and five weeks of class,” Corrigan told me, “the ‘rough’ part of the project, collecting together all the grist, is about three quarters finished. We have already generated nearly 1,500 typed pages (650,000 words). There will be a polishing period that will whittle all of this into a usable format.” He expects that phase to last until the end of the fall semester.

After we discussed the work in progress a bit, I broached some reservations that kept crossing my mind about the whole idea of a course on Harry Potter. It’s not that it seems like a terrible idea. But mild ambivalence about it seemed hard to shake.

On the one hand, it’s hard to gainsay, let alone begrudge, the success of any work of fiction that made reading popular for a whole generation of kids. And it is not hard to appreciate the advantage of giving students a taste of literary scholarship through closer examination of work they already know. As someone admittedly ignorant of the primary materials in question, I picked up some sense of the case to be made for Harry Potter from an essay by Michael Bérubé in the latest issue of The Common Review, which conveys some appreciation for the structural intricacy of the books.

The tightly constructed plots and complex shadings of characterization in Rowling’s work has had a profoundly educational effect for Bérubé’s son Jaime, who has Down syndrome.

“Indeed,” writes Bérubé, “one of the complaints about Rowling’s creations [is] that they are too baroquely plotted, too ‘cloak and dagger and triple reversal with double axel’ ....But it’s astonishing to me that tens of millions of young readers are following Rowling through her five-, seven-, and even nine-hundred page elaborations on the themes of betrayal, bravery, and insupportable loss; it’s all the more astonishing that one of those tens of millions is my own ‘retarded’ child, who wasn’t expected to be capable of following a plot more complicated than that of Chicken Little. And here’s what’s really stunning: Jamie remembers plot details over thousands of pages even though I read the books to him at night, just before he goes to bed, six or seven pages at a time. Well, narrative has been a memory-enhancing device for some time now, ever since bards got paid to chant family genealogies and catalog the ships that laid siege to Troy. But this is just ridiculous.”

So yes, there is something to respect in what J.K. Rowling has achieved. At the same time, isn’t undergraduate education a potentially decisive moment when students ought to be introduced to a wider conception of culture -- something outside the familiar, the readily available, the comfortingly familiar?

Last week, reporters from CNN were on the North Georgia campus to film Corrigan’s students as they played a Quidditch match (that being a magical competition well-known to Potter afficianados). The segment will presumably air some time around the time the final volume of the series is released. It all sounds enjoyable for everyone involved. Yet as I think about it, the ghosts of Matthew Arnold and Theodore Adorno hover nearby. They look pained.

Now, Arnold was a Victorian sage; and Adorno, a relentless Marxian critic of mass culture; and I am guessing that neither of them is familiar with the particular educational challenges involved in teaching undergraduates at North Georgia College and State University during the era of high-speed wireless connectivity. They are out of touch. Still, it seems as Arnold and Adorno would prefer that kids learn to appreciate forms of cultural creation that will not in any way ever come to the attention of a cable television network.

Corrigan heard me out as the spirits channeled their complaints.

“I am a Shakespearean,” he said, “and one of my greatest regrets in my field is the damage done to our historical understanding of his works occasioned by his having been viewed as "base, common, and popular" in his own day. If only some farsighted intellectual had taken that theatre in Southwark seriously and done in Shakespeare's day what we are attempting today, we would all be richer for the experience....Who is to say what is ‘best’ until we first explore, evaluate, and ascertain? Why not allow the culture that is generating the thought also engage in that exploration and evaluation? Surely that is the aim of pedagogy, instilling curiosity while guiding intellect towards informed opinion.”

Corrigan also notes something that is often palpable when people discuss the impending publication of the final Potter novel. The phenomenon began with the appearance of the first volume during the summer of 1997. Millions of kids and their parents have grown up with the series. It has in some sense been a generation-defining experience, the meaning of which is, as yet, impossible fully to unpack. The intense involvement of readers has in part been a matter of the narrative’s open-endedness; but soon that will change.

“It might be said,” as Corrigan puts it, “ that we, as a class, are contributing to the scholarship of a future world. Undoubtedly ‘Potter-mania’ will cool, but the cultural phenomenon has been recorded and will be remembered. I am leading a group of people who are currently Harry's age (between 18 and 28 years old). Moreover, they are in Harry's age -- they grew up with Harry. We are creating a fly in amber. Never again will any scholar be able to approach Harry Potter from this perspective, not knowing how it will end.”

Next week, he says, “the story of Harry will have been published for the world to know, and no one will ever again be able to look at these books as we can today.... My students are doing far more than reading seven novels and writing essays on what they think. They are exploring backgrounds that inform this series and along the way are delving into many fields of study. As such, they are learning the interrelatedness of literature with the worlds of thought that inform the idea of a university.”

There is also the more prosaic sort of instruction that goes with preparing an encyclopedia. Corrigan says his class is acquiring “the practical skills of working to a real deadline, editing, and dealing with ‘real world’ situations such as slow contract negotiations and the minutiae of New York publishing houses. For a group of English students, many of whom interested in publishing, this is invaluable internship experience.”

The specters listen quietly, but they look skeptical. Matthew Arnold wants to point out that, after all, we do not continue to read Shakespeare because he was once popular in his day. Theodore Adorno is annoyed by the expression “invaluable internship experience” (evidently it sounds really bad in German) and starts to mutter about preserving the difference between the liberal arts and vocational training.

On Corrigan’s behalf, I argue in defense of his points. Aware that this can only mean I am talking to myself again, it seems like a good idea to check in with Laurie Muchnick at Newsday to find out how the “month of Potter” is going.

She mentions that she’s had a reporter looking into how bookstores have handled security, since nobody is supposed to have access to the books until midnight on July 21. “Nobody” includes reviewers. The publisher, Scholastic, doesn’t bother sending out the books, since kids don’t care what the critics think. So Muchnick expects to be at the store that night to pick up her reserved copy.

The newspaper has been publishing short pieces by readers on what they expect to happen in the final volume -- a matter, not of pure imagination, but of deduction from the previous six volumes. By next week, though, all mysteries will be resolved.

When I tell her about Corrigan’s course, Muchnick says she can see the possible pedagogical value, but wonders if the moment might not be passing soon.

“I’ve been rereading all of the books,” she says, “and it’s been really impressive to see how carefully Rowling has structured them. There are clues to things happening later that are embedded in the earlier volumes. I can see how they would merit a sort of close reading, the old New Critical approach of looking really closely at the text to see what is going on in it.”

That is, in effect, what fans have done with their time between books -- trying to figure out what comes next by reading between the available lines. Interpretation has been a way of continuing one’s involvement in the text while waiting for the next installment.

But the relationship between analysis and anticipation will soon change. “Will people still be as interested in hunting for clues once they know that the answers are actually available?” asks Muchnick. “I just don’t know. People will still enjoy the books, but probably not in the same way.”

Author/s: 
Scott McLemee
Author's email: 
scott.mclemee@insidehighered.com

Jane Austen, Yadda, Yadda, Yadda

Recently I was cornered by a university employee who knows I’m a scholar of British literature, specializing in Jane Austen.

“I started Pride and Prejudice last week,” he told me. “It’s one of those books I know I should have read, but I couldn’t get past the first few chapters.”

“Really,” I replied, eyebrows raised.

“Yeah, I just lost interest,” he went on. “I kept thinking to myself, ‘Oh, brother. I think I know where this is going.’”

Was this disarming honesty or throwing down the gauntlet? Was I being called out? Whatever it was, I shifted nervously as I listened to the rest of his monologue: “My theory is that the novel can be pretty much summed up as Elizabeth and Darcy meet, Elizabeth and Darcy hate each other, Elizabeth and Darcy fall in love, yadda, yadda, yadda.”

Reader, I stared at him blankly. Of course, I spent hours afterward constructing witty, cynical comebacks, such as “Yeah, I know what you mean. I have that response to episodes of VH1’s 'Behind the Music' and to reading the Bible.” But in the moment, all I managed to spit out was something clichéd and professorial resembling, “Hmm. That’s interesting. I think maybe it takes a few readings of Austen to really appreciate her fiction’s depth, humor, and irony.”

That’s also my stock answer to traditional-aged undergraduates on the first day of class -- 20-year-olds who confess that they’ve signed up for a literature class on Austen and her contemporaries because they absolutely love (or absolutely hate) her fiction -- or maybe just the film adaptations. Or Colin Firth or Keira Knightley or Clueless. The Austen-haters often claim to be taking the course because they want to understand what in the world is the big deal. A few of them end up seeing it by the end of the semester, a few more don’t, and that’s fine. But the yadda-yadda-yadda employee was a well-read, middle-aged guy with no sophomore excuse for being sophomoric. My gut reaction to his confession registered somewhere between crestfallen and incensed.

I'm having a similarly mixed reaction to the latest wave of Austen mania in the U.S. and U.K., shifting nervously, while approaching it with a combination of anxiety and dread. I know that all English professors worth their salt should be constructing some theories and responses now, in advance of being cornered by colleagues and co-workers and co-eds, so as not to have to resort to the professorial and clichéd. What will we say when asked about Anne Hathaway’s Becoming Jane (2007); about upcoming The Jane Austen Book Club film, with its star-studded cast; or about PBS’s planned 10-week winter 2008 airing of the Complete Jane Austen on "Masterpiece Theatre"?

What’s the witty, cynical comeback to this cultural flowering of Austen-related stuff, I find myself wondering: “Can’t wait to see it!” “Wish I’d thought of it first!” “The Decline and Fall of Austen’s Empire.” “A tippet in the hand is worth two in the bush.” “A stitch in the huswife saves nine.” “Don’t look a gift pianoforte in the mouth”?

But along with such repartee, we’ll also need to ready weightier observations. First, I believe it’s imperative that we call a moratorium on starting sentences with “It is a truth universally acknowledged,” as in, “It is a truth universally acknowledged that this is the first time in television history Austen’s complete works have been aired in succession.” In the coming months we will no doubt suffer through dozens of newspaper and magazine articles beginning, “It is a truth universally acknowledged.” Best not to add to the collective torture.

In addition, when constructing our soundbites, we ought not to forget the sheer breadth of today’s Austen craze; it’s more than just films and television adaptations we’re in for. New books have appeared, too, like Confessions of a Jane Austen Addict (2007) and Jane Austen for Dummies (2006). Though I worry that these books make reading her fiction sound like something done at an Alcoholics Anonymous meeting for slow learners, surely it’s not too late for some well-placed damage control?

After all, the Austen-inspired publicity stunts are already in full swing. Perhaps you’ve heard about the kerfluffle that unfolded over the pond, “Jane Austen Rejected!” Thinly veiled versions of Austen’s fiction were sent out to British publishers as new work, under the name of Allison Laydee (a.k.a. David Lassman), and all were rejected. Even Harlequin Mills & Boon passed on publishing adulterated Jane Austen plots. The horror! The horror!

But isn’t this is déjà vu all over again? Please raise your tussy mussy if you remember 10 or so years ago, when we were last inundated with Austen film and TV adaptations; with Bridget Jones novels and films; and with Austen board games, stationery, and editorial cartoons. Everyone then seemed to be asking, “Why Austen? Why now?”

The late 1990s were strange days for us longtime members of the Jane Austen Society of North America. It was as if we no longer had to apologize for indulging in our versions of wearing plastic Spock ears, whether quadrille, or quilling, or merely quizzing. Many of us became instant pundits among our friends, family, and the media, providing copy for everything from the Arkansas Democrat to The Wall Street Journal. Only a few periodicals continued to misspell Jane’s name as Austin, while many more managed to render correctly Bennet, Morland, and Love and Freindship. Oh, those were heady times.

If you were there, then you’ll no doubt recall that we came up with some pretty wild theories to explain the Jane train, too. Remember when Camilla Paglia said Austen’s popularity could be explained as a cultural symptom in reaction to the O.J. Trial, as people longed for stories in which no one was being butchered? That was a good one. Or how some claimed that the return to Austen was a result of the fin de siècle’s prompting us to take stock and return to works of past centuries? Seems pretty thin now. Others claimed that Austen’s resurgence happened because we needed to measure the worth of our male heroes, from Bill Clinton and Brad Pitt to Kurt Cobain and Ross Perot. (Jane Austen and Ross Perot?)

So here we are, circa 2007, finding ourselves in danger of being asked yet again, “Why Austen? Why now?” How delightful. How frightening. I’m determined not to be caught off guard, so I’ve constructed some all-purpose answers to explain the latest Austen craze, suitable for everything from The Nation to "Larry King Live" to Marie Claire. Anyone struggling for words is, of course, welcome to use these as conversational building blocks:

Option A: “Today’s Austen mania is a form of cultural compensation for the disaster of the Iraq War and for the genocide in Darfur. Her novels offer us a way to forget the world’s evils by allowing us to travel back to those halcyon post-French Revolutionary days of Napoleon.”

Option B: “Austen’s timeless narratives of women’s romantic searching provide a welcome distraction from the Supreme Court’s rolling back of abortion rights, as we yearn for an era when many women had the power to refuse a proposal of marriage.”

Option C: “Austen’s newfound popularity signals that empire-waist frocks are due for a fashion revival; that irony, having been shunned after 9/11, is back and better than ever; and that Wal-Mart will roll back prices on its imported teas.”

This list is just a draft of talking points. I still have a few more ideas to work out. For instance, can it be an accident that Austen’s popularity is surging, just as Jane magazine has gone defunct? There is certainly a quotable quip in the making there. Even if we don’t perfect our theories in the coming months, I don’t think there should be much cause for worry. Check back with me in 2013, the 200th anniversary of Pride and Prejudice’s publication. Oh, brother. I think I know where this is going.

Author/s: 
Devoney Looser
Author's email: 
info@insidehighered.com

Devoney Looser is associate professor of English at the University of Missouri at Columbia, and the author of British Women Writers and the Writing of History (Johns Hopkins University Press). She has just completed a book on British women writers and old age, to be published next year.

On Drivel

I recently received a draft of one of my dissertation chapters back from my advisor. As always, he provided copious comments -- advice on improving the coherence of my argument, smoothing out some ungainly syntax, and choosing more appropriate words. My advisor is scrupulous, perhaps excessively so. I have learned a great deal about how to think and write from his comments.

But my advisor is also a tough reader, and I find that after all these years of being a student I am still learning how to take criticism. To wit: in my recent draft, written in bold, red ink is one word that succinctly represents what he thinks of the passage -- “drivel.” I quickly forgot all of the good things he had said about my argument as I focused on this one word, brutally penned in the margin. My incisive points, my elegantly constructed sentences, all reduced to a one-word judgment.

I knew that drivel meant nonsense, but shame prompted me to consult a dictionary. I learned that its meaning was a metaphorical extension of its more literal definition: to let saliva dribble from the mouth. Nothing more vividly represents brazen stupidity than the image of someone drooling. There is something intrinsically repulsive about the act of drooling and as I thought about how that metaphor might apply to my writing, I literally gave a small shudder. Ouch! Was my prose the equivalent of drivel? Analogous to an unconscious trickle of spit?

Yes. My advisor was right. What I had written was drivel. The passage didn’t meaningfully contribute to the argument. In fact, it didn’t seem to be saying much of anything. When I looked at the passage more closely, I saw that it was largely comprised of a loosely stitched together sequence of conventional phrases: “it is the fact that,” “of course,” “indeed, he goes on to argue,” and “on the one hand,” “on the other hand.” It was the utter conventionality of the writing that made it drivel. The passage represented writing on auto-pilot, requiring little to no consciousness on my part. I might as well have been slobbering onto the page. Somewhere behind all the nonsense, I had an idea, but what it was I could not say. Responding to the simple, severe remark felt something like going through the stages of grief. I moved from denial (“surely it’s not that bad”) through anger (“what nerve!”) and toward acceptance (“yup, it’s bad”).

I thought about this experience in the context of my own work. I teach writing and literature at Salt Lake Community College, and every semester I comment on student papers. I identify flaws in their reasoning, give advice on style and punctuation, and even point out when they’ve made an original point or turned a neat phrase. I have never written the word drivel in the margin of one of my student’s papers, but I have been tempted to do so on more than one occasion. I believe almost every writing teacher has felt the impulse to heap ferocious criticism on students. Those who haven’t are far more saintly than I.

I suppose after I achieved acceptance came a feeling of admiration. By God, I wish I had the guts to write the word drivel in the margin of a student paper! Of course, I don’t include these temptations in the class of my finer instincts. The temptation is more on par, I think, with the cheap thrill I get when an action-hero utters a powerful one-liner. Sometimes I just want to be the Arnold Schwarzenegger of writing teachers. But I am not Arnold, nor was meant to be.

The experience prompted me to entertain some more serious lessons about how my experience as a graduate student may translate into my work as a teacher. As a teacher of writing, it’s good to be put in the position of student writer, to experience all of the fear, anxiety, and hopefulness that goes into producing a piece of writing that will be judged by an authority figure. It is both humbling and instructive to be told that you are wanting, that what you’ve produced isn’t up to par. Being both a student and a teacher has made me more sympathetic to my students. I know what it feels like to be criticized and I am more likely to consider the consequences of harsh feedback. In other words, it’s a way of inoculating myself against my adolescent, writing-teacher-as-action-hero fantasies. My experience speaks to the benefit of occasionally subjecting ourselves to the rituals of performance and assessment that we ask our students to perform. We do this, of course, with conferences and papers. Becoming an active participant in disciplinary conversations not only helps me build on my knowledge in the field, it makes me feel like a student all over again.

Yet I am reminded that criticism is a form of praise. My advisor cares enough to call my writing drivel when he sees it, not because he thinks I’m stupid but rather because he believes I am capable of producing something better than drivel. I did not ultimately wilt at the word. I do not believe that I possess a special inner strength that makes me uniquely capable of withstanding severe criticism. Perhaps, then, we are not harsh enough with our students, that in our well-meaning effort to encourage them we end by being less than honest with them.

But maybe there are no life lessons to be drawn from drivel. Drivel is irredeemable. One can’t turn around and reclaim drivel. Never, we can hope, will there be an endowed chair of Drivel Studies. And I don’t believe that drivel is one of those terms that one can, with a bit of vernacular judo, turn on its head. Can I imagine my son saying in his teenage years, “That’s so drivel! It’s wicked drivel!”

There is finally no way around drivel. I find that I am refreshed by the honesty of the term. It reminds me of the uncomfortable fact that my interaction with students will always be structured around criticism, though we sometimes attempt to disguise this basic fact. I think students sometimes understand this better than we do.

Author/s: 
Jason Pickavance
Author's email: 
info@insidehighered.com

Jason Pickavance is an assistant professor of English at Salt Lake Community College and a graduate student in American studies at the University of Utah. Despite his occasional lapses into drivel, he plans on defending his dissertation this year.

Parallel to Shore

As I sit here starting this essay, I am waiting for an important call from my publisher. Today, the editorial board meets. All day I will be waiting to hear if the project I have been working on for over 10 years, and one for which the press patiently waited for 5, will finally find its way into print. I do not teach today, so I will fill my time with other work and regular routine: morning exercise, mowing the lawn, cleaning house. I will be trying not to stare at the phone, or check e-mail 47 times an hour. I will wait, but I will be thinking about this project no matter what I do -- the book is always with me.

That has how it has been with this project from the moment I found the Embree diaries in the American History Center at the University of Texas at Austin, during my preliminary research for my dissertation on women’s diaries. I’d had three body chapters planned, had already found the primary sources I’d need for the chapters on the overland journey and was searching for the diaries I’d need for my chapter on the personal impact of the Civil War. I’d planned a chapter on something else, but when I found these diaries, I knew I’d be writing on them instead. The diaries of Henrietta Baker Embree, the first wife of Dr. John Embree, and Tennessee Keys Embree, who married the widower Embree, covered 28 years of life in mid-19th-century Texas and related the experiences of two women who struggled with a difficult marriage to their mercurial and abusive husband. After I wrote about these diaries in my dissertation, I knew I wanted to bring the complete diaries to publication. This edition was the project I wanted to complete after my dissertation, which I defended over 10 years ago, and was my answer to the question “what next?” that my committee posed at the time.

I took this project with me into my visiting professorships, trying to complete the work while struggling with four classes a semester and one or more in the summer sessions, always worrying about whether the job would be renewed. During those years, I scraped together time, resources and the last scraps of creative energy I had, to put the pieces of this edition together. I received a tiny grant to reproduce the photographs from the archives. I found graduate students to help me transcribe the typescripts. I held onto this project despite the crushing weight of a six-year job search that netted only two interviews and no tenure-track job offers. I published my first book even as I was thinking of this one. I held onto this project as I entered the "adjunct track" and tried to balance part-time work at my community college with part-time work at a university 30 miles away, so I could make a living wage.

There were many, many times when I doubted my ability to bring this project to a published close. As teaching and earning enough income consumed more and more of my energies, the idea that I could devote any time to the project at all seemed a fantasy. As I assumed more and more responsibilities as the caregiver of my aging parents and the daily tasks of their household management, the time and peace I needed for writing evaporated. I knew the project was valuable, but I despaired that I would never be able to complete the work. Other important needs took precedence over the writing. And there was always the question of tangible worth. What was the point of this edition? If it sold as well as my first book, I could count on a three-figure income in royalties. I was no longer applying for tenure-track positions, so it would not be (nor it had ever been) the key to that elusive academic prize. Oprah was unlikely to call.

The one thing I knew, however, was that when I worked on the edition, when I re-read the lives of these diarists, I found the value unquestionable. I could not allow myself to not continue the work, no matter how sporadic my time on the project became, no matter how little financial reward there might be for the project in the end.

And so, when the night before classes began in January 2006 one of my two classes was withdrawn and given instead to a tenure-track faculty member whose class had not made for the semester, I was left with panic, time, and a little savings from the previous semester when I’d taught at two institutions. I also had a query from my press “Are you still working on this project? We’re still interested.” I’m surprised they didn’t ask if I was still alive.

Yes, I replied. I am still working. I gave myself a final deadline: Finish before the year is out. Finish or let the idea go. Finish or let some other editor take the project to publication. Finish, or be done. There were no graduate students, there was no stipend, there was no grant to complete the work, there was no tenure decision pending. There was just a universe-enforced semester course reduction and a book waiting to be completed. Still, it took the entire year.

My parents helped review the manuscript. As former teachers and avid genealogists, they found the work compelling. My father’s first career as a Methodist minister enabled him to shed light on many of my diarists’ theological dilemmas and church experiences. My mother’s careful reading of the original typescript helped me to correct many transcription errors. As I worked, I remembered all my students who found this story compelling, who loved it when I talked about my work, whose own grandparents and great-grandparents and great-great-grandparents kept diaries. I thought of every single person who brightened with interest when I talked about my work. And I thought about the two women who wrote their own words a century before, who struggled with their own ideas of success. The project took on new purpose in that last year, and when I finished my work -- writing the final section, purposefully, on December 31st, 2006 -- I knew I had done my part. If the project didn’t go any further, I had at least done what I could do. Now, I am down to the waiting.

To escape a rip current, we are told, “Swim parallel to shore. Do not swim toward the shore or fight the current because it is a fight you cannot win.” The danger, though real, is easy to escape if you remain calm. One of my current clients, who I am coaching as she works on her master’s thesis, hopes to finish “before the next disaster.” I understand her need and also the unlikelihood of that accomplishment. Having only just returned to safely to shore myself, all I can tell her is keep swimming parallel to shore.

Author/s: 
Amy L. Wink
Author's email: 
info@insidehighered.com

Amy L. Wink has taught at five different universities as visiting professor, a part-time professor, and is currently an adjunct professor at Austin Community College. She is the author of She Left Nothing In Particular: The Autobiographical Legacy of Nineteenth Century Women’s Diaries (University of Tennessee Press, 2001), and the editor of Their Hearts’ Confidante: The Diaries of Henrietta Baker Embree and Tennessee Keys Embree, 1856-1884 (forthcoming, University of Tennessee Press). She is currently working on her third book, a collection of personal essays, entitled A Seat at the Window.

Fame

I know who I want to be when I grow up. I want to be Stephen Colbert. I even know the title of the book I’ll publish: I Am Academic (And So Can You!). Unfortunately, I’ve already had, I think, my 15 minutes of fame.

Part I: I Get a Thrill

The experience was distinctly postmodern: minimalist, ironic, and as deflating as it was exhilarating. I could say that I had spent my entire life up until that point preparing for my moment of celebrity; on the other hand, it wasn’t quite what I had dreamed of.

I can still recall how many how many conversations my friends and I, standing around on the playground outside St. Patrick’s Grammar School, had on the subject of being famous. Blame it on the post-World War II atmosphere of fear and longing, blame it on the space race or Tiger Beat Magazine, but my baby-boomer generation was obsessed with fame -- or at least being noticed. (Current celebrities, take note: You’re not even original in wanting to be celebrities.) My earliest plans had to do with receiving an Oscar for a dramatic tap-dancing role in a film that combined the most poignant moments of The Five Pennies and The Nun’s Story. This fantasy was followed by a phase in which I spent long hours in my pink bedroom writing variations on Frost’s “Death of the Hired Man”; perhaps a president- elect would invite me to share his inaugural stage.

By the time Maya Angelou read “On the Pulse of Morning” at President Clinton’s invitation in 1993, I had earned several degrees, given birth to several children, taught writing at several colleges, and published a number of poems and essays, but fame had eluded me. There had been a number of indirect links: I knew a few poets with national reputations; I knew professors who were either reputable critics or who peppered their lectures with references to their reputable mentors. My first college roommate went on to become the editor-in-chief of two glossy women’s magazines. The lead articles in her publications, which I surreptitiously scanned while on line at the supermarket, were depressingly like the surveys she had conducted in our dorm room after “lights out.” I thought she was just making conversation, not planning how she would achieve fame.

Oh, there were moments when fame seemed close at hand. Once, another mother at my children’s bus stop asked me to autograph her copy of my article “The Two-Year-Old’s Guide to Dressing, Dining and Shopping,” which she had discovered in a free parenting magazine in the waiting room of her ob/gyn. And there were the three odd cases of mistaken identity. In an earnest discussion with her pre-school teacher, my oldest child incorrectly attributed the authorship of “Hickory, Dickory Dock” to me; a very tiny, very old woman followed me into the women’s room after the opening of Love Serenade in Manhattan, insisting that I was Shirley Basset, the film’s Australian director; and once, in the elevator of a conference hotel, an overeager graduate student mistook me for Joyce Carol Oates.

And then it happened. I inadvertently stumbled upon what is apparently the universal subject in academe. It does not involve politics, theory, or tenure; it does cut across gender, race, academic majors, and all levels of faculty and staff. My topic was the student excuse.

Written in a white heat, the morning after a student attempted to justify missing the first two sessions of a class that met only once a week, "The Dog Ate My Disk and Other Tales of Academic Woe" was a simple classification piece, the sort of exercise I used to assign in Basic Composition. My thesis appeared at the end of the introductory paragraph. Excuses from college students, I explained, fell into five broad categories: the family, the best friend, the evils of dorm life, the evils of technology, and the totally bizarre.

I was pleased that the editors liked my essay, although I thought it was slighter than my first Chronicle of Higher Education piece, published several months earlier. That one had garnered a few congratulatory notes and several comments in the halls of my building, along with a single request for reprinting, and then it subsided into a line on my résumé, which often seems more alive than I am. I knew the second piece was scheduled for August 2000, but by the time it appeared, I was immersed in syllabi for the fall. And, truthfully, the earliest indications on the home front weren’t promising. My two younger children looked at the illustration and said “cool,” but declined my offers to let them actually hold the paper and read the piece. My oldest child, now in college but possibly still cautious since the pre-school fiasco, said, “I think your piece on Don DeLillo and your horoscope is much funnier. You know, the one where you freak out just because some phony astrologer said ‘Good day for industrial secrets’ under your sign. Why didn’t you send that one?” My mother said she didn’t think “throwing up blood” was a nice thing to write about.

The first e-mail message was equally disheartening. Its author delivered a lengthy lecture on compassion, liberally laced with insulting epithets for me. The rest of the mail, fortunately for my fragile ego, was positive. I heard from administrators as well as from adjuncts, lecturers, and full-time tenured professors at public and private universities, small liberal arts schools like mine, and community colleges. I received requests for reprints and an invitation to be on talk radio “to discuss this national problem.”

Everyone, it seemed, had a story: this is a maxim I tell my students; it was heartening to find such evidence. Deans wrote fondly about professors from 20 and 30 years ago who had called their undergraduate bluffs; professors relayed stories involving plots that rivaled those of Oprah’s book club selections; and I received enough tales involving body parts and organs to lead me to conclude that there should be a separate category of excuses under the heading “Mutilations” or “Excuses Inspired by American Psycho.” No one ever questioned the veracity of my anecdotes involving dangerous machinery or the pope. In fact, the chair of a math department at a private university wrote to me “on behalf of several colleagues” to check the initials of the student involved in what I had referred to as “The Pennsylvania Chain Saw Episode.” They were certain she had matriculated at their school before moving on to mine. I was grateful to the chair of an education department who offered (unsolicited) verification of the phrase that had troubled my mother. I was less certain how to respond to the reader who said he “particularly enjoyed the bloody parts.”

Marvelously inventive stories poured in for months. While all those who contacted me had had dealings with students on one level or another, there was one writer who had a personal interest. From her office across our (small) campus, she e-mailed me to ask if her daughter, whom I’d had in several classes, was responsible for any of the stories (The daughter was innocent).

Part II The Afterlife

As I opened that last message, it occurred to me that my celebrity was largely electronic and ultimately solitary -- much like the process of writing itself. This was 21st-century virtual fame. In fact, the next phase was something of a virtual nightmare, involving my e-mailing institutions and individuals who had posted my essay on their Web sites without permission. The copyright offenders included a Southern church; a professor of communications who, according to her home page, had a doctorate in journalism ethics; and a sociology professor who explained that his “[W]ebsite [was] intended to . . . introduce you to the many ways which [sic] you can utilize the World Wide Web in your sociological endeavors” -- those endeavors beginning, apparently, with piracy.

Even the essay’s inclusion in composition texts was another mixed -- and humbling -- experience. The promotional material for one of the earliest texts featured the title of my piece, referring to me as “a lesser known writer.” (There is something worse than being a lesser-known writer -- it’s seeing that fact announced in print.) In the tables of contents of anthologies, my name hovers, Zelig-like, alongside those of Amy Tan and Shakespeare. And then there are the instructors’ manuals (which I secretly scan the way I used to read magazines in the supermarket), where David Sedaris rates the adjective “hilarious,” while I am described as merely “very funny.” As for the suggested essay question, “Do you think Segal is being unfair,” I want to write my own 500-word answer. It is one thing to be a misunderstood poet; I’m not certain that I can bear going though the rest of my life as a lesser known and misunderstood essayist.

I will admit to one glorious moment of pleasure early on, when I sat (alone) at my desk and thought, “They like me. They really like me.” The teaching load at a small liberal arts college, however, does not leave much time for basking in the limelight. Moreover, as the semesters progressed, the excuses began to mount. I realized that I had thought my essay was a sort of talisman: I mistakenly believed that having articulated my -- and my students’ -- griefs and grievances, I had put an end to all excuses. But they continued to come, as varied, creative, and astounding as ever. There was the Hemingway-esque “Something tragic happened,” stunning in its brevity and stoicism, and the Zen-like explanation of “I know you allow only two cuts and this was my third, but I was with you in spirit.”

As for “The Dog Ate My Disk,” it lives on, in its final incarnation -- you can purchase, for a very small fee, analytical essays about my (famous) piece at various plagiarism sites.

Author/s: 
Carolyn F. Segal
Author's email: 
info@insidehighered.com

Carolyn F. Segal is associate professor of English at Cedar Crest College.

Regrets, I've Had a Few

The editors of the cultural magazine .N+1 are publishing a booklet called What We Should Have Known: Two Discussions that they have prepared for undergraduates. Copies have only just come back from the printer, it seems, but I’ve had a look at a prepublication PDF and now feel a certain evangelizing fervor for the whole project.

Its topic, in brief, is the relationship between education and regret – how each one creates the conditions for the other. The books you read at a certain age can put you on the wrong path, even though you don’t recognize it at the time. You are too naively ambitious to get much out of them -- or too naive, perhaps, not that it makes much difference either way. And by the time you realize what you should have read, it’s too late. You would understand things differently, and probably better, had you made different choices. You would be a different person. Instead, you wasted a lot of time. (I know I did. There are nights when I recall all the time spent on the literary criticism of J. Hillis Miller and weep softly to myself.)

The booklet consists of transcripts of two meetings of N+1 contributors (a mixture of writers and academics, most in their 20's and 30's) as they discuss what they regret about their educations. Each contributor also submits a list of eight “Books That Changed My Life.”

The structure here seem to involve a rather intricate bit of irony. There is an explicit address to smart people in their teens, or barely out of them, offering suggestions on what to read, and how. It can be taken as a guide to how to avoid regret. The reflections and checklists are all well-considered. You could do a lot worse for an advice manual.

But the task is impossible. Avoiding regret is not an option, whether in your formal education or your love life; and it’s the price of the ticket that you must learn this the hard way. There are no shortcuts between naivete and sophistication. Or rather, there are a lot of shortcuts – but all of them will lead you astray.

Among the approaches tried and found wanting by participants in the discussion are:

  • The Dartmouth Review’s list of timeless classics by dead white European males.
  • The cultural studies templates for subverting DWEM hegemony.
  • Extremely intense close reading of the finest works of literature ever written.
  • Extremely intense close reading of the densest works of theory ever written.
  • Becoming so immersed in the works of a particular master-thinker (for example, Foucault) or author (Emerson, maybe) so that you end up quoting them all the time.
  • Just trying to keep up with whatever is on the syllabus as you move from semester to semester in a contemporary American university’s smorgasbord of electives.

Whichever path you follow, then, is bound to involve the risk of ending up someplace you might have qualms about, later. You just have to strike out and take your chances anyway. Regret will come, and you'll have to learn from it, too.

This candor is remarkable. And so is the hard edge of respect for the intellectual seriousness of young people. It reminded me, at several points, of a wonderful passage in an essay by Adorno:

“The naivete of the student who finds difficult and formidable things good enough for him has more wisdom in it than a grown-up pedantry that shakes its finger at thought, warning it that it should understand the simple things before it tackles the complex ones, which, however, are the only ones that tempt it. Postponing knowledge in this way only obstructs it.”

This booklet is a reflection on the difference between education and Bildung. That is, between the experience of moving through a given social institution, on the one hand, and the process of being inwardly “formed” by what you’ve learned, on the other.

It’s not an attempt to recast the curriculum, then. Or a polemic in the culture wars. Or a blueprint for reforming the vast multi-billion dollar research-and-entertainment complex known as “higher ed.” In some respects, it is much broader in focus than that; in others, it addresses the particularity of individual experience.

The emphasis falls on how books can influence a reader in ways having little to do with career, and everything to do with a sense of life. (Not that the participants are terribly solemn about this. One of them says, deadpan: “It’s like after I read Crime and Punishment in high school, I wanted to kill an old lady.”)

But there is also an undercurrent of disappointment with the university running throughout the discussion. “Our educations take place in institutions that are divided up in these ways that may not bear idealistic close inspection,” says Meghan Falvey, a graduate student in sociology at New York University. “You can really end up studying the wrong thing, sitting around a table with the wrong people, whose concerns are not your own. Almost inevitably it seems like you won’t know what your concerns are until you’re older or better read or something.”

Perhaps that is inevitable – a human problem, rather than the failing of any pedagogical arrangement that could be reformed. But other comments in What We Should Have Known suggest deep reservations about the university as an institution.

“I realized, the further I went on,” says Marco Roth, a doctoral candidate in literature at Yale, “that almost everyone in academia feels like an outsider, nobody knows what’s going on. Academia’s an empty vessel, but the ones who don’t realize it end up going all the way and end up in charge....They believe in the system. That there’s something they can conform to and master. And the proof is that they’ve stuck it out while so many others drop by the wayside into ‘obscurity.’”

An empty vessel is not worthless, of course. (It has its uses.) The complaint here, rather, is about the routinized and often rather vacuous cult of “professionalization” in the humanities. William James worried about this more than a century ago. But really, he could never have imagined how far things would go. In the more inane extremities of the process, any expression of doubt about the effects of professionalization will now immediately be denounced as “anti-intellectual” -- a tendency reflecting an incredibly impoverished conception of the life of the mind.

The participants in the discussions presented in What We Should Have Known are smart enough to know better; and none of them sounds timid enough to give a damn. The combination of seriousness and playfulness here is inspiring. My only regret is that I did not read this pamphlet a long time ago.

(Information about ordering What We Should Have Known is available here.)

Author/s: 
Scott McLemee
Author's email: 
scott.mclemee@insidehighered.com

Scott McLemee writes Intellectual Affairs each week. Suggestions and ideas for future columns are welcome.

Blind, Deaf and Dumb

"But publishers said their biggest hope was that Kindle would expand sales of books to a new generation of gadget lovers."
--The New York Times

When I ask my English students to set learning goals for themselves at the beginning of each term, I’m interested in finding out how they want to improve as readers and writers. I am even more interested in getting at what they think it means to be good readers and good writers.

The most common "reading" goal my students set for themselves is to read faster. Because they have so much going on in their young lives, they are looking for the most efficient ways of getting their reading finished. They also mention that they want to improve their comprehension of what they read, but more importantly they want to learn how do it quickly.

They soon learn I don’t teach speed reading. I teach slow reading. I teach slow, concentrated, finger-on-the-page reading. I read to my students in my slow Texas drawl. I crawl with them through the passages and passageways. We mosey. We copy down sentences. We write paraphrases. We imitate sentences. We read a couple of lines. We ask questions. We pause. We read those lines again. I dedicate entire classes to silent, sustained, shared reading. We call it “reading lab.”

The relationship we have with text is called reading. The quality of that relationship depends upon what we bring to those relationships. Improving the relationships college students have with text is our primary responsibility. And it chiefly occurs when we read to and for them. I know many professors think that students already know how to read when they come to college (or should), but there is a real difference between knowing how to read words on the page and having a productive relationship with those words.

Having a productive relationship with text is also dependent upon hearing the text. Many of my students cannot hear what they read. Perhaps it is because they were not read to as children. Whatever the cause, they often cannot hear the voice of the text. Their eyes may be working, but their ears aren’t. Nothing on the lips and tongue either. What they can’t taste, they can’t consume. That’s why I read to my students. I’m their hearing aid. Their sommelier. Given my experience with the text, I help them learn the lay of the land. I help them find the right narrative path so they can follow it page after page. I’m an English instructor who also teaches voice.

I also know that many students sometimes go blind when they see text. It’s a shameful state of cultural affairs. Poetry-blindness is particularly tragic. Poetry unsettles the eye. It can make us dizzy, all this reading back and forth, up and down the page. But students easily go blind in the face of other texts, too. Lost and wandering aimlessly, they might as well give up, shut their eyes, and fall asleep for good.

So it shouldn’t be surprising that many students should go silent in the company of text. That they are unresponsive in class. That they should go dumb after going deaf and blind. That they have no sense and sensation of what they’ve read. That they look to their professors for short cuts, quick reads, and knowledge patches.

It also shouldn’t be surprising that the solution is to teach students to hear and see and speak the words we assign them. To accomplish this, we not only have to slow down our students, we have to slow down ourselves. Do more with less. At its best, reading should be a sort of textual genuflection, the sign of the cross we make between our eyes, ears, mouth, and mind to enliven the soul.

However, the current frantic pace of school work is not conducive to learning how to read the variety of texts students are assigned across the curriculum. Learning to read well is also dependent on reflection -- time to weigh, consider, accommodate, connect, synthesize, incorporate, sort the wheat from the chaff. If reflection is rarely available (or if there’s rarely time to help students learn how to reflect), then learning to read is rare.

Our learning culture is awash in technology so that information can be delivered in the blink of an eye at any time of the day or night. It's true that more information is flowing, but it doesn't always result in more knowing. In this hypersphere, it may be that students are reading and writing more than ever before. But practice doesn't make perfect. It could just as easily wear us down as lift us up.

This is all a prelude to my short take on Amazon’s new product, Kindle, a wireless and portable handheld device designed to make books instantly accessible. It’s actually a graven image. A false gadget god engineered in the service of efficient data transfer and consumer credit. Don’t be fooled, Kindle is no innocent tool. It's not a gift that keeps on giving. It holds a charge so it can keep on charging.

My dear colleague, you will soon be expected to try it. And then you will be expected to buy it. To embrace its efficiencies and remarkable cost savings. To order your textbooks through it. To order your students to use it.

Someone will put it in your hands. Don’t ask where it came from. Or who made it. Just raise it and praise it, dummy. Look how lightweight and lovely. See how quickly you can turn the page!

Author/s: 
Laurence Musgrove
Author's email: 
info@insidehighered.com

Laurence Musgrove is an associate professor of English at Saint Xavier University, in Chicago.

Pages

Subscribe to RSS - English
Back to Top