Entertainment is in the eye of the beholder. Consider the case of what are usually called “beach novels” -- bulky sagas of lust, money, and adventure, page-turning epics of escapism that are (it’s said) addictive. I’ve never been able to work up the appetite to read one, even while bored on vacation in a seafront town. Clive James characterized the dialogue of one such novelist as resembling “an argument between two not-very-bright drunks.”
Which might be fun to witness in real life, actually, depending on the subject of the dispute. But reading the transcript seems like an invitation to a bad headache.
Diversion doesn’t have to be mind-numbing, let alone painful. With the end of the semester at hand, then, a few recommendations of recent books and DVDs that are smarter than your average bar fight -- and more entertaining.
The two dozen or so contributors to When I Was a Loser: True Stories of (Barely) Surviving High School managed to wear the entire range of unfortunate hair styles available throughout the 1970s and ‘80s. This collection -- edited by John McNally, who spent last semester as a visiting writer at Columbia College Chicago -- is one of the less solemn works of “creative nonfiction” (as the term of art now has it) currently available. Published by the Free Press, it is available in both paperback and e-book formats.
Most of the mortified authors are novelists and poets, ranging in age from their early 30s through their late 40s. It’s not that their memoirs are devoted to mullets or feathering, as such. But the stories they have to tell are all about the pressure to fit in, to be cool -- failure to do so bringing various penalties, as you may recall. There, on the cusp of adulthood, one has the first opportunity to create a new self. And hair is where it tends to happen first. Sex, religion, and first-job experiences also have their place.
With the benefit of hindsight, of course, the whole effort can seem embarrassing. The essays in When I Was a Loser are all about the different grades of self-consciousness and awkwardness. A few are lushly overwritten (adolescence is a purple thing) and one or two seem more than a little fictionalized. But most have the feel of authentically remembered humiliation, now rendered bearable by time and the cultivation of talent.
Several are well-known, including Dean Bakopoulos, whose novel Please Don't Come Back from the Moon was named by The New York Times as one of the notable books of 2005, and the prominent literary blogger Maud Newton. In the spirit of full disclosure, it bears mentioning that Maud is a friend, and her essay "Confessions of a Cradle Robber" (revealing the dark shame of having once been a fourteen year-old girl with a boyfriend who was twelve) was the first thing I read. My other favorite piece here was "How to Kill the Boy that Nobody Likes" by Will Clarke, a novelist who recalls being the most despised kid in junior high -- one nicknamed "The Will-tard" for his admittedly peculiar comportment. Clarke's rise to the status and celebrity of Student Council treasurer is a tribute to the power of a very silly 1970s paperback about the secret techniques of subliminal advertising. The author's name didn't ring a bell when I picked the book up, but it certainly will in the future.
Adolescence isn’t just for teenagers any more. "Twitch City," an absurdist sitcom that premiered on Canadian television in 1998, offers one of the funniest portraits around of someone determined to avoid the demands of adult life.It ran through 13 episodes before the show ended in 2000. The recent DVD release doesn’t provide many features. Still, it’s good to have the whole series available to those of us who weren’t part of its original cult following.
Its central character, Curtis (played by Don McKellar), is a man in his 20s who spends nearly every waking hour watching television. Among his few distractions from distraction is the effort to sublet more and more of his grungy apartment to anyone who can help him make the rent. His girlfriend Hope (played by the luminous Molly Parker) works at a variety of low-paying jobs. She can never quite figure out why she’s attracted to someone not just utterly lacking in ambition but unwilling even to leave the couch.
Part of the pleasure of "Twitch City" comes from seeing just how many stories can be generated around such a constrained, even claustrophobic premise. It is minimalist without being repetitive, and plausible, somehow, in spite of being preposterous.
When a chain of odd circumstances makes Curtis a media celebrity, he is visited by a woman (Jennifer Jason Leigh) claiming to be a graduate student in semiotics. She interviews him about his habits and outlook, and he delivers an analysis of the aesthetics of “Gilligan’s Island” that is a real tour de force -- a great moment of meta-TV. "Twitch City" is set in a neighborhood of Toronto, which occasionally made me wonder what Marshall McLuhan (who taught at U of T) would have made of it.
Another product of Canada worth a look is "Slings and Arrows," an ensemble comedy/drama that just finished its third and final season on the Sundance Channel. The first two (each consisting of six one-hour episodes) are now available on DVD.
Set at a repertory theater best known for its Shakespeare productions, "Slings and Arrows" is in some ways a show about trying to keep viable routines from turning into a rut of mediocrity. The theater’s regular audience is aging. It buys its season tickets out of force of habit, mostly. But box office sales aren’t what they could be, and it’s hard to find corporate sponsors who won’t try to meddle with how the place is run. And in any case, the troupe’s creative spark has diminished over time.
Revitalization isn’t impossible, but it takes some doing. Each season tracks the production of a different Shakespeare play ( Hamlet, Macbeth, and King Lear) with a keen eye and ear for the way the artistic director and the actors work out the staging. At the same time, plenty of drama and farce takes place behind the scenes.
People who have worked in theater tell me that the situations and backstage dynamics in "Slings and Arrows” are absolutely typical of professional productions. As much as I enjoyed the first season, it was hard to believe that the second would be anything beyond a repetition -- reducing success to a formula. But those misgivings were completely off track. The third season carried things to a natural close.
Nowadays there are sessions at the Modern Language Association meeting devoted to the great German literary theorist Walter Benjamin, whose selected writings have appeared in English in four hefty volumes from Harvard University Press. But if the man himself showed up and wandered the corridors, I doubt he would survive the usual quick and dismissive nametag-check. After all, he wrote mostly for magazines and newspapers. He’d be wearing the wrong kind of nametag to be worth anybody’s time.
Whether or not Howard Hampton is actually the reincarnation of Walter Benjamin, they have the same extraterritorial position vis-a-vis academic criticism. (Hampton writes for The Village Voice, Film Comment, and The Boston Globe, among other endnote-free zones.) And now they share the same publisher, with the recent appearance of Born in Flames: Termite Dreams, Dialectical Fairy Tales, and Pop Apocalypses (Harvard University Press).
Drawn from 15 years’ worth of running commentary on film, music (mostly rock), and books, Hampton’s selected essays transcend “mere reviewing” (as it’s called) to become examples of a fully engaged critical intelligence responding to the mass-media surround. Some of the best pieces are compact but sweeping analyses of changes in sensibility, amounting to miniature works of cultural history.
One example is “Reification Blues: The Persistence of the Seventies,” which listens to how the pop soundtrack of that decade left its mark on later music despite (or maybe because of) artists’ best efforts to forget it. Another case is “Whatever You Desire: Movieland and Pornotopia” -- an analysis of how mainstream Hollywood and pornography have shaped one another over the years, whether through mimicry or rejection of one another’s examples.
The curse of a lot of pop-culture commentary is its tendency to move too quickly toward big sociocultural statements -- ignoring questions of form and texture, instead using the film, album, etc., as pretext for generalized pontifications. That’s not a problem with Born in Flames. It’s a book that helps you pay attention, even to the nuances of Elvis’s performance in "Viva Las Vegas." Perhaps especially to the nuances of Elvis’s performance in "Viva Las Vegas"....
"It's an alternate universe governed by sheer whim," writes Hampton about the King's cinematic ouevre, "untouched by any sense of the outside world." Sounds like the perfect vacation spot.
Half a century before "The Sopranos" hit its stride, the Caribbean historian and theorist C.L.R. James recorded some penetrating thoughts on the gangster -- or, more precisely, the gangster film -- as symbol and proxy for the deepest tensions in American society. His insights are worth revising now, while saying farewell to one of the richest works of popular culture ever created.
First, a little context. In 1938, shortly before James arrived in the United States, he had published The Black Jacobins, still one of the great accounts of the Haitian slave revolt. He would later write Beyond a Boundary (1963), a sensitive cultural and social history of cricket – an appreciation of it as both a sport and a value system. But in 1950, when he produced a long manuscript titled “Notes on American Civilization,” James was an illegal alien from Trinidad. I have in hand documents from his interrogation by FBI agents in the late 1940s, during which he was questioned in detail about his left-wing political ideas and associations. (He had been an associate of Leon Trotsky and a leader in his international movement for many years.)
In personal manner, James was, like W.E.B. DuBois, one of the eminent black Victorians -- a gentleman and a scholar, but also someone listening to what his friend Ralph Ellison called “the lower frequencies” of American life. The document James wrote in 1950 was a rough draft for a book he never finished. Four years after his death, it was published as American Civilization (Blackwell, 1993). A sui generis work of cultural and political analysis, it is the product of years of immersion in American literature and history, as well as James’s ambivalent first-hand observation of the society around him. His studies were interrupted in 1953 when he was expelled by the government. James was later readmitted during the late 1960s and taught for many years at what is now the University of the District of Columbia.
American Civilization's discussion of gangster films is part of James's larger argument about media and the arts. James focuses on the role they play in a mass society that promises democracy and equality while systematically frustrating those who take those promises too seriously. Traveling in the American South in 1939 on his way back from a meeting with Trotsky in Mexico, James had made the mistake of sitting in the wrong part of the bus. Fortunately an African-American rider explained the rules to him before things got out of hand. But that experience -- and others like it, no doubt -- left him with a keen sense of the country's contradictions.
While James's analysis of American society is deeply shaped by readings of Hegel and Marx, it also owes a great deal to Frederick Jackson Turner’s theory of “the closing of the frontier.” The world onscreen, as James interpreted it, gave the moviegoer an alternative to the everyday experience of a life “ordered and restricted at every turn, where there is no certainty of employment, far less of being able to rise by energy and ability by going West as in the old days.”
Such frustrations intensified after 1929, according to James’s analysis. The first era of gangster films coincided with the beginning of the Great Depression. “The gangster did not fall from the sky,” wrote James. “He is the persistent symbol of the national past which now has no meaning – the past in which energy, determination, bravery were sure to get a man somewhere in the line of opportunity. Now the man on the assembly line, the farmer, know that they are there for life; and the gangster who displays all the old heroic qualities, in the only way he can display them, is the derisive symbol of the contrast between ideals and reality.”
The language and the assumptions here are obviously quite male-centered. But other passages in James’s work make clear that he understood the frustrations to cross gender lines -- especially given the increasing role of women in mass society as workers, consumers, and audience members.
“In such a society,” writes James, “the individual demands an aesthetic compensation in the contemplation of free individuals who go out into the world and settle their problems by free activity and individualistic methods. In these perpetual isolated wars free individuals are pitted against free individuals, live grandly and boldly. What they want, they go for. Gangsters get what they want, trying it for a while, then are killed.”
The narratives onscreen are a compromise between frustrated desire and social control.“In the end ‘crime does not pay,’” continues James, “but for an hour and a half highly skilled actors and a huge organization of production and distribution have given to many millions a sense of active living....”
Being a good Victorian at heart, James might have preferred that the audience seek “aesthetic compensation” in the more orderly pleasures of cricket, instead. But as a historian and a revolutionary, he accepted what he found. In offering “the freedom from restraint to allow pent-up feelings free play,” gangster movies “have released the bitterness, hate, fear, and sadism which simmer just below the surface.” His theoretical framework for this analysis was strictly classical, by the way. James was deeply influenced by Aristotle’s idea that tragedy allowed an audience to “purge” itself of violent emotions. One day, he thought, they would emerge in a new form -- a wave of upheavals that would shake the country to its foundations.
In 6 seasons over 10 years, “The Sopranos” has confirmed again and again C.L.R. James’s point about the gangster is an archetypal figure of American society. But the creators have gone far beyond his early insights. I say that with all due respect to James’s memory – and with the firm certainty that he would have been a devoted fan and capable interpreter.
For James, analyzing gangster films in 1950, there is an intimate connection between the individual viewer and the figure on the screen. At the same time, there is a vast distance between them. Movies offered the audience something it could not find outside the theater. The gangster is individualism personified. He has escaped all the rules and roles of normal life. His very existence – doomed as it is – embodies a triumph of personal will over social obligation.
By contrast, when we first meet Tony Soprano, a boss in the New Jersey mob, he is in some ways all too well integrated into the world around him. So much so, in fact, that it is giving him panic attacks from trying to meet all the demands from juggling the different roles he plays. In addition to being pater of his own brood, residing in a suburban McMansion, he is the dutiful (if put-upon) son in a dysfunctional and sociopathic family.
And then there are the pressures that attend being the competent manager of a successful business with diversified holdings. Even the form taken by his psychic misery seems perfectly ordinary: anxiety and depression, the tag-team heart-breakers of everyday neurosis.
James treats the cinematic gangsters of yesteryear as radical individualists – their crimes, however violent, being a kind of Romantic refusal of social authority. But the extraordinary power of “The Sopranos” has often come from its portrayal of an almost seamless continuum between normality and monstrosity. Perhaps the most emblematic moment in this regard came in the episode entitled “College,” early in show’s first year. We watch Tony, the proud and loving father, take his firstborn, Meadow, off to spend a day at the campus of one of her prospective colleges. Along the way, he notices a mobster who had informed to the government and gone into the witness protection program. Tony tracks the man down and strangles him to death.
At the college he sees an inscription from Hawthorne that reads, “No man ... can wear one face to himself and another to the multitude, without finally getting bewildered as to which one may be true." Earlier, we have seen Tony answer Meadow’s question about whether he is a member of the Mafia by admitting that, well, he does make a little money from illegal gambling, but no, he isn't a gangster. So the quotation from Hawthorne points to one source of Tony’s constant anxiety. But it also underscores part of the audience’s experience – an ambivalence that only grows more intense as “The Sopranos” unfolds.
For we are no more clear than Tony is which of his faces is “true.” To put it another way, all of them are. He really is a loving father and a good breadwinner (and no worse a husband, for all the compulsive philandering, than many) as well as a violent sociopath. The different sides of his life, while seemingly distinct, keep bleeding into one another.
Analyzing the gangster as American archetype in 1950, C.L.R. James found a figure whose rise and fall onscreen provided the audience with catharsis. With “The Sopranos,” we’ve seen a far more complex pattern of development than anything found in Little Caesar or High Sierra (among other films James had in mind).
With the finale, there will doubtless be a reminder – as in the days of the Hays Code – that “crime does not pay.” But an ironized reminder. After all, we’ve seen that it can pay pretty well. (As Balzac put it, “Behind every great fortune, a great crime.”) Closure won’t mean catharsis. Whatever happens to Tony or his family, the audience will be left with his ambivalence and anxiety, which, over time, we have come to make our own.
One generation’s faculty gossip is sometimes another’s cultural history. At the University of Chicago in the early 1950s, a professor stopped a teenage student leaving one of his classes. She was not properly enrolled in the course, but bureaucratic proprieties really did not have anything to do with it. She was stunning. He was smitten. They had lunch. And 10 days later, give or take, Philip Rieff was joined in marriage to a young woman who never actually did change her name to Susan Rieff, instead always being known as Susan Sontag.
They did not live happily ever after. The opening pages of Sontag’s last novel, In America, are written in a first-person voice that sounds very much like the author’s. The narrator mentions reading George Eliot as a young bride and bursting into tears at the realization she had, like Dorothea in Middlemarch, married Casaubon.
As you may recall, Dorothea is at first transfixed by the learning and gravitas of Casaubon, a scholar who is many years her senior. It soon dawns on her (as it does perhaps more quickly for the reader) that he is a bloodless pedant, joyless except when venting spleen against other bloodless pendants. And there are hints, as clear as Victorian propriety will allow, that Dorothea’s honeymoon has been disappointing in other ways as well.
Sontag’s allusion must rank as one of the more subtly devastating acts of revenge ever performed by an ex-wife. At the same time, it is in keeping with some durable and rather less literary attitudes towards professors -- the stereotype that treats them as being not just other-worldly, but also rather desexed by all the sublimation their work requires. This view really took hold in the 19th century, according to the analysis presented by A.D. Nuttall in Dead From the Waist Down: Scholars and Scholarship in Literature and the Popular Imagination (Yale University Press, 2003).
But a different cliché is emerging from Hollywood lately. The summer issue of The American Scholar contains an essay by William Deresiewicz called “Love on Campus” that identifies a “new academic stereotype” visible in popular culture. The sexually underachieving Casaubon’s day is over. The new stereotype of the professor has some notches in his bedpost (this character is almost always a male) and for the most part demonstrates his priapic prowess with students.
Universities in real life are “the most anxiously self-patrolled workplace in Ameican society,” writes Deresiewicz, “especially when it comes to relations between professors and students. This is not to suggest that sexual contact between college students and professors, welcome or unwelcome, never takes place, but the belief that it is the norm is the product of fantasy, not fact.”
Yet the fantasy is played out in numerous contemporary films. It merits examination for what it implies about how academe is perceived and (mis)understood.
The stereotyped character in question is often a professor of English or creative writing, as in "The Wonder Boys" or "The Squid and the Whale." But sometimes he teaches philosophy ("The Life of David Gale") or French ("Little Miss Sunshine"). He is consumed with ambition. But he is also a loser. Those condition -- academic ambition, abject failure -- are identical, at least given the implicit logic of the stereotype.
“In the popular imagination,” writes Deresiewicz, “humanities professors don’t have anything to be ambitious about. No one really knows what they do, and to the extent that people do know, they don’t think it’s worth doing.... It may be simply because academics don’t pursue wealth, power, or, to any real extent, fame, that they are vulnerable to such [criticism]. In our culture, the willingness to settle for something less than these Luciferian goals is itself seen as emasculating.”
So he neglects his family, or drinks, or both. Above all, he seduces his students. The latter is not so much an abuse of power as a symptom of having no real power at all. He is “a figure of creative sterility,” writes Deresiewicz, “and he is creatively sterile because he loves only himself. Hence his vanity, pomposity, and selfishness; his self-pity, passivity, and resentment. Hence his ambition and failure. And thence his lechery, for sleeping with his students is a sign not of virility but of impotence: he can only hit the easy targets; he feeds on his students’ vitality; he can’t succeed in growing up.”
At one level, this new character may look like the negation of earlier clichés about absent-minded and asexual professors. But that appearance is, in some ways, misleading. These more recent fictional figures are, so to speak, Casaubon on Viagra. Like his ancestor, the contemporary on-screen professor is empty and vain, and going nowhere fast. But he has another way to vent. “In both ‘Terms of Endearment’ and ‘We Don’t Live Here Anymore,’” notes Deresiewicz, “ ‘going to the library’ becomes a euphemism for ‘going to sleep with a student.’ ”
Deresiewicz offers a cogent analysis of how this stereotype may reflect the changing place of academe in American society and the contradictory attitudes it evinces. He also presents some thoughts on a dimension of education that popular culture for the most part ignores: the eros of learning, the way a student can fall in love with a teacher for reasons having nothing to do with sexuality. Combining them, as Sontag tried to do with Rieff, seems like a bad idea.
It is a remarkable essay -- cogent on many points, and adventurous in making some of them, given the inescapable risk of being misunderstood. (I half expect to see Deresiewicz on a cable program with the words "Professor Advocating 'Brain Sex' " at the bottom of the screen.) Rather than quote or paraphrase any more of it, let me simply recommend that you read the whole thing.
Recently I was cornered by a university employee who knows I’m a scholar of British literature, specializing in Jane Austen.
“I started Pride and Prejudice last week,” he told me. “It’s one of those books I know I should have read, but I couldn’t get past the first few chapters.”
“Really,” I replied, eyebrows raised.
“Yeah, I just lost interest,” he went on. “I kept thinking to myself, ‘Oh, brother. I think I know where this is going.’”
Was this disarming honesty or throwing down the gauntlet? Was I being called out? Whatever it was, I shifted nervously as I listened to the rest of his monologue: “My theory is that the novel can be pretty much summed up as Elizabeth and Darcy meet, Elizabeth and Darcy hate each other, Elizabeth and Darcy fall in love, yadda, yadda, yadda.”
Reader, I stared at him blankly. Of course, I spent hours afterward constructing witty, cynical comebacks, such as “Yeah, I know what you mean. I have that response to episodes of VH1’s 'Behind the Music' and to reading the Bible.” But in the moment, all I managed to spit out was something clichéd and professorial resembling, “Hmm. That’s interesting. I think maybe it takes a few readings of Austen to really appreciate her fiction’s depth, humor, and irony.”
That’s also my stock answer to traditional-aged undergraduates on the first day of class -- 20-year-olds who confess that they’ve signed up for a literature class on Austen and her contemporaries because they absolutely love (or absolutely hate) her fiction -- or maybe just the film adaptations. Or Colin Firth or Keira Knightley or Clueless. The Austen-haters often claim to be taking the course because they want to understand what in the world is the big deal. A few of them end up seeing it by the end of the semester, a few more don’t, and that’s fine. But the yadda-yadda-yadda employee was a well-read, middle-aged guy with no sophomore excuse for being sophomoric. My gut reaction to his confession registered somewhere between crestfallen and incensed.
I'm having a similarly mixed reaction to the latest wave of Austen mania in the U.S. and U.K., shifting nervously, while approaching it with a combination of anxiety and dread. I know that all English professors worth their salt should be constructing some theories and responses now, in advance of being cornered by colleagues and co-workers and co-eds, so as not to have to resort to the professorial and clichéd. What will we say when asked about Anne Hathaway’s Becoming Jane (2007); about upcoming The Jane Austen Book Club film, with its star-studded cast; or about PBS’s planned 10-week winter 2008 airing of the Complete Jane Austen on "Masterpiece Theatre"?
What’s the witty, cynical comeback to this cultural flowering of Austen-related stuff, I find myself wondering: “Can’t wait to see it!” “Wish I’d thought of it first!” “The Decline and Fall of Austen’s Empire.” “A tippet in the hand is worth two in the bush.” “A stitch in the huswife saves nine.” “Don’t look a gift pianoforte in the mouth”?
But along with such repartee, we’ll also need to ready weightier observations. First, I believe it’s imperative that we call a moratorium on starting sentences with “It is a truth universally acknowledged,” as in, “It is a truth universally acknowledged that this is the first time in television history Austen’s complete works have been aired in succession.” In the coming months we will no doubt suffer through dozens of newspaper and magazine articles beginning, “It is a truth universally acknowledged.” Best not to add to the collective torture.
In addition, when constructing our soundbites, we ought not to forget the sheer breadth of today’s Austen craze; it’s more than just films and television adaptations we’re in for. New books have appeared, too, like Confessions of a Jane Austen Addict (2007) and Jane Austen for Dummies (2006). Though I worry that these books make reading her fiction sound like something done at an Alcoholics Anonymous meeting for slow learners, surely it’s not too late for some well-placed damage control?
After all, the Austen-inspired publicity stunts are already in full swing. Perhaps you’ve heard about the kerfluffle that unfolded over the pond, “Jane Austen Rejected!” Thinly veiled versions of Austen’s fiction were sent out to British publishers as new work, under the name of Allison Laydee (a.k.a. David Lassman), and all were rejected. Even Harlequin Mills & Boon passed on publishing adulterated Jane Austen plots. The horror! The horror!
But isn’t this is déjà vu all over again? Please raise your tussy mussy if you remember 10 or so years ago, when we were last inundated with Austen film and TV adaptations; with Bridget Jones novels and films; and with Austen board games, stationery, and editorial cartoons. Everyone then seemed to be asking, “Why Austen? Why now?”
The late 1990s were strange days for us longtime members of the Jane Austen Society of North America. It was as if we no longer had to apologize for indulging in our versions of wearing plastic Spock ears, whether quadrille, or quilling, or merely quizzing. Many of us became instant pundits among our friends, family, and the media, providing copy for everything from the Arkansas Democrat to The Wall Street Journal. Only a few periodicals continued to misspell Jane’s name as Austin, while many more managed to render correctly Bennet, Morland, and Love and Freindship. Oh, those were heady times.
If you were there, then you’ll no doubt recall that we came up with some pretty wild theories to explain the Jane train, too. Remember when Camilla Paglia said Austen’s popularity could be explained as a cultural symptom in reaction to the O.J. Trial, as people longed for stories in which no one was being butchered? That was a good one. Or how some claimed that the return to Austen was a result of the fin de siècle’s prompting us to take stock and return to works of past centuries? Seems pretty thin now. Others claimed that Austen’s resurgence happened because we needed to measure the worth of our male heroes, from Bill Clinton and Brad Pitt to Kurt Cobain and Ross Perot. (Jane Austen and Ross Perot?)
So here we are, circa 2007, finding ourselves in danger of being asked yet again, “Why Austen? Why now?” How delightful. How frightening. I’m determined not to be caught off guard, so I’ve constructed some all-purpose answers to explain the latest Austen craze, suitable for everything from The Nation to "Larry King Live" to Marie Claire. Anyone struggling for words is, of course, welcome to use these as conversational building blocks:
Option A: “Today’s Austen mania is a form of cultural compensation for the disaster of the Iraq War and for the genocide in Darfur. Her novels offer us a way to forget the world’s evils by allowing us to travel back to those halcyon post-French Revolutionary days of Napoleon.”
Option B: “Austen’s timeless narratives of women’s romantic searching provide a welcome distraction from the Supreme Court’s rolling back of abortion rights, as we yearn for an era when many women had the power to refuse a proposal of marriage.”
Option C: “Austen’s newfound popularity signals that empire-waist frocks are due for a fashion revival; that irony, having been shunned after 9/11, is back and better than ever; and that Wal-Mart will roll back prices on its imported teas.”
This list is just a draft of talking points. I still have a few more ideas to work out. For instance, can it be an accident that Austen’s popularity is surging, just as Jane magazine has gone defunct? There is certainly a quotable quip in the making there. Even if we don’t perfect our theories in the coming months, I don’t think there should be much cause for worry. Check back with me in 2013, the 200th anniversary of Pride and Prejudice’s publication. Oh, brother. I think I know where this is going.
Devoney Looser is associate professor of English at the University of Missouri at Columbia, and the author of British Women Writers and the Writing of History (Johns Hopkins University Press). She has just completed a book on British women writers and old age, to be published next year.
A few weeks ago, a new edition of the selected works of Edmund Wilson appeared. Another monumental book this season is David Michaelis’s Schulz and Peanuts: A Biography (HarperCollins). The critic and the cartoonist never crossed paths, so far as anyone knows. But there is some overlap between these publications, it seems to me. The biography of Charles M. Schulz, who died in 2000, calls to mind Wilson’s The Wound and the Bow, a collection of essays published in 1941 and reprinted in the second of the two Library of America volumes.
The connection is indirect but insistent. In the essay that lent The Wound and the Bow its title, Wilson revisits one of the lesser-known plays by Sophocles -- a telling of the story of Philoctetes, who also appears in the Iliad. Philoctetes is a skilled and powerful archer, but he is also a man in exile through no fault of his own. A snakebite has left him with a wound that not only festers but reeks. Unable to bear the stench or his groans, the Greeks abandon him on a desert island. And there he stays until Odysseus is forced to bring him back into service as the only man able to bend the bow of Heracles.
Wilson (who had started using psychoanalysis as a means of interpreting literary works well before this was required by law) saw in the figure of Philoctetes something like an allegorical emblem for the artist’s inner life. Neurosis is the agonizing wound that leaves the sufferer isolated and bitter, while genius is the ability to bend the bow, to do what others cannot. Creativity and psychic pain, “like strength and mutilation,” as Wilson put it, “may be inextricably bound up together."
Not such a novel idea, after all this time. And one prone to abuse -- reducing artistic creativity to symptomatology. (Or, worse, elevating symptomatology into art: a phenomenon some of us first encounter while dating.)
In Wilson’s hands, though, it was a way through the labyrinth of a writer’s work, of finding hidden passages within it. The two longest studies in The Wound and the Bow were interpretations of Charles Dickens and Rudyard Kipling: two authors whose critical reputations had been nearly done in by their commercial success. Wilson’s criticism, while biographical in method, did not take the debunking route. If he documented the wound, he also showed the strength with which each figure could draw the bow.
Now, I’m not really sure that the archer serves all that well as a model of the artist. (The myths of Daedelus or Orpheus work better, for a variety of reasons, and cover much of the same analogical ground.) On the other hand, Philoctetes did tend to complain a lot -- as did Charles Schulz, it seems. The cartoonist emerges from his biographer’s pages as a man of numerous griefs and grievances. His life was shaped by an upbringing that was economically secure but emotionally complex. His childhood was spent among among relatives who expressed affection through joking insults (to give things the most positive construction possible).
Michaelis, who has also written about the life of the painter N.C. Wyeth, offers numerous well-framed appreciations of Schulz’s artistry. The book is Wilsonian, in that sense. But any revaluation of “Peanuts” as cultural artifact is bound to be less a topic for conversation than the unveiling of details about his melancholia and his resentments.
An episode of the documentary series "American Masters" on PBS airing later this month will be tied to the book, which should reach stores any day now. Soon it will be common knowledge that everyone who met the cartoonist’s first wife had a pretty good idea where Lucy originated. Numerous “Peanuts” strips are embedded throughout the book -- each of them echoing events or situations in Schulz’s life or some aspect of his personality and relationships. (Members of his family are complaining about the biography, a development to be expected.)
The cartoons themselves -- however telling as illustrations of things the biographer has discovered about Schulz -- are rich works in their own right. They fall somewhere between art and literature; but those categories really don't matter very much, because they create their own little world. The biography derives its meaning from the cartoons and not vice versa.
So in an effort to restore some balance, I’d like to recommend some supplementary reading about “Peanuts” -- an essay that says very little about Schulz himself. It focuses instead on what he created. How an artist becomes capable of bending the bow is difficult to understand. Biography is one approach, but it does not exhaust the topic. (In a way it only begins to pose the riddle.)
The piece in question is “The World of Charlie Brown” by Umberto Eco. It appeared in his collection Apocalittica e integrati, a volume that became rather notorious when it first appeared in 1964. Parts of the collection were translated, along with some later pieces, as Apocalypse Postponed (Indiana University Press, 1994)
Like other essays in the book, the analysis of “Peanuts” is part of Eco’s challenge to familiar arguments about “mass culture,” whether framed in Marxist or conservative terms. Either way, the theorists who wrote about the topic tended to be denunciatory. Eco, who was 32 when Apocalittica appeared, had published a couple of monographs on medieval intellectual history and was also working on semiotics and the philosophy of language. Aside from teaching, he paid the bills by working for a television network and a trade publisher. All the quasi-sociological hand-wringing about the media struck Eco as rather obtuse, and he did not hesitate to say so.
From the vantage point of someone who had written about the aesthetic theory of Thomas Aquinus, it was not self-evident that “mass culture” was the fresh horror that worried his contemporaries. He saw it beginning with the cathedrals -- or at least no later than the printing press. The fact that Eco wrote about Superman and television worried some of the reviewers.
One of them complained that treating “Plato and Elvis Presley” as both “equally worthy of consideration” was bound to have grave consequences: “In a few years the majority of Italian intellectuals will be producing films, songs, and comic strips....while in the university chairs, young dons will be analyzing the phenomena of mass culture.” It would be the closing of the Italian mind, I guess.
“The World of Charlie Brown” is evidence that Eco meant to do more than stir up argument. It originally appeared as the preface to the first collection of Schulz’s strips to appear in Italy. It is the work of a critic determined to win “Peanuts” a hearing as a serious work of art.
Eco seems unable to resist a certain amount of elitist chain-yanking. He says that the translators lavished on their work “the meticulous passion that Max Brod devoted to the manuscripts of Kafka...and Father Van Breda to the shorthand notes of Edmund Husserl.” The round-headed Charlie Brown embodies “a moment of the Universal Consciousness,” he writes, “the suburban Philoctetes of the paperbacks.” (I confess that I did not remember that part of the essay until rereading it just now.)
But the tongue soon comes out of his cheek. Eco reveals himself as a devoted student of the history of the American comic strip. He triangulates “Peanuts” with respect to Jules Feiffer’s satirical cartoons and “the lyric vein of Krazy Kat” -- comparisons that are so brilliantly apt that they immediately seem obvious, which they aren’t.
And Eco warns the Italian reader that appreciating the strip involves learning Schulz’s rhythm of theme and variation. “You must thoroughly understand the characters and the situations,” he writes, “for the grace, tenderness, and laughter are born only from the infinitely shifting repetition of the patterns....”
At this point, it is tempting to quote at length from Eco’s quick analysis of the essence of Schulz's characters essence. Each one embodies or resists some part of the human condition -- even, and perhaps especially, Snoopy.
In the world of “Peanuts,” writes Eco, “we find everything: Freud, mass-cult, digest culture, frustrated struggle for success, craving for affection, loneliness, passive acquiescence, and neurotic protest. But all these elements do not blossom directly, as we know them, from the mouths of a group of children: they are conceived and spoken after passing through the filter of innocence.” The strip is “a little human comedy for the innocent reader and for the sophisticated.” A child can enjoy them, and so can the reader who is tempted to draw analogies to Samuel Beckett.
The sophisticated part of Eco’s sensibility can recognize in Schulz’s art a depth that is full of shadows: “These children affect us because in a certain sense they are monsters: they are the monstrous infantile reductions of all the neuroses of a modern citizen of industrial civilization.” But the depths aren’t an abyss. The little monsters, while sometimes cruel, never become unspeakable. They “are capable suddenly of an innocence and a sincerity which calls everything into question....”
Charles Schulz was a neurotic, no doubt; but most neurotics aren’t Charles Schulz. He was something else. And it may be that we need an Italian semiotician to remind us just what: "If poetry means the capacity of carrying tenderness, pity, [and] wickedness to moments of extreme transparence, as if things passed through a light and there were no telling any more what substance they are made of,” as Eco wrote, “then Schulz is a poet.”
For countless dead bodies to become reanimated and swarm through the streets as cannibalistic ghouls would count as an apocalyptic development, by most people's standards. Then again, it is not one that we have to worry about all that much. Other possibilities of destruction tend to weigh more heavily on the mind. But if you combine extreme improbability with gruesome realism, the effect is a cinematic nightmare that won't go away -- one of the most durable and resonant forms of what Susan Sontag once described as "the imagination of disaster."
It all began with the release of George Romero's Night of the Living Dead in 1968: a low-budget independent film that more or less instituted the conventions of the cannibalistic zombie movie, as further developed in his Dawn of the Dead (1978) and Day of the Dead (1985). Other directors have played variations on his themes, but Romero remains the definitive zombie auteur -- not simply for founding the subgenre, but for making it apocalyptic in the richest sense. For the root meaning of "apocalypse," in Greek, is "an uncovering." Romero's zombies expose the dark underside of American culture: racism, consumerism, militarism, and so on.
His most recent addition to the zombie cycle, Diary of the Dead, which opened last Friday, returns viewers to the opening moments of the undead's onslaught. But while his first film, Night, was set in a world where radio and television were the only sources of information for panicking human refugees, Diary is a zombie film for the age of new media. Romero's band of survivors this time consists of a bunch of college students (and their alcoholic professor) who are busy making a film for class when the end of the world hits. One of them become obsessed with posting footage of the catastrophe online -- a chance for Romero to explore the ways that digital technology makes zombies of its users.
As an enthusiast for Romero's apocalyptic satire, I was somehow not terribly surprised to learn last year that Baylor University Press had published a book called Gospel of the Living Dead: George Romero's Visions of Hell on Earth. The author, Kim Paffenroth, is an associate professor of religious studies at Iona College in New Rochelle, New York.
Romero's zombie apocalypse brings "the complete breakdown of the natural world of food chains, social order, respect for life, and respect for death," writes Paffenroth, "because all those categories are meaningless and impossible to maintain in a world where one of the most fundamental limen, the threshold between alive and dead, has become a threshold that no one really crosses all the way over, but on which everyone lives suspended all the time." And in this moment of revelation, all the deadly sins stand fully revealed (and terribly rapacious).
The release of Diary of the Dead seemed a perfect excuse finally to interview Paffenroth. He answered questions by e-mail; the full transcript follows.
Q:You mention in your book that George Romero's work has literally given you nightmares. How did you go from watching his films to writing about them, and even publishing zombie fiction of your own?
A: Well, I was fascinated with the original Dawn when I was still a teen, but I'm afraid my level of commentary seldom got beyond -- "Zombies! Cool!" And then, to be honest, I didn't think of or watch any zombie films from the time Day came out until the Dawn remake was released. But during those years, I was just reading everything I could -- especially ancient and medieval literature, philosophy, and theology. So when I saw the Dawn remake, things clicked and I could give a more thorough and complicated response than I had when I was a youth, because I could then see how Romero was building on Dante and the Bible.
And to be frank, at that point I'd written a lot of books about the Bible and other theological topics, and no one read them. To an author, that's probably the worst disappointment imaginable. So I took a chance that if people didn't want to read about these theological subjects directly, maybe through the filter of their favorite monster genre, they'd be more open to the discussion and analysis. And it seems that they are.
As for making the transition to fiction writing, that's just crazy hubris that strikes all of us at some point -- the idea that anyone would want to read the tales we write -- and some of us are dogged and patient and lucky enough that it actually amounts to something. I never get over it, when I realize that there are some people who like my fiction and look forward to what I'll write next. That's a huge rush and I want to keep it going as long as I can.
Q:In the New Testament, Jesus dies, then comes back to life. His followers gather to eat his flesh and drink his blood. I am probably going to hell for this, but .... Is Christianity a zombie religion?
A: I think zombie movies want to portray the state of zombification as a monstrous perversion of the idea of Christian resurrection. Christians believe in a resurrection to a new, perfect state where there will be no pain or disease or violence. Zombies, on the other hand, are risen, but exist in a state where only the basest, most destructive human drive is left - the insatiable urge to consume, both as voracious gluttons of their fellow humans, and as mindless shoppers after petty, useless, meaningless objects. It's both a profoundly cynical look at human nature, and a sobering indictment of modern, American consumer culture.
Q:The human beings in Romero's world are living through an experience of "hell on earth." as your subtitle says. There are nods towards some possible naturalistic explanation for the dead within the films (that a virus or "space radiation" somehow brought corpses back to life) but the cause is never very useful or important to any of the characters. And some characters do think mankind is finally being punished. Is the apocalyptic dimension just more or less inevitable in this kind of disaster, or is it deliberate? To what degree is Romero's social satire consciously influenced by Christian themes? Or are those themes just inevitably built into the scenario and imagery?
A: I think "apocalyptic" has just come to mean "end of civilization," so of course, any movie or book with that as its premise is, by definition, "apocalyptic." And even if we throw in the interpretation "God's mad at us -- that big, mean God!" I still don't think that's very close to real, biblical apocalyptic.
Romero's view is a lot closer to biblical apocalyptic or prophetic literature, for he seems to make it clear, over and over, that humanity deserves this horror, and the humans in the films go to great lengths to make the situation even worse than it is already -- by their cruelty, greed, racism, and selfishness. Whether this is conscious or accidental, I really can't address with certainty: I only note that his prophetic vision is compatible with a Christian worldview, not that it stems from that.
Q:The fifth movie in George Romero's zombie cycle, Diary of the Dead , opened over the weekend. Does it seem like a progression or development in his vision, or does it simply revisit his earlier concerns in a new setting?
A: I think each film in the series has a special target that is the particular focus of Romero's disgust at the moment. The media has always been at the periphery in each of the previous films -- cooperating with government ineptitude and coverup in the first two until the plug's pulled and there is no more media -- but now it's the main subject of this installment.
Romero does a great job capturing the sick voyeurism of addiction to cell-phone cameras and the Internet - there are so many shots in this one where you just want to shout at the characters, "Put down the camera and HELP HER! SHE'S BEING EATEN ALIVE YOU IDIOT!" It is surely no accident that the two people who most help our protagonists are either cut off from the media (the Amish man) or they themselves have been the target of unfair representation in the media (black men who are called "looters" when white people in Katrina were said to be "salvaging" or "gathering" supplies). And the one time a crime is committed by one group of humans against another, the camera is forced off.
With that being said, I think in many ways it does return to the vision of Night of the Living Dead with its overwhelming cynicism and despair. Certainly the last shot is meant to evoke the same feeling of finality and doom as the first film, the gripping doubt that there's anything left in human society worth saving.
Q:It feels as if Romero is suggesting that Jason, the character holding the digital camera, is himself almost a zombie. There's something creepy about his detachment -- his appetite for just consuming what is going on around him, rather than acting to help anyone. But there are also indications that the cameraman does have a kind of moral commitment to what he is doing. He's trying to capture and transmit the truth of what is going on, because doing so might save lives. What did you make of that ambiguity? Is something redemptive going on here with behavior that otherwise seems quite inhuman?
A: I'd have to think about it in detail, once I have the DVD "text" to study. My initial reaction is that that interpretation mostly comes from the voice-over by Deb, his girlfriend and the narrator of Diary. The exact motives of Jason remain hazy to me. He says he doesn't want fame (what would it mean in their world?), yet he's obsessed with the 72,000 hits in 9 minutes. But he doesn't exactly explain why in that scene. I don't think he said that maybe some of the 72k people were saved or that he's doing a public service or helping save the world.
He just seems addicted and intoxicated by the 72k number itself -- like even if it's not fame, it's a junkie's fix, it's a validation of his value, as indeed is the chilling (and slightly comical) act of handing the camera to Deb at the end. As she keeps accusing him: if it doesn't happen on camera, it's like it doesn't happen.
So the camera is not reflecting reality, it's creating it. And Jason's version of reality is better than the government's falsified version of the first attack, because it's more accurate, but it's no less addictive or exploitive or inhumane by the end.
Q:Good points, but I still think there's some ambiguity about Jason's role, because this is a problem that comes up in debates over journalistic ethics -- whether the responsibility to report accurately and as a disengaged observer becomes, at some point, irresponsibility to any other standard of civilized behavior. Arguably Romero is having it both ways: criticizing Jason while simultaneously using the narrative format to ask whether or not his behavior might have some justification (however ex post facto or deluded).
A: Perhaps artists can have it both ways in a way journalists can't. Artists deal in ambiguities, journalists (supposedly) deal in facts. But with cell phones and the internet, suddenly everyone is a potential "journalist" and the facts are even more malleable and volatile than they ever were.
Q:You note that this subgenre has proven itself to be both popular with audiences and marginal to Hollywood. "Zombie movies," you write in your book, "just offend too many people on too many levels to be conventional and part of the status quo." And while not quite as gory as some of Romero's earlier work, Diary ends with an image calculated to shock and disgust. Is this a matter of keeping the element of humor under control? While a spoof like Shaun of the Dead was an affectionate homage to Romero, the element of social satire there didn't really have much, well, bite....
A: That's a great way to put it - that humorous homages use humor to offset the gore (look at the really over-the-top squashing scene in Hot Fuzz for an example of just how much gore you can offset, if the movie's funny enough!). But it also works the other way -- that biting social criticism needs some bite, needs to be a little out of control and not tamed or staid. I like that idea.
That being said, Romero makes my job a lot harder. The gore hounds sometimes put their hands over their ears and chant "LALALALA! I can't hear you!" if I say that some image they love on an aesthetic level might *mean* something -- while I think a lot of readers or viewers who might be receptive to critcism of our society just can't make it past the first disemboweling.
I would suppose it's an artistic judgment, and for me at least, Romero has been hitting the right balance for a long time, and is continuing to do so.
In 1997, Oxford University Press published Between God and Gangsta Rap: Bearing Witness to Black Culture,” by Michael Eric Dyson, who at that point was a professor of communications at the University of North Carolina at Chapel Hill. He has since gone on to bigger things; last summer, Dyson was named by Georgetown University as one of its University Professors. God and Gangsta arrived bearing glowing endorsements, including one by Houston A. Baker Jr., a former president of the Modern Language Association. (Two years ago, Baker left the English department at Duke University and joined the faculty at Vanderbilt University as Distinguished University Professor.)
In his new book, Betrayal: How Black Intellectuals Have Abandoned the Ideals of the Civil Rights Era (Columbia University Press), Baker recalls being stirred by his “hope for the black intellectual future to produce a supportive blurb invoking comparisons of Dyson with geniuses of times past. ”This, Baker now says, was “a grievous mistake.” Some tort lawyer should look into whether or not Baker is obliged to reimburse readers for the price of Dyson’s book.
Either way, it seems that Baker has now carefully read what he once so hastily blurbed, and found it wanting. “Dyson’s black public intellectual mode," he says, "is a Sugar Ray Robinson-style duck and cover strategy. It intermixes metaphors, and dodges and skips evasively with the light drama of nonce formulations. There are no intellectual knockouts. Further, there is virtually no irony whatsoever.” About a subsequent work, Baker says that the main factor “at work in Dyson’s text -- especially when he devotes lavish textual space to his own public appearances on ‘Meet the Press’ -- is authorial self-promotion.... This is the stuff of tabloid journalism. It is not worthy work for a true black public intellectual.”
A complex set of transactions is under way among those three adjectives, even beyond their relationship with the noun they qualify. Some black public intellectuals, it seems, aren’t truly intellectuals. And other black public intellectuals aren’t truly black.The whole domain must be policed by someone who manifests all three qualities in perfect harmony. Said gatekeeper must be willing and able to represent what the author calls “the black majority.” For the true black public intellectual, the interests, intentions, and aspirations of his community prove wonderfully apodictic. Guess who qualifies?
Not, to be sure, Shelby Steele or Stephen Carter or John McWhorter -- each of them a critic of affirmative action and of black popular culture. Baker treats these adherents of middle-class African-American assimilationism as so many fellow-travelers of the neoconservative ideology that emerged among Jewish intellectuals during the 1960s and ‘70s. Nor does Baker have much use for Henry Louis Gates or Cornel West. And his retroactive dis of the exceptionally telegenic Michael Eric Dyson has already been noted.
Betrayal takes on each of these figures through a mixture of critical analysis and personal insult -- blended in portions of roughly one part to three, respectively. This is an extraordinarily repetitious book. The range of ways to suggest that one’s targets are the contemporary equivalent of those African-American performers of the 19th and early 20th centuries who “blacked up” for the minstrel shows is, after all, finite and soon exhausted. Even the more substantial element of the book -- its critique of the emergence of a middle-class and centrist cohort of African-American intellectuals -- proves redundant. The late Harold Cruse anticipated the trend in The Crisis of the Negro Intellectual more than 40 years ago, and Adolph Reed Jr. brought it up to date in 1995 in his blistering essay, “What are the Drums Saying, Booker? The Current Crisis of the Black Intellectual.” (It can be found in his bookClass Notes: Posing as Politics and Other Thoughts on the American Scene, published almost 10 years ago but still an exemplary model of polemic as product of brain rather than spleen.)
What Betrayal offers is, primarily, is repetition of arguments others have made, spiced up with denunciations of motive (everybody loves money and going on TV) and passages that ventriloquize what Baker's opponents are really saying. Thus, Shelby Steele tells white America: “You should have known the majority of these power-hungry, searching-for-weakness ‘minorities’ out there have no merit, excellence, or cultural treasure to add to the world’s store. It probably would have been better for American morality and its capital reserves had white supremacy never ended.”
So one reads Steele as quoted in Betrayal -- followed by Baker's quick, glossing addendum: “Again, my words.” For Steele never actually said it. ("Again, my words" indeed: Baker likes the method enough to use it every so often.) In a war of words, this qualifies less as a weapon of mass destruction than a labor-saving device.
Baker assures readers that he, at least, is using the best tools available to the true black public intellectual. “I am,” Baker assures us, “a confident, certified, and practiced reader of textual argument, implicit textual values and implications, and the ever-varying significations of written words in their multiple contexts of reception.... I forgo ad hominem sensationalism, generalized condemnation, and scintillating innuendo where black neoconservatives and centrists are concerned. The following pages represent a rigorous, scholarly reading practice seasoned with wit.”
After reading some two hundred pages of "ad hominem sensationalism, generalized condemnation, and scintillating innuendo," one wonders if this passage, at least, may be a case of the "irony" that one of the blurbs for Betrayal attributes to it. I am not quite sure. But one moment of reading the book certainly had a profound effect on my grasp of just how seriously the book must be taken. This was when Baker discusses the affinity of certain contemporary black public intellectuals (the non-true kind) for neoconservatism.
Baker points out that in the 1940s, Irving Kristol, the founding father of that neoconservatism, abandoned the constricted world of left-wing politics “in search of a more expansive field of intellectual and associational commerce (one in which he would be ‘permitted’ to read Max Faber)....”
That parenthetical reference stopped me cold. I have a certain familiarity with the history of Kristol and his cohort, but somehow the role of Max Faber in their bildung had escaped my notice. Indeed, the name itself was totally unfamiliar. And having been informed that this book was "the product of “a rigorous, scholarly reading practice” -- one “seasoned with wit,” mind you, and published by Columbia University Press -- I felt quite embarrassed by this gap in my knowledge.
Off to the library, then, to unearth the works of Max Faber! But before I could get out the door, a little light bulb went off. Baker (who assures us that he is a capable judge of social-scientific discussions of African-American life) was actually referring to Max Weber.
It's a good thing the author of this book is "a confident, certified, and practiced reader of textual argument, implicit textual values and implications, and the ever-varying significations of written words in their multiple contexts of reception.” Otherwise one would have to feel embarrassed for him, and for the press that published it. And not just for its copy editors, by any means.
Last week, Intellectual Affairs gave the recent cable TV miniseries “Sex: The Revolution” a nod of recognition, however qualified, for its possible educational value. The idea that sex has a history is not, as such, self-evident. The series covers the changes in attitudes and norms between roughly 1950 and 1990 through interviews and archival footage. Most of this flies past at a breakneck speed, alas. The past becomes a hostage of the audience’s presumably diminished attention span.
Then again, why be ungrateful? Watching the series, I kept thinking of a friend who teaches history at Sisyphus University, a not-very-distinguished institution in the American heartland. For every student in his classroom who seems promising, there are dozens who barely qualify as sentient. (It sounds like Professor X, whose article “In the Basement of the Ivory Tower” appears in the latest issue of The Atlantic, teaches in the English department there.) Anything, absolutely anything, that might help stimulate curiosity about the past would be a godsend for the history faculty at Sisyphus U.
With that consideration in mind, you tend to watch “Sex: The Revolution” with a certain indulgence -- as entertainment with benefits, so to speak. Unfortunately, the makers stopped short. They neglected to interview scholars who might have provided more insight than a viewer might glean from soundbites by demi-celebrities. And so we end up with a version of history not too different from the one presented by Philip Larkin in the poem “Annus Mirabilis” --
Sexual intercourse began
In nineteen sixty-three
(Which was rather late for me) -
Between the end of the Chatterley ban
And the Beatles' first LP.
-- except without the irony. A belief that people in the old days must have been repressed is taken for granted. Was this a good thing or not? Phyllis Schlafly and reasonable people may disagree; but the idea itself is common coin of public discourse.
But suppose a television network made a different sort of program -- one incorporating parts of what one might learn from reading the scholarship on the history of sex. What sense of the past might then emerge?
We might as well start with the Puritans. Everybody knows how up-tight they were -- hostile to sex, scared of it, prone to thinking of it as one of the Devil’s wiles. The very word “Puritan” now suggests an inability to regard pleasure as a good thing.
A case in point being Michael Wigglesworth -- early Harvard graduate, Puritan cleric, and author of the first American best-seller, The Day of Doom (1642), an exciting poem about the apocalypse. Reverend Wigglesworth found the laughter of children to be unbearable. He said it made him think of the agonies of the damned in hell.You can just imagine how he would respond to the sound of moaning. Somehow it is not altogether surprising to learn that the Rev’s journal contains encrypted entries mentioning the “filthy lust” he felt while tutoring male students.
In short, a typical Puritan -- right? Well, not according to Edmund Morgan, the prominent early-Americanist, whose many contributions to scholarship over the years included cracking the Wigglesworth code. (He is now professor emeritus of history at Yale.)
Far from being typical, Wigglesworth, it seems, was pretty high-strung even by the standards of the day. In a classic paper called “The Puritans and Sex,” published in 1942, Morgan assessed the evidence about how ordinary believers regarded the libido in early New England. He found that, clichés notwithstanding, the Puritans tended to be rather matter-of-fact about it.
Sermons and casual references in letters and diaries reveal that the Puritans took sexual pleasure for granted and even celebrated it -- so long, at least, as it was enjoyed within holy wedlock. Of course, the early colonies attracted many people of both sexes who were either too young to marry or in such tight economic circumstances that it was not practical. This naturally meant a fair bit of random carrying on, even in those un-Craigslist-ed days. All such activity was displeasing unto the Lord, not to mention His earthly enforcers; but the court records show none of the squeamishness about that one might expect, given the Puritans’ reputation. Transgressions were punished, but the hungers of the flesh were taken for granted.
And Puritan enthusiasm for pleasures of the marriage bed was not quite so phallocentric as you might suppose. As a more recent study notes, New Englanders believed that both partners had to reach orgasm in order for conception to occur. Many Puritan women must have had their doubts on that score. Still, the currency of that particular bit of misinformation would tend to undermine the assumption that everybody was a walking bundle of dammed-up desire -- finding satisfaction only vicariously, through witch trials and the like.
Our imagined revisionist documentary would be full of such surprises. Recent scholarship suggests that American mores were pretty wild long before Alfred Kinsey quantified things in his famous reports.
Richard Godbeer’s Sexual Revolution in Early America (Johns Hopkins University Press, 2002) shows that abstinence education was not exactly the norm in the colonial period. Illegitimate births were commonplace; so was the arrival of children six or seven months after the wedding day. For that matter, cohabitation without benefit of clergy was the norm in some places. And while there were statutes on the books against sodomy -- understood as nonprocreative sexual activity in general -- it’s clear that many early Americans preferred to mind their own business.
Enforcing prohibitions on “unnatural acts” between members of the same sex was a remarkably low priority. “For the entire colonial period,” noted historians in a brief filed a few years ago when Lawrence v. Texas went to the U.S. Supreme Court, “we have reports of only two cases involving two women engaged in acts with one another.... The trial of Nicholas Sension, a married man living in Westhersfield, Connecticut, in 1677, revealed that he had been widely known for soliciting sexual contacts with the town’s men and youth for almost forty years but remained widely liked. Likewise, a Baptist minister in New London, Connecticut, was temporarily suspended from the pulpit in 1757 because of his repeatedly soliciting sex with men, but the congregation voted to restore him to the ministry after he publicly repented.”
History really comes alive, given details like that -- and we’ve barely reached the Continental Congress. The point is not that the country was engaged in one big orgy from Plymouth Rock onwards. But common attitudes and public policies were a lot more ambivalent and contradictory in the past than we’re usually prone to imagine.
There was certainly repression. In four or five cases from the colonial era, sodomy was punished by death. But in a society where things tend to be fluid -- where relocation is an option, and where money talks -- there will always be a significant share of the populace that lives and acts by its own lights, and places where the old rules don't much matter. And so every attempt to enforce inhibition is apt to seem like little, too late (especially to those making the effort).
You catch some of that frantic sense of moral breakdown in the literature of anti-Mormonism cited by Sarah Barringer Gordon in her study The Mormon Question: Polygamy and Constitutional Conflict in Nineteenth-Century America, published by the University of North Carolina Press in 2002. Novels about polygamous life in Utah were full of dark fascination with the lascivious excess being practiced in the name of freedom of religion – combined with fear that the very social and political order of the United States was being undermined. It was all very worrying, but also titillating. (Funny how often those qualities go together.)
The makers of “Sex: The Revolution” enjoyed the advantage of telling stories from recent history, which meant an abundance of film and video footage to document the past. Telling a revisionist story of American sexual history would suffer by visual comparison, tending either toward History Channel-style historical reenactments or Ken Burns-ish readings of documents over sepia-toned imagery.
But now, thanks to the efforts of phonographic archivists, we can at least listen to one part of the sexual discourse of long ago. A set of wax recordings from the 1890s -- released last year on a CD called “Actionable Offenses” -- preserves the kind of lewd entertainment enjoyed by some of the less respectable Americans of the Victorian era. And by “lewd,” I do not mean “somewhat racy.” The storytelling in dialect tends to be far coarser than anything that can be paraphrased in a family publication such as Inside Higher Ed. A performance called “Learning a City Gal How to Milk” is by no means the most obscene.
Anthony Comstock -- whose life’s work it was to preserve virtue by suppressing vice -- made every effort to wipe out such filth. It’s a small miracle that these recordings survived. The fact that they did gives us a hint at just how much of a challenge Comstock and associates must have faced.
When a popular program such as “Sex: The Revolution” recalls the past, it is usually an account of the struggle to free desire from inhibition. Or you can tell the same tale in a conservative vein: the good old days of restraint, followed by a decline into contemporary decadence.
Both versions are sentimental; both condescend to the past.
In the documentary I’d like to see, the forces of repression would be neither villains nor heroes. They would be hapless, helpless, confused -- and sinking fast in quicksand, pretty much from the start. It would be an eye-opening film. Not to mention commercially viable. After all, there would be a lot of sex in it.
"WALL-E," the latest animated production from Pixar Studios, is a heartwarming children’s film about ecological disaster. Its title character is a sturdy little trash-compacting robot whose name is the abbreviation for Waste Allocation Load-Lifter, Earth-class. He has been programmed to clear the vast junkpile left behind by mankind, which has long since absconded to live on a space station. His only companion -- at least as the film begins -- is a cockroach. Through plot developments it would spoil things to describe, WALL-E is transported to the human colony in deep space. In eight hundred years, it seems, our civilization will be a fusion of Wal-Mart, Club Med, and the World Wide Web.
Lots of kids will get their first taste of social satire from this film -- and chances are, they are going to enjoy it. Yet there is more to what Pixar has done than that. Some of the images are breathtaking. It turns out that robots have their romantic side, or at least WALL-E does; and the sight of him rescuing mementos from the wreckage (fragments shored up amidst human ruin) is perhaps more touching than the love story that later emerges.
I had heard almost nothing about the film before attending, so was not at all prepared for a strange surprise: It kept reminding me of Kenneth Burke’s writings about a grim future world he called Helhaven.
Burke, who died 15 years ago at the age of 96, was a poet, novelist, and critic who belonged to a cohort of modernist writers that included Hart Crane, Djuna Barnes, and William Carlos Williams. His name is not exactly a household word. It does not seem very likely that anyone at Pixar was counting on someone in the audience thinking, “Hey, this is a little bit like the essays that Kenneth Burke published in a couple of literary magazines in the early 1970s.” And I sure don’t mean to start an intellectual-property lawsuit here. The margin of overlap between Pixar and KB (as admirers tend to call him) is not a matter of direct influence. Rather, it’s a matter of each drawing out the most worrying implications of the way we live now.
Burke’s fiction and poetry tend to be overlooked by chroniclers of American literary history. But his experimental novel Towards a Better Life has exercised a strong influence on other writers -- especially Ralph Ellison, whose Invisible Man was deeply shaped by it. He also had a knack for being in interesting places at the right time. For example, he discovered and made the first English translation of Thomas Mann’s Death in Venice; and in the course of his day job as editor for The Dial, Burke helped prepare for its initial American publication a poem called “The Wasteland,” by one T.S. Eliot.
By the early 1930s, his occasional writings on aesthetic questions began to give shape to an increasingly systematic effort to analyze the full range of what Burke called “symbolic action,” a term that subsumed the entire range of human culture. His books were all over the disciplinary map -- part philosophy, part sociology, dashes of anthropology, plus elements from literature in various languages thrown in for good measure -- all tied together through his own idiosyncratic idioms.
Alas, given the vagaries of translation, Burke seems to have gone largely unnoticed by his theoretical peers in Europe; but it is fair to say that Burke’s method of “dramatism” is a kind of rough-hewn Yankee structuralism. His later speculations on “logology” have certain semi-Lacanian implications, even though KB was unaware of the French psychoanalyst’s work until very late in the game.
Along the way, Burke seems to have pioneered something that has only been given a name in more recent decades: the field of ecocriticism. In a book from 1937 called Attitudes Towards History, he noted that, among the recently emerging fields of study, “there is one little fellow called Ecology, and in time we shall pay him more attention.”
Burke often used the first-person plural -- so it is easy to read this as saying he meant to get back to the subject eventually. But his wording also implied that everyone would need to do so, sooner or later. Ecology teaches us “that the total economy of the planet cannot be guided by an efficient rationale of exploitation alone,” wrote Burke more than 70 years ago, “but that the exploiting part must eventually suffer if it too greatly disturbs the balance of the whole.”
In the early 1970s, Burke returned to this theme in a couple of texts that now seem more prophetic than ever. The Helhaven writings first appeared in The Sewanee Review and The Michigan Quarterly Review, and have been reprinted in the posthumous collection On Human Nature: A Gathering While Everything Flows, 1967-1984, published five years ago by the University of California Press.
The Helhaven writings -- a blend of science fiction and critical theory, with some of KB’s own poetry mixed in -- fall outside the familiar categories for labeling either creative or scholarly prose. In them, Burke imagined a future in which everyone who could escape from Earth did, relocating to a new, paradise-like home on the lunar surface he called Helhaven. The name was a pun combining “haven,” “heaven,” and “hell.”
The immediate context for Burke’s vision bears remembering: The Apollo missions were in progress, the first Earth Day was celebrated in 1970, and the release of the Pentagon Papers was making “technocratic rationality” sound like an oxymoron. And comments in the Helhaven writings make it clear all of these circumstances were on the author’s mind.
But just as important, it seems, was Burke’s realization that American life had completely trumped his previous effort to satirize it. At the very start of the Great Depression, Burke published a Jonathan Swift-like essay in The New Republic calling for his fellow citizens to destroy more of their natural resources. This was, he wrote, the key to prosperity. The old Protestant ethic of self-control and delayed gratification was a brake on the economy. “For though there is a limit to what a man can use,” he wrote, “there is no limit to what he can waste. The amount of production possible to a properly wasteful society is thus seen to be enormous.”
And if garbage was was good, war was better. “If people simply won’t throw things out fast enough to create new needs in keeping with the increased output under improved methods of manufacture,” suggested Burke, “we can always have recourse to the still more thoroughgoing wastage of war. An intelligently managed war can leave whole nations to be rebuilt, thus providing peak productivity for millions of the surviving population.”
Not everyone understood that Burke’s tongue was in cheek. A newspaper columnist expressed outrage, and the letters of indignation came pouring in. Burke’s editor at The New Republic told him that this invariably happened with satire. Some readers always took it seriously and got mad.
Four decades later, though, Burke saw an even greater problem. The joking recommendation he made in the 1930s to stimulate the economy via waste was, by the 1970s, an policy known as “planned obsolescence.” The idea of war as economic stimulus package evidently has its enthusiasts, too.
Furthermore, Burke now thought that the wasteful imperative was subsumed under what he called hypertechnologism -- the tendency for technology to develop its own momentum, and to reshape the world on its own terms. We had created machines to control and transform nature. But now they were controlling and transforming us. Our desires and attitudes tended to be the products of the latest innovations, rather than vice versa. (And to think that Burke died well before the rise of today’s market in consumer electronics.)
This wasn’t just a function of the economic system. It seemed to be part of the unfolding of our destiny as human beings. Borrowing a term from Aristotle, Burke referred to it as a manifestation of entelechy -- the tendency of a potential to realize itself. “Once human genius got implemented, or channelized, in terms of technological proliferation,” wrote Burke in 1974, “how [could we] turn back? Spontaneously what men hope for is more. And what realistic politician could ever hope to win on a platform that promised less?”
We were in “a self-perpetuating cycle,” he mused, “quite beyond our ability to adopt any major reforms in our ways of doing things.” Besides, failure to trust in progress is un-American. And so Burke tried to carry his speculations to their most extreme conclusion.
Suppose a beautiful lake were being turned into a chemical waste dump. Why try to figure out how to fix it? “That would be to turn back,” wrote Burke,” and we must fare ever forward. Hence with your eyes fixed on the beacon of the future, rather ask yourselves how, if you but polluted the lake ten times as much, you might convert it into some new source of energy ... a new fuel.”
By further extrapolation, Burke proposed letting the whole planet turn into a vast toxic cesspool as we built a new home -- a “gigantic womb-like Culture-Bubble, as it were” -- on the moon. The beautiful landscapes of Old Earth could be simulated on gigantic screens. Presumably there would be artificial gravity. Everything natural could be simulated by purely technological means.
We would have to take occasional trips back to be replenished by “the placenta of the Mother Earth,” our source for raw materials. Or rather, polluted materials. (Scientists on Helhaven would need to figure out how to purify them for human use.) Burke imagines a chapel on the lunar surface with telescopes pointed towards the Earth, with a passage from the Summa Theologica of Thomas Aquinas inscribed on the wall: “And the blessed in heaven shall look upon the sufferings of the damned, that they may love their blessedness more.”
The Helhaven writings seem darker -- and, well, battier -- than "WALL-E." Burke’s late work can get awfully wild, woolly, and self-referential; and these texts are a case in point. His imaginative streak is constantly disrupted by his theoretical glossolalia. He can barely sketch an image before his critical intelligence interrupts to begin picking it apart. The Helhaven texts, as such, can only appeal to someone already preoccupied with Burke's whole body of thought. You won't ever find in them the charm of watching a little robot struggle with a ping-pong paddle-ball.
But the similarities between KB’s perspective and that of the Pixar film are more striking than the differences. Both are warnings -- in each case, with a clear implication that the warning may have come much too late. For the point of such visions is not to picture how things might turn out. The planet-wide trash dump is not part of the future. Nor is the culture-bubble to be found in outer space. They are closer to us than that.
“Think of the many places in our country where the local drinking water is on the swill side, distastefully chlorinated, with traces of various contaminants,” he wrote almost four decades ago. “If, instead of putting up with that, you invest in bottled springwater, to that extent and by the same token you are already infused with the spirit of Helhaven. Even now, the kingdom of Helhaven is within you.”
Whatever happened to cinephilia? Does it still exist? I mean, in particular, the devotion of otherwise bookish souls to the screen. (The big screen, that is, not kind you are looking at now.) Do they still go to movies the way they once did? With anything like the passion, that is – the connoisseurship, the sheer appetite for seeing and comparing and discussing films?
I don’t think so. At least very few people that I know do. And certainly not in the way documented in Reborn (Farrar, Straus and Giroux) the recently published edition of Susan Sontag’s journals, which includes a few pages from a notebook listing the dozens of films the author attended over just three weeks in early 1961. An editorial comment provides more detail about Sontag’s record of her moviegoing that year: “On no occasion is there a break of more than four days between films seen; most often, SS notes having seen at least one, and not infrequently two or three per day.”
This was not just one writer’s personal quirk. It was clearly a generational phenomenon. In a memoir of his days as a student of philosophy at the Sorbonne in the late fifties and early sixties, the French political theorist Regis Debray describes how he and his friends would go from seminars to the cinema as often as their stipends allowed.
“We could afford to enjoy it several times a week,” he writes. “And that is not counting those crisis days when our satisfied and yet insatiable desire made us spend whole afternoons in its darkness. No sooner had we come out, scarcely had we left its embrace, our eyes still half-blind, than we would sit round a café table going over every detail.... Determinedly we discussed the montage, tracking shots, lighting, rhythms. There were directors, unknown to the wider public, whose names I have now forgotten, who let slip these passwords to the in-group of film enthusiasts. Are they still remembered, these names we went such distances to see? .... It may well be the case that our best and most sincere moments were those spent in front of the screen.”
Debray wrote this account of cinemania in the late sprint of 1967, while imprisoned in Bolivia following his capture by the military. He had gone there on a mission to see Che Guevara. An actor bearing a striking resemblance to the young Debray appears in the second part of Stephen Soderberg’s Che, now in theaters.
That passage from his Prison Writings (published by Random House in the early 1970s and long out of print; some university press might want to look into this) came to mind on a recent weekday afternoon.
After a marathon course of reading for several days, I was sick of print, let alone of writing, and had snuck off to see Soderberg’s film while it was still in the theater, on the assumption that it would lose something on the video screen. There was mild guilt: a feeling that, after all, I really ought to be doing some work. Debray ended up feeling a bit of guilt as well. Between trips to the cinema and arguing over concepts in Louis Althusser’s classroom, he found himself craving a more immediate sense of life – which was, in part, how he ended in the jungles of Bolivia, and then in its prisons.
Be that as it may, there was something appealing about this recollection of his younger self, which he composed at the ripe old age of 26. The same spirit comes through in the early pages of Richard Brody's Everything is Cinema: The Working Life of Jean-Luc Godard (Metropolitan Books) and now a finalist for one of the National Book Critics Circle awards. Brody evokes the world of cinema clubs in Paris that Godard fell into after dropping out of school – from which there emerged a clique of Left Bank intellectuals (including Francois Truffaut, Claude Chabrol, and Eric Rohmer) who first wrote critical essays on film for small magazines and then began directing their own.
They got their education by way of mania – which was communicable: Debray and Sontag were examples of writers who caught it from the New Wave directors. Another would be the novelist, poet, and linguist Pier Paolo Pasolini, who also started making films in the early sixties.
It’s not clear who the contemporary equivalents would be. In the mid-1990s you heard a lot about how Quentin Tarantino had worked in a video store and immersed himself in the history of film in much the same way that the French directors had. But the resemblance is limited at best. Godard engaged in a sustained (if oblique) dialogue with literature and philosophy in his films -- while Tarantino seems to have acquired a formidable command of cinematic technique without ever having anything resembling a thought in his head. Apart, of course, from “violence is cool,” which doesn’t really count.
These stray musings come via my own reading and limited experience. They are impressions, nothing more – and I put them down in full awareness that others may know better.My own sense of cinephilia's decline may reflect the fact that all of the movie theaters in my neighborhood (there used to be six within about a 15 minute walk) have gone out of business over the past ten years.
But over the same period cable television, Netflix, and the Internet have made it easier to see films than ever before. It is not that hard to get access to even fairly obscure work now. Coming across descriptions of Godard’s pre-Breathless short films, I found that they were readily available via YouTube. And while Godard ended up committing a good deal of petty crime to fund those early exercises, few aspiring directors would need to do so now: the tools for moviemaking are readily available.
So have I just gotten trapped (imprisoned, like Debray in Bolivia) by secondhand nostalgia? It wouldn't be the first time. Is cinephilia actually alive and well? Is there an underground renaissance, an alternative scene of digital cine clubs that I’m just not hearing about? Are you framing shots to take your mind off grad school or the job market? It would be good to think so -- to imagine a new New Wave, about to break.