For countless dead bodies to become reanimated and swarm through the streets as cannibalistic ghouls would count as an apocalyptic development, by most people's standards. Then again, it is not one that we have to worry about all that much. Other possibilities of destruction tend to weigh more heavily on the mind. But if you combine extreme improbability with gruesome realism, the effect is a cinematic nightmare that won't go away -- one of the most durable and resonant forms of what Susan Sontag once described as "the imagination of disaster."
It all began with the release of George Romero's Night of the Living Dead in 1968: a low-budget independent film that more or less instituted the conventions of the cannibalistic zombie movie, as further developed in his Dawn of the Dead (1978) and Day of the Dead (1985). Other directors have played variations on his themes, but Romero remains the definitive zombie auteur -- not simply for founding the subgenre, but for making it apocalyptic in the richest sense. For the root meaning of "apocalypse," in Greek, is "an uncovering." Romero's zombies expose the dark underside of American culture: racism, consumerism, militarism, and so on.
His most recent addition to the zombie cycle, Diary of the Dead, which opened last Friday, returns viewers to the opening moments of the undead's onslaught. But while his first film, Night, was set in a world where radio and television were the only sources of information for panicking human refugees, Diary is a zombie film for the age of new media. Romero's band of survivors this time consists of a bunch of college students (and their alcoholic professor) who are busy making a film for class when the end of the world hits. One of them become obsessed with posting footage of the catastrophe online -- a chance for Romero to explore the ways that digital technology makes zombies of its users.
As an enthusiast for Romero's apocalyptic satire, I was somehow not terribly surprised to learn last year that Baylor University Press had published a book called Gospel of the Living Dead: George Romero's Visions of Hell on Earth. The author, Kim Paffenroth, is an associate professor of religious studies at Iona College in New Rochelle, New York.
Romero's zombie apocalypse brings "the complete breakdown of the natural world of food chains, social order, respect for life, and respect for death," writes Paffenroth, "because all those categories are meaningless and impossible to maintain in a world where one of the most fundamental limen, the threshold between alive and dead, has become a threshold that no one really crosses all the way over, but on which everyone lives suspended all the time." And in this moment of revelation, all the deadly sins stand fully revealed (and terribly rapacious).
The release of Diary of the Dead seemed a perfect excuse finally to interview Paffenroth. He answered questions by e-mail; the full transcript follows.
Q:You mention in your book that George Romero's work has literally given you nightmares. How did you go from watching his films to writing about them, and even publishing zombie fiction of your own?
A: Well, I was fascinated with the original Dawn when I was still a teen, but I'm afraid my level of commentary seldom got beyond -- "Zombies! Cool!" And then, to be honest, I didn't think of or watch any zombie films from the time Day came out until the Dawn remake was released. But during those years, I was just reading everything I could -- especially ancient and medieval literature, philosophy, and theology. So when I saw the Dawn remake, things clicked and I could give a more thorough and complicated response than I had when I was a youth, because I could then see how Romero was building on Dante and the Bible.
And to be frank, at that point I'd written a lot of books about the Bible and other theological topics, and no one read them. To an author, that's probably the worst disappointment imaginable. So I took a chance that if people didn't want to read about these theological subjects directly, maybe through the filter of their favorite monster genre, they'd be more open to the discussion and analysis. And it seems that they are.
As for making the transition to fiction writing, that's just crazy hubris that strikes all of us at some point -- the idea that anyone would want to read the tales we write -- and some of us are dogged and patient and lucky enough that it actually amounts to something. I never get over it, when I realize that there are some people who like my fiction and look forward to what I'll write next. That's a huge rush and I want to keep it going as long as I can.
Q:In the New Testament, Jesus dies, then comes back to life. His followers gather to eat his flesh and drink his blood. I am probably going to hell for this, but .... Is Christianity a zombie religion?
A: I think zombie movies want to portray the state of zombification as a monstrous perversion of the idea of Christian resurrection. Christians believe in a resurrection to a new, perfect state where there will be no pain or disease or violence. Zombies, on the other hand, are risen, but exist in a state where only the basest, most destructive human drive is left - the insatiable urge to consume, both as voracious gluttons of their fellow humans, and as mindless shoppers after petty, useless, meaningless objects. It's both a profoundly cynical look at human nature, and a sobering indictment of modern, American consumer culture.
Q:The human beings in Romero's world are living through an experience of "hell on earth." as your subtitle says. There are nods towards some possible naturalistic explanation for the dead within the films (that a virus or "space radiation" somehow brought corpses back to life) but the cause is never very useful or important to any of the characters. And some characters do think mankind is finally being punished. Is the apocalyptic dimension just more or less inevitable in this kind of disaster, or is it deliberate? To what degree is Romero's social satire consciously influenced by Christian themes? Or are those themes just inevitably built into the scenario and imagery?
A: I think "apocalyptic" has just come to mean "end of civilization," so of course, any movie or book with that as its premise is, by definition, "apocalyptic." And even if we throw in the interpretation "God's mad at us -- that big, mean God!" I still don't think that's very close to real, biblical apocalyptic.
Romero's view is a lot closer to biblical apocalyptic or prophetic literature, for he seems to make it clear, over and over, that humanity deserves this horror, and the humans in the films go to great lengths to make the situation even worse than it is already -- by their cruelty, greed, racism, and selfishness. Whether this is conscious or accidental, I really can't address with certainty: I only note that his prophetic vision is compatible with a Christian worldview, not that it stems from that.
Q:The fifth movie in George Romero's zombie cycle, Diary of the Dead , opened over the weekend. Does it seem like a progression or development in his vision, or does it simply revisit his earlier concerns in a new setting?
A: I think each film in the series has a special target that is the particular focus of Romero's disgust at the moment. The media has always been at the periphery in each of the previous films -- cooperating with government ineptitude and coverup in the first two until the plug's pulled and there is no more media -- but now it's the main subject of this installment.
Romero does a great job capturing the sick voyeurism of addiction to cell-phone cameras and the Internet - there are so many shots in this one where you just want to shout at the characters, "Put down the camera and HELP HER! SHE'S BEING EATEN ALIVE YOU IDIOT!" It is surely no accident that the two people who most help our protagonists are either cut off from the media (the Amish man) or they themselves have been the target of unfair representation in the media (black men who are called "looters" when white people in Katrina were said to be "salvaging" or "gathering" supplies). And the one time a crime is committed by one group of humans against another, the camera is forced off.
With that being said, I think in many ways it does return to the vision of Night of the Living Dead with its overwhelming cynicism and despair. Certainly the last shot is meant to evoke the same feeling of finality and doom as the first film, the gripping doubt that there's anything left in human society worth saving.
Q:It feels as if Romero is suggesting that Jason, the character holding the digital camera, is himself almost a zombie. There's something creepy about his detachment -- his appetite for just consuming what is going on around him, rather than acting to help anyone. But there are also indications that the cameraman does have a kind of moral commitment to what he is doing. He's trying to capture and transmit the truth of what is going on, because doing so might save lives. What did you make of that ambiguity? Is something redemptive going on here with behavior that otherwise seems quite inhuman?
A: I'd have to think about it in detail, once I have the DVD "text" to study. My initial reaction is that that interpretation mostly comes from the voice-over by Deb, his girlfriend and the narrator of Diary. The exact motives of Jason remain hazy to me. He says he doesn't want fame (what would it mean in their world?), yet he's obsessed with the 72,000 hits in 9 minutes. But he doesn't exactly explain why in that scene. I don't think he said that maybe some of the 72k people were saved or that he's doing a public service or helping save the world.
He just seems addicted and intoxicated by the 72k number itself -- like even if it's not fame, it's a junkie's fix, it's a validation of his value, as indeed is the chilling (and slightly comical) act of handing the camera to Deb at the end. As she keeps accusing him: if it doesn't happen on camera, it's like it doesn't happen.
So the camera is not reflecting reality, it's creating it. And Jason's version of reality is better than the government's falsified version of the first attack, because it's more accurate, but it's no less addictive or exploitive or inhumane by the end.
Q:Good points, but I still think there's some ambiguity about Jason's role, because this is a problem that comes up in debates over journalistic ethics -- whether the responsibility to report accurately and as a disengaged observer becomes, at some point, irresponsibility to any other standard of civilized behavior. Arguably Romero is having it both ways: criticizing Jason while simultaneously using the narrative format to ask whether or not his behavior might have some justification (however ex post facto or deluded).
A: Perhaps artists can have it both ways in a way journalists can't. Artists deal in ambiguities, journalists (supposedly) deal in facts. But with cell phones and the internet, suddenly everyone is a potential "journalist" and the facts are even more malleable and volatile than they ever were.
Q:You note that this subgenre has proven itself to be both popular with audiences and marginal to Hollywood. "Zombie movies," you write in your book, "just offend too many people on too many levels to be conventional and part of the status quo." And while not quite as gory as some of Romero's earlier work, Diary ends with an image calculated to shock and disgust. Is this a matter of keeping the element of humor under control? While a spoof like Shaun of the Dead was an affectionate homage to Romero, the element of social satire there didn't really have much, well, bite....
A: That's a great way to put it - that humorous homages use humor to offset the gore (look at the really over-the-top squashing scene in Hot Fuzz for an example of just how much gore you can offset, if the movie's funny enough!). But it also works the other way -- that biting social criticism needs some bite, needs to be a little out of control and not tamed or staid. I like that idea.
That being said, Romero makes my job a lot harder. The gore hounds sometimes put their hands over their ears and chant "LALALALA! I can't hear you!" if I say that some image they love on an aesthetic level might *mean* something -- while I think a lot of readers or viewers who might be receptive to critcism of our society just can't make it past the first disemboweling.
I would suppose it's an artistic judgment, and for me at least, Romero has been hitting the right balance for a long time, and is continuing to do so.
In 1997, Oxford University Press published Between God and Gangsta Rap: Bearing Witness to Black Culture,” by Michael Eric Dyson, who at that point was a professor of communications at the University of North Carolina at Chapel Hill. He has since gone on to bigger things; last summer, Dyson was named by Georgetown University as one of its University Professors. God and Gangsta arrived bearing glowing endorsements, including one by Houston A. Baker Jr., a former president of the Modern Language Association. (Two years ago, Baker left the English department at Duke University and joined the faculty at Vanderbilt University as Distinguished University Professor.)
In his new book, Betrayal: How Black Intellectuals Have Abandoned the Ideals of the Civil Rights Era (Columbia University Press), Baker recalls being stirred by his “hope for the black intellectual future to produce a supportive blurb invoking comparisons of Dyson with geniuses of times past. ”This, Baker now says, was “a grievous mistake.” Some tort lawyer should look into whether or not Baker is obliged to reimburse readers for the price of Dyson’s book.
Either way, it seems that Baker has now carefully read what he once so hastily blurbed, and found it wanting. “Dyson’s black public intellectual mode," he says, "is a Sugar Ray Robinson-style duck and cover strategy. It intermixes metaphors, and dodges and skips evasively with the light drama of nonce formulations. There are no intellectual knockouts. Further, there is virtually no irony whatsoever.” About a subsequent work, Baker says that the main factor “at work in Dyson’s text -- especially when he devotes lavish textual space to his own public appearances on ‘Meet the Press’ -- is authorial self-promotion.... This is the stuff of tabloid journalism. It is not worthy work for a true black public intellectual.”
A complex set of transactions is under way among those three adjectives, even beyond their relationship with the noun they qualify. Some black public intellectuals, it seems, aren’t truly intellectuals. And other black public intellectuals aren’t truly black.The whole domain must be policed by someone who manifests all three qualities in perfect harmony. Said gatekeeper must be willing and able to represent what the author calls “the black majority.” For the true black public intellectual, the interests, intentions, and aspirations of his community prove wonderfully apodictic. Guess who qualifies?
Not, to be sure, Shelby Steele or Stephen Carter or John McWhorter -- each of them a critic of affirmative action and of black popular culture. Baker treats these adherents of middle-class African-American assimilationism as so many fellow-travelers of the neoconservative ideology that emerged among Jewish intellectuals during the 1960s and ‘70s. Nor does Baker have much use for Henry Louis Gates or Cornel West. And his retroactive dis of the exceptionally telegenic Michael Eric Dyson has already been noted.
Betrayal takes on each of these figures through a mixture of critical analysis and personal insult -- blended in portions of roughly one part to three, respectively. This is an extraordinarily repetitious book. The range of ways to suggest that one’s targets are the contemporary equivalent of those African-American performers of the 19th and early 20th centuries who “blacked up” for the minstrel shows is, after all, finite and soon exhausted. Even the more substantial element of the book -- its critique of the emergence of a middle-class and centrist cohort of African-American intellectuals -- proves redundant. The late Harold Cruse anticipated the trend in The Crisis of the Negro Intellectual more than 40 years ago, and Adolph Reed Jr. brought it up to date in 1995 in his blistering essay, “What are the Drums Saying, Booker? The Current Crisis of the Black Intellectual.” (It can be found in his bookClass Notes: Posing as Politics and Other Thoughts on the American Scene, published almost 10 years ago but still an exemplary model of polemic as product of brain rather than spleen.)
What Betrayal offers is, primarily, is repetition of arguments others have made, spiced up with denunciations of motive (everybody loves money and going on TV) and passages that ventriloquize what Baker's opponents are really saying. Thus, Shelby Steele tells white America: “You should have known the majority of these power-hungry, searching-for-weakness ‘minorities’ out there have no merit, excellence, or cultural treasure to add to the world’s store. It probably would have been better for American morality and its capital reserves had white supremacy never ended.”
So one reads Steele as quoted in Betrayal -- followed by Baker's quick, glossing addendum: “Again, my words.” For Steele never actually said it. ("Again, my words" indeed: Baker likes the method enough to use it every so often.) In a war of words, this qualifies less as a weapon of mass destruction than a labor-saving device.
Baker assures readers that he, at least, is using the best tools available to the true black public intellectual. “I am,” Baker assures us, “a confident, certified, and practiced reader of textual argument, implicit textual values and implications, and the ever-varying significations of written words in their multiple contexts of reception.... I forgo ad hominem sensationalism, generalized condemnation, and scintillating innuendo where black neoconservatives and centrists are concerned. The following pages represent a rigorous, scholarly reading practice seasoned with wit.”
After reading some two hundred pages of "ad hominem sensationalism, generalized condemnation, and scintillating innuendo," one wonders if this passage, at least, may be a case of the "irony" that one of the blurbs for Betrayal attributes to it. I am not quite sure. But one moment of reading the book certainly had a profound effect on my grasp of just how seriously the book must be taken. This was when Baker discusses the affinity of certain contemporary black public intellectuals (the non-true kind) for neoconservatism.
Baker points out that in the 1940s, Irving Kristol, the founding father of that neoconservatism, abandoned the constricted world of left-wing politics “in search of a more expansive field of intellectual and associational commerce (one in which he would be ‘permitted’ to read Max Faber)....”
That parenthetical reference stopped me cold. I have a certain familiarity with the history of Kristol and his cohort, but somehow the role of Max Faber in their bildung had escaped my notice. Indeed, the name itself was totally unfamiliar. And having been informed that this book was "the product of “a rigorous, scholarly reading practice” -- one “seasoned with wit,” mind you, and published by Columbia University Press -- I felt quite embarrassed by this gap in my knowledge.
Off to the library, then, to unearth the works of Max Faber! But before I could get out the door, a little light bulb went off. Baker (who assures us that he is a capable judge of social-scientific discussions of African-American life) was actually referring to Max Weber.
It's a good thing the author of this book is "a confident, certified, and practiced reader of textual argument, implicit textual values and implications, and the ever-varying significations of written words in their multiple contexts of reception.” Otherwise one would have to feel embarrassed for him, and for the press that published it. And not just for its copy editors, by any means.
Last week, Intellectual Affairs gave the recent cable TV miniseries “Sex: The Revolution” a nod of recognition, however qualified, for its possible educational value. The idea that sex has a history is not, as such, self-evident. The series covers the changes in attitudes and norms between roughly 1950 and 1990 through interviews and archival footage. Most of this flies past at a breakneck speed, alas. The past becomes a hostage of the audience’s presumably diminished attention span.
Then again, why be ungrateful? Watching the series, I kept thinking of a friend who teaches history at Sisyphus University, a not-very-distinguished institution in the American heartland. For every student in his classroom who seems promising, there are dozens who barely qualify as sentient. (It sounds like Professor X, whose article “In the Basement of the Ivory Tower” appears in the latest issue of The Atlantic, teaches in the English department there.) Anything, absolutely anything, that might help stimulate curiosity about the past would be a godsend for the history faculty at Sisyphus U.
With that consideration in mind, you tend to watch “Sex: The Revolution” with a certain indulgence -- as entertainment with benefits, so to speak. Unfortunately, the makers stopped short. They neglected to interview scholars who might have provided more insight than a viewer might glean from soundbites by demi-celebrities. And so we end up with a version of history not too different from the one presented by Philip Larkin in the poem “Annus Mirabilis” --
Sexual intercourse began
In nineteen sixty-three
(Which was rather late for me) -
Between the end of the Chatterley ban
And the Beatles' first LP.
-- except without the irony. A belief that people in the old days must have been repressed is taken for granted. Was this a good thing or not? Phyllis Schlafly and reasonable people may disagree; but the idea itself is common coin of public discourse.
But suppose a television network made a different sort of program -- one incorporating parts of what one might learn from reading the scholarship on the history of sex. What sense of the past might then emerge?
We might as well start with the Puritans. Everybody knows how up-tight they were -- hostile to sex, scared of it, prone to thinking of it as one of the Devil’s wiles. The very word “Puritan” now suggests an inability to regard pleasure as a good thing.
A case in point being Michael Wigglesworth -- early Harvard graduate, Puritan cleric, and author of the first American best-seller, The Day of Doom (1642), an exciting poem about the apocalypse. Reverend Wigglesworth found the laughter of children to be unbearable. He said it made him think of the agonies of the damned in hell.You can just imagine how he would respond to the sound of moaning. Somehow it is not altogether surprising to learn that the Rev’s journal contains encrypted entries mentioning the “filthy lust” he felt while tutoring male students.
In short, a typical Puritan -- right? Well, not according to Edmund Morgan, the prominent early-Americanist, whose many contributions to scholarship over the years included cracking the Wigglesworth code. (He is now professor emeritus of history at Yale.)
Far from being typical, Wigglesworth, it seems, was pretty high-strung even by the standards of the day. In a classic paper called “The Puritans and Sex,” published in 1942, Morgan assessed the evidence about how ordinary believers regarded the libido in early New England. He found that, clichés notwithstanding, the Puritans tended to be rather matter-of-fact about it.
Sermons and casual references in letters and diaries reveal that the Puritans took sexual pleasure for granted and even celebrated it -- so long, at least, as it was enjoyed within holy wedlock. Of course, the early colonies attracted many people of both sexes who were either too young to marry or in such tight economic circumstances that it was not practical. This naturally meant a fair bit of random carrying on, even in those un-Craigslist-ed days. All such activity was displeasing unto the Lord, not to mention His earthly enforcers; but the court records show none of the squeamishness about that one might expect, given the Puritans’ reputation. Transgressions were punished, but the hungers of the flesh were taken for granted.
And Puritan enthusiasm for pleasures of the marriage bed was not quite so phallocentric as you might suppose. As a more recent study notes, New Englanders believed that both partners had to reach orgasm in order for conception to occur. Many Puritan women must have had their doubts on that score. Still, the currency of that particular bit of misinformation would tend to undermine the assumption that everybody was a walking bundle of dammed-up desire -- finding satisfaction only vicariously, through witch trials and the like.
Our imagined revisionist documentary would be full of such surprises. Recent scholarship suggests that American mores were pretty wild long before Alfred Kinsey quantified things in his famous reports.
Richard Godbeer’s Sexual Revolution in Early America (Johns Hopkins University Press, 2002) shows that abstinence education was not exactly the norm in the colonial period. Illegitimate births were commonplace; so was the arrival of children six or seven months after the wedding day. For that matter, cohabitation without benefit of clergy was the norm in some places. And while there were statutes on the books against sodomy -- understood as nonprocreative sexual activity in general -- it’s clear that many early Americans preferred to mind their own business.
Enforcing prohibitions on “unnatural acts” between members of the same sex was a remarkably low priority. “For the entire colonial period,” noted historians in a brief filed a few years ago when Lawrence v. Texas went to the U.S. Supreme Court, “we have reports of only two cases involving two women engaged in acts with one another.... The trial of Nicholas Sension, a married man living in Westhersfield, Connecticut, in 1677, revealed that he had been widely known for soliciting sexual contacts with the town’s men and youth for almost forty years but remained widely liked. Likewise, a Baptist minister in New London, Connecticut, was temporarily suspended from the pulpit in 1757 because of his repeatedly soliciting sex with men, but the congregation voted to restore him to the ministry after he publicly repented.”
History really comes alive, given details like that -- and we’ve barely reached the Continental Congress. The point is not that the country was engaged in one big orgy from Plymouth Rock onwards. But common attitudes and public policies were a lot more ambivalent and contradictory in the past than we’re usually prone to imagine.
There was certainly repression. In four or five cases from the colonial era, sodomy was punished by death. But in a society where things tend to be fluid -- where relocation is an option, and where money talks -- there will always be a significant share of the populace that lives and acts by its own lights, and places where the old rules don't much matter. And so every attempt to enforce inhibition is apt to seem like little, too late (especially to those making the effort).
You catch some of that frantic sense of moral breakdown in the literature of anti-Mormonism cited by Sarah Barringer Gordon in her study The Mormon Question: Polygamy and Constitutional Conflict in Nineteenth-Century America, published by the University of North Carolina Press in 2002. Novels about polygamous life in Utah were full of dark fascination with the lascivious excess being practiced in the name of freedom of religion – combined with fear that the very social and political order of the United States was being undermined. It was all very worrying, but also titillating. (Funny how often those qualities go together.)
The makers of “Sex: The Revolution” enjoyed the advantage of telling stories from recent history, which meant an abundance of film and video footage to document the past. Telling a revisionist story of American sexual history would suffer by visual comparison, tending either toward History Channel-style historical reenactments or Ken Burns-ish readings of documents over sepia-toned imagery.
But now, thanks to the efforts of phonographic archivists, we can at least listen to one part of the sexual discourse of long ago. A set of wax recordings from the 1890s -- released last year on a CD called “Actionable Offenses” -- preserves the kind of lewd entertainment enjoyed by some of the less respectable Americans of the Victorian era. And by “lewd,” I do not mean “somewhat racy.” The storytelling in dialect tends to be far coarser than anything that can be paraphrased in a family publication such as Inside Higher Ed. A performance called “Learning a City Gal How to Milk” is by no means the most obscene.
Anthony Comstock -- whose life’s work it was to preserve virtue by suppressing vice -- made every effort to wipe out such filth. It’s a small miracle that these recordings survived. The fact that they did gives us a hint at just how much of a challenge Comstock and associates must have faced.
When a popular program such as “Sex: The Revolution” recalls the past, it is usually an account of the struggle to free desire from inhibition. Or you can tell the same tale in a conservative vein: the good old days of restraint, followed by a decline into contemporary decadence.
Both versions are sentimental; both condescend to the past.
In the documentary I’d like to see, the forces of repression would be neither villains nor heroes. They would be hapless, helpless, confused -- and sinking fast in quicksand, pretty much from the start. It would be an eye-opening film. Not to mention commercially viable. After all, there would be a lot of sex in it.
"WALL-E," the latest animated production from Pixar Studios, is a heartwarming children’s film about ecological disaster. Its title character is a sturdy little trash-compacting robot whose name is the abbreviation for Waste Allocation Load-Lifter, Earth-class. He has been programmed to clear the vast junkpile left behind by mankind, which has long since absconded to live on a space station. His only companion -- at least as the film begins -- is a cockroach. Through plot developments it would spoil things to describe, WALL-E is transported to the human colony in deep space. In eight hundred years, it seems, our civilization will be a fusion of Wal-Mart, Club Med, and the World Wide Web.
Lots of kids will get their first taste of social satire from this film -- and chances are, they are going to enjoy it. Yet there is more to what Pixar has done than that. Some of the images are breathtaking. It turns out that robots have their romantic side, or at least WALL-E does; and the sight of him rescuing mementos from the wreckage (fragments shored up amidst human ruin) is perhaps more touching than the love story that later emerges.
I had heard almost nothing about the film before attending, so was not at all prepared for a strange surprise: It kept reminding me of Kenneth Burke’s writings about a grim future world he called Helhaven.
Burke, who died 15 years ago at the age of 96, was a poet, novelist, and critic who belonged to a cohort of modernist writers that included Hart Crane, Djuna Barnes, and William Carlos Williams. His name is not exactly a household word. It does not seem very likely that anyone at Pixar was counting on someone in the audience thinking, “Hey, this is a little bit like the essays that Kenneth Burke published in a couple of literary magazines in the early 1970s.” And I sure don’t mean to start an intellectual-property lawsuit here. The margin of overlap between Pixar and KB (as admirers tend to call him) is not a matter of direct influence. Rather, it’s a matter of each drawing out the most worrying implications of the way we live now.
Burke’s fiction and poetry tend to be overlooked by chroniclers of American literary history. But his experimental novel Towards a Better Life has exercised a strong influence on other writers -- especially Ralph Ellison, whose Invisible Man was deeply shaped by it. He also had a knack for being in interesting places at the right time. For example, he discovered and made the first English translation of Thomas Mann’s Death in Venice; and in the course of his day job as editor for The Dial, Burke helped prepare for its initial American publication a poem called “The Wasteland,” by one T.S. Eliot.
By the early 1930s, his occasional writings on aesthetic questions began to give shape to an increasingly systematic effort to analyze the full range of what Burke called “symbolic action,” a term that subsumed the entire range of human culture. His books were all over the disciplinary map -- part philosophy, part sociology, dashes of anthropology, plus elements from literature in various languages thrown in for good measure -- all tied together through his own idiosyncratic idioms.
Alas, given the vagaries of translation, Burke seems to have gone largely unnoticed by his theoretical peers in Europe; but it is fair to say that Burke’s method of “dramatism” is a kind of rough-hewn Yankee structuralism. His later speculations on “logology” have certain semi-Lacanian implications, even though KB was unaware of the French psychoanalyst’s work until very late in the game.
Along the way, Burke seems to have pioneered something that has only been given a name in more recent decades: the field of ecocriticism. In a book from 1937 called Attitudes Towards History, he noted that, among the recently emerging fields of study, “there is one little fellow called Ecology, and in time we shall pay him more attention.”
Burke often used the first-person plural -- so it is easy to read this as saying he meant to get back to the subject eventually. But his wording also implied that everyone would need to do so, sooner or later. Ecology teaches us “that the total economy of the planet cannot be guided by an efficient rationale of exploitation alone,” wrote Burke more than 70 years ago, “but that the exploiting part must eventually suffer if it too greatly disturbs the balance of the whole.”
In the early 1970s, Burke returned to this theme in a couple of texts that now seem more prophetic than ever. The Helhaven writings first appeared in The Sewanee Review and The Michigan Quarterly Review, and have been reprinted in the posthumous collection On Human Nature: A Gathering While Everything Flows, 1967-1984, published five years ago by the University of California Press.
The Helhaven writings -- a blend of science fiction and critical theory, with some of KB’s own poetry mixed in -- fall outside the familiar categories for labeling either creative or scholarly prose. In them, Burke imagined a future in which everyone who could escape from Earth did, relocating to a new, paradise-like home on the lunar surface he called Helhaven. The name was a pun combining “haven,” “heaven,” and “hell.”
The immediate context for Burke’s vision bears remembering: The Apollo missions were in progress, the first Earth Day was celebrated in 1970, and the release of the Pentagon Papers was making “technocratic rationality” sound like an oxymoron. And comments in the Helhaven writings make it clear all of these circumstances were on the author’s mind.
But just as important, it seems, was Burke’s realization that American life had completely trumped his previous effort to satirize it. At the very start of the Great Depression, Burke published a Jonathan Swift-like essay in The New Republic calling for his fellow citizens to destroy more of their natural resources. This was, he wrote, the key to prosperity. The old Protestant ethic of self-control and delayed gratification was a brake on the economy. “For though there is a limit to what a man can use,” he wrote, “there is no limit to what he can waste. The amount of production possible to a properly wasteful society is thus seen to be enormous.”
And if garbage was was good, war was better. “If people simply won’t throw things out fast enough to create new needs in keeping with the increased output under improved methods of manufacture,” suggested Burke, “we can always have recourse to the still more thoroughgoing wastage of war. An intelligently managed war can leave whole nations to be rebuilt, thus providing peak productivity for millions of the surviving population.”
Not everyone understood that Burke’s tongue was in cheek. A newspaper columnist expressed outrage, and the letters of indignation came pouring in. Burke’s editor at The New Republic told him that this invariably happened with satire. Some readers always took it seriously and got mad.
Four decades later, though, Burke saw an even greater problem. The joking recommendation he made in the 1930s to stimulate the economy via waste was, by the 1970s, an policy known as “planned obsolescence.” The idea of war as economic stimulus package evidently has its enthusiasts, too.
Furthermore, Burke now thought that the wasteful imperative was subsumed under what he called hypertechnologism -- the tendency for technology to develop its own momentum, and to reshape the world on its own terms. We had created machines to control and transform nature. But now they were controlling and transforming us. Our desires and attitudes tended to be the products of the latest innovations, rather than vice versa. (And to think that Burke died well before the rise of today’s market in consumer electronics.)
This wasn’t just a function of the economic system. It seemed to be part of the unfolding of our destiny as human beings. Borrowing a term from Aristotle, Burke referred to it as a manifestation of entelechy -- the tendency of a potential to realize itself. “Once human genius got implemented, or channelized, in terms of technological proliferation,” wrote Burke in 1974, “how [could we] turn back? Spontaneously what men hope for is more. And what realistic politician could ever hope to win on a platform that promised less?”
We were in “a self-perpetuating cycle,” he mused, “quite beyond our ability to adopt any major reforms in our ways of doing things.” Besides, failure to trust in progress is un-American. And so Burke tried to carry his speculations to their most extreme conclusion.
Suppose a beautiful lake were being turned into a chemical waste dump. Why try to figure out how to fix it? “That would be to turn back,” wrote Burke,” and we must fare ever forward. Hence with your eyes fixed on the beacon of the future, rather ask yourselves how, if you but polluted the lake ten times as much, you might convert it into some new source of energy ... a new fuel.”
By further extrapolation, Burke proposed letting the whole planet turn into a vast toxic cesspool as we built a new home -- a “gigantic womb-like Culture-Bubble, as it were” -- on the moon. The beautiful landscapes of Old Earth could be simulated on gigantic screens. Presumably there would be artificial gravity. Everything natural could be simulated by purely technological means.
We would have to take occasional trips back to be replenished by “the placenta of the Mother Earth,” our source for raw materials. Or rather, polluted materials. (Scientists on Helhaven would need to figure out how to purify them for human use.) Burke imagines a chapel on the lunar surface with telescopes pointed towards the Earth, with a passage from the Summa Theologica of Thomas Aquinas inscribed on the wall: “And the blessed in heaven shall look upon the sufferings of the damned, that they may love their blessedness more.”
The Helhaven writings seem darker -- and, well, battier -- than "WALL-E." Burke’s late work can get awfully wild, woolly, and self-referential; and these texts are a case in point. His imaginative streak is constantly disrupted by his theoretical glossolalia. He can barely sketch an image before his critical intelligence interrupts to begin picking it apart. The Helhaven texts, as such, can only appeal to someone already preoccupied with Burke's whole body of thought. You won't ever find in them the charm of watching a little robot struggle with a ping-pong paddle-ball.
But the similarities between KB’s perspective and that of the Pixar film are more striking than the differences. Both are warnings -- in each case, with a clear implication that the warning may have come much too late. For the point of such visions is not to picture how things might turn out. The planet-wide trash dump is not part of the future. Nor is the culture-bubble to be found in outer space. They are closer to us than that.
“Think of the many places in our country where the local drinking water is on the swill side, distastefully chlorinated, with traces of various contaminants,” he wrote almost four decades ago. “If, instead of putting up with that, you invest in bottled springwater, to that extent and by the same token you are already infused with the spirit of Helhaven. Even now, the kingdom of Helhaven is within you.”
Whatever happened to cinephilia? Does it still exist? I mean, in particular, the devotion of otherwise bookish souls to the screen. (The big screen, that is, not kind you are looking at now.) Do they still go to movies the way they once did? With anything like the passion, that is – the connoisseurship, the sheer appetite for seeing and comparing and discussing films?
I don’t think so. At least very few people that I know do. And certainly not in the way documented in Reborn (Farrar, Straus and Giroux) the recently published edition of Susan Sontag’s journals, which includes a few pages from a notebook listing the dozens of films the author attended over just three weeks in early 1961. An editorial comment provides more detail about Sontag’s record of her moviegoing that year: “On no occasion is there a break of more than four days between films seen; most often, SS notes having seen at least one, and not infrequently two or three per day.”
This was not just one writer’s personal quirk. It was clearly a generational phenomenon. In a memoir of his days as a student of philosophy at the Sorbonne in the late fifties and early sixties, the French political theorist Regis Debray describes how he and his friends would go from seminars to the cinema as often as their stipends allowed.
“We could afford to enjoy it several times a week,” he writes. “And that is not counting those crisis days when our satisfied and yet insatiable desire made us spend whole afternoons in its darkness. No sooner had we come out, scarcely had we left its embrace, our eyes still half-blind, than we would sit round a café table going over every detail.... Determinedly we discussed the montage, tracking shots, lighting, rhythms. There were directors, unknown to the wider public, whose names I have now forgotten, who let slip these passwords to the in-group of film enthusiasts. Are they still remembered, these names we went such distances to see? .... It may well be the case that our best and most sincere moments were those spent in front of the screen.”
Debray wrote this account of cinemania in the late sprint of 1967, while imprisoned in Bolivia following his capture by the military. He had gone there on a mission to see Che Guevara. An actor bearing a striking resemblance to the young Debray appears in the second part of Stephen Soderberg’s Che, now in theaters.
That passage from his Prison Writings (published by Random House in the early 1970s and long out of print; some university press might want to look into this) came to mind on a recent weekday afternoon.
After a marathon course of reading for several days, I was sick of print, let alone of writing, and had snuck off to see Soderberg’s film while it was still in the theater, on the assumption that it would lose something on the video screen. There was mild guilt: a feeling that, after all, I really ought to be doing some work. Debray ended up feeling a bit of guilt as well. Between trips to the cinema and arguing over concepts in Louis Althusser’s classroom, he found himself craving a more immediate sense of life – which was, in part, how he ended in the jungles of Bolivia, and then in its prisons.
Be that as it may, there was something appealing about this recollection of his younger self, which he composed at the ripe old age of 26. The same spirit comes through in the early pages of Richard Brody's Everything is Cinema: The Working Life of Jean-Luc Godard (Metropolitan Books) and now a finalist for one of the National Book Critics Circle awards. Brody evokes the world of cinema clubs in Paris that Godard fell into after dropping out of school – from which there emerged a clique of Left Bank intellectuals (including Francois Truffaut, Claude Chabrol, and Eric Rohmer) who first wrote critical essays on film for small magazines and then began directing their own.
They got their education by way of mania – which was communicable: Debray and Sontag were examples of writers who caught it from the New Wave directors. Another would be the novelist, poet, and linguist Pier Paolo Pasolini, who also started making films in the early sixties.
It’s not clear who the contemporary equivalents would be. In the mid-1990s you heard a lot about how Quentin Tarantino had worked in a video store and immersed himself in the history of film in much the same way that the French directors had. But the resemblance is limited at best. Godard engaged in a sustained (if oblique) dialogue with literature and philosophy in his films -- while Tarantino seems to have acquired a formidable command of cinematic technique without ever having anything resembling a thought in his head. Apart, of course, from “violence is cool,” which doesn’t really count.
These stray musings come via my own reading and limited experience. They are impressions, nothing more – and I put them down in full awareness that others may know better.My own sense of cinephilia's decline may reflect the fact that all of the movie theaters in my neighborhood (there used to be six within about a 15 minute walk) have gone out of business over the past ten years.
But over the same period cable television, Netflix, and the Internet have made it easier to see films than ever before. It is not that hard to get access to even fairly obscure work now. Coming across descriptions of Godard’s pre-Breathless short films, I found that they were readily available via YouTube. And while Godard ended up committing a good deal of petty crime to fund those early exercises, few aspiring directors would need to do so now: the tools for moviemaking are readily available.
So have I just gotten trapped (imprisoned, like Debray in Bolivia) by secondhand nostalgia? It wouldn't be the first time. Is cinephilia actually alive and well? Is there an underground renaissance, an alternative scene of digital cine clubs that I’m just not hearing about? Are you framing shots to take your mind off grad school or the job market? It would be good to think so -- to imagine a new New Wave, about to break.
The idea that "Little Orphan Annie" as a historical document full of clues to contemporary American political culture is not, perhaps, self-evident. Many of us remember the comic strip, if at all, primarily as the inspiration for a long-running Broadway musical; the latter being a genre of which I, for one, have an irrational fear. (If there is a hell, it has a chorus line.)
Yet there is a case to make for Annie as an ancestor of Joe the Plumber, and not just because both are fictional characters.The two volumes, so far, of The Complete Little Orphan Annie (issued last year by IDW Publishing in its "Library of American Comics" series) come with introductory essays by Jeet Heer, a graduate student in history at York University, in Toronto, who finds in the cartoonist Harold Gray one of the overlooked founding fathers of the American conservative movement. Heer contends that the adventures of the scrappy waif reflect a strain of right-wing populism that rejected the New Deal. He is now at work on a dissertation called "Letters to Orphan Annie: The Emergence of Conservative Populism in American Popular Culture, 1924-1968."
Heer is the co-editor, with Kent Worcester, of Arguing Comics: Literary Masters on a Popular Medium (2004) and A Comics Studies Reader (2009), both published by the University Press of Mississippi. I recently interviewed him about his work by e-mail. A transcript of that exchange follows.
Q: You've co-edited a couple of anthologies of writings on the critical reception of comics and are now at work on a dissertation about one iconic strip, "Little Orphan Annie." Suppose a cultural mandarin like George Steiner challenged the whole notion of "comics studies" as manifesting a trivial interest in ephemeral entertainments on rotting newsprint. In the name of what values would you defend your work?
A: Since I think George Steiner is a fraudulent windbag, he’s perhaps a bad hypothetical example. But let’s talk about some genuine mandarins, rather than those who just put on airs. I came to comics studies partially as a lifelong reader of comics (after my family immigrated to Canada from India I learned to read English by deciphering Archie comics as if they were hieroglyphics) but also intellectually via high modernism. As a graduate student, I was fascinated by mid-century Catholic intellectuals who did so much to inform our understanding of modernism (Marshall McLuhan, Walter Ong, Hugh Kenner). Erudite as all get out and working to reconcile Catholicism with modernity, these thinkers constantly emphasized that the great modernists (Joyce, Eliot, Pound) were deeply shaped by modern mass culture (Joyce kept a copy of the comic strip "Gasoline Alley" on his mantelshelf and stuffed Finnegans Wake with countless allusions to comics). McLuhan and company taught me that high and low culture don’t exist in hermetically sealed compartments but rather are part of an organic, mutually enriching, conversation: Culture is not an exclusive club, it’s a rent party where anyone can join in and dance.
Aesthetically, I’d argue that the best comics (Herriman’s "Krazy Kat," Art Spiegelman’s "Maus," Lynda Barry’s "Ernie Pook Comeeks") are as good as anything being done in the fine arts or literature. Most comics aren’t as good as "Krazy Kat," of course, but the sheer popularity and longevity of ordinary comics like "Archie" or "Blondie" makes them historically and sociologically interesting. "Little Orphan Annie" is a good example, although it is more than ordinary as a work of art, it is also historically fascinating since it helped reshape conservatism in America, giving birth in the 1930s to a form of cultural populism that you can still see on Fox News. Read by millions (including politicians like Clare Booth Luce, Jesse Helms, and Ronald Reagan), Orphan Annie has a political significance that makes it worth studying.
Finally comics are very interesting on a theoretical level. Comics involve a fusion of works and pictures (this is true even of pantomime strips, where we “read” the images as well as look at them). Therefore, comics are inherently hybrid, existing at the crossroads between the literature and the fine arts. As French theorist Thierry Groensteen has noted, the hybrid nature of comics makes them a scandal to the “ideology of purity” that has long dominated art theory (i.e., philosophers and critics ranging from G.E. Lessing to Clement Greenberg). The best writing on comics (a sampling of which can be found in A Comics Studies Reader) all grapple with formal issues raised by hybridity: How can words and pictures interact in the same work? What’s the relationship between seeing and reading? Do visual artifacts have their own language? These are all very challenging questions, which makes comics studies an exciting field.
Q: In addition to two deluxe volumes of the complete "Little Orphan Annie" from the 1920s (with more on the way), you have put together a collection of the proto-surrealist strip "Krazy Kat." Would you describe the process of assembling this sort of edition? It seems like the work would be as much curatorial as editorial.
A: Until fairly recently, most cartoonists didn’t keep their original art, which meant that reprints of old comic strips and comic books had to be shot from the published work (often yellowing old newsprint). This meant that the art often looked like photocopies of 1970s vintage: smudgy and muddy, frequently off-register.
In the last decade, thanks to digital technology, it’s become much easier to clean up old art and restore it to how it originally looked (the cost of printing in color has also gone down). I’m lucky to work with a group of publishers that are willing to put in the hours necessary to do the restoration work. I’ll single out Pete Maresca whose books Sundays with Walt and Skeezix and Little Nemo in Slumberland: So Many Splendid Sundays reprint old Sunday pages at the exact size they originally appeared, with the dimension of a newspaper broadsheet. Pete is meticulous in trying to restore the colors to their original form (the strips were, among other things, a marvel of engraving craftsmanship brought to the United States by German immigrants). To do so, Pete often has to spend a week or more on each page, in effect taking longer on the restoration work than the cartoonist took in drawing the page.
This is not a project I'm directly involved with, but my publisher Fantagraphics recently published an amazing edition of Humbug (a sophisticated humor magazine from the 1950s edited by Mad magazine founder Harvey Kurtzman). Paul Baresh, who also works on the "Krazy Kat" books, did a remarkable job, equal perhaps to someone cleaning up the icons on a medieval church, in restoring the original art. He talks about the production process here.
Q: We'll get to your dissertation's focus on the ideological dimension of character and plot in "Little Orphan Annie" in a moment. But first let me ask about the artwork. As someone who has studied comics closely, do you see anything innovative or distinctive in its visual style? Also, what's the deal with Annie's eyes? It looks like she just got back from a rave....
A: The earliest newspaper cartoonists mostly came out of Victorian magazine illustrations, which meant their art tended to be florid, dense with decoration and unnecessary details. Harold Gray, Annie’s creator, belonged to the second generation of comic strip artists who did art that was sensitive to the fact that newspaper drawings didn’t need to be so elaborate, indeed that simpler art was more effective because it pushed the narrative along quicker. Gray’s great gift was in character design. In her pilgrim's progress through the world Annie meets all sorts of people, ranging from decent, hard-working farmers to sinister, hoity-toity pseudo-aristocrats. Gray was able to distill the essence of each character so that you can tell, at a glance, that the farmers were care-worn and dowdy but decent, while Mrs. Bleating-Hart (the Eleanor Roosevelt stand-in) was pompous and exploitative.
The best description of Gray’s art I’ve ever seen was written by the 15-year-old fan John Updike, in a letter I found in Gray’s papers at Boston University. “The drawing is simple and clear, but extremely effective,” Updike wrote. “You could tell just by looking at the faces who is the trouble maker and who isn't, without any dialogue. The facial features, the big, blunt fingered hands, the way you handle light and shadows are all excellently done." Updike’s reference to “light and shadows” refers to Gray’s other skill, in creating mood and atmosphere. Annie lives in a dark, somber, gothic world, where evil blank eyes are always peering out of windows.
Annie’s blank eyeballs were a convention Gray inherited from his artistic mentor Sidney Smith, who did "The Gumps." But artistically, as Gray explained to a fan in 1959, the blank eyeballs served to enhance reader involvement with the strip: not seeing what is going on in the eyes of the characters, readers could impose their own fears and concerns into the narrative. Recent comics theorists, most famously Scott McCloud in his frequently-cited book Understanding Comics, have argued that blank, empty characters (Charlie Brown, Mickey Mouse) are easier to identify with. Gray seems to have understood that instinctively.
Q: You argue that from its start in the mid-1920s the strip manifests a strain of conservative populism. The honest, hard-working, "just folks" Annie makes her indomitable way in a world full of elitists, social-climbing poseurs, and pointy-headed do-gooders. How did the strip respond to the economic catastrophe of 1929 and the New Deal that came in its wake?
A: While big business Republicans like Herbert Hoover were politically vanquished by the Great Depression, Harold Gray actually prospered during the 1930s, with Annie becoming the star of the most popular radio show of the decade. How can we explain this, given that Gray was as much an advocate of two-fisted capitalism as Hoover?
Whatever the merits of Hoover’s policies, the President was tone deaf in responding to the Depression because he adopted a harsh rhetoric that denied the reality of poverty. “Nobody is actually starving,” Hoover said as millions had to line up in soup kitchens. “The hoboes, for example, are better fed than they ever have been. One hobo in New York got ten meals in one day.”
Orphan Annie and “Daddy” Warbucks never voiced such complacently unfeeling indifference to poverty. Annie was poor even in the prosperous 1920s, often living as a hobo and begging for food when separated from her capitalist guardian Warbucks. In the 1920s Gray was a progressive Republican in the tradition of Theodore Roosevelt: He praised labour unions, public schools, feminist reforms (Annie dreams of being President like her hero Lincoln), and mocked anti-socialist rhetoric. In reaction to the New Deal, Gray became much more of a partisan right winger, turning the template of his story (Annie and Warbucks battling against powerful and corrupt forces) into an explicitly conservative populist allegory.
In 1931, Daddy Warbucks loses his fortune to unscrupulous Wall Street speculators, is blinded, and lives for a time as a street beggar. But after hitting bottom he regains his fighting spirit and outwits the Wall Street sharks who brought him and America low. By 1932, the villains in the strip are increasingly identified with the political left: snide bohemian intellectuals who mock traditional values, upper-crust class traitors who give money to communists, officious bureaucrats who hamper big business, corrupt labour union leaders who sabotage industry, demagogic politicians who stir up class envy in order to win elections, and busybody social workers who won’t let a poor orphan girl work for a living because of their silly child labor laws. Gray started to identify liberalism with elitism, a potent bit of political framing which continues to shape political discourse in American today.
Q: What have you learned from going through the archives, including the cartoonist's fan mail? What does it tell you about how people responded to his politics? Were there people who supported FDR but still rooted for the plucky little orphan?
A: Reading Gray’s correspondence with his fans was what made me fall in love with this project. There was such a rich array of letters from such a wide spectrum of readers: some from little kids, some from famous or soon-to-be famous people (as mentioned John Updike and Clare Boothe Luce but also the journalist Pete Hamill), most from ordinary, run-of-the-mill but often eloquent adults. Politically the letters are all over the place; some readers loved the way Gray attacked liberals, but many readers (Updike and Hamill are good examples) were New Deal supporters. One such reader wrote that we love Annie because she’s just plain folks like the rest of us and Gray should stop ruining her stories by attacking the President Roosevelt, who is trying to help the real Annies.
What I’ve learned is that people don’t read comics in a passive way: Many Annie readers were bringing to the strip their own life experiences and worldviews. This really helps us understand the way comics can weave themselves into the everyday life of readers.
One example close to my heart: In 1942 Annie forms a group call the Junior Commandos to help the war effort by collecting scrap metal. One of the Junior Commandos is a African-American boy named George, who is shown to be intelligent and resourceful. Rare for the time, George was drawn in a realistic, non-stereotypical way. Gray received many letters from black readers, praising him for showing that their race was contributing to winning the war (although some black readers also felt George was a little bit too servile). Gray also received a letter from an editor of newspaper in Mobile, Alabama, who was upset that a white girl was shown consorting with a black boy. In these letters, we can see how Annie provoked discussion about wartime racial politics.
Q: How did the strip respond to the Sixties? Did Warbucks support the Goldwater campaign? Was Annie menaced by hippies?
A: With the rise of “movement conservatism” and the Goldwater campaign, Gray responded to the times by making Daddy Warbucks and his allies even more militantly anti-communists than before (mind you, the strip featured communist villains since the early 1930s.) Dissatisfied with the mealy-mouth diplomacy of the State Department, Daddy Warbucks and his private army fight a Castro-style Latin American dictator.
Throughout the 1960s beatniks and hippies are cast as villains. As one sympathetic character complains, “I wonder why we see all these peculiar people nowadays” like “th’ beatnik types, th’ ones with long hair, the ones with beads and funny clothes.” There is a fascinating sequence in 1967 showing anti-war protesters burning American flags, and then being attacked by a group of patriotic immigrants from around the world who love America. “We are loyal Americans defending our flag!” one immigrant proclaims. “What are you unclean vermin?” In some ways, this episode prefigures the hardhats versus hippies drama that Rick Perlstein describes in his great book Nixonland.
Like the conservative writers at National Review and the Republican Party itself, Gray also became more sympathetic to the South as a region, seeing it as a bastion of traditional values. Many of Annie’s adventures in the 1960s are set in the South, although the issue of civil rights is scrupulously avoided. This was a big shift for Gray, who in the 1920s started off as a Lincoln Republican (his middle name was even Lincoln), with Annie explicitly and implicitly criticizing racism. We can see the emergence of the “Southern strategy” in Annie.
Q:In the dissertation, you draw a parallel between Gray's populist sensibility and the work of Wilmoore Kendall, the right-wing political philosopher. Actually that's where you hooked my attention -- very few people remember Kendall, let alone write about him (though Gary Wills portrays him in Confessions of a Conservative and Kendall is the inspiration for the title character of Saul Bellow's story "Mosby's Memoirs").Since you aren't arguing that the thinker influenced the cartoonist or vice versa, how do you account for the affinity between Kendall's take on John Locke and Little Orphan Annie?
A: Kendall is a fascinating figure, deserving much more attention than he’s received (although John Judis in his biography of William F. Buckley does a good job of describing Kendall’s pivotal role as mentor to the founder of National Review). Prior to Kendall’s path breaking work (he flourished as a thinker from the 1940s till his death in 1968) conservative intellectuals were almost always openly elitist and anti-democratic: think of T.S. Eliot’s royalism, Albert Jay Nock’s pinning his hope on the “saving remnant,” H.L. Mencken’s Nietzsche-inspired scorn for the booboise, F.A. Hayek’s belief that the courts should be used to curb the rise of the welfare state.
Kendall broke with this tradition of celebrating hierarchy and fearing the masses. He firmly believed that the vast majority of the American people were “conservative in their hips” and that the American political institutions were designed not to thwart the will of the majority but to articulate the deeply held conservative principles of the masses. As a conservative who was closer on a theoretical level to Jean-Jacques Rousseau than to Edmund Burke, Kendall recast the language of American populism in an anti-liberal direction. To his mind, liberals were “the establishment” which needed to be overthrown. As the historian George Nash noted, Kendall was Nash, Kendall was “a populist and a conservative. The contrast with much aristocratic, even explicitly antipopulist, conservatism in the postwar period was striking.”
The reason Kendall’s in the thesis is that the overwhelming majority of the literature on mid-century American conservatism deals with elite political and intellectual figures like Buckley, James Burnham, Whittaker Chambers, Richard Nixon, Barry Goldwater, etc. Historians haven’t done such a good job at locating the origins of conservative ideas in the broader culture, in movies, songs, and comic strips. In drawing parallels between Kendall’s worldview and the ideas that were earlier articulated in Annie, I’m trying to show that high and low culture don’t exist in isolation, but are part of a large conversation with common ideas and images percolating up and down the line. Consider the phrase “egg-head” which Kendall often used when insulting liberal professors (a rather cheeky term since he himself was a Yale man).
Long before the phrase “egg-head” was coined, cartoonists like Gray drew oval-faced professors who lacked common sense and sneered at practical minded business men. I don’t know whether Kendall read Annie or not (although some of his colleagues at National Review clearly did since they wrote about her in their magazine). But it seems to me incontestably that Kendall and Gray shared an overlapping worldview, and can usefully be compared. The fact that Gray’s conservative populism preceded Kendall’s work by a decade also raises interesting questions as to whether elite intellectuals are always at the vanguard of ideological change.
Q: When cultural studies began implanting itself in American academic life about 20 years ago, there was a strong tendency to discover and celebrate the "subversive" and "emancipatory" aspects of popular culture. There was some blend of wishful thinking and willful ignorance about this, at times -- along with a narrow present-mindedness that tended to ignore popular culture from earlier decades, or to look only at things that seemed "counter-hegemonic" in comforting ways. Do you see your work as an explicit challenge to that sort of cultural studies, with its ahistorical perspective and cookie-cutter hermeneutics? Or do you understand what you are doing as part of the "cultural turn" within the historical profession itself?
A: I completely agree with your characterization of early American cultural studies, especially in the form it took in the 1980s and 1990s. The whole “Madonna is subversive” schtick exhausted whatever limited value it had very quickly. So yes, I hope my work tries to challenge the limits of this mode of thinking by being more historical, more grounded in archival research, and attentive to the divergent political voices found in popular culture. One of the great things about working in archives is that the very diversity of voices you find in the past (as in the letters to Orphan Annie I’m working with) force you rethink any “ahistorical perspective” or “cookie-cutter hermeneutics” you may have started off with.
Having said that, I wouldn’t be able to do the work I do without the opening created by cultural studies. One of the points Kent Worcester and I make in our two anthologies is that there was a wide variety of very interesting writers (ranging from Gershon Legman to Thomas Mann) who wrote on comics in the past but it was only with the advent of cultural studies that comics were able to find a secure home in the academy, with an infrastructure of journals, conferences, and library support. Cultural studies has greatly expanded the academic opportunities for anyone interested in popular culture.
My own discipline of history has been transformed by cultural studies. As you properly note, there has been a “cultural turn” in history. To my way of thinking, this “cultural turn” can be traced back to the original British New Left of the 1950s, and especially the writings of E.P. Thompson. My own work might seem far afield from Thompson’s epic work on the British working class, the moral economy of food riots, and the politics of Romantic poetry. Still for all his work has been criticized and challenged in the last few decades, it remains for me the best example of how to do cultural history. Thompson had a great ear: he could pick up nuances from the past that other historians were simply too tone-deaf to hear. The voices of ordinary people, in all their tangled complexity, came through in Thompson’s work. As more historians grapple with culture, Thompson remains the model to follow. I doubt if my work has anywhere near the value of Thompson’s, but as a close friend always tells me, you have to aim high.
On the evening of June 10, 2007, several million people watching "The Sopranos" experienced a moment of simultaneous bewilderment. During the final scene of the final episode of its final season (a montage in which the tension built up steadily from cut to cut) the screen went blank -- and the soundtrack, consisting of the Journey power ballad "Don't Stop Believing," had gone dead. The impending narrative climax never arrived. But neither was this an anticlimax, exactly; it did not seem to be related at all to the events taking place onscreen. Many viewers probably assumed it was a technical glitch.
Once the credits began rolling, any anger at the service provider was usually redirected to the program’s creators. The willing suspension of disbelief had been not so much broken as violated.The blank screen could be (and was) interpreted variously: as an indication that Tony Soprano was blown away by an assassin, perhaps, or as a gesture of hostility by David Chase (towards the audience, or HBO, or even the notion of closure itself).
But analysis was not payoff. The end remained frustrating. The Sopranos offered its viewers an aporia they couldn’t refuse.
As I write this column (scheduled to appear two years to the day after that final episode aired) the bibliography of academic commentary on "The Sopranos" runs to more than half a dozen volumes. That's not counting all the stray conference papers and scattered volumes with chapters on it, let alone the knickknack books offering Tony Soprano's management secrets.
Life being as short as it is, I have not kept up with the literature, but did recently pause in the middle of watching the third season to read the latest book-length commentary, The Sopranos by Dana Polan, a professor of cinema studies at New York University, published in March by Duke University Press.
His departmental affiliation notwithstanding, Polan’s analysis challenges the idea that "The Sopranos" was much more akin to film than to television programming.This is certainly one of the more familiar tropes in critical discussion of "The Sopranos," whether around the water cooler or in more formal settings. An associated line of thought identifies it with a tradition of “quality TV” -- as when a critic in The New York Times suggested that the series “is strangely like 'Brideshead Revisited,' 'The Singing Detective,' and 'I, Claudius.' ”
(The fact that Tony Soprano’s mother is named Livia certainly did seem like a nod to the latter show’s monstrous matriarch. At least one classicist has argued that the real-life Livia Drusilla of the first century was the victim of an unscrupulous smear campaign. I mention this for the convenience of anyone who wants to attempt a revisionist interpretation of Livia Soprano’s role. Good luck with that.)
Rather than going along with the familiar judgment that "The Sopranos" stood above and apart from the usual run of mass-cultural fare, Polan reads it as continuous with both the traditions of genre television and the hierarchy-scrambling protocols of the postmodern condition.
The thugs in Tony Soprano’s crew are familiar, obsessed even, with the Godfather films, and cite them constantly – a bit of intertextuality that left the audience constantly scrambling to find and extrapolate on allusions within the unfolding story. But Polan maintains that the show was structured at least as much by parallels to the old-fashioned situation comedy. Or rather, to the especially ironic variation on sitcom themes found in one program in particular, "The Simpsons."
“In this revised form,” writes Polan, “the job front is a complicated site lorded over by capricious and all-powerful bosses; the sons are slackers who would prefer to get in trouble or watch television than succeed at school; the daughter is a liberal and intellectually ambitious child who is dismayed by her father’s déclassé way of life and political incorrectness but who deep down loves him and looks for moments of conciliation; the wife is a homemaker who often searches for something meaningful to her existence and frequently tries to bring cultural or moral enrichment into the home; the bar is a male sanctuary; and there is an overall tone of postmodern fascination with citation and a general sense of today’s life as lived out in an immersion in popular culture and with behaviors frequently modeled on that culture.”
Someone posting at the New York Times blog Paper Cuts a few months ago took the entirely predictable route of charging the book with “taking all the fun out of our favorite unstable texts” by smearing jargon on slices of the show.
But surely I cannot be the only reader who will respond with a kind of wistful nostalgia to Polan’s recurrent, urgent insistence that postmodern irony is organizing principle of "The Sopranos."
The show “frustrates easy judgment,” he writes, “by incorporating a multiplicity of critical positions into the text so that it becomes unclear to what extent there is one overall moral or thematic attitude that governs the work.”
Man, that really takes me back. While "The Sopranos" itself premiered in 1999, this interpretation has something very 1989-ish about it.... The Berlin Wall was in ruins, and so were the metanarratives. Joe Isuzu was introducing a new generation to the liar's paradox. And it seemed like if you could just make your irony sufficiently ironic, brute contingency would never touch you. Those were "good" times.
Yet formally self-conscious and deliberately ambiguous though it tended to be, "The Sopranos" was by no means so completely decentered in its “overall moral or thematic attitude” as all that. On the contrary, it seems to me to have been very definitely grounded what might be called (for want of any better phrase) a deeply pessimistic Freudian moral sensibility.
That label may sound almost oxymoronic to most people. We tend to think of Freud’s work as a negation of moralism: an attempt to liberate the individual from the excessive demands of the social order. But his view of the world was a far cry from that of the therapeutic culture that took shape in his wake. He was skeptical about about how much insight most patients could ever achieve -- let alone the benefits following from the effort. The mass of humanity, Freud said, was “riffraff.” The best the analyst could hope for was to cure the client of enough “neurotic misery” to be able to deal with “ordinary human unhappiness.”
A regular consumer of new therapeutic commodities like Tony’s sister Janice Soprano may expect to get some profound and satisfying self-transformation for her money. But the original psychoanalytic perspective was far more dubious. Freud also had misgivings about how his work would be received in the United States. While approaching by ship in 1909 (this year marks the centennial of his lectures at Clark University), Freud took exception to Jung’s remark that they were bringing enlightenment to the New World. No, said Freud, their ship was delivering the plague.
Indeed, someone like Tony Soprano entering treatment would have been one of the old doctor’s worst nightmares about the fate of his work. The question of Dr. Melfi’s willingness to continue treating Tony (not simply the danger this presents to her, but the moral puzzle of what “improvement” would even mean in the case of a sociopath) runs throughout the series.
When Carmela Soprano decides to seek therapy, she is referred to an old immigrant analyst named Dr. Krakower who refuses to indulge her belief that Tony is fundamentally decent. This is, of course, something the viewers, too, have been encouraged to believe from time to time -- in spite of seeing it disproved in one brutal encounter after another.
“Many patients want to be excused for their current predicament,” says Dr. Krakower, “because of events that occurred in their childhood. That's what psychiatry has become in America. Visit any shopping mall or ethnic pride parade, and witness the results.” He then refuses to accept payment from Carmela, or to continue treatment, until she breaks with Tony: “I'm not charging you because I won't take blood money, and you can't, either. One thing you can never say is that you haven't been told.”
Dr. Krakower then disappears from the show. A present absence, so to speak. We, the viewers, have by that point had numerous reminders that we are deriving vicarious pleasure from seeing how Tony and his crew earn the blood money that Dr. Krakower won't touch. We have been given a very clear indication of the difference between complicity and some version of the Freudian moral stance.
The deep pessimism of that outlook comes through time and again as we see how powerful are the psychic undercurrents within the family. Far from it being “unclear to what extent there is one overall moral or thematic attitude that governs the work,” we are on a terrain of almost Victorian naturalism, in which rare moments of insight are no match for the blind play of urges that define each character.
Take, again, the example of New Age gangster moll Janice Soprano. In his book, Polan notes that she “keeps hooking up with the dysfunctional and violent heads of Mafia crews within Tony’s jurisdiction.” In spite of everything, she never learns from her mistakes.
Polan treats this as an example of “the amnesia plot” – a sly, pomo-ironic wink, perhaps, at all those times on "Gilligan’s Island" when somebody got hit on the head with a coconut.
But surely some other interpretation is possible. Outside the play of televisual signifiers, there are people who, in one crucial area or other of their lives, never learn a damned thing – or if they do, it still makes no difference, because they make the same mistakes each time a fresh opportunity presents itself. This is, perhaps, the essence of Freud’s distinction between neurotic misery and normal unhappiness.
Not that the old misogynist necessarily gives us the key to understanding Janice Soprano. But her behavior, cycling through its compulsions in spite of various therapists and gurus, is consistent with Freud’s grimmer estimates of human nature.
The virtual impossibility of changing one’s life (even when staying alive depends on it) was also the lesson of the gay mobster Vito Spatagfore’s trajectory during the final season. Having fled both the closet and his murderously homophobic peers, Vito has every reason to settle down to an idyllic life in New Hampshire, where he has both a low-carbohydrate diet and a studly fireman boyfriend.
But Vito feels compelled to return to New Jersey and his old way of life, with predictable results. It all plays out like something inspired by Beyond the Pleasure Principle, in which Freud’s speculations on the repetition compulsion lead him to the concept of thanatos, the death drive.
When the screen went blank two years ago, it was, among other things, a disruption of our daydream-like engrossment in the world of the Sopranos. It was a sharp, even brutal reminder that the viewer had made an investment in Tony's life. The audience was left frustrated: we wanted him to either escape the consequences of his actions or get killed. Neither motive is exactly creditable, but daydreams often manifest truths we'd rather disavow.
Polan’s book is often insightful about the visual dimension of The Sopranos, if a bit reductive about treating its self-consciousness as generically postmodern. The program’s long shadow, he writes, “tells us something serious about the workings of popular culture in the media economies of today. Irony sells, and that matters.”
We all make different meanings out of the raw materials provided by any given cultural artifact – so in the spirit of hermeneutic laissez faire, I won’t quibble. But the realization that "irony sells" does not exhaust the show's meaning. It seems, rather, like something one of the brighter gangsters might say.
For this viewer's part, at least, the lesson of "The Sopranos" is rather different: Life is over soon enough, and it is not full of second chances – even though we tend to expect them. (We often prove really good at kidding ourselves about how many chances there are.) Be as ironic about life as you want; it doesn’t help. You end up just as dead.
Last week’s column took an admittedly nostalgic look back at public television as it was 30 years ago -- when its programming was forthrightly didactic and unabashed about indulging culture-vulture appetites. To a kid living in a rural town in Texas – one in which the school system could just aspire to mediocrity – the area PBS affiliate was as close to an Advanced Placement program as the circumstances would allow. And so I remain grateful.
The column also noted that KERA (the station in Dallas that served as my alma mater) has lately been providing serious arts coverage via its Web site. One recent offering, for example, was a podcast about the exhibit, at an area university, of artwork from Fluxus, an avant garde movement of the 1960s. Local arts coverage of any substance can’t be taken for granted. As it happened, the reporter and critic who recorded that podcast is Jerome Weeks. Until a couple of years ago, he was a staff book critic for one of the Dallas newspapers – until it, like so many others, started cutting back on that sort of thing.
Jerome (we have had beverages together, hence the first-name basis) was once a graduate student in English before being seduced away from academe by Ephemera, the muse of journalism. And now he’s been lured still farther away, into the world of broadcasting. I’d heard bits and pieces of his story but wanted to find out more -- on the assumption that it might not be completely atypical. The whole cultural infrastructure is in upheaval. The ability to reinvent yourself from time to time seems increasingly obligatory.
Around the time I was watching “Waiting for Godot” on PBS in the late 1970s, Jerome was a graduate student in literature. By 1980, he was enrolled in the Ph.D. program at the University of Texas at Austin -- on track to specialize, as it turns out, in Samuel Beckett, whose papers are on campus. “Seemed like a good idea at the time,” he told me by e-mail, “considering the wealth of 20th century material owned by UT’s Humanities Resource Center. ‘Wealth’ is the applicable word. At the time, late-‘70s/early-‘80s UT was flush with money.”
But then, he says, “I burned out for the reasons almost every one of my grad student-friends did at the time. The employment market cratered and many departments responded feebly or went into shock; they hadn’t faced such a death spiral since the beginning of the baby boom.”
It was the dawn of a new system of low-overhead pedagogy. Graduate students could be counted on as an endlessly replenishing reserve army of academic labor. He says he realized that he “could do all of this research and writing, and still end up in career limbo…. Frankly, I was naïve enough to be shocked by academia’s eating-its-young, economic cynicism, the perfect preparation for associate profs today. It’s not that little has changed; it’s that it has expanded and become entrenched.”
A few years earlier, he'd had a brief taste of life as a newspaperman at the Detroit Free Press. “While my brush with journalism had convinced me how much I wanted to be a literature professor,” he says now, “my apprenticeship in academia convinced me how much graduate research looked like journalism without the tape recorder.”
Faced with a choice between two evils, it's usually best to pick the one with a reliable paycheck. By the mid-1980s, Jerome was on the staffs of various magazines and newspapers in Texas covering books and theater, including the occasional Beckett production. For a decade he was the book columnist for the Dallas Morning News -- and might well have expected to continue in that job for the rest of his life, had the newspaper business not started imploding.
Not quite three years ago, he took a buyout and started a blog at ArtsJournal called book/daddy (an allusion to the slang expressions “mack daddy” and “bone daddy,” and I suspect possibly also a delayed reaction to having heard Gayatri Spivak discuss phallogocentrism in Austin thirty years ago).
Then, with no experience in broadcasting, he was hired by KERA as producer and reporter for Art & Seek, a multimedia program covering the arts. He does short radio features, and 10 minute TV interviews with authors, and articles for the station’s website.
“It sometimes seems I'm dispensing culture chat with an eyedropper,” he told me. “But seriously, how many people can you name who regularly interview authors and artists and review their work -- on television, on radio and online? Commercial radio and broadcast television do nothing, of course, and the arts on cable are mostly a joke.”
So is the idea that reliable coverage can be expected from blogging via spontaneous generation. Disobliging as it may be to press the point, reporting is a skill. “It's a lot easier to teach someone Web procedures and Web news needs than it is to teach the ins and outs of an area's arts-music-theater-literature scene and how to write intelligently about this particular art form, frame it in a wider context….. Whatever might be said against them, NPR and PBS do have audiences that expect a thoughtful quality to their news and analysis, so turning over this new venture in arts coverage to twenties-something who are savvy about Flash but know little about Feydeau really wouldn’t make sense.”
NPR and PBS remain, he says, "the only national, electronic media with a book-reading, museum-going, theater-watching, concert-listening audience. When I was the book critic for the Dallas Morning News, publishers' reps and even publishers themselves would tell me to my face that they'd prefer their touring author were interviewed on the local NPR station than written up by me. A radio talk was more likely to produce an audience for a bookstore appearance than anything in print.”
No doubt he is right about that. On the other hand, as another friend puts it, an awful lot of National Public Radio involves recycling what was in yesterday’s New York Times.
But some public broadcasting affiliates are experimenting with locally produced arts coverage – an encouraging development, but difficult to sustain. Original reviews and reporting require a staff, “preferably a knowledgeable staff,” says Jerome, “and as newspapers know, that's expensive.” The one for Art&Seek consists of four people who developed the site “only after lengthy sessions with local arts leaders” and work in collaboration with volunteers and tipsters who also contribute.
“I’m enough of a geek," he told me, "to have fun with the new gizmos and techniques, to catch a perfect piece of audio for a radio report, to learn how to edit on Final Cut Pro. But my heart is still in the essay, the hammered-out argument that expands my own thinking as much as any reader’s, the critique that deftly nails a subject.” One example is his recent commentary (longer and more far-ranging than any newspaper would publisher) on the history of American controversies over public funding for the arts. And his radio piece on Fluxus contained “enough quirky human interest (man with oddball art taking over his house) to satisfy my editors” while also giving him a chance to discuss a movement he’d long found interesting.
Edmund Wilson once wrote that cultural journalism had been a good way to get other people to pay for continuing his education. I'm not sure Wilson would thrive in a multimedia environment, but then again he's dead and doesn't have to worry about that now. For the living, existence is a matter of flux (if not of Fluxus); and as the example of Jerome Weeks suggests, half the art is just to land on your feet.
As an untenured professor I live in constant dread that my voice will (like Ben Stein's in "Ferris Bueller's Day Off") morph into an endless monotone that will meet an equally endless silence, and that things will get so desperate that only a choreographed rendition of “Twist and Shout” during a German American Day parade in Chicago will shake me and my students out of our stupor.
As the generational distance between me and my students grows (they’ve probably only seen these Gen-X defining scenes) on DVD or YouTube, if at all), it seems as if Bueller moments are unavoidable.
But for all of the examples of generational disconnect in the movies of the late director John Hughes -- particularly those produced when my junior colleagues and I came of age in the mid-1980s -- Hughes (who died this month) also offers cues for avoiding the Bueller Triangle where meaningful interaction among adults and youth simply vanishes. In this light, Hughes’s films are revelatory for educators.
For example, “Ferris Bueller’s Day Off” affirms the pedagogical strategies of effective teachers. Students want to take ownership of their learning. Like Ferris, they don’t want to be passive receptors of information but active creators of meaningful knowledge.
They don’t just want to study the historical, economic, political, psycho-sexual, and post-colonial contours of the red Ferrari. They want to drive it. We’ve got to enable them to go where their passions and curiosities lead them, and learn to teach them the significance of our “ologies” and “isms” from the passenger’s seat.
Living up to expectations landed the popular girl, the weirdo, the geek, the jock and the rebel in “The Breakfast Club.” Ironically, Saturday morning detention provided safe space for conversation without which these otherwise disparate characters would not have discovered the right blend of commonality and individuality needed to resist life-threatening pressures.
Professors who provide safe spaces in and outside of the classroom for discerning conversation successfully bridge the gap between our expectations of students, and students’ expectations of us. Free of ridicule and judgment students are liberated to ask themselves the eternal question on the road to adulthood: “Who do I want to become?” For further reading, see “She’s Having a Baby.”
“That’s why they call them crushes,” Samantha Baker’s dad explains in a rare Hughes moment of adult clarity and compassion in “Sixteen Candles.” “If they were easy they’d call them something else.” More than just re-telling a tale of teenage crushes, Hughes illuminates the struggle for authenticity when it comes to romance, dating and sex. What was glaringly absent in 1984 is also missing today, especially in the collegiate “hook up” culture. We need more open-minded adults willing to listen to students before pragmatically proposing a list of dos and don’ts.
And adults like Andy Walsh’s broken-hearted father, Jack, or her eclectic boss, Iona, in “Pretty in Pink,” who teach young people by demonstrating what learning looks like -- neither relating to them as peers nor hovering to try to protect them from life’s inevitable failures -- provide the materials students need to make their own prom gowns, a now classic metaphor for navigating the drama of adolescence.
How many times did Hughes depict the power of privilege and the misuse of teenage social capital? Millennials have to navigate social differences, many of which may be more divisive than they were 20 years ago in “Some Kind of Wonderful” because they are more subtle. While it is true that we “can’t tell a book by it’s cover,” to quote the protagonist Keith Nelson, relational power plays continue, to use Watts’s retort, to reveal “how much it’s gonna cost you.”
Taking responsibility for privilege so that we might use it wisely involves understanding and owning our particular contexts rather than simply rejecting them. In fact, Hughes’s films provide ample fodder for unpacking Peggy McIntosh’s “invisible knapsack of privilege,” given his preference for white suburbia and demeaning portrayals of ethnic minorities.
So if we don’t want to forget about Hughes we should not only reminisce about the way his characters spoke directly to our various adolescent selves. We might also remember how not to behave as adults when it comes to engaging our successors.
After all, we’re no longer writing the papers for Mr. Bender in detention. We’re grading them.
Maureen H. O’Connell is an assistant professor of theology at Fordham University and a participant in the 2009-10 pre-tenure teaching workshop at Wabash College's Wabash Center for Teaching and Learning in Theology and Religion.
To be sick for very long, confined to bed for days on end, is boring. Worse, you feel it making you boring. The world shrinks to the dimensions of the illness and its treatment. Recuperation means that things return to their proper scale; you remember that existence is more than the sum of all symptoms.
The past few weeks – while undergoing tests that ruled out H1N1 and narrowed the diagnosis to some especially enthusiastic strain of viral bronchitis – I began to suspect that the tuberculosis patients in Thomas Mann’s novel had it lucky. On the Magic Mountain, there was prophetic dialogue about the impending collapse of Western Civilization, and flirtation with Russian countesses over dinner. At the same time, even. By contrast, the range of my own conversation was dwindling down to the potential side-effects of my prescribed antibiotics. (“May cause tendons to disintegrate.”)
Things were getting pretty dire when the Independent Film Channel began showing a six-part documentary called "Monty Python: Almost the Truth (The Lawyer’s Cut)." It was something like a happiness pill. Laughter, as the saying goes, is the best medicine. At least it won’t poison you.
The series, already available on DVD, is long on anecdote and short on cultural history -- which is probably for the best, given how the Pythons always treated professors and critics. Sure, it might add something to have Stuart Hall on screen to recount how the "Monty Python's Flying Circus" was received at the Birmingham Centre for Contemporary Cultural Studies when the show first aired in 1969. But they would have probably made him wear lingerie.
And to judge by the documentary, this lingering academic habitus extended beyond the Python’s knack for turning cultural capital into high silliness. While brainstorming for their feature films Monty Python and the Holy Grail and The Life of Brian, the group enjoyed reading up on the Arthurian legends and the world of Roman-occupied Palestine. And it shows. The irreverence works because there is, to begin with, a core of reverence for the primary sources. That the Pythons could then create Dadaist collages out of Le Mort d’Arthur or the Dead Sea Scrolls – meanwhile doing things with the grammar of comedy it would take a seminar in Russian formalist critical theory even to begin to explain – is evidence of some kind of crazy collective genius.
While under the weather, I really wasn’t up to analyzing the Pythons. The idea was to let their humor lift my own. But sooner or later, the question was bound to come up: How had the professoriate responded to them?
At the Library of Congress, the earliest work I was able to get a look at was John O. Thompson’s Monty Python: Complete and Utter Theory of the Grotesque, published by the British Film Institute in 1982. This consisted of a series of excerpts from Python scripts juxtaposed with passages from Freud, Bakhtin, and other worthies. As a work of criticism, this was not that satisfying to read. It seemed less like a book than a packet of freeze-dried coffee crystals.
No surprise to find a volume of papers called Monty Python and Philosophy: Nudge, Nudge, Think, Think, published in 2006 by Open Court in its “Popular Culture and Philosophy” series. There are now at least three academic-press series of this sort, and it seems like matter of time before there is at least one volume on every prime-time TV program of the past half-century -- not excluding “Jonie Loves Chachi.” (Was the sitcom's worldview consequentialist or deontological? Discuss.)
With the Pythons, at least, there is an elective affinity between the show and philosophy as a discipline. I don’t know how much Wittgenstein any of them read while at Cambridge, but some of his work must have gotten through by osmosis. Many skits are examples of two or more language games in collision. And in an interview for the IFC documentary, John Cleese mentions that he’s always been impressed by Henri Bergson’s theory of the comic as a response to inflexible behavior.
The Open Court volume was a bit stronger than some of its ilk, but uneven. More consistently rewarding is Monty Python’s Flying Circus by Marcia Landy, published by Wayne State University Press in 2005. Landy,a professor of English and film studies at the University of Pittsburgh, gives an account of the ensemble's history, then provides a succinct analytic survey of the Pythons’ recurrent obsessions and themes, and explores their formal innovations, which owed as much to film and literature as to the history of television.
The book is thorough and very smart, and surprisingly compact. I read it in roughly the time required to eat a bowl of chicken soup. In some ways, Landy’s monograph seems like a primer for people who have never seen Python and wonder what the fuss is about. At the other extreme is Monty Python’s Flying Circus: An Utterly Complete, Thoroughly Unillustrated, Absolutely Unauthorized Guide to Possibly All the References From Arthur “Two Sheds” Jackson to Zambesi by Darl Larsen, published last year by Scarecrow Press.
Larsen, an associate professor of media arts at Brigham Young University, records and annotates literally thousands of literary, cultural, and political references and allusions made in the course of the show’s 45 episodes. It is the ultimate nerd encyclopedia. (I mean that, of course, in a good way.)
In 2003, the author published Monty Python, Shakespeare, and English Renaissance Drama (McFarland), based on a dissertation accepted by Northern Illinois University three years earlier. Larsen’s earlier work was not simply that the TV show sent up the Bard, among other figures, but that both Shakespeare and the Pythons had created a vocabulary of “Englishness” -- a set of endlessly citable, easily recognizable elements that came to embody various aspects of British culture and history. Like Shakespeare, the Pythons deployed “elements of satire, the grotesque, carnival, and ribald wordplay.” And both played fast and loose with real history in the interest of entertainment.
With his reference book (which at 550 pages resembles a phone directory) Larsen carries his argument to the next level. He tracks the scores of contemporary references and cultural allusions in each episode, explicating them as systematically as another scholar might gloss the in-jokes found in an Elizabethan poem or play. It is the product of hundreds of hours, at least, of watching the program -- and thousands more of research to document and annotate the references.
Here we are in the zone where Pythonophilia turns into Pythonomania. I was in awe of the book, and got in touch with Larsen to find out how he’d come to compile it.
“As I'd watched the 'Flying Circus' episodes as a fan and then a researcher,” he wrote me by e-mail, “I was struck by how many of the references flew right past me. Maybe because I was too young? Or too American? Maybe, but there was much going on. This wasn't 'Benny Hill' with a silly costume and a fixed leer â€‘â€‘ there was more. The Pythons were clearly waving their Oxbridge educations, their collective fascination with history and popular culture, their love/hate of the TV medium in the faces of stodgy BBC Directors General and Programme Planners and the general viewing public…. The episodes seemed ripe for annotation, simply, and I couldn't help myself.”
He had a model in mind: Larsen says that during his student years he was deeply impressed by the apparatus for A.C. Hamilton’s edition of The Faerie Queene. The extensive annotations brought Spenser’s allegorical poem “into real currency” for him, Larsen says, “and made studying for my comp exams much more bearable.”
His desire to map the Pythonian intertext was also driven by dissatisfaction. “It kind of irked me,” he explains, “that studying D.H. Lawrence was perfectly acceptable, but that we should avoid artists or works who reference or are influenced by Lawrence and his world, as the Pythons clearly were. The Pythons are as much about 20th century philosophy and Man's place in a lonely universe as they are about dead parrots and missing cheese…. Try and think of them this way: The Pythons were born in a time of world war, grew up in austerity and privation, and came of age just when people like Sartre and The Beatles were doing their best work.”
Well, no need to persuade me. As Eric Idle says in the spam sketch, “I love it.” But how did colleagues respond to his interest in Python?
As a graduate student, Larsen had the support of “the eminent Shakespearean scholar William Proctor Williams, who both suffered and championed me and the subject matter through the dissertation process.” But to continue with Python research after his first book was accepted for publication was not an obvious career-booster. At BYU, he says, “there were those who were concerned, perhaps rightly, that working on such things between a third and sixthâ€‘year review might not be the best use of time, and they said so. Pigâ€‘headed, I pushed on.”
Early proofs of his reference work were included when Larsen's sixth-year portfolio went for outside review. The response to his work “was very heartening (and I achieved rank and status), with one interesting anomaly. One reviewer praised the scope and depth of the scholarship before essentially shaking his head in wonderment at the ‘silly’ subject matter.”
Well, yes. Quite. Silliness being, after all, the point.
This weekend, having just recovered from my bout of illness, I attended a performance of Ben Jonson’s The Alchemist at the Shakespeare Theater in Washington. This play, written in 1610, contains a conspicuous number of proto- or quasi-Pythonesque elements: upper-class twits, religious fanatics, horny babes, a fake Spaniard, and improbable plotlines that collide like drunks on a bender. There was also at least one analingus joke. And it does help to know at least a little about alchemy, since the playwright is making fun of that, too.
The whole thing was very silly indeed. Anyone claiming otherwise just wasn’t paying attention. But there is the merely silly and the greatly silly. The great stuff lasts over time. It improves the quality of life. Unless, of course, it kills you.