On the evening of June 10, 2007, several million people watching "The Sopranos" experienced a moment of simultaneous bewilderment. During the final scene of the final episode of its final season (a montage in which the tension built up steadily from cut to cut) the screen went blank -- and the soundtrack, consisting of the Journey power ballad "Don't Stop Believing," had gone dead. The impending narrative climax never arrived. But neither was this an anticlimax, exactly; it did not seem to be related at all to the events taking place onscreen. Many viewers probably assumed it was a technical glitch.
Once the credits began rolling, any anger at the service provider was usually redirected to the program’s creators. The willing suspension of disbelief had been not so much broken as violated.The blank screen could be (and was) interpreted variously: as an indication that Tony Soprano was blown away by an assassin, perhaps, or as a gesture of hostility by David Chase (towards the audience, or HBO, or even the notion of closure itself).
But analysis was not payoff. The end remained frustrating. The Sopranos offered its viewers an aporia they couldn’t refuse.
As I write this column (scheduled to appear two years to the day after that final episode aired) the bibliography of academic commentary on "The Sopranos" runs to more than half a dozen volumes. That's not counting all the stray conference papers and scattered volumes with chapters on it, let alone the knickknack books offering Tony Soprano's management secrets.
Life being as short as it is, I have not kept up with the literature, but did recently pause in the middle of watching the third season to read the latest book-length commentary, The Sopranos by Dana Polan, a professor of cinema studies at New York University, published in March by Duke University Press.
His departmental affiliation notwithstanding, Polan’s analysis challenges the idea that "The Sopranos" was much more akin to film than to television programming.This is certainly one of the more familiar tropes in critical discussion of "The Sopranos," whether around the water cooler or in more formal settings. An associated line of thought identifies it with a tradition of “quality TV” -- as when a critic in The New York Times suggested that the series “is strangely like 'Brideshead Revisited,' 'The Singing Detective,' and 'I, Claudius.' ”
(The fact that Tony Soprano’s mother is named Livia certainly did seem like a nod to the latter show’s monstrous matriarch. At least one classicist has argued that the real-life Livia Drusilla of the first century was the victim of an unscrupulous smear campaign. I mention this for the convenience of anyone who wants to attempt a revisionist interpretation of Livia Soprano’s role. Good luck with that.)
Rather than going along with the familiar judgment that "The Sopranos" stood above and apart from the usual run of mass-cultural fare, Polan reads it as continuous with both the traditions of genre television and the hierarchy-scrambling protocols of the postmodern condition.
The thugs in Tony Soprano’s crew are familiar, obsessed even, with the Godfather films, and cite them constantly – a bit of intertextuality that left the audience constantly scrambling to find and extrapolate on allusions within the unfolding story. But Polan maintains that the show was structured at least as much by parallels to the old-fashioned situation comedy. Or rather, to the especially ironic variation on sitcom themes found in one program in particular, "The Simpsons."
“In this revised form,” writes Polan, “the job front is a complicated site lorded over by capricious and all-powerful bosses; the sons are slackers who would prefer to get in trouble or watch television than succeed at school; the daughter is a liberal and intellectually ambitious child who is dismayed by her father’s déclassé way of life and political incorrectness but who deep down loves him and looks for moments of conciliation; the wife is a homemaker who often searches for something meaningful to her existence and frequently tries to bring cultural or moral enrichment into the home; the bar is a male sanctuary; and there is an overall tone of postmodern fascination with citation and a general sense of today’s life as lived out in an immersion in popular culture and with behaviors frequently modeled on that culture.”
Someone posting at the New York Times blog Paper Cuts a few months ago took the entirely predictable route of charging the book with “taking all the fun out of our favorite unstable texts” by smearing jargon on slices of the show.
But surely I cannot be the only reader who will respond with a kind of wistful nostalgia to Polan’s recurrent, urgent insistence that postmodern irony is organizing principle of "The Sopranos."
The show “frustrates easy judgment,” he writes, “by incorporating a multiplicity of critical positions into the text so that it becomes unclear to what extent there is one overall moral or thematic attitude that governs the work.”
Man, that really takes me back. While "The Sopranos" itself premiered in 1999, this interpretation has something very 1989-ish about it.... The Berlin Wall was in ruins, and so were the metanarratives. Joe Isuzu was introducing a new generation to the liar's paradox. And it seemed like if you could just make your irony sufficiently ironic, brute contingency would never touch you. Those were "good" times.
Yet formally self-conscious and deliberately ambiguous though it tended to be, "The Sopranos" was by no means so completely decentered in its “overall moral or thematic attitude” as all that. On the contrary, it seems to me to have been very definitely grounded what might be called (for want of any better phrase) a deeply pessimistic Freudian moral sensibility.
That label may sound almost oxymoronic to most people. We tend to think of Freud’s work as a negation of moralism: an attempt to liberate the individual from the excessive demands of the social order. But his view of the world was a far cry from that of the therapeutic culture that took shape in his wake. He was skeptical about about how much insight most patients could ever achieve -- let alone the benefits following from the effort. The mass of humanity, Freud said, was “riffraff.” The best the analyst could hope for was to cure the client of enough “neurotic misery” to be able to deal with “ordinary human unhappiness.”
A regular consumer of new therapeutic commodities like Tony’s sister Janice Soprano may expect to get some profound and satisfying self-transformation for her money. But the original psychoanalytic perspective was far more dubious. Freud also had misgivings about how his work would be received in the United States. While approaching by ship in 1909 (this year marks the centennial of his lectures at Clark University), Freud took exception to Jung’s remark that they were bringing enlightenment to the New World. No, said Freud, their ship was delivering the plague.
Indeed, someone like Tony Soprano entering treatment would have been one of the old doctor’s worst nightmares about the fate of his work. The question of Dr. Melfi’s willingness to continue treating Tony (not simply the danger this presents to her, but the moral puzzle of what “improvement” would even mean in the case of a sociopath) runs throughout the series.
When Carmela Soprano decides to seek therapy, she is referred to an old immigrant analyst named Dr. Krakower who refuses to indulge her belief that Tony is fundamentally decent. This is, of course, something the viewers, too, have been encouraged to believe from time to time -- in spite of seeing it disproved in one brutal encounter after another.
“Many patients want to be excused for their current predicament,” says Dr. Krakower, “because of events that occurred in their childhood. That's what psychiatry has become in America. Visit any shopping mall or ethnic pride parade, and witness the results.” He then refuses to accept payment from Carmela, or to continue treatment, until she breaks with Tony: “I'm not charging you because I won't take blood money, and you can't, either. One thing you can never say is that you haven't been told.”
Dr. Krakower then disappears from the show. A present absence, so to speak. We, the viewers, have by that point had numerous reminders that we are deriving vicarious pleasure from seeing how Tony and his crew earn the blood money that Dr. Krakower won't touch. We have been given a very clear indication of the difference between complicity and some version of the Freudian moral stance.
The deep pessimism of that outlook comes through time and again as we see how powerful are the psychic undercurrents within the family. Far from it being “unclear to what extent there is one overall moral or thematic attitude that governs the work,” we are on a terrain of almost Victorian naturalism, in which rare moments of insight are no match for the blind play of urges that define each character.
Take, again, the example of New Age gangster moll Janice Soprano. In his book, Polan notes that she “keeps hooking up with the dysfunctional and violent heads of Mafia crews within Tony’s jurisdiction.” In spite of everything, she never learns from her mistakes.
Polan treats this as an example of “the amnesia plot” – a sly, pomo-ironic wink, perhaps, at all those times on "Gilligan’s Island" when somebody got hit on the head with a coconut.
But surely some other interpretation is possible. Outside the play of televisual signifiers, there are people who, in one crucial area or other of their lives, never learn a damned thing – or if they do, it still makes no difference, because they make the same mistakes each time a fresh opportunity presents itself. This is, perhaps, the essence of Freud’s distinction between neurotic misery and normal unhappiness.
Not that the old misogynist necessarily gives us the key to understanding Janice Soprano. But her behavior, cycling through its compulsions in spite of various therapists and gurus, is consistent with Freud’s grimmer estimates of human nature.
The virtual impossibility of changing one’s life (even when staying alive depends on it) was also the lesson of the gay mobster Vito Spatagfore’s trajectory during the final season. Having fled both the closet and his murderously homophobic peers, Vito has every reason to settle down to an idyllic life in New Hampshire, where he has both a low-carbohydrate diet and a studly fireman boyfriend.
But Vito feels compelled to return to New Jersey and his old way of life, with predictable results. It all plays out like something inspired by Beyond the Pleasure Principle, in which Freud’s speculations on the repetition compulsion lead him to the concept of thanatos, the death drive.
When the screen went blank two years ago, it was, among other things, a disruption of our daydream-like engrossment in the world of the Sopranos. It was a sharp, even brutal reminder that the viewer had made an investment in Tony's life. The audience was left frustrated: we wanted him to either escape the consequences of his actions or get killed. Neither motive is exactly creditable, but daydreams often manifest truths we'd rather disavow.
Polan’s book is often insightful about the visual dimension of The Sopranos, if a bit reductive about treating its self-consciousness as generically postmodern. The program’s long shadow, he writes, “tells us something serious about the workings of popular culture in the media economies of today. Irony sells, and that matters.”
We all make different meanings out of the raw materials provided by any given cultural artifact – so in the spirit of hermeneutic laissez faire, I won’t quibble. But the realization that "irony sells" does not exhaust the show's meaning. It seems, rather, like something one of the brighter gangsters might say.
For this viewer's part, at least, the lesson of "The Sopranos" is rather different: Life is over soon enough, and it is not full of second chances – even though we tend to expect them. (We often prove really good at kidding ourselves about how many chances there are.) Be as ironic about life as you want; it doesn’t help. You end up just as dead.
Last week’s column took an admittedly nostalgic look back at public television as it was 30 years ago -- when its programming was forthrightly didactic and unabashed about indulging culture-vulture appetites. To a kid living in a rural town in Texas – one in which the school system could just aspire to mediocrity – the area PBS affiliate was as close to an Advanced Placement program as the circumstances would allow. And so I remain grateful.
The column also noted that KERA (the station in Dallas that served as my alma mater) has lately been providing serious arts coverage via its Web site. One recent offering, for example, was a podcast about the exhibit, at an area university, of artwork from Fluxus, an avant garde movement of the 1960s. Local arts coverage of any substance can’t be taken for granted. As it happened, the reporter and critic who recorded that podcast is Jerome Weeks. Until a couple of years ago, he was a staff book critic for one of the Dallas newspapers – until it, like so many others, started cutting back on that sort of thing.
Jerome (we have had beverages together, hence the first-name basis) was once a graduate student in English before being seduced away from academe by Ephemera, the muse of journalism. And now he’s been lured still farther away, into the world of broadcasting. I’d heard bits and pieces of his story but wanted to find out more -- on the assumption that it might not be completely atypical. The whole cultural infrastructure is in upheaval. The ability to reinvent yourself from time to time seems increasingly obligatory.
Around the time I was watching “Waiting for Godot” on PBS in the late 1970s, Jerome was a graduate student in literature. By 1980, he was enrolled in the Ph.D. program at the University of Texas at Austin -- on track to specialize, as it turns out, in Samuel Beckett, whose papers are on campus. “Seemed like a good idea at the time,” he told me by e-mail, “considering the wealth of 20th century material owned by UT’s Humanities Resource Center. ‘Wealth’ is the applicable word. At the time, late-‘70s/early-‘80s UT was flush with money.”
But then, he says, “I burned out for the reasons almost every one of my grad student-friends did at the time. The employment market cratered and many departments responded feebly or went into shock; they hadn’t faced such a death spiral since the beginning of the baby boom.”
It was the dawn of a new system of low-overhead pedagogy. Graduate students could be counted on as an endlessly replenishing reserve army of academic labor. He says he realized that he “could do all of this research and writing, and still end up in career limbo…. Frankly, I was naïve enough to be shocked by academia’s eating-its-young, economic cynicism, the perfect preparation for associate profs today. It’s not that little has changed; it’s that it has expanded and become entrenched.”
A few years earlier, he'd had a brief taste of life as a newspaperman at the Detroit Free Press. “While my brush with journalism had convinced me how much I wanted to be a literature professor,” he says now, “my apprenticeship in academia convinced me how much graduate research looked like journalism without the tape recorder.”
Faced with a choice between two evils, it's usually best to pick the one with a reliable paycheck. By the mid-1980s, Jerome was on the staffs of various magazines and newspapers in Texas covering books and theater, including the occasional Beckett production. For a decade he was the book columnist for the Dallas Morning News -- and might well have expected to continue in that job for the rest of his life, had the newspaper business not started imploding.
Not quite three years ago, he took a buyout and started a blog at ArtsJournal called book/daddy (an allusion to the slang expressions “mack daddy” and “bone daddy,” and I suspect possibly also a delayed reaction to having heard Gayatri Spivak discuss phallogocentrism in Austin thirty years ago).
Then, with no experience in broadcasting, he was hired by KERA as producer and reporter for Art & Seek, a multimedia program covering the arts. He does short radio features, and 10 minute TV interviews with authors, and articles for the station’s website.
“It sometimes seems I'm dispensing culture chat with an eyedropper,” he told me. “But seriously, how many people can you name who regularly interview authors and artists and review their work -- on television, on radio and online? Commercial radio and broadcast television do nothing, of course, and the arts on cable are mostly a joke.”
So is the idea that reliable coverage can be expected from blogging via spontaneous generation. Disobliging as it may be to press the point, reporting is a skill. “It's a lot easier to teach someone Web procedures and Web news needs than it is to teach the ins and outs of an area's arts-music-theater-literature scene and how to write intelligently about this particular art form, frame it in a wider context….. Whatever might be said against them, NPR and PBS do have audiences that expect a thoughtful quality to their news and analysis, so turning over this new venture in arts coverage to twenties-something who are savvy about Flash but know little about Feydeau really wouldn’t make sense.”
NPR and PBS remain, he says, "the only national, electronic media with a book-reading, museum-going, theater-watching, concert-listening audience. When I was the book critic for the Dallas Morning News, publishers' reps and even publishers themselves would tell me to my face that they'd prefer their touring author were interviewed on the local NPR station than written up by me. A radio talk was more likely to produce an audience for a bookstore appearance than anything in print.”
No doubt he is right about that. On the other hand, as another friend puts it, an awful lot of National Public Radio involves recycling what was in yesterday’s New York Times.
But some public broadcasting affiliates are experimenting with locally produced arts coverage – an encouraging development, but difficult to sustain. Original reviews and reporting require a staff, “preferably a knowledgeable staff,” says Jerome, “and as newspapers know, that's expensive.” The one for Art&Seek consists of four people who developed the site “only after lengthy sessions with local arts leaders” and work in collaboration with volunteers and tipsters who also contribute.
“I’m enough of a geek," he told me, "to have fun with the new gizmos and techniques, to catch a perfect piece of audio for a radio report, to learn how to edit on Final Cut Pro. But my heart is still in the essay, the hammered-out argument that expands my own thinking as much as any reader’s, the critique that deftly nails a subject.” One example is his recent commentary (longer and more far-ranging than any newspaper would publisher) on the history of American controversies over public funding for the arts. And his radio piece on Fluxus contained “enough quirky human interest (man with oddball art taking over his house) to satisfy my editors” while also giving him a chance to discuss a movement he’d long found interesting.
Edmund Wilson once wrote that cultural journalism had been a good way to get other people to pay for continuing his education. I'm not sure Wilson would thrive in a multimedia environment, but then again he's dead and doesn't have to worry about that now. For the living, existence is a matter of flux (if not of Fluxus); and as the example of Jerome Weeks suggests, half the art is just to land on your feet.
As an untenured professor I live in constant dread that my voice will (like Ben Stein's in "Ferris Bueller's Day Off") morph into an endless monotone that will meet an equally endless silence, and that things will get so desperate that only a choreographed rendition of “Twist and Shout” during a German American Day parade in Chicago will shake me and my students out of our stupor.
As the generational distance between me and my students grows (they’ve probably only seen these Gen-X defining scenes) on DVD or YouTube, if at all), it seems as if Bueller moments are unavoidable.
But for all of the examples of generational disconnect in the movies of the late director John Hughes -- particularly those produced when my junior colleagues and I came of age in the mid-1980s -- Hughes (who died this month) also offers cues for avoiding the Bueller Triangle where meaningful interaction among adults and youth simply vanishes. In this light, Hughes’s films are revelatory for educators.
For example, “Ferris Bueller’s Day Off” affirms the pedagogical strategies of effective teachers. Students want to take ownership of their learning. Like Ferris, they don’t want to be passive receptors of information but active creators of meaningful knowledge.
They don’t just want to study the historical, economic, political, psycho-sexual, and post-colonial contours of the red Ferrari. They want to drive it. We’ve got to enable them to go where their passions and curiosities lead them, and learn to teach them the significance of our “ologies” and “isms” from the passenger’s seat.
Living up to expectations landed the popular girl, the weirdo, the geek, the jock and the rebel in “The Breakfast Club.” Ironically, Saturday morning detention provided safe space for conversation without which these otherwise disparate characters would not have discovered the right blend of commonality and individuality needed to resist life-threatening pressures.
Professors who provide safe spaces in and outside of the classroom for discerning conversation successfully bridge the gap between our expectations of students, and students’ expectations of us. Free of ridicule and judgment students are liberated to ask themselves the eternal question on the road to adulthood: “Who do I want to become?” For further reading, see “She’s Having a Baby.”
“That’s why they call them crushes,” Samantha Baker’s dad explains in a rare Hughes moment of adult clarity and compassion in “Sixteen Candles.” “If they were easy they’d call them something else.” More than just re-telling a tale of teenage crushes, Hughes illuminates the struggle for authenticity when it comes to romance, dating and sex. What was glaringly absent in 1984 is also missing today, especially in the collegiate “hook up” culture. We need more open-minded adults willing to listen to students before pragmatically proposing a list of dos and don’ts.
And adults like Andy Walsh’s broken-hearted father, Jack, or her eclectic boss, Iona, in “Pretty in Pink,” who teach young people by demonstrating what learning looks like -- neither relating to them as peers nor hovering to try to protect them from life’s inevitable failures -- provide the materials students need to make their own prom gowns, a now classic metaphor for navigating the drama of adolescence.
How many times did Hughes depict the power of privilege and the misuse of teenage social capital? Millennials have to navigate social differences, many of which may be more divisive than they were 20 years ago in “Some Kind of Wonderful” because they are more subtle. While it is true that we “can’t tell a book by it’s cover,” to quote the protagonist Keith Nelson, relational power plays continue, to use Watts’s retort, to reveal “how much it’s gonna cost you.”
Taking responsibility for privilege so that we might use it wisely involves understanding and owning our particular contexts rather than simply rejecting them. In fact, Hughes’s films provide ample fodder for unpacking Peggy McIntosh’s “invisible knapsack of privilege,” given his preference for white suburbia and demeaning portrayals of ethnic minorities.
So if we don’t want to forget about Hughes we should not only reminisce about the way his characters spoke directly to our various adolescent selves. We might also remember how not to behave as adults when it comes to engaging our successors.
After all, we’re no longer writing the papers for Mr. Bender in detention. We’re grading them.
Maureen H. O’Connell is an assistant professor of theology at Fordham University and a participant in the 2009-10 pre-tenure teaching workshop at Wabash College's Wabash Center for Teaching and Learning in Theology and Religion.
To be sick for very long, confined to bed for days on end, is boring. Worse, you feel it making you boring. The world shrinks to the dimensions of the illness and its treatment. Recuperation means that things return to their proper scale; you remember that existence is more than the sum of all symptoms.
The past few weeks – while undergoing tests that ruled out H1N1 and narrowed the diagnosis to some especially enthusiastic strain of viral bronchitis – I began to suspect that the tuberculosis patients in Thomas Mann’s novel had it lucky. On the Magic Mountain, there was prophetic dialogue about the impending collapse of Western Civilization, and flirtation with Russian countesses over dinner. At the same time, even. By contrast, the range of my own conversation was dwindling down to the potential side-effects of my prescribed antibiotics. (“May cause tendons to disintegrate.”)
Things were getting pretty dire when the Independent Film Channel began showing a six-part documentary called "Monty Python: Almost the Truth (The Lawyer’s Cut)." It was something like a happiness pill. Laughter, as the saying goes, is the best medicine. At least it won’t poison you.
The series, already available on DVD, is long on anecdote and short on cultural history -- which is probably for the best, given how the Pythons always treated professors and critics. Sure, it might add something to have Stuart Hall on screen to recount how the "Monty Python's Flying Circus" was received at the Birmingham Centre for Contemporary Cultural Studies when the show first aired in 1969. But they would have probably made him wear lingerie.
And to judge by the documentary, this lingering academic habitus extended beyond the Python’s knack for turning cultural capital into high silliness. While brainstorming for their feature films Monty Python and the Holy Grail and The Life of Brian, the group enjoyed reading up on the Arthurian legends and the world of Roman-occupied Palestine. And it shows. The irreverence works because there is, to begin with, a core of reverence for the primary sources. That the Pythons could then create Dadaist collages out of Le Mort d’Arthur or the Dead Sea Scrolls – meanwhile doing things with the grammar of comedy it would take a seminar in Russian formalist critical theory even to begin to explain – is evidence of some kind of crazy collective genius.
While under the weather, I really wasn’t up to analyzing the Pythons. The idea was to let their humor lift my own. But sooner or later, the question was bound to come up: How had the professoriate responded to them?
At the Library of Congress, the earliest work I was able to get a look at was John O. Thompson’s Monty Python: Complete and Utter Theory of the Grotesque, published by the British Film Institute in 1982. This consisted of a series of excerpts from Python scripts juxtaposed with passages from Freud, Bakhtin, and other worthies. As a work of criticism, this was not that satisfying to read. It seemed less like a book than a packet of freeze-dried coffee crystals.
No surprise to find a volume of papers called Monty Python and Philosophy: Nudge, Nudge, Think, Think, published in 2006 by Open Court in its “Popular Culture and Philosophy” series. There are now at least three academic-press series of this sort, and it seems like matter of time before there is at least one volume on every prime-time TV program of the past half-century -- not excluding “Jonie Loves Chachi.” (Was the sitcom's worldview consequentialist or deontological? Discuss.)
With the Pythons, at least, there is an elective affinity between the show and philosophy as a discipline. I don’t know how much Wittgenstein any of them read while at Cambridge, but some of his work must have gotten through by osmosis. Many skits are examples of two or more language games in collision. And in an interview for the IFC documentary, John Cleese mentions that he’s always been impressed by Henri Bergson’s theory of the comic as a response to inflexible behavior.
The Open Court volume was a bit stronger than some of its ilk, but uneven. More consistently rewarding is Monty Python’s Flying Circus by Marcia Landy, published by Wayne State University Press in 2005. Landy,a professor of English and film studies at the University of Pittsburgh, gives an account of the ensemble's history, then provides a succinct analytic survey of the Pythons’ recurrent obsessions and themes, and explores their formal innovations, which owed as much to film and literature as to the history of television.
The book is thorough and very smart, and surprisingly compact. I read it in roughly the time required to eat a bowl of chicken soup. In some ways, Landy’s monograph seems like a primer for people who have never seen Python and wonder what the fuss is about. At the other extreme is Monty Python’s Flying Circus: An Utterly Complete, Thoroughly Unillustrated, Absolutely Unauthorized Guide to Possibly All the References From Arthur “Two Sheds” Jackson to Zambesi by Darl Larsen, published last year by Scarecrow Press.
Larsen, an associate professor of media arts at Brigham Young University, records and annotates literally thousands of literary, cultural, and political references and allusions made in the course of the show’s 45 episodes. It is the ultimate nerd encyclopedia. (I mean that, of course, in a good way.)
In 2003, the author published Monty Python, Shakespeare, and English Renaissance Drama (McFarland), based on a dissertation accepted by Northern Illinois University three years earlier. Larsen’s earlier work was not simply that the TV show sent up the Bard, among other figures, but that both Shakespeare and the Pythons had created a vocabulary of “Englishness” -- a set of endlessly citable, easily recognizable elements that came to embody various aspects of British culture and history. Like Shakespeare, the Pythons deployed “elements of satire, the grotesque, carnival, and ribald wordplay.” And both played fast and loose with real history in the interest of entertainment.
With his reference book (which at 550 pages resembles a phone directory) Larsen carries his argument to the next level. He tracks the scores of contemporary references and cultural allusions in each episode, explicating them as systematically as another scholar might gloss the in-jokes found in an Elizabethan poem or play. It is the product of hundreds of hours, at least, of watching the program -- and thousands more of research to document and annotate the references.
Here we are in the zone where Pythonophilia turns into Pythonomania. I was in awe of the book, and got in touch with Larsen to find out how he’d come to compile it.
“As I'd watched the 'Flying Circus' episodes as a fan and then a researcher,” he wrote me by e-mail, “I was struck by how many of the references flew right past me. Maybe because I was too young? Or too American? Maybe, but there was much going on. This wasn't 'Benny Hill' with a silly costume and a fixed leer â€‘â€‘ there was more. The Pythons were clearly waving their Oxbridge educations, their collective fascination with history and popular culture, their love/hate of the TV medium in the faces of stodgy BBC Directors General and Programme Planners and the general viewing public…. The episodes seemed ripe for annotation, simply, and I couldn't help myself.”
He had a model in mind: Larsen says that during his student years he was deeply impressed by the apparatus for A.C. Hamilton’s edition of The Faerie Queene. The extensive annotations brought Spenser’s allegorical poem “into real currency” for him, Larsen says, “and made studying for my comp exams much more bearable.”
His desire to map the Pythonian intertext was also driven by dissatisfaction. “It kind of irked me,” he explains, “that studying D.H. Lawrence was perfectly acceptable, but that we should avoid artists or works who reference or are influenced by Lawrence and his world, as the Pythons clearly were. The Pythons are as much about 20th century philosophy and Man's place in a lonely universe as they are about dead parrots and missing cheese…. Try and think of them this way: The Pythons were born in a time of world war, grew up in austerity and privation, and came of age just when people like Sartre and The Beatles were doing their best work.”
Well, no need to persuade me. As Eric Idle says in the spam sketch, “I love it.” But how did colleagues respond to his interest in Python?
As a graduate student, Larsen had the support of “the eminent Shakespearean scholar William Proctor Williams, who both suffered and championed me and the subject matter through the dissertation process.” But to continue with Python research after his first book was accepted for publication was not an obvious career-booster. At BYU, he says, “there were those who were concerned, perhaps rightly, that working on such things between a third and sixthâ€‘year review might not be the best use of time, and they said so. Pigâ€‘headed, I pushed on.”
Early proofs of his reference work were included when Larsen's sixth-year portfolio went for outside review. The response to his work “was very heartening (and I achieved rank and status), with one interesting anomaly. One reviewer praised the scope and depth of the scholarship before essentially shaking his head in wonderment at the ‘silly’ subject matter.”
Well, yes. Quite. Silliness being, after all, the point.
This weekend, having just recovered from my bout of illness, I attended a performance of Ben Jonson’s The Alchemist at the Shakespeare Theater in Washington. This play, written in 1610, contains a conspicuous number of proto- or quasi-Pythonesque elements: upper-class twits, religious fanatics, horny babes, a fake Spaniard, and improbable plotlines that collide like drunks on a bender. There was also at least one analingus joke. And it does help to know at least a little about alchemy, since the playwright is making fun of that, too.
The whole thing was very silly indeed. Anyone claiming otherwise just wasn’t paying attention. But there is the merely silly and the greatly silly. The great stuff lasts over time. It improves the quality of life. Unless, of course, it kills you.
Ten years ago, in the final pages of a collection of his selected writings, Cornel West gave readers a look at the work he had in progress, or at least in mind, for the years ahead. One would be “a major treatment of African-American literature and modern Greek literature.” Another was “a meditation on Chekhov and Coltrane that delves into the distinctive conceptions of the tragic in American civilization and of the comic in Russian civilization.” He would be writing an intellectual autobiography “modeled on black musical forms.” Nor had he given up on plans to complete a study of David Hume. There would also be a book on Josiah Royce.
West described his projects as “bold,” “challenging” and “exciting.” These are adjectives, it must be said, better left in someone else’s hands. But the books did sound interesting, and I looked forward to them – especially the one on Royce. In recent years, whenever West released an album of vocal stylings or appeared in a sequel to The Matrix, I would think, “Maybe he’s finally gotten that out of his system and will go back to work on The Spirit of Modern Philosophy.” (Royce was stressing the importance of Hegel's Phenomenology back when Kojève was just a gleam in his daddy's eye.)
I have been following West since the early 1980s, when his papers were appearing in journals such as Social Text, Boundary 2, and Cultural Critique, as well as the occasional issue of The Village Voice. His first three monographs were interesting if not definitive. More appealing in a lot of ways are the two volumes called Beyond Eurocentrism and Multiculturalism, published by Common Courage in 1993, which I have turned to a few times over the years for a shot of energy; the lectures and essays reprinted there are West at his best, shifting between theoretical and vernacular vocabularies in a way that suggests a fusion of Dialectic of Enlightenment and Democratic Vistas by way of Run DMC.
Cornel West’s work was once bold, challenging, exciting. The past tense here is unavoidable. His critical edge and creative powers might yet be reborn (he is 56). But in the wake of his latest book, Brother West: Living and Loving Out Loud, this hope requires a considerable leap of faith. Published by Hay House, the book also bears a second subtitle: “A Memoir.” It is the most disappointing thing I have read in at least a year.
This is not the intellectual autobiography West promised a decade ago. In essence it is a fawning celebrity profile -- one in which reporter and superstar have somehow fused into a single first-person voice. And in fact that turns out to be quite literally true. In the final pages, West pays tribute to David Ritz, his collaborator, who has undertaken similar projects with Marvin Gaye and Grandmaster Flash, among others.
“David Ritz and I have worked together to sculpt a voice that I hear as my own,” explains West, or someone trying to sound like him. “Many of my other books were written in what I consider an ‘academic voice.’ Brother West is rendered in a ‘conversational’ voice.”
In this respect, of course, the Class of 1943 University Professor in the Center for African American Studies at Princeton University is following the lead of David Hume – who, after writing A Treatise of Human Nature, published numerous very popular essays with the help of a writer from Entertainment Weekly.
The problem, to be clear, is not that this is meant to be is a popular book, or even that West himself could not be bothered to write it. Brother West offers much evidence that amour propre and self-knowledge are not the same thing. One tends to be in conflict with the other. A memoir will often show traces of the struggle between them.
Not so here. That battle is plainly over. Self-knowledge has been taken hostage, and amour propre curdled into self-infatuation.
One whole page at the start of the book reads as follows:
I’m a bluesman in the life of the mind, and a jazzman in the world of ideas. -- Cornel West
It will not be the reader’s last encounter with this sentiment. West repeats it at least a few dozen more times -- never with any variation or development. (Clearly this is minimalist jazz: West plays one note, then goes up half a step, then back again.) The rich history of writing by African-American intellectuals -- the essays by Richard Wright, Ralph Ellison, and Amiri Baraka, to make the list no longer than that -- has left no discernible trace on this book. Some of West's own work from the 1980s suggests he has thoughts on that tradition, as well as capacity to contribute to it. But here we are just reminded every so often that he likes to think of himself as a performer. This is not enlightening.
The broad outlines of West's life are interesting enough. His family lived in California, along the edge between the ghetto and the lower middle class. As a teenager in the 1960s he had one foot planted in the church and the other at Black Panther Party headquarters. His academic career started with getting his B.A. from Harvard in three years, then picked up speed. He has had bestsellers. His love life sounds complicated enough to merit an HBO mini-series.
But all of this is just penciled in. There is seldom much detail and never any depth. West makes a few references to academic mentors. He notes his intense interest in various philosophers or authors. Yet there is never a sustained effort to grapple with them as influences on his life and thinking. He mentions his own scholarly books on Marxism and pragmatism (for some odd reason forgetting that he also published one on African-American theology) but does not describe the process of thinking and writing that went into them.
That is not to say that Brother West fails to discuss authorship at all. You catch glimpses of its joys as rendered in the clunky prose of his collaborator: "I like seeing Race Matters translated into Japanese, Italian, and Portuguese. I like seeing The American Evasion of Philosophy translated into Chinese, Spanish, and Italian. I like that there are hundreds of thousands of copies of my book Democracy Matters translated into Spanish. There’s also an edition that’s selling in the French-speaking world. I like the fact that all nineteen of my books are still in print with the exception of the two that won the American Book Award in 1993.”
If sketchy in other regards, Brother West is never anything but expansive on how Cornel West feels about Cornel West. He is deeply committed to his committed-ness, and passionately passionate about being full of passion. Various works of art, literature, music, and philosophy remind West of himself. He finds Augustinian humility to be deeply meaningful. This is mentioned in one sentence. His taste for three-piece suits is full of subtle implications that require a couple of substantial paragraphs to elucidate.
As mentioned, his romantic life sounds complicated. Brother West is a reminder of Samuel Johnson’s description of remarriage as the triumph of hope over experience. One paragraph of musings following his third divorce obliged me to put the book down and think about things for a long while. Here it is:
“The basic problem with my love relationships with women is that my standards are so high -- and they apply equally to both of us. I seek full-blast mutual intensity, fully fledged mutual acceptance, full-blown mutual flourishing, and fully felt peace and joy with each other. This requires a level of physical attraction, personal adoration, and moral admiration that is hard to find. And it shares a depth of trust and openness for a genuine soul-sharing with a mutual respect for a calling to each other and to others. Does such a woman exist for me? Only God knows and I eagerly await this divine unfolding. Like Heathcliff and Catherine’s relationship in Emily Bronte’s remarkable novel Wuthering Heights or Franz Schubert’s tempestuous piano Sonata No. 21 in B flat (D.960) I will not let life or death stand in the way of this sublime and funky love that I crave!”
No doubt this is meant to be inspirational. It is at any rate exemplary. Rendered more or less speechless, I pointed the passage out to my wife.
She looked it over and said, “Any woman who reads this needs to run in the opposite direction when she sees him coming.”
Returning to the book, I found, just a few pages later, that West was getting divorced for a fourth time. Seldom does reader response yield results that prove so empirically verifiable.
The longest episode narrated in Brother West is its account of the conflict with Larry Summers, then president of Harvard University, starting in October 2001. West reports that Summers began their now-legendary meeting by indicating that they should join forces against the neoconservative Harvard prof Harvey Mansfield.
“Help me f___ him up,” said Summers (according to West, says his quasi-ghostwriter).
West had recently released his first hip hop CD, so perhaps Summers thought this would put him at ease. Not so. West says he made clear to Summers that his feeling for Mansfield was collegial.
With popping a cap in a fellow faculty member’s ass now off the table, the exchange then took the form that has now become famous, culminating in Summers’ demand that West make himself available for fortnightly meetings to evaluate his grades and publication plans.
“If you think that I’m going to trot in here every two weeks to be monitored like a miscreant graduate student,” West says he said, “I’m afraid, my brother, that you’ve messed with the wrong brother.”
As the conflict continued to escalate -- ultimately leading to West’s departure for Princeton -- he was diagnosed with prostate cancer. He has been in treatment and is on the mend. Meanwhile, alas, West has not published a single one of the books he said he was working on 10 years ago. The last one before Brother West was a collection of inspirational passages that is usually shelved in the self-help section.
I would much prefer to think that all of this is a matter of his life being in turmoil throughout this decade, rather than Larry Summers being right about anything. But the painful truth is that West's work has grown ever less substantial over time. He has gone from being a public intellectual into a mere celebrity -- someone well-known for being well-known. Brother West marks the extremity of that process.
Legend has it that the blues guitarist Robert Johnson acquired his haunting style by selling his soul to the devil at a crossroads. West, as a “bluesman of the life of the mind,” has clearly also been to the crossroads. The devil gave him a team of publicists. I don't think this was a good bargain on West's part. It left him unable to recognize that self-respect is often the enemy of self-esteem.
But where there’s life, there’s hope. West might eventually tear up the contract. Perhaps the professorial bluesman should take his own trope seriously and undergo a long period of what jazz musicians call "woodshedding."
The woodshed is where you retire with your instrument. You practice and practice -- and then you practice some more -- and eventually something happens. You reconnect with the instrument. Your fingers shape the sound in a fresh way. In the woodshed you don’t think about the audience, because there isn’t one, apart from the crickets and termites, who don’t much care and aren’t going to be impressed in any case.
It is clearly time for Cornel West to take himself to the woodshed -- and not for a weekend either. He needs to perform for the crickets for a good long while, until he finds something new and meaningful to play. His greatest gift to the public and to himself might be to ignore both for as long as possible.
Last week this column offered an assessment, not exactly glowing, of Cornel West’s latest book, an autobiography of sorts. The review failed to say anything about race. That will sound like a paradox to some people but it is not meant to be. A commentary on a book by a prominent black academic is not necessarily a commentary on race in America. (Still less need it be an editorial on the state of African-American studies, which some people took my review to be.) Race matters, but it is not all that matters. The quality of the book as such also counts for something.
Nor did the column discuss West’s politics, which are more or less my own. It is not that I am uninterested in either race or politics, but they were not the focus of the piece. Rather, West’s book was. But people seem to want to discuss race and politics -- so let’s.
It is impossible to make any point so clearly that some portion of the audience will not grow wildly indignant because they think you are saying the exact opposite. You may repeat the point in various ways, hoping that the message will prevail. (Redundancy within a signal is what distinguishes it from the noise in a channel.) But there are limits to just how much good this will do. The will to misunderstand seems to be invincible, and some people enjoy indignation very much.
It would have been difficult to make any more explicit than I did that my disappointment with Brother West – and, yes, anger at West for perpetrating something comparable to the “book” that Sarah Palin “wrote” – followed from a deep admiration for his best work. It is a bad thing to see proof of someone’s talent and intellect being siphoned off by the entertainment industry. The corruption of the best is the worst. Such was the feeling my review expressed, more than once, and in more than one way.
Some readers did get the message. A few suggest that it matched their own impressions. “I read The American Evasion of Philosophy at the beginning of my graduate studies,” one person told me “and it changed my life, but now I look at the excerpt from this new book at MSNBC and it makes me feel despair.” Considering that Brother West was published by a prominent vendor of inspirational and self-help books, this seems like a bad sign on a number of levels.
But to a significant layer of the public, West is not someone whose actual work, as such, means anything at all. They celebrate it, or loathe it, but that does not mean they have ever given any part of his work five minutes of thought. For them, West is not neither an intellectual nor even an individual. He is a synecdoche. He stands for black academics in general, or hip hop, or the history of affirmative action, or the entire history of African-American writing beginning with Phyllis Wheatley. Or something.
This figure is an avatar in the video game of the culture wars. Depending on how the player has adjusted the settings, the character is (a) the relentless and noble freedom fighter whose every move on screen strikes a blow for human liberation or (b) Al Sharpton plus Cliff’s Notes.
Now, my own estimate of Cornel West shares nothing with either of those attitudes – nor do I have much time for video games, actual or metaphorical. But that did not keep lots of people from trying to enlist me in the fantasy.
One especially feverish player announced that the column was so racist that it had only just stopped short of sending West a box of fried chicken. It would be patently impossible to demonstrate this from anything the column actually said. But the statement, while lacking any correspondence to reality, could be regarded as at least coherent on its own terms, once the premises were unpacked.
The most important premise being that no white writer can say anything critical about a black writer without having vile and probably violent motives. This axiom is typically nested, in turn, within an assumption that American life is best understood as having two distinct cultural complexes. One is coded white and the other black. They are accessible via distinct (and well-guarded) entrances, and obey incommensurable zoning codes. You are supposed to stay in your proper matrix.
A system of internal colonies, between which a spirit of mutual disinterest prevails, is not my idea of a good society. As a basis for cultural criticism, “separate but equal” is not that appealing a principle. Nor do I feel deeply accountable to any “tolerance” found choking on its own stifled aggression. My sense of life owes a lot to the work of C.L.R. James, who thought that the multiracial crew of the Pequod was what made Moby Dick such a touchstone to understanding American possibilities.
We should leave racial essentialism to the stand-up comedians who finesse it best. This is a hybrid culture. That is perhaps the one good thing you can say about it. I don’t intend to give that up.
In any case, the notion that a white critic has no business assessing a black writer begs an important question. (And not just, "Does that apply vice versa too?")
One of the decisive early influences on my own writing was Anatole Broyard. For many years he was a critic at The New York Times; he died in 1990. Some time before that, I read a collection of his pieces called Aroused By Books, and looking it over again recently, it seems clear that my response was to steal everything about his style and method that wasn’t nailed down. A few years ago, the public learned that Broyard had taken considerable pains to conceal the fact that he came from an African-American family.
By the logic of old-fashioned, real-deal, no-doubt-about-it white supremacy, anybody who looks white but has a “single drop” of “black blood” is actually black. This is binary thinking gone berserk. But it creates a problem for anyone who wants to insist on criticizing the critic for wandering into someone else’s ethnic enclave.
To bring this down to the matter at hand: How do you know I am white? How, indeed, do I? This society tells me that I am. But then, this society tells me plenty of things that serve its own interests – usually in ways it wouldn’t want questioned too closely. Perhaps obsession with patrolling the perimeters of our gated communities is not a good thing. I’m just putting that out there.
In any case, I want to make clear that there is no way I would ever send Cornel West a box of fried chicken. If we’re going to indulge in identity politics, let me just mention that I come from a Southern working-class family. If I had a box of fried chicken, I would eat it myself. Cornel West earns more in a weekend of public speaking than I do from a year of writing. Let him buy his own food.
As to politics.... It is said that American universities are under the control of tenured radicals trying to continue the revolution by other means. This is constantly repeated but it is utter nonsense. A thin layer of such people do exist, but their power is limited. The prevailing culture of the institution seems far more responsive to the spirit of corporate governance than to any belief that “democracy is in the streets.”
On that score, my admiration for Cornel West remains very much alive. People criticizing him as a “typical” leftist professor could not be more mistaken. For many soi-disant radical academics, the policing of one another’s verbal behavior is as close to activism as they will ever get. By contrast, Brother West gives intriguing glimpses of his involvement with the Black Panther Party, the Social Text collective, and the Democratic Socialists of America. Although the book does not mention it, he also contributed to the socialist journalNew Politics in the 1980s, when it was barely getting revived again after a long period of suspension. As a member of the current New Politics editorial board, I want to express thanks to him for that, and hope he got the copies sent as payment.
Cornel West's heart is in the right place. You can tell that it beats harder when there is a movement towards justice. He walks the walk. This cannot be taken for granted. But here, again, Brother West proves so terribly disappointing. It conveys nothing, absolutely nothing, of what it is like to work in a movement. Politics is not just exhortation. The ability to make a fiery speech is part of being an activist; that is true. But so is assessing your experience as part of a group of people trying to work together. Not a bit of that comes through in his writing.
“From each according to his abilities,” as the old spiritual says, “to each according to his needs.” The professor’s needs are being well met. It is how he is using his abilities that is in question. This is called taking someone seriously.
My review did so – but at the cost of violating certain norms of etiquette. This was explained to me, after the fact, by a tenured professor who is an admirer of West. A cardinal if unwritten rule of the academic world, it seems, is that one must never go on the record with a pointed criticism of anyone prominent or influential. This was not so much a moral principle as the wisdom required to survive in the marketplace. After all, they might retaliate.
Well, so much for speaking truth to power. You can’t please everyone. It would be pretty craven to try.
This column now approaches its fifth anniversary. At the risk of succumbing to the contagious influence of Brother West, I will say that it has had a mission. It has engaged with hundreds of books and authors with the simple intention of trying to communicate and assess their essences for as wide an audience as cares to pay attention – using a variety of formats and tones, and employing whatever degree of vigor, or earnestness, or broadness of humor, or allusive riffing, or explicit citation, or double-encrypted irony, as may seem necessary and appropriate at any given moment.
“Will it ever stop?” as the poet so memorably puts it,
Yo I don’t know. Turn off the lights, and I’ll glow. To the extreme, I rock the mic like a vandal, Light up the stage and wax a chump like a candle.
It is not only people living on islands who count as insular -- etymology notwithstanding. Consider a recent piece by New York Times columnist David Brooks, whose usual shtick might be called “Thornstein Veblen for Dummies.” Using the disaster in Haiti following the earthquake earlier this month as his peg, Brooks diagnosed the country’s poverty and imploding civil society as byproducts of Voodoo – which, with its magical worldview, discourages rational calculation and planning.
Evidently the pundit is growing ambitious; he has graduated to Max Weber for Dummies. The thesis makes perfect sense, as long as you ignore as much economic and political history as possible.
After enslaved people of African descent liberated themselves during the Haitian revolution of the 1790s (creating an independent state, at enormous cost of life) they were forced to pay reparations to France, which had fought a war to resubjugate the island, but lost. The price of diplomatic recognition was not cheap; by one estimate, France demanded the equivalent of $21 billion in today’s currency. Haiti continued to pay it well into the middle of the 20th century. The resources of a poor country were transferred, decade after decade, to a rich country. Was this more rational than a belief in zombies? Would it not tend to foster a belief that the world is governed by capricious forces who must be placated?
The response of sundry blowhards to the news from Haiti is only partly the result of unabashed ignorance, of course. Moral callousness is also a factor. But even among people feeling empathy and a sense of responsibility to help there is often a blindspot with regard to the Caribbean – an underestimation of its place in the history of Atlantic societies, its role in connecting Europe, Africa, and the Americas.This was one of the points tirelessly emphasized by the late C.L.R. James, the Trinidadian historian and political theorist, whose classic book The Black Jacobins: Toussaint Louverture and the San Domingo Revolution (1938), is being rediscovered now. Perhaps it is not too late to grasp the Caribbean as a crucial part of the process shaping global society.
For some more up-to-date reflections on the region at this moment of crisis, I got in touch with Nicholas Laughlin, editor of The Caribbean Review of Books, who lives near James’s old hometown of Port of Spain, Trinidad. In addition to transforming CRB from a quarterly print magazine to an online journal, he is co-editor of the poetry magazine Town and an administrator of Alice Yard, a small contemporary arts space in Port of Spain.
Laughlin is the editor of Letters from London by C.L.R. James (University Press of New England, 2003) and of the revised, expanded, and re-annotated edition of Letters Between a Father and Son by V.S. Naipaul (Pan Macmillan, 2009). I reviewed the earlier book some years ago, and have had the occasional brief dialogue with him by e-mail in the meantime. Following the events of the past two weeks, we had a much more substantial discussion – one touching on the history and politics of the Caribbean, and how its cultural institutions (academic and otherwise) fit into the “economy of attention” of the 21st century.
A transcript of that exchange follows. Some of Laughlin’s spelling has been Americanized, for I have yielded to the cultural imperialism of WordPerfect.
Q: You know how the disaster in Haiti is being discussed by the mass media here. What can you say about how it is being framed within the Caribbean?
A: The Caribbean is so various, it's hard to generalize. I don't really know how recent events in Haiti are being framed in the Hispano- and Francophone Caribbean. Within the Anglophone Caribbean, responses vary from country to country or island to island. In the Bahamas – just north of Haiti, where there are significant numbers of Haitian immigrants – there's been concern about being swamped by refugees. My colleague Nicolette Bethel – anthropologist, playwright, theatre director, and editor of the online literary journal tongues of the ocean – has criticized the way the earthquake has been reported in the Bahamian press, even as many Bahamians have thrown themselves into organizing relief efforts.
In Trinidad, on the other hand, at the opposite end of the Caribbean, there's been some anger about the way the government has responded i.e. with what looks to many of us like faint concern. It took our prime minister nearly a full day to make any kind of statement about the earthquake, in the form of off-the-cuff remarks to the press.
Of course many of us in the Caribbean have CNN, the BBC, and the U.S. networks on cable, and read the international papers online. Where there's been local reporting, it's mostly focused on local angles – the Jamaican press gave lots of coverage to the visit their prime minister made to Haiti last week, and here in Trinidad there have been several stories on Trinidadians who happened to be in Haiti during the earthquake.
As with the media anywhere, the media here "like" stories of chaos and mayhem. So there's been ample coverage, via wire service stories, of looting, machete-wielding gangs, street violence etc., even though there are many people on the ground in Haiti who say that violent incidents have been rare and very localized, and there has been extraordinary cooperation among displaced Haitians – and the whole issue of "looting" needs serious deconstruction.
Q: Yes, it’s quite similar to how the coverage of the disaster in New Orleans unfolded in the American media just after Katrina. Do you notice anything distinctive about the Caribbean discussion of the crisis now?
A: I think there is wider awareness in the Caribbean (than in the U.S., say) of some of the historical circumstances that contributed to the present crisis – crippling and unjust debt, meddling by foreign powers, and so on. I'm pretty sure that the Haitian Revolution and its aftermath, including the "reparations" payments to the French (by now we can all quote the amounts by heart) are on the secondary schools history syllabus in Trinidad.
I've read a few incisive op-ed pieces in the local press – by the historian Hilary Beckles in Barbados, the activist Raffique Shah in Trinidad and Tobago, the literary scholar Carolyn Cooper in Jamaica, and I'm sure I've missed others – that explain Haiti's history of being bullied by wealthier countries with much bigger armies. Certainly among what you might call the Caribbean's intellectual elite, for want of a better term, there's a definite sense of the wider Caribbean's moral debt to Haiti. We've almost all read at least some of The Black Jacobins.
Still, the rhetoric of "failure," the idea that Haiti is "unlucky," has a foothold in the discourse. I got into a sort of argument the other day, on Facebook, with a friend who was riffing off the Senegalese president's offer to "repatriate" Haitians. This friend suggested, no doubt as a kind of deliberately absurd thought experiment, that the "seemingly interminable problem" of Haiti would be solved by permanently evacuating the whole country – resettling all nine or ten million people elsewhere. Even among well-educated and well-meaning people, the idea of Haiti as a "problem" is entrenched.
But another friend who entered the conversation said something that struck and moved me. She said she was appalled by her own attitudes towards Haiti – meaning, I think, that the horrors of the earthquake and its aftermath had forced her to confront her own unconscious prejudices and ignorance. I feel the same, and there seems to be a wider sense here that Caribbean citizens must take some blame for Haiti's troubles in recent decades. We haven't been interested enough, haven't pressured our politicians enough, haven't bothered to try to understand. Many friends and colleagues seem to share an awareness that real recovery for Haiti means meaningful involvement by Caribbean citizens, and a still-unfocused resolve to be a part of that. I hope we stick to our guns.
Q: The possibility of pan-Caribbean citizenship is familiar from C.L.R. James’s writings. It was something he saw as necessary and urgent. But it sounds like there hasn’t been much progress on that front in the two decades since his death. Why is that?
A: Individual Caribbean countries have much in common, of course, but there are real knowledge gaps separating us and very real prejudices behind the facade of solidarity that we generally like to put forward. At the best of times there are strong prejudices against the region's less wealthy nations. In the southern Caribbean, that means Guyana, and Haiti seems to fill that role further north. If some Bahamians are worried about being overrun by Haitians, some Barbadians feel the same way about Guyanese, and Trinidadians have long been suspicious of "small islanders" wanting to settle here.
It's more acute when it comes to Haiti – not only is it a very "poor" and "undeveloped" country, but it has a reputation for violence, HIV, and voodoo. Never mind that violent crime and HIV infection rates are rising everywhere in the region, and every Caribbean territory has one or more versions of a syncretic religion combining elements of belief and practice from West Africa, Christianity, and sometimes other traditions.
Q: As editor of The Caribbean Review of Books, you are in a good position to assess the literary and intellectual traffic within the region, and between the Caribbean and the rest of the Atlantic. Would you say something about this?
A: I often think I'd be in a better position to, as you put it, "assess the literary and intellectual traffic" if I lived not in Port of Spain but in New York or London or Toronto or even Miami. I'm also pretty sure it would be easier to publish a magazine like the CRB in one of those places. It would probably be easier to secure grant funding, and the magazine would be physically closer to a critical mass of potential readers.
One of the hot concepts in Caribbean academic circles these days is the "transnational" Caribbean. It can mean different things. The positive spin is the notion of the Caribbean not as a physical region but as a cultural or social phenomenon – a space, not a place – that includes the major Caribbean populations in metropolitan centers like the ones I listed above. So we can claim Brooklyn and Brixton as "ours," and we quote the late Jamaican poet Louise Bennett about "colonizing in reverse."
On the one hand, this notion of the transnational is simply descriptive. There are millions of Caribbean immigrants and their immediate descendants in the U.S., Canada, the U.K. and elsewhere, and there is certainly a sense in which they continue to be Caribbean, participate in conversations about Caribbean society, contribute to Caribbean economies (through remittances), etc. On the other hand, and especially from the point of view of someone who actually does live here, it sometimes seems like wishful thinking, or at least like an attempt to put a hopeful face on a situation that is not so hopeful. Colonizing in reverse, or brain drain?
The last census in Guyana suggested some shockingly high percentage of Guyanese of my generation with a secondary education now live abroad – over 80 percent. Anecdotally, I can say that about half of my graduating class at secondary school (one of Trinidad's "elite" schools) is now abroad.
This isn't a new phenomenon, of course. Going abroad for education or to expand intellectual possibilities has been part of the standard narrative of Caribbean intellectual life at least as far back as 1932, when James left Trinidad. It's widely held that West Indian literature suddenly sprang into existence in the 1950s when various aspiring writers from different British West Indian territories went to London, discovered common cultural elements, and found an audience for their work via the BBC's Caribbean Voices program and postwar publishers with a taste for exotica from the colonies. (Though that narrative is now disputed by some younger Caribbean lit scholars working on earlier writers and texts.)
There was a moment in the late '60s when it seemed the center of intellectual gravity might shift back to the Caribbean itself, but it didn't take long for post-Independence disillusion to set in. It's very moving but also puzzling for me to read a book like Andrew Salkey's Georgetown Journal (1972), set just at that moment when Independence optimism was beginning to tremble.
Q: I asked about this without thinking about how the cultural history would overlap with your own personal experience. Would you say a little more about that?
A: I came of age in the 1980s, which with adult hindsight I can see was a very pessimistic time for Caribbean people of my parents' generation, but I remember as a schoolchild thinking that people who "went away to live" were specially lucky, even if it was an eventuality I couldn't imagine for myself. Had I gone to university abroad, it's likely I wouldn't have come back to Trinidad, not to live. I still can't decide whether that would have been a better thing.
Having reached my mid-30s, having never lived anywhere else, I'm now fairly certain I'll stay here. But that's something I still think about often – almost every time I travel to the U.S. or Britain, I spend a good chunk of my time trying to imagine an alternative life there. I think that for many Caribbean people of my generation and approximate background – middle class, relatively well-educated – the question of going or staying remains acute.
Sitting here in Diego Martin, west of Port of Spain, it seems to me that in 2010 the literary and intellectual traffic within the Caribbean – and between the region and North America and Europe – is still directed mainly by agents physically located outside the Caribbean itself. Most of our intellectuals and writers are elsewhere. Almost all our books are published elsewhere, There are only two publishers of consequence in the Anglophone Caribbean – both based in Jamaica, both quite small. The main intellectual journal of the Anglophone Caribbean, Small Axe, is based in New York.
Most serious contemporary Caribbean artists either live abroad or depend heavily on financial support from abroad via grants, residencies, etc. Many if not most intellectual or cultural initiatives in the Caribbean similarly depend on financial support from abroad. What's kept the CRB going in the past couple years is a grant from the Prince Claus Fund in the Netherlands. And the audiences for all of these are also now, in the main, in North America and Europe. In the money economy as well as the attention economy, we still depend on investment and, frankly, charity from elsewhere.
Q: What role does academe play in all of this?
A: It's true that indigenous institutions like the University of the West Indies do a great deal to promote conversations between separate territories, and the three campuses are relatively important centers of activity. But the university faculty are vastly outnumbered by Caribbean scholars in the academy abroad who are inevitably better funded and better positioned to insert themselves into essential debates in their disciplines.
It's terribly revealing that the theme of the Caribbean Studies Association's 2009 conference, held in Jamaica, was "Centering the Caribbean in Caribbean Studies." You can read the phrase in more ways than one, but my interpretation is: “bringing the Caribbean back to the center of Caribbean studies.” Well, where else was it?
I don't mean to set up a binary opposition between here and there, local and diaspora, us and them, because of course the reality is far more complex. There is conversation and exchange and movement between all these nodes, and they are often fruitful. But aspects of the situation are depressing. For the better part of five centuries the Caribbean was devoted to producing raw materials to enrich already wealthy countries further north. Now sometimes it feels like we're producing cultural raw materials to be turned into books, films, lectures, etc. by intellectual agents in New York or London or Toronto.
Q: Is there a silver lining to contemporary developments?
A: When I (infrequently) attend academic conferences or meetings in the Caribbean, like a stuck record I implore the assembled scholars to make more strategic use of the web to share their research, to make it more widely available. I remind them that many of us in the Caribbean don't have easy access to research libraries or online journal subscriptions. In the age of WordPress and Blogger, when anyone who can use a word processor can also set up a website, there's no excuse.
One of the interesting and encouraging developments in the Trinidad art scene in the past year or so has been the rapid flourishing of artists' blogs. The writers will follow close behind, I hope. There is no serious engagement with visual art in the press here – no real reviewing, no professional critics, and commercial galleries are generally highly conservative and mercenary. So younger artists are increasingly creating work to share online, and using their blogs and websites to document their practice and comment on the work of their peers. It's early yet, and if there's a conversation going on, it's still happening within a small circle, but the odd international curator has peeked or poked in.
Some of my friends and colleagues in the art scene here have been energized and encouraged by this development. It may fizzle out, or these small individual initiatives may coalesce. In the past year or two, I've been more involved in, and paid more attention to, the Caribbean visual art scene than to the literary scene – partly because that's where the energy seems to be, partly because Caribbean visual images seem to be doing better than Caribbean literary texts in the economy of attention.
Q: That seems like a useful expression – “the economy of attention.” Clearly it is bound up, in all sorts of complicated ways, with economics in the more familiar sense. But it’s also political....
A: At the moment everything going through my head is colored by the fact of Haiti. Who gets to decide what help Haiti needs and how to rebuild? I'm not sure Haitians will. Who gets to decide what contemporary Caribbean literature is? Publishers in New York and London and literary scholars in American, British, and Canadian universities. Those two questions aren't comparable in degree, but are bound together in a common dilemma.
Howard Zinn -- whose A People’s History of the United States, first published by Harper & Row in 1980, has sold some two million copies -- died last week at the age of 87. His passing has inspired numerous tributes to his role in bringing a radical, pacifist perspective on American history to a wide audience.
It has also provoked denunciations of Zinn as “un-American,” which seems both predictable and entirely to his credit. One of Zinn’s lessons was that protest is a deeply American inclination. The thought is unbearable in some quarters.
One of the most affectionate tributes came from the sports writer Dave Zirin. As with many other readers, he found that reading Zinn changed his whole sense of why you would even want to study the past. “When I was 17 and picked up a dog-eared copy of Zinn's book,” he writes, “I thought history was about learning that the Magna Carta was signed in 1215. I couldn't tell you what the Magna Carta was, but I knew it was signed in 1215. Howard took this history of great men in powdered wigs and turned it on its pompous head.” Zirin went on to write A People’s History of Sports (New Press, 2008), which is Zinnian down to its cells.
Another noteworthy commentary comes from Christopher Phelps, an intellectual historian now in the American and Canadian studies program at the University of Nottingham. He assesses Zinn as a kind of existentialist whose perspective was shaped by the experience of the civil rights struggle. (He had joined the movement in the 1950s as a young professor at Spelman College, a historically black institution in Atlanta.)
An existentialist sensibility -- the tendency to think in terms of radical commitment, of decision making as a matter of courage in the face the Absurd -- was common to activists of his generation. That Phelps can hear the lingering accent in Zinn’s later work is evidence of a good ear.
Zinn “challenged national pieties and encouraged critical reflection on received wisdom,” writes Phelps. “He understood that America’s various radicalisms, far from being ‘un-American,’ have propelled the nation toward more humane and democratic arrangements.... He urged others to seek in the past the inspiration to dispel resignation, demoralization, and deference, the foundations of inertia. The past meant nothing, he argued, if severed from present and future.”
Others have claimed that Zinn did not sufficiently denounce Stalinism and its ilk. The earliest example of the complaint that I know came in a review of People’s History that appeared in The American Scholar in 1980, when that magazine was a cultural suburb of the neoconservative movement. The charge has been recycled since Zinn’s death.
This is thrifty. It is also intellectually dishonest. For what is most offensive about Zinn (to those who find him so) is that he held both the United States and the Soviet Union to the same standard. He even dared to suggest that they were in the grip of a similar dynamic.
“Expansionism,” he wrote in an essay from 1970, “with its accompanying excuses, seems to be a constant characteristic of the nation-state, whether liberal or conservative, socialist or capitalist. I am not trying to argue that the liberal-democratic state is especially culpable, only that it is not less so than other nations. Russian expansionism into Eastern Europe, the Chinese moving into Tibet and battling with India over border territories -- seem as belligerent as the pushings of that earlier revolutionary upstart, the United States.... Socialism and liberalism both have advantages over feudal monarchies in their ability to throw a benign light over vicious actions.”
Given certain cretinizing trends in recent American political discourse, it bears stressing that Zinn here uses “liberalism” and “socialism” as antonyms. A liberal supports individual rights in a market economy. By any rigorous definition, Sarah Palin is a liberal. And so, of course, is Barack Obama, who can only be called a “socialist” by an abuse of language. (But such abuse is an industry now, and I feel like Sisyphus just for complaining about it.)
The most substantial critique of A People’s History remains the review by Michael Kazin that appeared in Dissent in 2004. Kazin’s polemic seems to me too stringent by half. Zinn's book is not offered as the last word on the history of the United States, but as a corrective to dominant trends. It is meant to be part of an education, rather than the totality of it.
But Kazin does make points sometimes acknowledged even by the book’s admirers: “Zinn reduces the past to a Manichean fable and makes no serious attempt to address the biggest question a leftist can ask about U.S. history: why have most Americans accepted the legitimacy of the capitalist republic in which they live?”
That is indeed the elephant in the room. Coercion has certainly been a factor in preserving the established order, but persuasion and consent have usually played the greater part. Any American leftist who came of age after Antonio Gramsci’s work began to be assimilated is bound to consider hegemony a starting point for discussion, rather than an afterthought.
But Zinn was the product of an earlier moment -- one for which the stark question of commitment had priority. A strategic map of the political landscape was less urgent than knowing that you stood at a crossroads. You either joined the civil rights struggle or you didn’t. You were fighting against nuclear proliferation or the Vietnam War, or you were going along with them. It is possible to avoid recognizing such alternatives -- though you do end up making the choice between them, one way or the other.
There were subtler interpretations of American history than Howard Zinn’s. Anyone whose understanding of the past begins and ends with it has confused taking a vitamin for consuming a meal. But that does not make it worthless. The appreciation of complexity is a virtue, but there are times when a moment of clarity is worth something, too.
Certain research topics seem to destined to inspire the question, “Seriously, you study that?” So it is with the field of Twitter scholarship. Which -- just to get this out of the way -- is not actually published in 140 characters or less. (The average “tweet” is the equivalent of two fairly terse sentences. It is like haiku, only more self-involved.)
The Library of Congress announced in April that it was acquiring the complete digital archives of the “microblogging” service, beginning with the very first tweet, from ancient times. At present, the Twitter archive consists of 5 terabytes of data. If all of the printed holdings of the LC were digitalized, it would come to 10 to 20 terabytes (this figure does not include manuscripts, photographs, films, or audio recordings).
Some 50 million new messages are sent on Twitter each day, although one recent discussion at the LC suggested that the rate is much higher -- at least when the site is not shutting down from sheer traffic volume, which seems to be happening a lot lately. A new video on YouTube shows a few seconds from the "garden hose" of incoming Twitter content.
When word of this acquisition was posted to the Library of Congress news blog two months ago, it elicited comment by people who could not believe that anything so casual and hyper-ephemeral as the average tweet was worth preserving for posterity – let alone analyzing. Thanks to the Twitter archive, historians will know that someone ate a sandwich. Why would they care?
Other citizens became agitated at the thought that “private” communications posted to Twitter were being stored and made available to a vast public. Which really does seem rather unclear on the concept. I’m as prone to dire mutterings about the panopticon as anybody -- but come on, folks. The era of digital media reinforces the basic principle that privacy is at least in part a matter of impulse control. Keeping something to yourself is not compatible with posting it to a public forum. Evidently this is not as obvious as it should be. Things you send directly to friends on Twitter won't be part of the Library's holdings, but if you celebrated a hook-up by announcing it to all and sundry, it now belongs to the ages.
A working group of librarians is figuring out how to “process” this material (to adapt the lingo we used when I worked as an archival technician in the Library's manuscript division) before making the collection available to researchers. But it’s not as if scholars have been waiting around until the collection is ready. Public response to the notion of “Twitter studies” might be incredulous, but the existing literature gives you some idea of what can be done with this giant pulsing mass of random discursive particles.
A reading of the scholarship suggests that individual tweets, as such, are not the focus of very much attention. I suppose the presidential papers of Barack Obama will one day include an annotated edition of postings to his Twitter feed. But that is the exception and not the rule.
Instead, the research, so far, tends to fall into two broad categories. One body focuses on the properties of Twitter as a medium. (Or, what amounts to a variation on the same thing, as one part of an emerging new-media ecosystem.) The other approach involves analyzing gigantic masses of Twitter data to find evidence concerning public opinion or mood.
Before giving a thumbnail account of some of this work – which, as the bibliography I’ve consulted suggests, seems intrinsically interdisciplinary – it may be worth pointing out something mildly paradoxical: the very qualities that make Twitter seem unworthy of study are precisely what render it potentially quite interesting. The spontaneity and impulsiveness of expression it encourages, and the fact that millions of people use it to communicate in ways that often blur the distinction between public and private space, mean that Twitter has generated an almost real-time documentary record of ordinary existence over the past four years.
There may be some value to developing tools for understanding ordinary existence. It is, after all, where we spend most of our time.
Twitter shares properties found in numerous other new-media formats. The term “information stream” is sometimes used to characterize digital communication, of whatever sort. Inside Higher Ed “flows” at the rate of a certain number of articles per day during the workweek. An online scholarly journal, by contrast, will probably trickle. A television network’s website -- or the more manic sort of Twitter feed -- will tend to gush. But the “streaming” principle is the same in any case, and you never step into the same river twice.
A recent paper by Mor Naaman and others from the School of Communication and Information at Rutgers University uses a significant variation on this concept, the “social awareness stream,” to label Twitter and Facebook, among other formats. Social awareness streams, according to Naaman et al., “are typified by three factors distinguishing them from other communication: a) the public (or personal-public) nature of the communication and conversation; b) the brevity of posted content; and, c) a highly connected social space, where most of the information consumption is enabled and driven by articulated online contact networks.”
Understanding those “articulated online contact networks” involves, for one thing, mapping them. And such mapping efforts have been underway since well before Twitter came on the scene. What makes the Twitter “stream” particularly interesting is that – unlike Facebook and other social-network services -- the design of the service permits both reciprocal connections (person A “follows” person B, and vice versa) and one-sided (A follows B, but that’s it). This makes for both strong and weak communicative bonds within networks -- but also among them. And various conventions have emerged to allow Twitter users to signal one another or to urge attention to a particular topic or comment. Besides “retweeting” someone’s message, you can address a particular person (using the @ symbol, like so: @JohnDoe) or index a message by topic (noted with the hashtag, thusly: #topicdujour).
All of this is, of course, familiar enough to anyone who uses Twitter. But it has important implications for just what kind of communication system Twitter fosters. To quote the title of an impressive paper by Haewoon Kwak and three other researchers from the department of computer science at the Korea Advanced Institute of Science and Technology: “What is Twitter, a Social Network or a News Media?” (No sense protesting that “media” is not a singular noun. Best to grind one’s teeth quietly.)
Analyzing almost 42 million user profiles and 106 million tweets, Kwak and colleagues find that Twitter occupies a strange niche that combines elements of both mass media and homophilous social groups. (Homophily is defined as the tendency of people to sustain more contact with those they judge to be similar to themselves than with those who they perceive to be dissimilar.) "Twitters shows a low level of reciprocity," they write. "77.9 percent of user pairs with any link between them are connected one-way, and only 22.1 percent have reciprocal relationships between them.... Previous studies have reported much higher reciprocity on other social networking services: 68 percent on Flickr and 84 percent on Yahoo."
In part, this reflects the presence on Twitter of already established mass-media outlets – not to mention already-famous people who have millions of “followers” without reciprocating. But the researchers find that a system of informal but efficient “retweet trees” also function “as communication channels of information diffusion.” Interest in a given Twitter post can rapidly spread across otherwise disconnected social networks. Kwak’s team found that any retweeted item would “reach an average of 1,000 users no matter what the number of followers is of the original tweet. Once retweeted, a tweet gets retweeted [again] almost instantly on the second, third, and fourth hops away from the source, signifying fast diffusion of information after the first retweet.”
Eventually someone will synthesize these and other analyses of Twitter’s functioning -- along with studies of other institutional and mass-media networks -- and give us some way to understand this post-McLuhanesque cultural system. In the meantime, research is being done on how to use the constant landslide of Twitter messages to gauge public attitudes and mood.
As Brendan O’Connor and his co-authors from Carnegie Mellon University note in a paper published last month, the usual method of conducting a public-opinion poll by telephone can cost tens of thousands of dollars. (Besides, lots of us hang up immediately on the suspicion that it will turn into a telemarketing call.)
Using one billion Twitter messages from 2008 and ’09 as a database, O’Connor and colleagues ran searches for keywords related to politics and the economy, then generated a “sentiment score” based on the lists of 1,600 “positive” and 1,200 “negative” words. They then compared these “text sentiment” findings to the results of more traditional public opinion polls concerning consumer confidence, the election of 2008, and the new president’s approval ratings. They found sufficiently strong correlation to be encouraging -- and noted that by the summer of 2009, when many more people were on Twitter than had been the case in 2008, the text-sentiment results proved a good predictor of consumer confidence levels.
A different methodology was used in “Modeling Public Mood and Emotion: Twitter Sentiment and Socio-Economic Phenomena” by John Bollen of Indiana University and two other authors. They collected all public tweets from August 1 to December 20, 2008 and harvested from them data about the content that could be plugged into “a well-established psychometric instrument, the Profile of Mood States” which “measures six individual dimensions of mood, namely Tension, Depression, Anger, Vigor, Fatigue, and Confusion.” This sounds like something from one of Woody Allen’s better movies.
The data crunching yielded “a six dimensional mood vector” covering the months in question. Which, as luck would have it, coincided with both the financial meltdown and the presidential election of 2008. The resulting graphs are intriguing.
Following the election, the negative moods (Tension, Depression, etc.) fell off. There was “a significant spike in Vigor.” Examination of samples of Twitter traffic showed “a preponderance of tweets expressing high levels of energy and positive sentiments.”
But by December 2008, as the Dow Jones Industrial Average fell to below 9000 points, the charts show a conspicuous rise in Anger -- and an even stronger one for Depression. The researchers write that this may have been an early signal of “what appears to be a populist movement in opposition to the new Obama administration.”
“Tweets may be regarded,” write Bollen and colleagues, “as microscopic instantiations of mood.” And they speculate that the microblogging system may do more than reflect shifts of public temper: “The social network of Twitter may highly affect the dynamics of public sentiment…[O]ur results are suggestive of escalating bursts of mood activity, suggesting that sentiment spreads across network ties.”
As good a reason as any to put this archive of the everyday into the time capsule. And while my perspective on this may be a little off-center, I think it is fair that the Twitter record should be stored at the the Library of Congress, which also houses the papers of the American presidents up through Theodore Roosevelt.
Almost 20 years ago, I started to work there just around the corner from the bound volumes containing, among other things, the diaries of George Washington. The experience of taking a quick look at them was something like a rite of passage for people working in the manuscript division. And to judge by later conversations among colleagues, the experience was usually slightly bewildering.
You would open the volume and gaze at the very page where his hand had guided the quill. You would start to read, expecting deep thoughts, or historical-seeming ones, at any rate. And this, more or less, is what you found on every page:
"Rained today. Three goats died. Need to buy a new plow.”
He had another 85 characters to spare.
P.S. Follow me on Twitter here, and keep up with news on scholarly publishing here.
The first three seasons of "Mad Men" (the fourth begins on Sunday) were set in a world recognizable from The Hidden Persuaders, Vance Packard’s landmark work of pop sociology from 1957. Reviving the spirit of muckraking to probe the inner workings of postwar affluence, Packard reported on how the ad agencies on Madison Avenue used psychological research to boost the manipulative power of their imagery and catchphrases.
To prime the consumer market, habits and attitudes left over from the Great Depression had to be liquidated. Desire must be set free -- or at least educated into enough confidence to be assertive, Advertising meant selling not just a product but a dream. There was, for example, the famous ad campaign portraying women who found themselves in public, in interesting situations while wearing little more their Maidenform undergarments. The idea was to lodge the product in the potential consumer’s unconscious by associating it with a common dream situation.
But my sense is that "Mad Men" is poised to enter a new, post-Packardian phase. At the end of the third season, several characters left the established firm of Sterling Cooper and set out to create their own advertising “shop” – all of this not very long after the Kennedy assassination. Trauma seldom stalls the wheels of commerce for long. And we know, with hindsight, that American mass culture was just about to undergo a sudden, swift de-massification – the proliferation, over the next few years, of ever more sharply defined consumer niches and episodic subcultures.
Stimulating consumer desire by making an end run around the superego was no longer the name of the game. The new emphasis took a different form. It is best expressed by the term “lifestyle” -- which, as far as I can tell, was seldom used before the mid-60s, except as a piece of jargon from the Adlerian school of psychoanalytic revisionism.
Alfred Adler had coined the term to describe the functioning of the inferiority complex. (“Inferiority complex” was another Adler-ism; this was the concept that precipitated his break with Freud in the 1910s.) The neurotic, according to Adler, transformed his inferiority complex into a comprehensive structure of psychic defense – a whole pattern of life, designed to avoid its more disagreeable realities as much as possible.
Obviously “lifestyle” would acquire other meanings. But arguably that original sense is always there, below the surface. What looks like an identity or a niche has its shadow -- its underside of insecurity.
I don’t know how much Alfred Adler the creators of "Mad Men" have read. But they have certainly tuned into this dimension of its central characters.
Don and Peggy have crafted lives for themselves that express, not who they are, but who they want to be. (Or in Don’s case, who he wants to be taken to be. We’re talking double-encrypted personal inauthenticity.) They have turned feelings of inferiority and powerlessness into ambition -- rising to positions in advertising that enable them to elicit and channel those feelings in the consumer.
Pete (easily the most unlikable figure on the show) is the walking embodiment of status anxiety and a borderline sociopath. His only saving grace is that he is too ineptly Machiavellian to succeed at any scheme he might hatch. Unable to advance within the hierarchy at Sterling Cooper, he walked away to help start the new agency.
We’ve seen that he has one forward-looking idea: Pete realizes that there is an African-American market out there that advertisers could target. Nobody at Sterling Cooper had any interest in crafting campaigns to run in Jet magazine. But any sense that his role might be “progressive” runs up against the most salient thing about him: he is a hollow man, incapable of empathy but ready to turn the way the wind blows.
Vance Packard portrayed Madison Avenue as a place staffed by people who were competent and lucid, if not particularly scrupulous. Packard intended The Hidden Persuaders as social criticism, but the book participated in the technocratic imagination. It assumed that advertising’s best and brightest both possessed knowledge and could apply it, steering the marketplace by remote control.
Against this, "Mad Men" has been slowly building up a counternarrative. Its first season was set in 1960 -- the final year of the Eisenhower administration. The third season closed just after an assassin’s shots ended what would, in short order, be recalled as Camelot. A few scattered references have been made to a war underway in Southeast Asia.
Trust in the foresight of technocrats is about to take a hard fall. And the center of gravity in the advertising world is about to shift from masterful “hidden manipulators” to figures who can ride the wave of cultural upheaval because they are skilled at manufacturing niches for themselves.
The characters running the new agency are not confident engineers of consumer desire but – albeit in a special sense -- confidence artists. Not that they are swindlers. But they know how to fabricate a self and sell it to other people.
With its fourth season, "Mad Men" is on the verge of finally becoming a series about the Sixties. It is also a work of historical fiction about where consumerism came from, and what it was like. I suppose the past tense is unavoidable. Over the next decade, to judge by recent trends, people will need a leap of the imagination to remember the Golden Age of Lifestyles.