The first three seasons of "Mad Men" (the fourth begins on Sunday) were set in a world recognizable from The Hidden Persuaders, Vance Packard’s landmark work of pop sociology from 1957. Reviving the spirit of muckraking to probe the inner workings of postwar affluence, Packard reported on how the ad agencies on Madison Avenue used psychological research to boost the manipulative power of their imagery and catchphrases.
To prime the consumer market, habits and attitudes left over from the Great Depression had to be liquidated. Desire must be set free -- or at least educated into enough confidence to be assertive, Advertising meant selling not just a product but a dream. There was, for example, the famous ad campaign portraying women who found themselves in public, in interesting situations while wearing little more their Maidenform undergarments. The idea was to lodge the product in the potential consumer’s unconscious by associating it with a common dream situation.
But my sense is that "Mad Men" is poised to enter a new, post-Packardian phase. At the end of the third season, several characters left the established firm of Sterling Cooper and set out to create their own advertising “shop” – all of this not very long after the Kennedy assassination. Trauma seldom stalls the wheels of commerce for long. And we know, with hindsight, that American mass culture was just about to undergo a sudden, swift de-massification – the proliferation, over the next few years, of ever more sharply defined consumer niches and episodic subcultures.
Stimulating consumer desire by making an end run around the superego was no longer the name of the game. The new emphasis took a different form. It is best expressed by the term “lifestyle” -- which, as far as I can tell, was seldom used before the mid-60s, except as a piece of jargon from the Adlerian school of psychoanalytic revisionism.
Alfred Adler had coined the term to describe the functioning of the inferiority complex. (“Inferiority complex” was another Adler-ism; this was the concept that precipitated his break with Freud in the 1910s.) The neurotic, according to Adler, transformed his inferiority complex into a comprehensive structure of psychic defense – a whole pattern of life, designed to avoid its more disagreeable realities as much as possible.
Obviously “lifestyle” would acquire other meanings. But arguably that original sense is always there, below the surface. What looks like an identity or a niche has its shadow -- its underside of insecurity.
I don’t know how much Alfred Adler the creators of "Mad Men" have read. But they have certainly tuned into this dimension of its central characters.
Don and Peggy have crafted lives for themselves that express, not who they are, but who they want to be. (Or in Don’s case, who he wants to be taken to be. We’re talking double-encrypted personal inauthenticity.) They have turned feelings of inferiority and powerlessness into ambition -- rising to positions in advertising that enable them to elicit and channel those feelings in the consumer.
Pete (easily the most unlikable figure on the show) is the walking embodiment of status anxiety and a borderline sociopath. His only saving grace is that he is too ineptly Machiavellian to succeed at any scheme he might hatch. Unable to advance within the hierarchy at Sterling Cooper, he walked away to help start the new agency.
We’ve seen that he has one forward-looking idea: Pete realizes that there is an African-American market out there that advertisers could target. Nobody at Sterling Cooper had any interest in crafting campaigns to run in Jet magazine. But any sense that his role might be “progressive” runs up against the most salient thing about him: he is a hollow man, incapable of empathy but ready to turn the way the wind blows.
Vance Packard portrayed Madison Avenue as a place staffed by people who were competent and lucid, if not particularly scrupulous. Packard intended The Hidden Persuaders as social criticism, but the book participated in the technocratic imagination. It assumed that advertising’s best and brightest both possessed knowledge and could apply it, steering the marketplace by remote control.
Against this, "Mad Men" has been slowly building up a counternarrative. Its first season was set in 1960 -- the final year of the Eisenhower administration. The third season closed just after an assassin’s shots ended what would, in short order, be recalled as Camelot. A few scattered references have been made to a war underway in Southeast Asia.
Trust in the foresight of technocrats is about to take a hard fall. And the center of gravity in the advertising world is about to shift from masterful “hidden manipulators” to figures who can ride the wave of cultural upheaval because they are skilled at manufacturing niches for themselves.
The characters running the new agency are not confident engineers of consumer desire but – albeit in a special sense -- confidence artists. Not that they are swindlers. But they know how to fabricate a self and sell it to other people.
With its fourth season, "Mad Men" is on the verge of finally becoming a series about the Sixties. It is also a work of historical fiction about where consumerism came from, and what it was like. I suppose the past tense is unavoidable. Over the next decade, to judge by recent trends, people will need a leap of the imagination to remember the Golden Age of Lifestyles.
Once upon a time -- long, long ago -- I spent rather a lot of time reading about the theory of narrative. This was not the most self-indulgent way to spend the 1980s, whatever else you can say about it. Arguably the whole enterprise had begun with Aristotle, but it seemed to be reaching some kind of endgame around the time I was paying attention. You got the sense that narratologists would soon be able to map the genome of all storytelling. It was hard to tell whether this would be a good thing or a bad thing, but they sure seemed to be close.
The turning point had been the work of the Russian folklorist Vladimir Propp. In the late 1920s, he had broken down 100 fairy tales into a set of elementary “functions” performed by the characters, which could then be analyzed as occurring in various combinations according to a handful of fixed sequences. The unrelated-seeming stories were just variations on a very few algebraic formulas.
Of course, fairly tales tend to be pretty formulaic to begin with -- but with some tweaking, Propp's approach could be applied to literary texts. By the 1960s, French structuralist critics such as Roland Barthes and Gerard Genette were analyzing the writings of Poe and Proust (not to mention James Bond novels) to extract their narrative DNA. And then came Hayden White’s Metahistory: The Historical Imagination in Nineteenth Century Europe (1973), which showed how narratology might be able to handle nonfiction. White found four basic modes of “emplotment” -- romantic, comic, tragic, and satirical -- in the storytelling done by historians.
It was obviously just a matter of time before some genius came along to synthesize and supersede all of this work in book called Of Narratology, at least half of which would be written in mathematical symbols. The prospect seemed mildly depressing. In the end, I was more interested in consuming narratives (and perhaps even emitting them, from time to time) than in finding the key to all mythologies. Apart from revisiting Peter Brooks's Reading for the Plot: Design and Intention in Narrative (1984) -- the only book on the topic I recall with any pleasure -- narratology is one of those preoccupations long since forgotten.
And so Christian Salmon’s Storytelling: Bewitching the Modern Mind reads like a dispatch from the road not taken. Published in France in 2007 and recently issued in English translation by Verso, it is not a book contribution to the theory of narrative but a report on its practical applications. Which, it turns out, involve tremendous amounts of power and money -- a plot development nobody would have anticipated two or three decades ago.
“From the mid-1990s onward,” writes Salmon, concentration on narrative structure “affected domains as diverse as management, marketing, politics, and the defense of the nation.” To a degree, perhaps, this is obvious. The expression “getting control of the narrative” has long since become part of the lexicon of mass-media knowingness, at least in the United States. And Salmon -- who is a member of the Centre for Research in the Arts and Language in Paris and a columnist for Le Monde -- has one eye trained on the American cultural landscape, seeing it as the epicenter of globalization.
Roughly half of Salmon’s book is devoted to explaining to French readers the history and nuances of such ubiquitous American notions as “spin” and "branding." He uses the expression “narratocracy” to characterize the form of presidential leadership that has emerged since the days of Ronald Reagan. The ability to tell a compelling story is part of governance. (And not only here. Salmon includes French president Sarkozy as practitioner of “power through narrative.”)
Less familiar, perhaps, is the evidence of a major shift toward narrative as a category within marketing and management. Corporations treat storytelling as an integral part of branding; the public is offered not just a commodity but a narrative to consume. He quotes Barbara Stone, a professor of marketing at Rutgers University: “When you have a product that’s just like another product, there are any number of ways to compete. The stupid way is to lower prices. The smart way is to change the value of the product by telling a story about it.” And so you are not just buying a pair of pants, for example, but continuing the legacy of the Beat Generation.
“It is not as though legends and brands have disappeared,” writes Salmon. But now they “talk to us and captivate us by telling us stories that fit in with our expectations and worldviews. When they are used on the Web, they transform us into storytellers. We spread their stories. Good stories are so fascinating that we are encouraged to tell them again.”
Other stories are crafted for internal consumption. Citing management gurus, Salmon shows the emergence of a movement to use storytelling to regulate the internal life of business organizations. This sometimes draws upon the insights of well-known narrative artists of canonical renown, as in books like Shakespeare on Management. (Or Motivational Secrets of the Marquis de Sade, if I can ever sell that idea.) But it also involves monitoring and analyzing the stories that circulate within a business – the lore, the gossip, the tales that a new employee hears to explain how things got the way they are.
An organization’s internal culture is, from this perspective, the totality of the narratives circulating within it. “It is polyphonic,” notes Salmon, “but it is also discontinuous and made up of interwoven fragments, of histories that are talked about and swapped. They can sometimes be contradictory, but the company becomes a storytelling organization whose stories can be listened to, regulated, and, of course, controlled ... by introducing systematized forms of in-house communications and management based upon the telling of anecdotes.”
At the same time, the old tools of structuralist narratology (with its dream of reducing the world’s stock of stories to a few basic patterns) is reinvented as an applied science. One management guru draws on Vladimir Propp’s Morphology of the Folktale in his own work. And there are software packages that “make it possible to break a narrative text down into segments, to label its main elements and arrange its propositions into temporal-causal sequences, to identify scenes, and to draw up trees of causes and decisions.”
One day corporations will be able to harvest all the stories told about them by consumers and employees, then run them through a computer to produce brand-friendly counter-narratives in real time. That sort of thing used to happen in Philip K. Dick's paranoid science-fiction novels, but now it's hard to read him as anything but a social realist.
All of this diligent and relentless narrativizing (whether in business or politics) comes as a response to ever more fluid social relations under high-speed, quick-turnover capitalism.
The old system, in which big factories and well-established institutions were central, has given way to a much more fluid arrangement. Storytelling, then, becomes the glue that holds things together -- to the degree that they do.
The “new organizational paradigm,” writes Salmon, is “a decentralized and nomadic company…that is light, nimble, and furtive, and which acknowledges no law but the story it tells about itself, and no reality other than the fictions it sends out into the world.”
Not long after Storytelling originally appeared in 2007, the world’s economy grew less forgiving of purely fictive endeavors. The postscript to the English-language edition offers Salmon’s reflections on the presidential campaign of 2008, with Barack Obama here figured as a narratocrat-in-chief “hold[ing] out to a disoriented America a mirror in which shattered narrative elements can be put together again.”
This, it seems to me, resembles an image from a fairy tale. The “mirror” is a magical implement restoring to order everything that has been tending towards chaos throughout the rest of the narrative. Storytelling is a smart and interesting book, for the most part, but it suffers from an almost American defect: the desire for a happy ending.
A genome biologist, Gregory Petsko, has gone to bat for the humanities, in an open letter to the State University of New York at Albany president who recently (and underhandedly) announced significant cuts. (For those who haven’t been paying attention: the departments of theater, Italian, Russian, classics, and French at SUNY-Albany are all going to be eliminated).
If you are in academia, and Petsko’s missive (which appeared on this site Monday) hasn’t appeared on your Facebook wall, it will soon. And here’s the passage that everyone seizes on, evidence that Petsko understands us and has our back (that is, we in the humanities): "The real world is pretty fickle about what it wants. The best way for people to be prepared for the inevitable shock of change is to be as broadly educated as possible, because today's backwater is often tomorrow's hot field. And interdisciplinary research, which is all the rage these days, is only possible if people aren't too narrowly trained."
He's right. And if scientists want to speak up for the humanities, I’m all for it. But Petsko understands us differently than we understand ourselves. Why fund the humanities, even if they don’t bring in grant money or produce patents? Petsko points out "universities aren't just about discovering and capitalizing on new knowledge; they are also about preserving knowledge from being lost over time, and that requires a financial investment."
How many us willingly embrace that interpretation of what we do? "My interest is not merely antiquarian...." is how we frame the justification for our cutting edge research. Even as we express our dismay when crucial texts go out of print, any sacred flame that we were tending was blown out when the canon wars were fought to a draw. Why should we resurrect it? Because, says Petsko, "what seems to be archaic today can become vital in the future." His examples are virology and Middle Eastern studies. Mine is 18th-century literature — and with all the imaginative vigor at my disposal, I have trouble discerning the variation on the AIDS scare or 9/11 that would revive interest in my field. That’s OK, though: Petsko has other reasons why the humanities matter:
"Our ability to manipulate the human genome is going to pose some very difficult questions for humanity in the next few decades, including the question of just what it means to be human. That isn't a question for science alone; it's a question that must be answered with input from every sphere of human thought, including -- especially including -- the humanities and arts... If I'm right that what it means to be human is going to be one of the central issues of our time, then universities that are best equipped to deal with it, in all its many facets, will be the most important institutions of higher learning in the future."
Well, that would be great. I have no confidence, though, that we in the humanities are positioned to take advantage of this dawning world, even if our departments escape SUNY-style cost-cutting. How many of us can meaningfully apply what we do to "the question of just what it means to be human" without cringing, or adopting an ironic pose, or immediately distancing ourselves from that very question? How many of us see our real purpose as teaching students to draw the kinds of connections between literature and life that Petsko uses to such clever effect in his diatribe?
Petsko is not necessarily right in his perception of what the humanities are good for, nor are professionals in the humanities necessarily wrong to pursue another vision of what our fields are about. But there is a profound disconnect between how we see ourselves (and how our work is valued and remunerated in the university and how we organize our professional lives to respond to those expectations) and how others see us. If we're going to take comfort in the affirmations of Petsko and those outside of the humanities whom he speaks for, perhaps we need to take seriously how he understands what we do. Perhaps the future is asking something of us that we are not providing — or perhaps we need to do a better job of explaining why anyone other than us should care about what we do.
Kirstin Wilcox is senior lecturer in English at the University of Illinois at Urbana-Champaign.
For this week’s column (the last one until the new year) I asked a number of interesting people what book they’d read in 2010 that left a big impression on them, or filled them with intellectual energy, or made them wish it were better known. If all three, then so much the better. I didn’t specify that it had to be a new book, nor was availability in English a requirement.
My correspondents were enthusiastic about expressing their enthusiasm. One of them was prepared to name 10 books – but that’s making a list, rather than a selection. I drew the line at two titles per person. Here are the results.
Lila Guterman is a senior editor at Chemical and Engineering News, the weekly magazine published by the American Chemical Society. She said it was easier to pick an outstanding title from 2010 than it might have been in previous years: “Not sleeping, thanks to a difficult pregnancy followed by a crazy newborn, makes it almost impossible for me to read!”
She named Rebecca Skloot’s The Immortal Life of Henrietta Lacks, published by Crown in February. She called it an “elegantly balanced account of a heartbreaking situation for one family that simultaneously became one of the most important tools of biology and medicine. It was a fast-paced read driven by an incredible amount of reporting: A really exemplary book about bioethics.”
Neil Jumonville, a professor of history at Florida State University, is editor of The New York Intellectual Reader (Routledge, 2007). A couple of collections of essays he recently read while conducting a graduate seminar on the history of liberal and conservative thought in the United States struck him as timely.
“The first is Gregory Schneider, ed., Conservatives in America Since 1930 (NYU Press, 2003). Here we find a very useful progression of essays from the Old Right, Classical Liberals, Traditional Conservatives, anticommunists, and the various guises of the New Right. The second book is Michael Sandel, Liberalism and Its Critics (NYU Press, 1984). Here, among others, are essays from Isaiah Berlin, John Rawls, Robert Nozick, Alisdair MacIntyre, Michael Walzer, a few communitarians represented by Sandel and others, and important pieces by Peter Berger and Hannah Arendt.”
Reading the books alongside one another, he said, tends to sharpen up one's sense of both the variety of political positions covered by broad labels like “liberal” and “conservative” and to point out how the traditions may converge or blend. “Some people understand this beneficial complexity of political positions,” he told me, “but many do not.”
Michael Yates retired as a professor of economics and labor relations at the University of Pittsburgh at Johnstown in 2001. His most recent book is In and Out of the Working Class, published by Arbeiter Ring in 2009.
He named Wallace Stegner’s The Gathering of Zion: The Story of the Mormon Trail, originally published in 1964. “I am not a Mormon or religious in the slightest degree,” he said, “and I am well aware of the many dastardly deeds done in the name of the angel Moroni, but I cannot read the history of the Mormons without a feeling of wonder, and I cannot look at the sculpture of the hand cart pioneers in Temple Square [in Salt Lake City] without crying. If only I could live my life with the same sense of purpose and devotion…. It is not possible to understand the West without a thorough knowledge of the Mormons. Their footprints are everywhere."
Adam Kotsko is a visiting assistant professor of religion at Kalamazoo College. This year he published Politics of Redemption: The Social Logic of Salvation (Continuum) and Awkwardness (Zero Books).
“My vote," he said, "would be for Sergey Dogopolski's What is Talmud? The Art of Disagreement, on all three counts. It puts forth the practices of Talmudic debate as a fundamental challenge to one of the deepest preconceptions of Western thought: that agreement is fundamental and disagreement is only the result of a mistake or other contingent obstacle. The notion that disagreements are to be maintained and sharpened rather than dissolved is a major reversal that I'll be processing for a long time to come. Unfortunately, the book is currently only available as an expensive hardcover.”
Helena Fitzgerald is a contributing editor for The New Inquiry, a website occupying some ambiguous position between a New York salon and an online magazine.
She named Patti Smith’s memoir of her relationship with Robert Mapplethorpe, Just Kids, published by Ecco earlier this year and recently issued in paperback. “I've found Smith to be one of the most invigorating artists in existence ever since I heard ‘Land’ for the first time and subsequently spent about 24 straight with it on repeat. She's one of those artists who I've long suspected has all big secrets hoarded somewhere in her private New York City. This book shares a satisfying number of those secrets and that privately legendary city. Just Kids is like the conversation that Patti Smith albums always made you want to have with Patti Smith.”
Cathy Davidson, a professor of English and interdisciplinary studies at Duke University, was recently nominated by President Obama to serve on the National Council on the Humanities. She, too, named Patti Smith’s memoir as one of the books “that rocked my world this year.” (And here the columnist will interrupt to give a third upturned thumb. Just Kids is a moving and very memorable book.)
Davidson also mentioned rereading Tim Berners-Lee's memoir Weaving the Web, first published by HarperSanFrancisco in 1999. She was “inspired by his honesty in letting us know how, at every turn, the World Wide Web's creation was a surprise, including the astonishing willingness of an international community of coders to contribute their unpaid labor for free in order to create the free and open World Wide Web. Many traditional, conventional scientists had no idea what Berners-Lee was up to or what it could possibly mean and, at times, neither did he. His genius is in admitting that he forged ahead, not fully knowing where he was going….”
Bill Fletcher Jr., a senior scholar at the Institute for Policy Studies, is co-author, with Fernando Gapasin, of Solidarity Divided, The Crisis in Organized Labor and A New Path Toward Social Justice, published by the University of California Press in 2009.
He named Marcus Rediker and Peter Linebaugh’s The Many-Headed Hydra: The Hidden History of the Revolutionary Atlantic (Beacon, 2001), calling it “a fascinating look at the development of capitalism in the North Atlantic. It is about class struggle, the anti-racist struggle, gender, forms of organization, and the methods used by the ruling elites to divide the oppressed. It was a GREAT book.”
Astra Taylor has directed two documentaries, Zizek! and Examined Life. She got hold of the bound galleys for James Miller’s Examined Lives: From Socrates to Nietzsche, out next month from Farrar Straus and Giroux. She called it “a book by the last guy I took a university course with and one I've been eagerly awaiting for years. Like a modern day Diogenes Laertius, Miller presents 12 biographical sketches of philosophers, an exploration of self-knowledge and its limits. As anyone who read his biography of Foucault knows, Miller's a master of this sort of thing. The profiles are full of insight and sometimes hilarious.”
Arthur Goldhammer is a senior affiliate of the Center for European Studies at Harvard University and a prolific translator, and he runs an engaging blog called French Politics.
“I would say that Florence Aubenas' Le Quai de Ouistreham (2010) deserves to be better known,” he told me. “Aubenas is a journalist who was held prisoner in Iraq for many months, but upon returning to France she did not choose to sit behind a desk. Rather, she elected to explore the plight of France's ‘precarious’ workers -- those who accept temporary work contracts to perform unskilled labor for low pay and no job security. The indignities she endures in her months of janitorial work make vivid the abstract concept of a ‘dual labor market.’ Astonishingly, despite her fame, only one person recognized her, in itself evidence of the invisibility of social misery in our ‘advanced’ societies.”
The book that made the biggest impression on her this year was Judith Giesberg's Army at Home: Women and the Civil War on the Northern Home Front, published by the University of North Carolina Press in 2009. “Too often,” Rubin told me, “historians ignore the lives of working-class women, arguing that we don't have the sources to get inside their lives, but Giesberg proves us wrong. She tells us about women working in Union armories, about soldiers' wives forced to move into almshouses, and African Americans protesting segregated streetcars. This book expands our understanding of the Civil War North, and I am telling everyone about it.”
Siva Vaidhyanathan is a professor of media studies and law at the University of Virginia. His next book, The Googlization of Everything: (And Why We Should Worry), will be published by the University of California Press in March.
He thinks there should have been more attention for Carolyn de la Pena's Empty Pleasures: The Story of Artificial Sweeteners from Saccharin to Splenda, published this year by the University of North Carolina Press: “De la Pena (who is a friend and graduate-school colleague) shows artificial sweeteners have had a powerful cultural influence -- one that far exceeds their power to help people lose weight. In fact, as she demonstrates, there is no empirical reason to believe that using artificial sweeteners helps one lose weight. One clear effect, de la Pena shows, is that artificial sweeteners extend the pernicious notion that we Americans can have something for nothing. And we know how that turns out.”
Vaidhyanathan noted a parallel with his own recent research: “de la Pena's critique of our indulgent dependence on Splenda echoes the argument I make about how the speed and simplicity of Google degrades our own abilities to judge and deliberate about knowledge. Google does not help people lose weight either, it turns out.”
Michael Tomasky covers U.S. politics for The Guardian and is editor-in-chief of Democracy: A Journal of Ideas.
“On my beat,” he said, “the best book I read in 2010 was The Spirit Level (Bloomsbury, 2009), by the British social scientists Richard Wilkinson and Kate Pickett, whose message is summed up in the book's subtitle, which is far better than its execrable title: ‘Why Greater Equality Makes Societies Stronger.’ In non-work life, I'm working my way through Vasily Grossman's Life and Fate from 1959; it's centered around the battle of Stalingrad and is often called the War and Peace of the 20th century. I'm just realizing as I type this how sad it is that Stalingrad is my escape from American politics.”
I was a graduate student in the 1980s, during the heyday of the so-called “culture wars” and the curricular attacks on "Western civilization." Those days were punctuated by some Stanford students chanting slogans like "Hey hey, ho ho, Western Civ has got to go," and by fiery debates about Allan Bloom’s book The Closing of the American Mind, which appeared in 1987, toward the end of my years in graduate school. Back then the battle lines seemed clear: conservatives were for Western civilization courses and the traditional literary canon, while liberals and progressives were against those things and for a new, more liberating approach to education.
In retrospect I find that decade and its arguments increasingly difficult to comprehend, even though I experienced them firsthand. I ask myself: What on earth were we thinking? Exactly why was it considered progressive in the 1980s to get rid of courses like Western civilization (courses that frequently included both progressives and conservatives on their reading lists)? And why did supporting a traditional liberal arts education automatically make one a conservative — especially if such an education included philosophers like Jean-Jacques Rousseau and Karl Marx?
A quarter of a century later, with the humanities in crisis across the country and students and parents demanding ever more pragmatic, ever more job-oriented kinds of education, the curricular debates of the 1980s over courses about Western civilization and the canon seem as if they had happened on another planet, with completely different preconceptions and assumptions than the ones that prevail today. We now live in a radically different world, one in which most students are not forced to take courses like Western civilization or, most of the time, in foreign languages or cultures, or even the supposedly more progressive courses that were designed to replace them. And whereas as late as the 1980s English was the most popular major at many colleges and universities, by far the most popular undergraduate major in the country now is business.
The battle between self-identified conservatives and progressives in the 1980s seems increasingly like rearranging the deck chairs on the Titanic. While humanists were busy arguing amongst themselves, American college students and their families were turning in ever-increasing numbers away from the humanities and toward seemingly more pragmatic, more vocational concerns.
And who can really blame them? If humanists themselves could not even agree on the basic value, structure, and content of a liberal arts education — if some saw the tradition of Western civilization as one of oppression and tyranny, while others defended and validated it; if some argued that a humanistic education ought to be devoted to the voices of those previously excluded from "civilized" discussion, such as people of color and women, while others argued that such changes constituted a betrayal of the liberal arts — is it any wonder that students and their families began turning away from the humanities?
After all, economics and business professors did not fight about the basic structure of business or economics majors, even though there were differences among Keynesian and Friedmanite economists, for instance, over monetary policy. And physics professors did not engage in fundamental debates about physics curriculums — which should one teach, quantum mechanics or relativity? — in spite of Einstein’s problems with quantum mechanics ("God does not play dice with the universe"). In the 1980s the humanities as a whole seemed to be the only field where even experts were unable to agree on what constituted the appropriate object of study.
If I go to a doctor’s office and witness doctors and nurses fighting about whether or not I should take a particular medication, I’m likely to go elsewhere for my health care needs. I think something analogous happened to the humanities in the 1980s, and it is continuing to happen today, although by now the humanities are so diminished institutionally that these changes no longer have the overall significance they had in the 1980s. In the 1980s the humanities still constituted the core of most major universities; by now, at most universities, even major ones, the humanities are relatively marginal, far surpassed, in institutional strength, by business, medical, and law schools.
One of the core functions of the humanities for centuries was the passing down of a tradition from one generation to the next. The idea behind Western civilization courses was supposed to be that students needed them in order to understand the origins and development of their own culture. In the 1980s three developments worked against that idea. The first was an educational establishment that was no longer content simply to pass knowledge down from one generation to the next, and that wanted to create new knowledge. The second development, which dovetailed with the first, was the emergence of new approaches to the humanities that examined structures of oppression and domination in traditions previously viewed as unimpeachable. One could examine women's history, for instance, or non-Western cultures. The third development, which dovetailed with the first and second, was the increasing demand for “relevance” in higher education, with "relevance" being understood as present-oriented and pragmatic, i.e. job-related.
The conflation of these three developments led to the widespread perception — and not just among self-proclaimed progressives — that anything traditional or old was also, almost by definition, conservative, fuddy-duddy, and impractical. In essence those three developments have now long since triumphed, and the educational world of today is largely the result of that triumph.
Unfortunately, however, traditions that are not passed on from one generation to the next die. If an entire generation grows up largely unexposed to a particular tradition, then that tradition can in essence be said to be dead, because it is no longer capable of reproducing itself. It does not matter whether the tradition in question is imagined as the Western tradition, the Christian tradition, or the Marxist tradition (and of course both Christianity and Marxism are part of the Western tradition). Traditions are like languages: if they are not passed on, they die. Most traditions, of course, have good and bad elements in them (some might argue for Christianity, some for Marxism, relatively few for both), and what dies when a tradition dies is therefore often both good and bad, no matter what one’s perspective. But what also dies with a tradition is any possibility of self-critique from within the tradition (in the sense that Marxism, for instance, constituted a self-critique from within the Western tradition), since a tradition’s self-critique presupposes the existence of the tradition. Therefore the death of a tradition is not just the death of the oppression and tyranny that might be associated with the tradition, but also the death of progressive and liberating impulses within the tradition.
We all know, of course, that nature abhors a vacuum, and for that reason when a tradition dies, what fills in the vacuum where the tradition used to be is whatever is strongest in the surrounding culture. In our culture we know quite well what that is: the belief in money, in business, in economics, and in popular culture. That is our real religion, and it has largely triumphed over any tradition, either progressive or tyrannical. It is no more a coincidence that business is the most popular major in the United States today than it was that theology was one of the major fields of the 1700s.
As a result of the triumph of relevance and pragmatism over tradition, the ivy-covered walls of academia, which once seemed so separated from what is often called the “real world,” now offer very little protection from it. In fact the so-called "real world" almost entirely dominates the supposedly unreal world of academia. It may have once been true that academia offered at least a temporary sanctuary for American students on their way to being productive, hard-working contributors to a booming economy; now, however, academia offers very little refuge to students on their way into a shaky, shell-shocked economy where even the seemingly rock-solid belief in the “free market” has been thrown into question. In 1987 Allan Bloom wrote: "Education is not sermonizing to children against their instincts and pleasures, but providing a natural continuity between what they feel and what they can and should be. But this is a lost art. Now we have come to exactly the opposite point." Over two decades later, it seems to me that Bloom was right, and that indeed we have come “to exactly the opposite point.” Unfortunately now, neither self-styled conservatives nor self-styled progressives are likely to want to defend a vision of education that even in Bloom’s view was long gone. And sadder still is the fact that few of our students will even realize what has been lost.
And so I think we owe an apology to our students. We humanists inherited a tradition more or less intact, with all its strengths and weaknesses, but it appears highly likely that we will not be able or willing to pass it on to them. That is a signal failure, and it is one for which we will pay dearly. No doubt there is lots of blame to go around, but instead of looking around for people to blame, it would be more constructive to save what we can and pass it along to the next generation. They are waiting, and we have a responsibility.
Stephen Brockmann is professor of German at Carnegie Mellon University and president of the German Studies Association.
In the context of the news that day in February, the announcement was almost jarring in its banality. On a day when legislators at all levels and all over the country were in full panic mode about budget deficits, and at a time when public investments in education, particularly higher education and most particularly the liberal arts, were being offered as examples of excessive government spending, a new commission had been formed.
At the request of a bipartisan group of members of Congress, the American Academy of Arts and Sciences had gathered a group of distinguished citizens and asked them to recommend 10 actions "that Congress, state governments, universities, foundations, educators, individual benefactors, and others should take now to maintain national excellence in humanities and social scientific scholarship and education, and to achieve long-term national goals for our intellectual and economic well-being." A bipartisan request to form a group to engage in long-range planning about the nation’s intellectual well-being by focusing on the liberal arts — such an announcement not only seemed out of place in the newspapers that day, it seemed almost to come from another generation.
Had these people not heard that, as House Speaker John Boehner put it, "We’re broke"? Didn’t they — these misguidedly bipartisan legislators and anachronistic advocates of the liberal arts — realize that we were in a crisis that precluded long-term planning and collective action? How could they fail to see that education today must focus on job training and economic competitiveness? And what were they thinking in focusing on liberal arts?
It has indeed been hard in recent months to hear anything other than the voices of doom. But the language spoken by these voices represents its own form of crisis, for it is almost entirely economic, as if all relevant factors in our current situation could be captured on a spreadsheet or a ledger. The reduction of complex social and political issues to economics signifies a failure of imagination; and "fiscal responsibility," while an excellent principle at all times, has come to serve as a proxy for our fears that we have lost our way in the world, that the future will not be as bright for our children as it was for us when we were young, that America is being outcompeted by countries that used to be "third world," that the future has somehow gotten away from us.
Fear, whose radical form is terror, has temporarily crippled our national imagination. Many young people today can barely recall a time when we were not subject to the shadowy horrors of terror and terrorists. Today, 10 years after 9-11, terror is a fact of life, and fear makes all the sense in the world. How else to explain the emergence of what are in effect survivalist and vigilante attitudes among so many of our political leaders?
At this time, it is useful for those with longer memories to recall that "other generation" that the current effort to support the liberal arts so strongly evokes. This would be the generation that, having fought their way out of the Great Depression, went out and won World War II. That generation, like ours, had things to fear, but they conquered their fears by taking action, including creating a commission charged with long-term planning for the nation’s educational system, focusing on liberal education.
This commission, created by President James Bryant Conant of Harvard, was formed in 1943, in the middle of the war, and completed virtually all of its work while the outcome of the war was still uncertain. Still, the vision its members announced was confident, spacious and radical. Their report, General Education in a Free Society — or the “Redbook,” as it was called — outlined a program of liberal education for both high school and college students, with required courses in the sciences, the social sciences, and the humanities. The intention was to extend to masses of people — including the hundreds of thousands of returning soldiers who would be going to college on the new GI Bill — the kind of non-vocational education previously available only to a select few.
Such a program, the commission thought, would be profoundly American in that it would prepare people for citizenship in a democracy, giving them what they needed not just to find a job but to live rich and abundant lives, the kinds of lives that people in less fortunate societies could only dream about. Announcing the great mission of American education and the new shape of American society after the war, the Redbook was hailed as a powerful symbol of national renewal, and served as an announcement of America’s cultural maturity. Its main arguments were translated into national policy by the six-volume 1947 "Truman Report," called Higher Education for American Democracy.
The program bespoke confidence in democracy, and in the ability of people to decide the course of their lives for themselves. It suggested, too, a conviction that a democracy based on individual freedom required some principle of cohesion, which would, in the program they outlined, be provided by an understanding of history and culture, which they entrusted to the humanities.
Of course, not every institution of higher education has followed this extraordinarily ambitious and idealistic vision. Indeed, by one recent account, only 8 percent of all American institutions of higher education give their students a liberal education. But that 8 percent includes virtually every institution known to the general populace, including Cal Tech and MIT. With their unique dedication to liberal education, American universities are acknowledged to be the best in the world at two of the central tasks of higher education: educating citizens and conducting research.
Mass liberal education was advocated in the face of challenges every bit as great as those we face today. As a consequence of the war, the national debt had exploded, reaching unprecedented levels (121 percent of GDP in 1946, compared with 93 percent in 2010). And as the grim realities of the Cold War set in, including the prospect of nuclear annihilation and the widespread fear of enemies within, many people felt that the nation was vulnerable in ways it never had been. It would have been understandable if the nation had tried to hedge against an unpredictable future by cutting spending, turning inward, and retooling the educational system so that it would produce not well-rounded citizens but technocrats, managers, nuclear engineers, and scientists.
Instead, we created the Marshall Plan, built the interstate highway system, and increased access to higher education so dramatically that, by 1960, there were twice as many people in higher education as in 1945. And incidentally, the middle class was strong and growing, and the fight for civil rights acquired an irresistible momentum. Things were very far from perfect, but we unhesitatingly call the generation that accomplished all this "the greatest."
What really distinguished the American philosophy of higher education in the generation after WWII was its faith in the future. People educated under a system of liberal education were expected not to fill slots but to create their lives in a world that could not be predicted but did not need to be feared. The lesson for today is perfectly clear. Terrors will always be with us, but we can choose to confront them through collective action and a recommitment to the core principles of democracy, including access, for those who wish to have it and are able to profit from it, to a liberal education. "We’re broke" is a sorry substitute for the kind of imagination and boldness needed now, or at any time. We must take the long view, the global view, and the view that does the most credit to ourselves.
I would not presume to tell the new commission which steps to support the liberal arts they should endorse. But I would urge on them a general principle: that liberal education should not be considered a luxury that can be eliminated without cost, much less an expensive distraction from the urgent task of economic growth, but a service to the state and its citizens. It is an essential service because it reflects and strengthens our core commitments as a nation, without which we truly would be broke.
In a memorable passage from The Philosophy of History, Hegel quotes a common saying of his day that runs, “No man is a hero to his valet-de-chambre.” This corresponds, in contemporary terms, to the familiar sentiment that even the most distinguished individual “puts his pants on one leg at a time like everybody else.” It is somewhere between wisdom and truism. But Hegel seems to take it badly. After quoting the proverb, he adds his own twist: “not because the former is no hero, but because the latter is a valet.”
In other words, the portrait of a world-transforming figure -- say, Napoleon -- left by somebody who shined his shoes and helped him to bed after a night of drinking is no basis for judging the meaning of said figure’s life. For that, presumably, you need a philosopher. Hegel mentions in passing that his quip was repeated “ten years later” by Goethe. I imagine being very casual while dropping that reference, as his students in the lecture hall go “Dude!” (or whatever the German equivalent of emphatic amazement was in 1830).
The dig at butlers seems awfully snobbish – and also rather unwise, at least to admirers of P.G. Wodehouse. But its thrust is really aimed elsewhere. He is thinking of something that is still fairly new in the early 19th century: a mass public, eager to consume intimate revelations and psychological speculations regarding powerful and influential people. This means wallowing in envy and egotism. Hegel says it is driven by the “undying worm” of realizing that one’s “excellent views and vituperations remain absolutely without result in the world.” Anyone distinguished is thereby reduced “to a level with – or rather a few degrees lower than – the morality of such exquisite discerners of spirits.”
This sounds irritable enough. And remember, the telegraph hadn’t even been invented yet. The golden age of cutting everybody down to size was still to come. Nor, indeed, has it ended.
But Joel Best’s new book Everyone’s a Winner: Life in Our Congratulatory Culture, published by the University of California Press, describes a situation that appears, at first blush, the exact opposite of the one that bothered Hegel. The word “heroic,” writes Best, a professor of sociology at the University of Delaware, “once applied narrowly to characterize great deeds by either mythic or historical figures,” but is now often “broadened to encompass virtually anyone who behaves well under difficult – even potentially difficult – circumstances.” And sometimes not even that. (When Stephen Colbert tells his audience that they’re the real heroes, it satirizes the way certain cable TV demagogues flatter the American couch potato.)
“Activists are heroes,” he writes. “Coal miners are heroes. People with terminal cancer are heroes. A word once reserved for the extraordinary is now applied to the merely admirable.”
This is one aspect of a pattern that Best finds emerging in numerous domains of American life. There is an abundance of claims to eminence and excellence. The awards proliferate as we hold public celebration of achievement in every activity imaginable. Restaurants display their rankings from local newsweeklies. Universities are almost always certifiably distinguished, in some regard or other. A horror movie called The Human Centipede (First Sequence) won the 2010 Scream Award in the category “most memorable mutilation.” I have seen the film and believe it deserved this honor. (Seriously, you don’t want to know.)
Anyone possessing even a slight curmudgeonly streak will already have had suspicions about this trend, of course. Best corroborates it with much evidence. A case in point is his graph of the number of British and American awards for mystery novels. In 1946, the figure stood at five. By 1979, it had grown to five times that many, and in 2006 (the last year he charts), there were roughly 110. “Nor is the trend confined to book awards,” he notes. “The number of film prizes awarded worldwide has grown to the point that there are now nearly twice as many awards as there are full-length movies produced. For both books and films, the number of prizes has grown at a far faster clip than the numbers of new books or movies.”
The Congressional Gold Medal honoring an outstanding contribution to the nation (its first recipient, in 1776, was George Washington) was presented five times in the course of the 1950s. The frequency of the award has grown since. Between 2000 and 2009, it was given out 22 times.
The examples could be multiplied, perhaps exponentially. The range of people, products, and activities being honored has expanded. At the same time, the number of awards in each category tends to grow. In short, the total energy invested in assessing, marking, and celebrating claims about status (that is, worthiness of respect or deference) seems to have increased steadily over the past few decades in the United States -- and Best says that discussions with colleagues in Canada, Japan, and Western Europe suggest that the same trend has emerged in other countries.
Older ways of looking at status regarded it as a rare commodity. Gaining it, or losing it was fraught, with anxiety. And it still is, but something important has changed. Hegel’s comments imply that powerlessness and lack of status were bound to inspire resentment over established claims to excellence and significance. In a condition of “status scarcity,” there is bound to be a struggle that unleashes destructive tendencies. But Best maintains that another dynamic has emerged -- the manufacture of status on an almost industrial scale, rather than a mass society in which status is smashed.
This tendency overlaps with the profusion of what he terms “social worlds” (what might otherwise be called subcultures or lifestyle cohorts) which that emerge as people with shared interests or commitments gather and form their own organizations. Giving and getting awards often becomes part of consolidating the niche.
“The perceived shortage of status,” he writes, reflecting a sense of “insufficient status being given to people like us, is one of the reasons disenchanted people form new social worlds.” Doing so “means that folks aren’t forced to spend their whole lives in circles where they inevitably lose the competition for status. Rather, by creating their own worlds, they acquire the ability to mint status of their own. They can decide who deserves respect and why.” The result is what Best calls "status affluence." There are, he acknowledges, grounds to criticize this situation – an obvious one being that status, like currency, becomes devalued when too much of it is being put into circulation. On the whole, though, he judges it as salutary, and as making for greater social cohesion and stability.
And in any case, there is no obvious way to change it. A few years ago, a bill calling for no more than two Congressional Gold Medals to be issued per year won some support -- only to end in limbo. If there is a tap to control the flow of awards, nobody knows how to work it.
"Status affluence" isn't the same as equality -- and I'm struck by the sense that it coexists with profound and growing socioeconomic disparities. As Joseph E. Stiglitz recently pointed out, the income of the top 1 percent in the United States has grown by 18 percent over the past decade, while people in the middle have seen their incomes shrink: "While many of the old centers of inequality in Latin America, such as Brazil, have been striving in recent years, rather successfully, to improve the plight of the poor and reduce gaps in income, America has allowed inequality to grow." As interesting as Best's book is, it leaves me wondering if status affluence isn't a symptom, rather than a sign that the distribution of recognition has grown more equitable. A parachute is better than nothing, but this one seems like it might be made of papier-mâché.
This coming weekend's conference on the late Ellen Willis -- essayist, radical feminist, and founder of the cultural reporting and criticism program at New York University -- begins to look as if it is going to be rather a big deal. It coincides with publication by the University of Minnesota Press of Out of the Vinyl Deeps: Ellen Willis on Rock Music, which, besides doing wonders for the reputations of Moby Grape and Creedence Clearwater Revival, is going to consolidate Willis’s role as a figure young writers read, and reread, and dream of somehow becoming. Originally the conference was planned for a small meeting space somewhere in downtown New York, but it’s been relocated to the Tishman Auditorium at NYU, which holds 450 people. Five years after her death, this is Ellen Willis’s moment
As someone who began reading her work almost 30 years ago (to an 18-year-old Velvet Underground fanatic, any collection of essays called Beginning to See the Light needed no further recommendation), I am happy to think so. And as someone scheduled to speak during the first session -- but nowhere near finishing his paper -- I am terrified to think so. Meanwhile, the organizers keep reminding the panelists that the event is being moved to a bigger venue, due to popular demand. And would we please be sure to get there on time? Maybe they are afraid of an unruly crowd; the warm-up act needs to get on stage without undue delay.
So, yes: a big, anxious deal. Though mostly a celebration. A small sampling of her work is available on a website run by her daughter, although this is no substitute for the three collections of essays on social and cultural matters that appeared during her lifetime.
Speaking of which, somebody at the conference needs to address the issue of how it happens that Out of the Vinyl Deeps is only appearing just now. Why is it only in 2011 that we have a book demonstrating that she was one of the best rock critics of the 1960s and ‘70s? Those decades have been mythologized as the era when rock writers of gigantic stature -- Lester Bangs, Robert Christgau, Nick Kent, Greil Marcus, Dave Marsh, Richard Meltzer, and Nick Tosches -- thundered across the countercultural landscape, sometimes doing battle, like dinosaurs. (Big, stoned dinosaurs.) You can find collections of work by all of these guys, and ardent fanboys ready to debate their respective degrees of eminence. In fact, I listed them alphabetically to avoid that sort of thing.
There were only a handful of pieces on rock in Willis's first collection of essays (and none in the subsequent volumes, which focused on feminist theory and cultural politics), but they were stunning. Anecdotal evidence and personal experience suggest that rereading them repeatedly was not an uncommon response. And when you did, you heard (and felt) songs by Bob Dylan, or the Who, or the Velvet Underground, in ways you never had before. She was as insightful as any of the dino-critics -- and a much better writer than some of them -- yet Willis never really figured in the legend.
With dozens of her writings on popular music now gathered between covers, this will change. But again, what took so long? This must be explained. (The possibility of an all-male species of dinosaur was unlikely in any event.)
Most of the pieces in the new book appeared in The New Yorker, to which Willis began contributing in 1968. A few months later, in response to the prevailing and otherwise intractable sexism of the New Left, she started the influential group Redstockings along with Shulamith Firestone, who soon wrote The Dialectic of Sex: The Case for Feminist Revolution (1970).
It was another Redstockings member, Carol Hanisch, who coined the phrase “the personal is political.” And on a personal-political note, I will mention that reading Firestone’s manifesto as a teenager scared the hell out of me, in a salutary way. The trauma had passed by the time Willis collected her own feminist writings in No More Nice Girls: Countercultural Essays (Wesleyan, 1992) -- a volume it is particularly interesting to read alongside Daring to Be Bad: Radical Feminism in America (University of Minnesota Press, 1989), for which Willis wrote the introduction. Clearly this sort of material is still upsetting to some people. A blogger named Doug Phillips, for example, blames Ellen Willis and the Willis-ites for “promot[ing] ultra-radical lesbian-feminist politics, trans-sexuality, and mother goddess worship.” Like that’s a bad thing.
While her libertarian worldview would certainly accommodate transsexual lesbian pagans in its conception of the good society, anyone who actually reads Ellen Willis will learn that she was, in fact, an enthusiastically heterosexual atheist who, at some point, accepted monogamy in practice, if not in theory. None of which will give Doug Phillips much comfort. But apart from specifying her exact position within the culture wars, the stray bits of personal information in her work are interesting for what they reveal about Willis as a writer.
Some of her most memorable pieces were in the vein of what used to be called the New Journalism, in which the reporter’s subjectivity is part of the narrative. But this amounts to only a small part of her output. The proliferation of memoir may be an indirect effect of feminism (“the personal is the literary”), but the role of the “I” in Willis is rarely confessional. Her essays, while usually familiar in tone, tend to be analytic in spirit. The first-person is a lens, not a mirror.
As mentioned, Out of the Vinyl Deeps is Willis’s fourth volume of essays. Following the last one she saw through the press, Don’t Think, Smile! Notes on a Decade of Denial (Beacon, 2000), she published a fair amount of uncollected material and was working on an interpretation of American culture from the perspective of Wilhelm Reich’s psychoanalytic theory. So perhaps there will be another posthumous volume at some point.
If so, it would be her fifth collection -- and her sixth book. Like most readers, I have always assumed that Beginning to See the Light, from 1981, was her first title. (It was reprinted by Wesleyan in 1992.) But almost 20 years earlier, Willis published another book. She did not list it in the summary of her career appearing in volume 106 of the reference-book series Contemporary Authors (Gale Publishers) and seems never to have referred to it in print. Indeed, I wondered if the Library of Congress cataloger didn’t make a mistake by listing Questions Freshmen Ask: A Guide for College Girls (E.P. Dutton, 1962) as written by the same author as No More Nice Girls. After all, there could be two Ellen Willises.
And in a way, there were. I’m still trying to figure out the relationship between them -- how the one became the other.
On page 4, the author of Questions Freshmen Ask explains her qualifications for writing the book: “As a graduate of Barnard College, I feel I have had the kind of experience that enables me to provide the answers to many of your questions. Since Barnard is on the one hand a small women’s college and on the other, part of a large coeducational institution (Columbia University), I am aware of the problems of both types of schools.”
The entry for Ellen Willis in Contemporary Authors notes that she graduated from Barnard in 1962. The 20-year-old author occasionally turns a phrase or writes in a rhythm that will sound familiar to aficionados of her older self -- and the introduction by Barbara S. Musgrave, class dean of Smith College, commends the book as “written so engagingly it gives something of the flavor of college ahead of time.”
It is certainly a time capsule. Exhibit A: “Most colleges estimate that books will cost you in the neighborhood of seventy-five dollars a year.” Exhibit B: “Freshmen often resent all the new regulations under which they are asked to live…. The fact is that your college is less interested in your individual welfare than in the smooth running of the community as a whole.” (Fifty years later, the in loco parentis rules Willis has in mind are long dead. And the administration's communitarian motives count less than its interest in not getting sued.)
Some of the advice remains valid -- especially the parts about the need to budget time and money. And the occasional bit of historical context can be glimpsed between the lines. The author’s freshman year would have been not long after the Sputnik launch. The push was on to expand access to higher education so that the nation would not be overwhelmed by superior brainpower. Willis is explicit about offering guidance to girls who will enter college with no idea what to expect, because their parents didn’t go.
“In the old days,” she writes, “when money or an influential relative seemed almost a ticket of admission to the campus, a student didn’t have to be too purposeful about college. A girl could shrug and say she wanted to go to college, well, because all her friends were going and it had never occurred to her not to go. But times have changed, and you can’t afford to be aimless -- not if you want to justify the admissions director’s faith in you.”
As with a recommendation to “be a good sport” about nitpicky campus rules, this stress on living up to the expectation of an authority figure is hard to square with the later Ellen Willis. But there are passages in which (with abundant hindsight, admittedly) you can see the fault lines.
“No matter what you eventually do after you graduate,” she writes, “you will want to have a mind that’s alert and full of ideas. There will be books you want to understand, important decisions to make, leisure time to fill. With the mental resources your education provides, you will be able to enjoy life more fully….”
Here, the Willis fan thinks: Yes, I know this author. But then you hit a passage like this: “If you spend four years at college single-mindedly preparing yourself for a television production job in New York, and then end up marrying an anthropologist who has to live in the Middle East, what have you accomplished?”
The drive for autonomy vs. the destiny of matrimony: the center cannot hold. Five years after Questions Freshmen Ask: A Guide for College Girls appeared, Janis Joplin recorded her first album and Ellen Willis wrote the first piece in Out of the Vinyl Deeps: an essay on Bob Dylan that is more rewarding than certain books on him that come to mind. Whatever it was that transformed Ellen Willis in the meantime, it almost certainly involved a record player.
“I am not a donkey,” Max Weber once said, “and I do not have a field.” And yet it is always possible to label Weber as a sociologist without unduly provoking anybody. Things are decidedly more complicated in the case of the American thinker Kenneth Burke (1897-1993). Situating such Burkean treatises as Permanence and Change (1935), A Grammar of Motives (1945), and Language as Symbolic Action (1966) in cultural and intellectual history is a task to test the limits of interdisciplinary research. His theories concerning aesthetics, communications, social order and ecology took shape through dialogue with the work of Aristotle, Marx, Freud, Nietzsche, Bergson, and the American pragmatist philosophers, to make the list as short as possible. (And Weber too, of course.) It’s still hard to improve upon the assessment made by Stanley Edgar Hyman, the literary critic and Bennington College professor, more than 60 years ago: “He has no field, unless it be Burkology.”
The triennial meeting of the Kenneth Burke Society, held at Clemson University over the Memorial Day weekend, drew a diverse crowd, numbering just over one hundred people -- with at least a third, by my estimate, being graduate students or junior faculty. The Burkological elders told tales of the days when incorporating more than a couple of citations from “KB” in a dissertation would get you scolded by an adviser. Clearly things have changed in the meantime. Tables near registration were crowded with secondary literature from the past decade or so, as well as a couple of posthumous collections of KB's work. The program featured papers on the implications of his ideas for composition textbooks, disability studies, jazz, environmental activism, and the headscarf controversy.
There were also Burkean discussions of “Mad Men,” Mein Kampf, and the Westboro Baptist Church. Unfortunately I missed it, but Camille Kaminski Lewis gave a paper based on her continuing analysis of the history and ideology of Bob Jones University, where she once taught. (Her book on the subject did not meet with the institution's approval, a matter she discussed in an essay for the Burke Society's Journal.)
The range of topics would sound bewildering to anyone uninitiated into KB’s work; likewise with the vocabulary he created along the way (“dramatism,” “logology,” “terministic screen,” “socio-anagogic interpretation”). But people attending the conference received commemorative tee-shirts bearing excerpts from KB’s “Definition of Man” -- an essay attempting to reduce his thinking to a succinct formula, devoid of any jargon:
"Man is the symbol-making animal, inventor of the negative, separated from his natural condition by instruments of his own making, moved by the sense of order, and rotten with perfection."
Quit a bit is going on within that nutshell. (The phrase “rotten with perfection,” for example, is Burke’s idiosyncratic take on Aristotle’s idea of entelechy.) But an academic organization devoted to an esoteric thinker who fits comfortably in no particular departmental pigeonhole would seem unlikely to have much potential for growth. On the final day of the conference, David Cratis Williams told me that when the Kenneth Burke Society formed in 1984, he suspected that it would for the most part appeal to people who had known KB personally. And that small circle was bound to shrink over time, as people retired.
Something else has happened instead. There was more to it than a few then-young Burkologists becoming institutionally well-situated – though that no doubt made a huge difference. Williams, for example, is the director of the graduate program in communication and media studies at Florida Atlantic University. (He is also working on a biography of the maverick thinker.) And David Blakesley, who organized the conference at Clemson just a few months after arriving there to assume an endowed chair in English, is also the founder of Parlor Press, a peer-reviewed scholarly publishing house. The name of the press is taken from a passage in which Burke describes the world as a parlor where an unending conversation unfolds.
Having a few well-placed and entrepreneurial Burkeans has certainly helped to consolidate the Society. But I suspect that other factors are involved in the continuing vitality of the KB scholarship. Three things stood out about the conference: the crowd was multigenerational; many of the younger Burkeans have a strong interest in archival research; and the scholarship is now orienting toward digital media, not just to study it but to use it.
These tendencies seem to be mutually reinforcing. Since the early 1990s, Jack Selzer, a professor of English at Pennsylvania State University's main campus, has not only been doing archival research on Burke’s involvement with a number of literary and intellectual circles, but encouraging his students to use the Burke papers at Penn State as well. One of his graduate students was Ann George, now an associate professor of English at Texas Christian University. In 2007, the University of South Carolina Press published Kenneth Burke in the 1930s, which situates its subject in the political and cultural context of the Depression. (While specialized and extremely suggestive to the longtime Burkean, it’s also the book I’d be most likely to recommend to someone new to KB.)
Now students of both Selzer and George are digging around in the 55 linear feet of Burke papers at PSU -- and sometimes taking trips out to the farmhouse in New Jersey where Burke lived and worked, full of still more manuscripts as well as KB’s heavily annotated library. Besides his correspondence with other literary and academic figures, they’re finding unpublished manuscripts and notes showing his concern with economics, music, and other areas relatively neglected by earlier Burke scholars. One senior figure told me that the influx of graduate students was both encouraging and anxiety-inducing: “I really have to finish the project I’ve been working on because now it’s just a matter of time before one of them beats me to it.”
The value of having digital editions of his writings seems clear -- especially in the case of works that Burke revised from edition to edition. In the meantime, two graduate students are digitizing "Conversations with Kenneth Burke," which consists of eight hours of interview footage with Burke conducted by Clarke Rountree at the University of Iowa in 1986. (He is now a professor of communication arts at the University of Alabama in Huntsville.)
Joel Overall, who is one of Ann George's students at TCU, told me about it. "Our project involves upgrading 8 hours of interview footage from VHS to DVD format,” he said. “In addition to upgrading the graphic design of packaging materials, DVD titles, and credits, we're also working on transcriptions of the interview that will be included through subtitles and a searchable pdf file. This is a particularly valuable contribution since KB was somewhat difficult to understand at the age of 89.“ (The other member of the project, Ethan Sproat, is at Purdue, where he worked with David Blakesley before DB's move to Clemson.)
The DVD will be released by the Society within the next year. “Since [Burke’s] written works are often difficult when first encountered, these interviews allow us to hear his voice and see him in cinematic motion, providing us with extra-textual elements that are crucial to understanding his work.”
Following the conference, David Blakesley pointed out another development in the Burkological world. While he was a polyglot as well as a polymath -- reading and translating work from from French and German, and an ardent student of Latin literature as well -- Burke's reputation has long been almost exclusively confined to the United States. But Belgian and French scholars were at the conference.
“They, too, felt welcome, “ he said, “and are excited about their prospects for future work on Burke. In fact, Ronald Soetart (University of Ghent) wants to organize a European Burke conference now. The French contingent was eager to see that as well since there appears to be a groundswell of interest in Burke throughout Europe. I noticed that when I presented on Burke and visual rhetoric at the International Association of Visual Semiotics in Venice last April, too.”
I attended the conference as a keynote speaker, and also delivered a paper -- and so was sitting there feeling mildly fried when I was invited to participate in another multimedia project. A group of Clemson graduate students in the master of arts in professional communication (MAPC) program were conducting a series of interviews for a video on the field of rhetoric. (That is rhetoric understood as the well-established study of effective communication, rather than in the modern sense of a technique for evading reality.)
Drew Stowe, a second-year student in the program, explained that the project would “show the importance of rhetoric for modern students, in the modern university, and to lay audiences such as parents of prospective students, the board of trustees and other corporate partners who recruit graduates from the MAPC program.” Burke is considered one of the most innovative thinkers in rhetoric since antiquity, so scouting the conference for talking heads made sense.
In front of the camera, I aspired to coherence rather than eloquence. My main point was that KB’s work is a toolbox of ideas useful for analyzing the messages with which everyone is bombarded. As someone who’s read a few of Burke’s books until they’ve worn out -- my hardback copy of the first edition of Philosophy of Literary Form (1941), for example, started falling apart during the conference -- I take his continuing relevance as a given. But where did it come from?
“I've always sensed that KB lived at a particularly interesting cultural moment,” wrote Jack Selzer to me by email, following the conference. “Major wars were changing international affairs fundamentally, new communications technologies were so important, and of course postmodern and post-Nietzschean philosophies (and the linguistic turn) were troubling modernist and rationalist assumptions. Somehow he was brilliant enough to perceive the vitality of these changes even as he was living amidst them, and he was able to theorize and meditate on things so productively -- even though (or because?) he was so close to them. As a consequence, what he has to say remains very contemporary. It was wonderful to see the younger scholars drawn to his work in every way imaginable, and I think it has to do with how shrewd KB was about such important intellectual currents.”
Ann George described teaching Burke in a couple of courses over the past years and finding that students “were struck with, and even a little dispirited by,” the parallels between Burke’s motivating concerns and the present scene. “His political, economic, and environmental insights are remarkable: American exceptionalism and the war in Iraq; 'socialization of losses' via government bailouts, 'rereadings' of the Constitution, Ponzi schemes -- it's all there. Of all the theorists we read in the modern rhetoric course … though, students felt Burke offered more answers -- or more hope -- because he didn't idealize human motives or overestimate how much we might be able to change things for the better. “
That’s a very good point -- and there is a profoundly humanist vision that emerges as the pieces of Burke’s theoretical jigsaw puzzle come together.
He put it best in Attitudes Toward History (1937): "The progress of human enlightenment can go no further than in picturing people not as vicious, but as mistaken.
“When you add that people are necessarily mistaken, that all people are exposed to situations in which they must act as fools, that every insight contains its own special kind of blindness, you complete the comic circle, returning again to the lesson of humility that undergirds great tragedy.” Studying Burke is sometimes difficult, but there are moments when it makes the world seem a little less mad.