Political science


Important as it was, the campaign of Barack Obama was not the only history-making element of the 2008 presidential election. With Sarah Palin, we crossed another epochal divide. The boundary between reality television and American politics (already somewhat weakened by the continuous "American Idol" plebiscite) finally collapsed.

Her campaign's basic formula was familiar: members of an ordinary middle-class family turn into instantly recognizable national celebrities while competing for valuable prizes.

But like any contestant at this late stage of an already decadent genre, Palin seemed much less conscious of the stakes of the game (power) than in how it let her broadcast her own sense of herself.

At that level she could not lose – the ballot box notwithstanding. I’m not sure what Sarah Palin’s favorite work of postmodern theory might be (all of them, probably) but she seems to take her lead from Jean Baudrillard’s Seduction. Other political figures use the media as part of what JB calls “production.” That is, they generate signs and images meant to create an effect within politics. For the Baudrillardian “seducer,” by contrast, the power to create fascination is its own reward.

Watching Palin respond to questions about her book Going Rogue (or not respond to them, often enough) is, from this perspective, no laughing matter. She grows ever more comfortable talking about herself. If no more capable of simulating knowledge of public issues, she is getting her story straight, more or less. And this matters. For now she does not have to be accurate, just coherent. She is consolidating her presence, her "brand." Teams of professional ideologists can feed Palin her lines later.

Is this too cynical? I fear it may not be cynical enough. For it assumes that Palin will eventually be integrated into her party’s apparatus and turned into a mouthpiece of old-school Republican electoral politics -- a basic platform of tax cuts for the rich and unregulated handgun ownership for everybody else.

That is not the only possible outcome, however. Someone with Palin’s developing command of the arts of media seduction -- and whose knack on that score is largely a matter of her performative maverickiness -- has the potential to change the rules of the game.

The editors of a new collection of essays called Going Rouge – a punning title that belies its basic seriousness – recognize that in Palin we may have something more than a new celebrity. “No one speaks of McCainism or Doleism,” write Richard Kim and Betsy Reed in their introduction, “but Palinism signals not just a political position but a political style, a whole way of doing politics.”

The volume itself is the product of a whole new way of doing serious nonfiction. It is the first title from OR Books, which has a staff, so far, of two people. One of them is Colin Robinson, who roughly this time last year lost his job as an editor at Simon and Schuster. He tells me that OR now has two offices. One is the coffee shop where he and his partner John Oakes (co-founder of independent publisher Four Walls Eight Windows) work in the morning. The other is the bar they go to at night.

When we talked earlier this year, Robinson described his idea for a new kind of trade publishing. The usual approach is to print an enormous number of copies of a title to get an economy of scale, then give large discounts to chain bookstores – leaving almost no money to promote it. For serious nonfiction, this was a miserable system. Any money for advertising tended to go to publicize, say, The Stephen King Cookbook or suchlike. (Palin's autobiography is an example of a book enjoying just such heavy promotion.)

His plan, Robinson said, would be to publish a few titles that he thought were worthwhile, making them available as e-books and print-on-demand paperbacks -- and then concentrate on advertising them online, among other ways via video. So far you have to buy Going Rouge directly from the publisher (it sold about 4,000 copies before its official publication date on November 16) but it will be available for order from bookstores next month.

Most of the chapters are reprints from magazines such as The New Yorker, The New Republic, and The Nation; a few first appeared on Web sites. The list of contributors is a Who’s Who of left-leaning journalists and commentators. Max Blumenthal, Juan Cole. Naomi Klein, Rick Perlstein, and Katha Pollitt, among others. There are a few critical evaluations of Palin by her fellow Republicans, including one by a conservative columnist who suggests that she makes George W. Bush “sound like Cicero.” The editors also reprint a number of interviews with and public statements by Palin herself – among them, selections from her Twitter and Facebook writings.

A celebration, then, it is not. But Going Rouge does represent an acknowledgment of Palin’s importance, ambiguous though the precise nature of that importance may be. It cannot be reduced to her short-term plans. She remains circumspect about them, for now anyway. But she is busy demonstrating a strong intuitive grasp of how mass media can be used – among other things, to change the subject.

An example is the item Palin posted on Facebook in early August: “The America I know and love is not one in which my parents or my baby with Down Syndrome will have to stand in front of Obama’s ‘death panel’ so his bureaucrats can decide, based on a subjective judgment of their ‘level of productivity in society,’ whether they are worthy of health care. Such a system is downright evil.”

This was fantasy. But it was effective fantasy. To borrow again from Baudrillard, it seduced -- abolishing reality and replacing it with a delirious facsimile.

The editors of Going Rouge give Palin credit for the rhetorical power generated by her words, and perhaps also by her canny use of the social-networking venue: “With remarkable economy of prose, Palin cast health care reform as an assault on the country, put a face on its supposed victims (her baby Trig), coined the expression ‘death panel’ (linking it directly to Obama), raised the specter of euthanasia in the service of a state-run economy, and rallied the troops around a fight against ‘evil.’ In short, she personalized, popularized, and polarized the debate. Never mind that Democratic health care reform bills merely funded optional end-of-life consultations that had heretofore been almost universally acknowledged as a good. (Indeed, Palin herself once championed them in Alaska.)”

Well, consistency is, after all, the hobgoblin of tiny minds. Sarah Palin is playing the political game on a much grander scale -- with rules she may be rewriting as she goes.

With a first printing of 1.5 million copies of her book, I don’t know that the intervention of an upstart press can pose much of a challenge. But OR Books deserves credit for trying. Someone has to speak up for reality from time to time. Otherwise it will just disappear.

Scott McLemee
Author's email: 

The Eggheads Scramble

Two journals from opposite ends of the political spectrum have just run discussions of the role of American intellectuals in the age of Obama. It is the sort of coincidence that seems meaningful – or would, at least, if small-circulation magazines played the role they once did in shaping discussions about culture and politics. These days the Zeitgeist prefers to express itself on basic cable.

Tevi Troy’s essay “Bush, Obama, and the Intellectuals” appears in the third issue of National Affairs. When it first showed up on newsstands last year, the graphic design and wonky substance of National Affairs made it look as if it had been cloned from the genetic remains of The Public Interest, the flagship journal of neoconservatism, published between 1965 and 2005. Which it turns out is more or less the case. The editors “strive to walk in the footsteps of our intellectual and institutional predecessor,” they write, calling their predecessor “a journal that for decades enriched our public life with its unparalleled clarity and wisdom.”

Meanwhile the social-democratic journal Dissent began running its symposium “Intellectuals and Their America” in its winter number; part two is in the spring issue. The contributors have included Jackson Lears, Martha Nussbaum, Katha Pollitt Michael Tomaksy, Sam Tanenhaus, and Leon Wieseltier, among others. The common denominator is that these figures do not belong to Dissent’s editorial board. Nor are they, to my recollection, frequent contributors.

Here, too, the mood is elegaic. Dissent’s editors invoke “Our Country and Our Culture,” the symposium that ran across several issues of Partisan Review in 1952 – when that magazine was, they note, “near the apex of its influence.” One detects a wistfulness.

In 2002, Tevi Troy, who is now a visiting fellow at the right-wing Hudson Institute, came out with Intellectuals and the American Presidency: Philosophers, Jesters, and Technicians, published by Rowman and Littlefield. His new article is a serviceable précis of the book, offering a quick assessment of how several presidents have sought to court the intelligentsia.

No easy task, for they are a fickle lot: a milieu "with its own, often low-minded, politics and culture," writes Troy, "and its own complex connections to the popular culture and the rough-and-tumble of American politics.”

Obama came into office enjoying much goodwill -- especially from the sort of citizen who can follow the implications of an allusion to Reinhold Niebuhr. But this is not a blank check. Obama needs to avoid “underestimat[ing] the damage he would suffer if the cultural and academic elites who have backed him so far suddenly turned their knives against him. Precisely because Obama’s presidency rests, in part, on his status as a cultural phenomenon, he would pay a heavy price for losing their support.”

But for practical guidance, the president would need to turn from the article to the appendix to Intellectuals and the American Presidency, where Troy presents a set of maxims on how the White House can handle this constituency.

“Don’t make meetings with intellectuals public,” he warns, “and don’t reveal what was said in any official way.... Do use the president’s meal times liberally as a way to garner support from intellectuals. Even if you don’t back their policies, few people will refuse a free meal at the White House.... Do not, as president, publicly rely on think-tank guidance.... Do let it be known when the president is reading a popular work by a well-known scholar, as long as it is not a Swedish planning text, à la Michael Dukakis.”

Troy served in a number of positions during the presidency of George W. Bush. His article in National Affairs includes the most wonderfully counterintuitive sentence anyone has written in some time: “As an institutional matter, Bush’s outreach to intellectuals could well serve as a model for future presidents... .”

Anybody craving documentation for this arresting claim should recall Troy's maxims: “Do expect that any intellectual in the White House will produce a book describing the experience.” The implications seem clear. I look forward to Troy’s memoirs.

The Dissent symposium does not explicitly address the change of occupants in the White House. But its timing suggests that as a subtext; and so does the editors' reference to “Our Country and Our Culture.”

The joke about Partisan Review in the 1940s was that its offices contained special typewriters with the word “alienation” on one key. So when PR held its legendary symposium in 1952, the editors’ willingness to use the first-person possessive pronoun was a meaningful gesture. It suggested the end of alienation; it signaled that intellectuals were ready to accept a place in the scene before them.

And so things stand again now, perhaps -- after eight years when the unofficial national slogan amounted to “Ignorance is Strength.”

The first part of "Intellectuals and Their America" is now available online, with more responses to follow. While I was initially intrigued by the idea (several of the contributors are writers whose work it is always worth making the time to read) the cumulative effect of reading it has been disappointment and discouragement. For the most salient thing about “Intellectuals and Their America” is how lackluster the whole enterprise seems -- how vigor-free the taking of positions.

Not to deny that certain simulations of polemic are attempted. But they prove tired and rote.

In the second part of the symposium (not yet online) Jean Bethke Elshtain, a professor of social and political ethics at the University of Chicago, revisits a familiar complaint: professors are too inclined to leftist groupthink. “I refer to Harold Rosenberg,” she writes, ”who in 1948 characterized the contemporary academy as ‘the herd of independent minds.’ ”

Except that he didn’t. Rosenberg's barbed phrase was not aimed at academy. He was complaining about his colleagues, those New York Intellectuals of song and legend, who in those days seldom gave university life a second thought. Yet their tendency to assume positions in politics and culture appeared awfully well-synchronized, even so.

It is not a defect of avant garde-inclined thinkers only, as perhaps Elshtain should know. She was part of that herd of liberal intellectuals who offered arguments in favor of the Iraq war in 2003 -- often proving quite strenuous in declaring themselves independent-minded on that score.

In his contribution to the symposium, Michael Eric Dyson, a professor of sociology at Georgetown University, wants to put in a good word for public intellectuality: “Let’s not pretend that quarantining the life of the mind to the academy hasn’t at times made the rest of the culture sick.”

Ignoring the special qualities of this metaphor (it is both inapposite and incoherent) what seems most striking here is the implication that his point will prove controversial, somehow. Again, the contributor’s own example disproves his point. Dyson’s impressive academic career has been built almost entirely around trade-press books and mass-media appearances.

An army of university publicists works to make faculty part of the public conversation. The real issue is the quality of their interventions. Let's not pretend that generating soundbites on the topic of the day qualifies as a contribution to intellectual life, as such.

Once, the publication of this kind of symposium in a journal might clarify what was at stake in arguments among intellectuals. It could leave participants, and readers, with a sense of the state of the nation and its culture.

And indeed, one of the most important things about “Our Country and Culture” was that three contributors to it -- Irving Howe, Norman Mailer, and C. Wright Mills --clearly defined themselves as unhappy with the drift of the discussion. They were sufficiently opposed to its implicit invitation to join the American consensus that, two years later, they joined forces to help start a magazine called Dissent. And then, after a few more years, they dissented so much that they parted ways. (It is a tradition.)

If “Intellectuals and Their America” suggests that all life has gone out of the symposium as ritual, that has little to do with the era itself. The stakes now seem high enough. But the shape of public space itself has changed. Intellectual life is not a herd of independent minds. Rather, it involves any number of herds, some of them more furious than others. And the shepherding role of any given publication is now severely limited. I feel as much nostalgia for old formats as anyone, but this much seems clear: imitating a model from the Truman years seems a poor incentive to intellectual debate in the Obama era.

Scott McLemee
Author's email: 


Once upon a time -- long, long ago -- I spent rather a lot of time reading about the theory of narrative. This was not the most self-indulgent way to spend the 1980s, whatever else you can say about it. Arguably the whole enterprise had begun with Aristotle, but it seemed to be reaching some kind of endgame around the time I was paying attention. You got the sense that narratologists would soon be able to map the genome of all storytelling. It was hard to tell whether this would be a good thing or a bad thing, but they sure seemed to be close.

The turning point had been the work of the Russian folklorist Vladimir Propp. In the late 1920s, he had broken down 100 fairy tales into a set of elementary “functions” performed by the characters, which could then be analyzed as occurring in various combinations according to a handful of fixed sequences. The unrelated-seeming stories were just variations on a very few algebraic formulas.

Of course, fairly tales tend to be pretty formulaic to begin with -- but with some tweaking, Propp's approach could be applied to literary texts. By the 1960s, French structuralist critics such as Roland Barthes and Gerard Genette were analyzing the writings of Poe and Proust (not to mention James Bond novels) to extract their narrative DNA. And then came Hayden White’s Metahistory: The Historical Imagination in Nineteenth Century Europe (1973), which showed how narratology might be able to handle nonfiction. White found four basic modes of “emplotment” -- romantic, comic, tragic, and satirical -- in the storytelling done by historians.

It was obviously just a matter of time before some genius came along to synthesize and supersede all of this work in book called Of Narratology, at least half of which would be written in mathematical symbols. The prospect seemed mildly depressing. In the end, I was more interested in consuming narratives (and perhaps even emitting them, from time to time) than in finding the key to all mythologies. Apart from revisiting Peter Brooks's Reading for the Plot: Design and Intention in Narrative (1984) -- the only book on the topic I recall with any pleasure -- narratology is one of those preoccupations long since forgotten.

And so Christian Salmon’s Storytelling: Bewitching the Modern Mind reads like a dispatch from the road not taken. Published in France in 2007 and recently issued in English translation by Verso, it is not a book contribution to the theory of narrative but a report on its practical applications. Which, it turns out, involve tremendous amounts of power and money -- a plot development nobody would have anticipated two or three decades ago.

“From the mid-1990s onward,” writes Salmon, concentration on narrative structure “affected domains as diverse as management, marketing, politics, and the defense of the nation.” To a degree, perhaps, this is obvious. The expression “getting control of the narrative” has long since become part of the lexicon of mass-media knowingness, at least in the United States. And Salmon -- who is a member of the Centre for Research in the Arts and Language in Paris and a columnist for Le Monde -- has one eye trained on the American cultural landscape, seeing it as the epicenter of globalization.

Roughly half of Salmon’s book is devoted to explaining to French readers the history and nuances of such ubiquitous American notions as “spin” and "branding." He uses the expression “narratocracy” to characterize the form of presidential leadership that has emerged since the days of Ronald Reagan. The ability to tell a compelling story is part of governance. (And not only here. Salmon includes French president Sarkozy as practitioner of “power through narrative.”)

Less familiar, perhaps, is the evidence of a major shift toward narrative as a category within marketing and management. Corporations treat storytelling as an integral part of branding; the public is offered not just a commodity but a narrative to consume. He quotes Barbara Stone, a professor of marketing at Rutgers University: “When you have a product that’s just like another product, there are any number of ways to compete. The stupid way is to lower prices. The smart way is to change the value of the product by telling a story about it.” And so you are not just buying a pair of pants, for example, but continuing the legacy of the Beat Generation.

“It is not as though legends and brands have disappeared,” writes Salmon. But now they “talk to us and captivate us by telling us stories that fit in with our expectations and worldviews. When they are used on the Web, they transform us into storytellers. We spread their stories. Good stories are so fascinating that we are encouraged to tell them again.”

Other stories are crafted for internal consumption. Citing management gurus, Salmon shows the emergence of a movement to use storytelling to regulate the internal life of business organizations. This sometimes draws upon the insights of well-known narrative artists of canonical renown, as in books like Shakespeare on Management. (Or Motivational Secrets of the Marquis de Sade, if I can ever sell that idea.) But it also involves monitoring and analyzing the stories that circulate within a business – the lore, the gossip, the tales that a new employee hears to explain how things got the way they are.

An organization’s internal culture is, from this perspective, the totality of the narratives circulating within it. “It is polyphonic,” notes Salmon, “but it is also discontinuous and made up of interwoven fragments, of histories that are talked about and swapped. They can sometimes be contradictory, but the company becomes a storytelling organization whose stories can be listened to, regulated, and, of course, controlled ... by introducing systematized forms of in-house communications and management based upon the telling of anecdotes.”

At the same time, the old tools of structuralist narratology (with its dream of reducing the world’s stock of stories to a few basic patterns) is reinvented as an applied science. One management guru draws on Vladimir Propp’s Morphology of the Folktale in his own work. And there are software packages that “make it possible to break a narrative text down into segments, to label its main elements and arrange its propositions into temporal-causal sequences, to identify scenes, and to draw up trees of causes and decisions.”

One day corporations will be able to harvest all the stories told about them by consumers and employees, then run them through a computer to produce brand-friendly counter-narratives in real time. That sort of thing used to happen in Philip K. Dick's paranoid science-fiction novels, but now it's hard to read him as anything but a social realist.

All of this diligent and relentless narrativizing (whether in business or politics) comes as a response to ever more fluid social relations under high-speed, quick-turnover capitalism.

The old system, in which big factories and well-established institutions were central, has given way to a much more fluid arrangement. Storytelling, then, becomes the glue that holds things together -- to the degree that they do.

The “new organizational paradigm,” writes Salmon, is “a decentralized and nomadic company…that is light, nimble, and furtive, and which acknowledges no law but the story it tells about itself, and no reality other than the fictions it sends out into the world.”

Not long after Storytelling originally appeared in 2007, the world’s economy grew less forgiving of purely fictive endeavors. The postscript to the English-language edition offers Salmon’s reflections on the presidential campaign of 2008, with Barack Obama here figured as a narratocrat-in-chief “hold[ing] out to a disoriented America a mirror in which shattered narrative elements can be put together again.”

This, it seems to me, resembles an image from a fairy tale. The “mirror” is a magical implement restoring to order everything that has been tending towards chaos throughout the rest of the narrative. Storytelling is a smart and interesting book, for the most part, but it suffers from an almost American defect: the desire for a happy ending.

Scott McLemee
Author's email: 

Perils of Presidential Parallels

My cyber-savvy son recently e-mailed a message board entry he’d spotted on Google, titled “Stupid UnAmerican Writers Like Robert Schmuhl,” along with a tender, filial sentiment: “Haha!”

The day before, an essay I’d written was posted on one website before quickly finding its way to several others. Some reprinted the entire article, while others offered selected quotations and pointed reactions focusing on the sanity, seriousness and style of the author.

Maybe it was the question AOL’s Politics Daily posed in its headline introducing the initial internet iteration: “Is Sarah Palin the Next Barack Obama?” Maybe it was the consideration of strange-yet-true similarities between the former governor and the current president? Maybe it was the yoking together of polar opposite political figures who provoke don’t-confront-me-with-the-facts convictions among devout supporters.

Whatever the case, the virility of today’s viral blogosphere robustly flexed its inflamed muscles. More quickly than a garrulous pedagogue can finish answering a student’s query, poison darts — composed in high dudgeon How-Dare-You? rage — started to clutter my inbox and to show up on the net. Defenders of both Palin and Obama attacked their keyboards to deride and denigrate any suggestion of parallels. Some responsorial eruptions matched or exceeded the word count of the original article.

For over three decades of teaching and writing about contemporary American politics, I’ve tried (however vainly) to make sense of the forces, patterns and trends animating our civic life. With Palin scheduled to deliver a major speech in Iowa, the first state in the 2012 presidential nominating process, it struck me that she might be following a path somewhat similar to Obama’s in 2008.

  • Both figures emerged with stunning rapidity on the national scene and possess media magnetism.
  • Both used major speeches at their parties’ national conventions as their national political launching pads.
  • Both produced well-publicized and best-selling books to flesh out their life histories and views.
  • Both have somewhat exotic backgrounds by nature of their upbringings in states distant from the continental U.S.
  • Both arrived at national prominence with limited experience in upper-level governmental service.
  • Both positioned themselves as outsiders as they became more widely known, with a willingness to take on their parties’ establishments and Washington’s traditional ways.
  • Both create intense followings that are connected by using social media.

I could continue — and did — taking note early on of “their continent-spanning differences on issues and ideology.” Though that point was repeated near the end, where I mentioned politicians often face now-or-never moments of decision in their careers, gentle readers seemed inclined to disregard or dismiss the basic, non-partisan facts. Palinites were aggrieved to see any comparisons to their hero, and Obamaniacs felt the same way — but from the opposite perspective.

In retrospect, I probably should have expected the charges of “intellectual laziness” and worse the essay evoked. Back in early 2007, even before Obama announced his presidential candidacy, I had composed a similar confection of comparative analysis that the Chicago Tribune in a Sunday feature headlined “Reagan and Obama: Not So Different?”

Well, as Ronald Reagan began so many sentences, that explanatory effort (again with appropriate flashing warning signs that the two figures were “distinctly different”) proved that in the 21st century the words flog and blog have become synonymous. One cyber spitball sticks in my memory like a wad of discarded gum on a shoe: “Good grief! Barack baby wouldn’t make a pimple on Ronald Reagan’s posterior.”

For someone who refuses to choose political sides (never even voting in races I write or talk about) and who just tries to interpret civic affairs fairly, I suppose the reactions to my acts of describing more than coincidental parallels reflect the partisan, polarized toxicity infecting American political culture today. Even if you assiduously avoid taking a stand, others at high volume will do so for you. A discourse of conflict is de rigueur now.

In classes and articles, I come across as a broken record — a retro expression suggesting advancing age — that the phrase “media bias” is considerably more than a knee-jerk epithet to be tossed around as an all-purpose, back-of-the-hand complaint of journalistic prejudice. The two words joined together in unholy cohabitation are rife with complexities that require inquisitive minds and individual judgments.

Bias of some kind is inherent in all human communication, but that doesn’t mean every source of information approaches a story with a preordained perspective or agenda-advancing opinion. Especially in coverage of government and politics, the emphasis in mainstream outlets tends to revolve around institutional criteria (conflict, novelty, consequences and the like) rather than ideological ones, with more concern for accountability than advocating a cause.

Saying this usually sends hidebound conservatives and liberals into paroxysms of disbelief — but that’s really the way it is in traditional newsrooms. Increasingly, though, in this take-no-prisoners political environment, the perception of bias can be self-generating, with an individual bringing personal, preconceived dispositions to whatever’s read, seen and heard.

A work of journalism can be as straight and as balanced as possible, but people in the audience impose their own slant, basing their reactions on firmly planted thoughts and emotions. In this process, neutrality rapidly morphs into partiality — and we’re off to the races of full-throated rejoinders to an ignominious outrage of, say, identifying similarities between two public figures of competing parties.

For an academic more accustomed to point-making than point-scoring, today’s ecosystem of information is both boon and bane. Outlets abound to disseminate arguments and analysis to audiences never before imagined, yet those messages can be misinterpreted by people more fixated on how they already think than on learning something new.

As the digital denunciations of the Palin-Obama disquisition piled up, I lamented to a friend that nobody seemed to be reading the article with any semblance of objectivity. He provided comfort by paraphrasing Oscar Wilde: “The only thing worse than being read is not being read.” So it goes.

The story’s often told that H.L. Mencken’s mother once asked the 20th century’s most incendiary pundit-cum-provocateur: “What are you doing, Harry?”

With alacrity, the sage of Baltimore shot back: “I’m stirring up the animals.”

In today’s political and communications world, it’s possible to stir up the animals of every species, phylum and partisan orientation without even trying. Mencken probably would have reveled in our raucous and interactive age, but some of us might intrude a worry, now and then, about democracy and its discontents.

Robert Schmuhl
Author's email: 

Robert Schmuhl is Walter H. Annenberg-Edmund P. Joyce Professor of American Studies and Journalism and director of the John W. Gallivan Program in Journalism, Ethics and Democracy at the University of Notre Dame. His collection of essays, In So Many More Words: Arguments and Adventures, has just been published by University of Notre Dame Press.

The Last Utopia

The German critic Walter Benjamin once gave a set of satirical pointers about how to write fat books -- for example, by making the same point repeatedly, giving numerous examples of the same thing, and writing a long introduction to outline the project, then reminding the reader of the plan as often as possible. Whether or not they are aware of doing so, many academic authors seem to follow his advice closely. Samuel Moyn's The Last Utopia: Human Rights in History, published by Harvard University Press, is a remarkable exception. Its survey of the legacy of ideas later claimed as cornerstones of the politics of human rights is both dense and lucid; its challenging reassessment of recent history is made in a little over two hundred pages. It's almost as if the book were written with the thought that people might want to read it.

After writing a review of The Last Utopia, I interviewed the author by e-mail; a transcript follows. Moyn is a professor of history at Columbia University and the editor of Humanity: An International Journal of Human Rights, Humanitarianism, and Development, published by the University of Pennsylvania Press.

Q: Describing your book as "a critique of the politics of human rights" has occasionally gotten me puzzled looks. After all, what's to criticize about human rights? How do you describe or explain your project in The Last Utopia?

A: As a historical project, The Last Utopia mainly tries to sort out when "international human rights" -- whether as a set of concepts or a collection of movements -- came about. I conclude: pretty recently. They are slightly older as a set of concepts than as a collection of movements, but in both senses they came to prominence in the 1970s, not before.

But then it follows that human rights are just one set of mobilizing notions that humans have had reason to adopt over the years. Looking far back, I try to dispute that human universalism -- treating humanity as what the Universal Declaration of Human Rights calls a single moral family -- never existed before international human rights did. Actually the number of ideologies (notably religious worldviews) based on the moral unity of the species is stupendously high. If so, the moral principles, and even more the practices, associated with human rights turn out to be just one version of a commitment to "humanity."

And in modern times, different universalistic projects have coexisted and competed all along, at least until many people began to assume that human rights were the only kind of universalism there is. One main argument of the book is that this process is visible even within the history of rights talk. Rights -- especially natural rights and the rights of man -- were authority for very different projects and practices than human rights now imply. In the years of American and French Revolution, the appeal to rights justified violent state founding (and national integration). Today, in a postcolonial world, human rights imply not simply a different, supranational agenda, but also wildly different mechanisms of mobilization, from lighting candles, to naming and shaming, to putting checks in the mail.

Ultimately, I conclude, both the affirmation of human rights and criticism of them must begin with the fact that they are new and recent, not timeless or age-old.

Q: You maintain that there is a significant difference between the version of human rights that came to the fore internationally beginning in the late 1970s and earlier notions. You seem to be arguing that campaigns for human rights in recent decades have tended to be antipolitical -- forms of moral renewal, even. But you also show that the relatively small circles taking up the idea of human rights in the 1940s and '50s often involved people of faith who understood it in terms of some kind of religious humanism. So what was different about the later embrace of human rights?

A: It's true I do emphasize the participation of European and trans-Atlantic Christians -- both Catholics and Protestants -- in the early story of international human rights in 1940s and after. But their most frequent associations with human rights were to "Western civilization" and moral community, along with worries about materialism and hedonism. They supported human rights built around freedom of conscience and religious practice, which they saw threatened most fundamentally by the Soviet Union to the east, in an interesting version of orientalism that targeted communist secularism.

Thirty years later, the move to morality was still available within Christian idiom, and Catholics, in particular, were key participants in the origins of human rights movements both behind the Iron Curtain and in the southern cone. Indeed, the Catholic Church amplified its connection of human rights and dignity in the Vatican II era. But all things considered, these affiliations were not the crucial ones for the fortunes of human rights as a galvanizing notion.

Rather, it was reforming leftists -- who had once thrown in their lots with versions of socialism -- who moved to moral humanism in circumstances of foreclosure or exhaustion. They had no space under their regimes to offer political alternatives, or after tiring years of political agitation were looking for something outside and above politics. And indeed these very figures found themselves making alliances -- tactical and coalitional at first -- with forces they would have decried a few years before. Dissidence in the name of "human rights" had replaced a political championship of divisive social alternatives.

Q:You note that earlier discussions of rights posited them as being exercised only within a political community -- while the notion of human rights tended to see them as existing outside of the nation-state, and even as defined against it. But here I want to ask you about someone you mention only in passing. The Universal Declaration of Human Rights that the United Nations issued in 1948 was championed (even to some degree rough-drafted) by H.G. Wells, who very definitely did regard the notion of human rights as something that would be advanced by a definite political authority -- namely, some kind of world state.

Wells wrote about that sort of thing in his science fiction, of course, but also tried to get a global state off the ground. Apart from the Teabaggers worried about One World Government, nobody much thinks about that sort of super-state anymore. Still, wouldn't it suggest that the notion of human rights does imply some kind of sovereignty able to enforce its claims?

A: The Tea Party is not new in this regard. But the perfervid fantasies on the American right of "world government" over the years shouldn't lead one to think that circles supporting that move were ever large, let alone politically influential. You're absolutely correct about Wells, whose globalist dreams went back to long before he began to champion the rights of man as a World War II battle cry, and relate in interesting ways to his fiction. To my knowledge no one so far has offered a synthetic vision of the long campaign for world government, which in some sense still has not gotten off the ground.

In any case, the dream of world government is most revealing about human rights for helping show that the Universal Declaration came to a world of states and indeed empires and perhaps even helped stabilize it. After all, in the beginning the larger United Nations system was more about maintaining that world than overcoming it. This gets to a key theme of the book, in its attempt to demystify the founding of the United Nations and the role of universal rights in its origins.

By the same token, those who have agitated for world government have arguably understood something deep about human rights. As Hannah Arendt argued in the 1940s in her dismissive treatment of the concept, rights presuppose a bounded citizenship space. Her challenge to votaries of human rights is that their declaration is meaningless unless there is a real plan to incorporate "humans" as citizens. And this may have been exactly what the partisans of world government all along, however few in number, have wanted to accomplish.

Q: The term "utopia" has a range of connotations -- some clearly disparaging, others much more honorific. What do you mean by the utopianism of human rights activism? And what's the overtone? Sometimes your characterization sounds a bit dismissive, while at other points it seems as if there's an implication that utopian desire is a necessary thing.

A: Thanks for giving the opportunity to answer this question, because the book evidently gives rise to some confusion on this score. I am a utopian, and -- as I say in the book -- I admire human rights activists for trying to make the world better rather than doing nothing or trying to make it worse.

One of my critics, Gary Bass, has argued that human rights movements follow a "liberalism of fear" which merely tries to stave off terrible evil rather than construct the good life for the world. As the book shows, however, that wasn't true in the 1970s, since several of those who most bitterly scorned prior utopias transmuted their idealism into new forms associated with human rights, rather than dropping idealism altogether. And it certainly isn't true now, when human rights have expanded -- notably in the global south -- far beyond their original antitotalitarianism to embrace a host of causes of improvement, intersecting humanitarianism and development.

If human rights are utopian, however, they are only one version of that commitment. I associate them with utopianism in order to ask the right questions of the movement. How much difference has it made? Were the utopias human rights replaced more or less plausible in light of their successor? Is it time to reorient human rights as an energizing agenda, or replace it?

I'm not sure of the answer. The title The Last Utopia could mean that human rights are the final idealistic cause -- or simply the most recent.

Q: Was your book inspired by any particular sense of frustration, disillusionment, or disappointment? How would you characterize your own political stance?

A: Probably my own trajectory simply reflects the collective learning of many Americans who in the past 10 years evolved away a somewhat naive belief in the transformative implications of human rights for the post-Cold War world order. While a young law student, I actually worked in the White House during the Kosovo bombing campaign, and vividly remember that it was a time when it seemed possible for universal justice to implemented by American power. Then 9/11, and Iraq, happened.

But while many fewer people put stock in the meaning of America's leadership today, the past decade has also seen a profusion of professional scholarship on the history of human rights, and it was my main goal to synthesize and comment on what is known so far, beyond any immediate political agenda. I actually don't get in the book much beyond the crucial turning point of the 1970s, so only a brief epilogue comments on how my analysis provides a way to think about the past couple of decades as contemporary history -- though I hope someday a sequel to the book will go much further!

This sequel would have to fit together three things: hope and then disillusionment about human rights in and through to the unipolar world America has been leading (though perhaps not for even the foreseeable future!), the imaginative and institutional crystallization of human rights as a specifically European language of self-understanding and governance, and the uptake in global humanitarianism of human rights as the language with which both helpers and opponents address fragile postcolonial states and their periodic "crises." Definitely, human rights have transformed our world in recent decades, but in ways that make it no less problematic.

Q: How did you manage to write such a short book on such a big subject?

A: In complete honesty, it helps to have signed a contract whose fine print your editor later enforces! More seriously, I wanted to try to write a book that summarizes what historians have unearthed so far, and shows what they still need to figure out, in short compass. And since I am the sort of historian who spends less time in archives looking for new information than in my armchair reading philosophy and political theory, I thought I could best prioritize different ways of integrating and rethinking existing evidence. It made most sense, in other words, to use the book mainly to make distinctions and pose questions. Definitive answers would have taken a lot more space -- and a different author.

Scott McLemee
Author's email: 

The Year in Reading

For this week’s column (the last one until the new year) I asked a number of interesting people what book they’d read in 2010 that left a big impression on them, or filled them with intellectual energy, or made them wish it were better known. If all three, then so much the better. I didn’t specify that it had to be a new book, nor was availability in English a requirement.

My correspondents were enthusiastic about expressing their enthusiasm. One of them was prepared to name 10 books – but that’s making a list, rather than a selection. I drew the line at two titles per person. Here are the results.

Lila Guterman is a senior editor at Chemical and Engineering News, the weekly magazine published by the American Chemical Society. She said it was easier to pick an outstanding title from 2010 than it might have been in previous years: “Not sleeping, thanks to a difficult pregnancy followed by a crazy newborn, makes it almost impossible for me to read!”

She named Rebecca Skloot’s The Immortal Life of Henrietta Lacks, published by Crown in February. She called it an “elegantly balanced account of a heartbreaking situation for one family that simultaneously became one of the most important tools of biology and medicine. It was a fast-paced read driven by an incredible amount of reporting: A really exemplary book about bioethics.”

Neil Jumonville, a professor of history at Florida State University, is editor of The New York Intellectual Reader (Routledge, 2007). A couple of collections of essays he recently read while conducting a graduate seminar on the history of liberal and conservative thought in the United States struck him as timely.

“The first is Gregory Schneider, ed., Conservatives in America Since 1930 (NYU Press, 2003). Here we find a very useful progression of essays from the Old Right, Classical Liberals, Traditional Conservatives, anticommunists, and the various guises of the New Right. The second book is Michael Sandel, Liberalism and Its Critics (NYU Press, 1984). Here, among others, are essays from Isaiah Berlin, John Rawls, Robert Nozick, Alisdair MacIntyre, Michael Walzer, a few communitarians represented by Sandel and others, and important pieces by Peter Berger and Hannah Arendt.”

Reading the books alongside one another, he said, tends to sharpen up one's sense of both the variety of political positions covered by broad labels like “liberal” and “conservative” and to point out how the traditions may converge or blend. “Some people understand this beneficial complexity of political positions,” he told me, “but many do not.”

Michael Yates retired as a professor of economics and labor relations at the University of Pittsburgh at Johnstown in 2001. His most recent book is In and Out of the Working Class, published by Arbeiter Ring in 2009.

He named Wallace Stegner’s The Gathering of Zion: The Story of the Mormon Trail, originally published in 1964. “I am not a Mormon or religious in the slightest degree,” he said, “and I am well aware of the many dastardly deeds done in the name of the angel Moroni, but I cannot read the history of the Mormons without a feeling of wonder, and I cannot look at the sculpture of the hand cart pioneers in Temple Square [in Salt Lake City] without crying. If only I could live my life with the same sense of purpose and devotion…. It is not possible to understand the West without a thorough knowledge of the Mormons. Their footprints are everywhere."

Adam Kotsko is a visiting assistant professor of religion at Kalamazoo College. This year he published Politics of Redemption: The Social Logic of Salvation (Continuum) and Awkwardness (Zero Books).

“My vote," he said, "would be for Sergey Dogopolski's What is Talmud? The Art of Disagreement, on all three counts. It puts forth the practices of Talmudic debate as a fundamental challenge to one of the deepest preconceptions of Western thought: that agreement is fundamental and disagreement is only the result of a mistake or other contingent obstacle. The notion that disagreements are to be maintained and sharpened rather than dissolved is a major reversal that I'll be processing for a long time to come. Unfortunately, the book is currently only available as an expensive hardcover.”

Helena Fitzgerald is a contributing editor for The New Inquiry, a website occupying some ambiguous position between a New York salon and an online magazine.

She named Patti Smith’s memoir of her relationship with Robert Mapplethorpe, Just Kids, published by Ecco earlier this year and recently issued in paperback. “I've found Smith to be one of the most invigorating artists in existence ever since I heard ‘Land’ for the first time and subsequently spent about 24 straight with it on repeat. She's one of those artists who I've long suspected has all big secrets hoarded somewhere in her private New York City. This book shares a satisfying number of those secrets and that privately legendary city. Just Kids is like the conversation that Patti Smith albums always made you want to have with Patti Smith.”

Cathy Davidson, a professor of English and interdisciplinary studies at Duke University, was recently nominated by President Obama to serve on the National Council on the Humanities. She, too, named Patti Smith’s memoir as one of the books “that rocked my world this year.” (And here the columnist will interrupt to give a third upturned thumb. Just Kids is a moving and very memorable book.)

Davidson also mentioned rereading Tim Berners-Lee's memoir Weaving the Web, first published by HarperSanFrancisco in 1999. She was “inspired by his honesty in letting us know how, at every turn, the World Wide Web's creation was a surprise, including the astonishing willingness of an international community of coders to contribute their unpaid labor for free in order to create the free and open World Wide Web. Many traditional, conventional scientists had no idea what Berners-Lee was up to or what it could possibly mean and, at times, neither did he. His genius is in admitting that he forged ahead, not fully knowing where he was going….”

Bill Fletcher Jr., a senior scholar at the Institute for Policy Studies, is co-author, with Fernando Gapasin, of Solidarity Divided, The Crisis in Organized Labor and A New Path Toward Social Justice, published by the University of California Press in 2009.

He named Marcus Rediker and Peter Linebaugh’s The Many-Headed Hydra: The Hidden History of the Revolutionary Atlantic (Beacon, 2001), calling it “a fascinating look at the development of capitalism in the North Atlantic. It is about class struggle, the anti-racist struggle, gender, forms of organization, and the methods used by the ruling elites to divide the oppressed. It was a GREAT book.”

Astra Taylor has directed two documentaries, Zizek! and Examined Life. She got hold of the bound galleys for James Miller’s Examined Lives: From Socrates to Nietzsche, out next month from Farrar Straus and Giroux. She called it “a book by the last guy I took a university course with and one I've been eagerly awaiting for years. Like a modern day Diogenes Laertius, Miller presents 12 biographical sketches of philosophers, an exploration of self-knowledge and its limits. As anyone who read his biography of Foucault knows, Miller's a master of this sort of thing. The profiles are full of insight and sometimes hilarious.”

Arthur Goldhammer is a senior affiliate of the Center for European Studies at Harvard University and a prolific translator, and he runs an engaging blog called French Politics.

“I would say that Florence Aubenas' Le Quai de Ouistreham (2010) deserves to be better known,” he told me. “Aubenas is a journalist who was held prisoner in Iraq for many months, but upon returning to France she did not choose to sit behind a desk. Rather, she elected to explore the plight of France's ‘precarious’ workers -- those who accept temporary work contracts to perform unskilled labor for low pay and no job security. The indignities she endures in her months of janitorial work make vivid the abstract concept of a ‘dual labor market.’ Astonishingly, despite her fame, only one person recognized her, in itself evidence of the invisibility of social misery in our ‘advanced’ societies.”

Anne Sarah Rubin is an associate professor of history at the University of Maryland, Baltimore County and project director for Sherman’s March and America: Mapping Memory, an interactive historical website.

The book that made the biggest impression on her this year was Judith Giesberg's Army at Home: Women and the Civil War on the Northern Home Front, published by the University of North Carolina Press in 2009. “Too often,” Rubin told me, “historians ignore the lives of working-class women, arguing that we don't have the sources to get inside their lives, but Giesberg proves us wrong. She tells us about women working in Union armories, about soldiers' wives forced to move into almshouses, and African Americans protesting segregated streetcars. This book expands our understanding of the Civil War North, and I am telling everyone about it.”

Siva Vaidhyanathan is a professor of media studies and law at the University of Virginia. His next book, The Googlization of Everything: (And Why We Should Worry), will be published by the University of California Press in March.

He thinks there should have been more attention for Carolyn de la Pena's Empty Pleasures: The Story of Artificial Sweeteners from Saccharin to Splenda, published this year by the University of North Carolina Press: “De la Pena (who is a friend and graduate-school colleague) shows artificial sweeteners have had a powerful cultural influence -- one that far exceeds their power to help people lose weight. In fact, as she demonstrates, there is no empirical reason to believe that using artificial sweeteners helps one lose weight. One clear effect, de la Pena shows, is that artificial sweeteners extend the pernicious notion that we Americans can have something for nothing. And we know how that turns out.”

Vaidhyanathan noted a parallel with his own recent research: “de la Pena's critique of our indulgent dependence on Splenda echoes the argument I make about how the speed and simplicity of Google degrades our own abilities to judge and deliberate about knowledge. Google does not help people lose weight either, it turns out.”

Michael Tomasky covers U.S. politics for The Guardian and is editor-in-chief of Democracy: A Journal of Ideas.

“On my beat,” he said, “the best book I read in 2010 was The Spirit Level (Bloomsbury, 2009), by the British social scientists Richard Wilkinson and Kate Pickett, whose message is summed up in the book's subtitle, which is far better than its execrable title: ‘Why Greater Equality Makes Societies Stronger.’ In non-work life, I'm working my way through Vasily Grossman's Life and Fate from 1959; it's centered around the battle of Stalingrad and is often called the War and Peace of the 20th century. I'm just realizing as I type this how sad it is that Stalingrad is my escape from American politics.”

Scott McLemee
Author's email: 

The Revolution Will Not Be Tweeted

In an essay from the late 1960s, Umberto Eco wrote that day when revolutionaries could seize power by storming the central government offices of the old regime were over. The distribution of forces changed over the course of the 20th century. Taking control of the television stations had become at least as important -- possibly more so.

Reading this in the 1980s, I felt a certain skepticism. The Italian semiotician was responding, in part, to Marshall McLuhan, who made all sorts of gnomic and frequently silly pronouncements about the mass media; it appeared as if Eco were succumbing to the same impulse. A revolutionary strategist once defined the state as "bodies of armed men" -- and confronting those armed men, rather than the anchormen, still seemed to me like the decisive moment in any change of system or regime regime.

That passage has come to mind again over the past month – especially last Thursday, when the Egyptian government tried to derail the protests against Hosni Mubarak by shutting off access to the Internet. It amounts to a significant wrinkle in Eco’s argument. This time, it was the counterrevolutionaries seizing control of the media (not to use it, but to dissolve it). Likewise, the authorities in Tunisia responded to the popular uprising there by running a piece of code on the country’s servers to harvest the Facebook passwords of its citizens. It then tried to shut down their accounts.

The role of social networking and online communication in anti-authoritarian uprisings is a topic that gained special currency during the protests over the Iranian presidential election in June 2009. And the discussion often resonates with the familiar themes of what might be called the new digital populism: established authority shaking in its boots before the distributed power of the ‘netitizens. Watching American television coverage of the Egyptian events, in particular, one could be forgiven for supposing that new media sparked the uprising, since nothing in that country’s history over the past three decades is discussed as much as the arrival of Twitter and Facebook.

Considerably more thoughtful discussion has been taking place on academic blogs. Ulises Mejias, an assistant professor of new media at the State University of New York at Oswego, points out “how absurd it is to refer to events in Iran, Tunisia, Egypt and elsewhere as the Twitter Revolution, the Facebook Revolution, and so on. What we call things, the names we use to identify them, has incredible symbolic power, and I, for one, refuse to associate corporate brands with struggles for human dignity.”

While acknowledging the role that “tech-savvy youth” have played in recent mass protests, Jillian York of the Berkman Center for Internet & Society at Harvard University avoids calling them “social-media revolutions” because “the implication of such nomenclature is that Twitter or Facebook can make or break a protest, turn a revolt into a revolution. This is not the case: Neither in Iran nor Tunisia was social media the catalyst for uprising.”

The most categorical challenge to the notion that digital tools have some intrinsic democratogenic potency comes from Evgeny Morozov, a visiting scholar at Stanford University, whose polemic The Net Delusion: The Dark Side of Internet Freedom was published last month by PublicAffairs. He argues “that it's wrong to assess the political power of the Internet solely based on its contribution to social mobilization: We should also consider how it empowers the government via surveillance, how it disempowers citizens via entertainment, how it transforms the nature of dissent by shifting it into a more virtual realm, how it enables governments to produce better and more effective propaganda, and so forth. All of this might decrease the likelihood that the revolutionary situation like the one in Tunisia actually happens -- even if the Internet might be of tremendous help in social mobilization. The point here is that while the Internet could make the next revolution more effective, it could also make it less likely.”

Cory Doctorow, the novelist and a co-editor of the website Boingboing, has published an extensive critique of The Net Delusion -- arguing that its broadsides against net activism are misdirected. “Where Morozov describes people who see the internet as a ‘deterministic one-directional force for either global liberation or oppression,’ or ‘refusing to acknowledge that the web can either strengthen rather than undermine authoritarian regimes,’ I see only straw-men, cartoons drawn from CNN headlines, talking head sound bites from the US administrative branch, and quips from press conferences.”

Doctorow may be right that the net activists he knows are much more sophisticated than Morozov allows. But it cannot be said that simplistic ideas one hears articulated in “CNN headlines, talking head sound bites from the US administrative branch, and quips from press conferences” are without effect or consequence.

A graph of the Internet traffic going to and from Egypt last Thursday shows online activity proceeding at a brisk pace all afternoon -- then suddenly collapsing to a bare minimum around 5 o'clock, as the country’s service providers shut access down.

This did not have the desired effect. The protests occurring the next day were bigger than before, and have grown steadily ever since -- with labor unions organizing a general strike, and people carrying on with the strangely festive brand of courage that seems always to emerge during this sort of historical episode. A very few Egyptians have managed to get access to Twitter and the like. But nobody can claim that digital technology is driving events there.

How to understand this dynamic, then? In August, the United States Institute of Peace issued a report called “Blogs and Bullets: New Media in Contentious Politics,” co-authored by half a dozen political scientist and media analysts. (One of them is Henry Farrell, an associate professor of political science at the George Washington University, and a friend.) It offers the smartest assessment I have seen of the impact of new media on movements such as the upheavals sweeping North Africa lately – because it makes clear that we just don’t know very much.

“Conclusions are generally drawn from potentially nonrepresentative anecdotes,” the authors write, sometimes combined with "laborious hand coding of a subset of easily identified major (usually English) media.” There's a tendency to focus on new media as "the magic bullet" explaining the course of events when "at best, it may be a 'rusty bullet,' " since "traditional media sources [may prove] equally if not more important." Nor is it clear how digital tools affect the various dimensions of political conflict -- whether they serve to forge alliances among groups, for example, or tend to make each one close in upon itself more.

As the familiar refrain goes, more research is needed. For now, all generalizations are guesses.

Scott McLemee
Author's email: 

Our Zombies, Ourselves

In October, the cable network AMC launched a new series called "The Walking Dead," which -- as a long-time zombie-apocalypse aficionado -- I watched throughout its six-episode opening season, even after it became became obvious that the best-developed characters were the reanimated corpses. Their lines consisted of gurgling noises. This was surely preferable to the dialogue of the living, who tended to explain their motivations and back stories in exchanges that were at once histrionic and wooden. Someone playing a zombie must work within the expressive limits of shuffling and moaning; this precludes chewing up the scenery. The actor doesn’t need to work up a deep sense of motivation. The biting involves very little background, so they just get on with it.

Four weeks into the series, there was an opening scene in which two (living) sisters sat in a boat with their fishing poles, explaining to each other that they were sisters and that their father had enjoyed fishing -- a wistful yet laborious recollection, it seemed. Two questions came to mind at almost the same time, “Why am I still watching this?” and “Can zombies swim?” (The latter thought was more hopeful than anxious.) I still don't have an answer. But one of the sisters did end up a zombie, meaning they don't talk anymore, which improved the series considerably.

As my morbid fascinations go, this one is evidently not that idiosyncratic. In the opening pages of Theories of International Politics and Zombies, just published by Princeton University Press, Daniel W. Drezner presents a couple of graphs showing the long-term growth of popular and academic interest in zombies in the years since George Romero and his friends in Pittsburgh made their low-budget masterpiece Night of the Living Dead (1968). The number of movie releases and scholarly publications appearing each year remains modest until the early 1990s. At that point, the curves spike upward and continue to grow rapidly, year by year. The output in cultural commodities about the living dead underwent surges again around 2005.

Nor was it slowed by the economic downturn. On the contrary. Today, the danger that cannibalistic ghouls might swarm the planet, laying waste to the routines of everyday life, is, if not exactly plausible, at any rate part of the standard repertoire of worst-case scenarios.

This hardly means the genre has a great future ahead of it. Clearly the rot is setting in. But the mythology is now so well-established that, for example, the University of Florida posted an emergency-preparedness contingency plan for zombie apocalypse on its homepage in 2009 (available here in PDF) and received praise in a local newspaper: "Simple actions such as this sharing of information and planning ahead will be what stays our species from annihilation."

Whatever else it may be, an attack by bloodthirsty ghouls offers a teachable moment. And Drezner, who is a professor of international politics at Tufts University, does not waste it. Besides offering a condensed and accessible survey of how various schools of international-relations theory would respond, he reviews the implications of a zombie crisis for a nation's internal politics and its psychosocial impact. He also considers the role of standard bureaucratic dynamics on managing the effects of relentless insurgency by the living dead. While a quick and entertaining read, Theories of International Politics and Zombies is a useful introductory textbook on public policy -- as well as a definitive monograph for the field of zombie studies. (The author is a member of the Zombie Research Society, which should give him a plaque or something.)

By Drezner’s account, no single approach to international politics -- whether realist or liberal, neoconservative or constructivist -- would provide the “magic bullet” for solving the crisis. For one thing, they are not designed for it. The sudden reanimation of corpses driven by an insatiable appetite for human flesh would involve a considerable departure from more familiar problems of governance.

But if I read him correctly, the author does seem to think that the realist paradigm in international relations theory has a special relationship with the zombie-apocalypse scenario. It rests on the intertwined principles that “anarchy is the overarching constraint of world politics” (that is, there is no “centralized, legitimate authority” able to enforce a particular order among nation-states) and that “the actors that count are those with the greatest ability to use force,” namely “states with sizable armed forces.” While nation-states possessing an advanced military-industrial complex would have a definite advantage in human-zombie combat, the balance of terror is not one-sided. The tendency of zombies to swarm is a staple of movies and fiction; it turns them into something like an army. The logic of the realist paradigm is to treat states as driven by “an innate lust for power.” Likewise, the undead “have an innate lust for human flesh.” Power and flesh alike count as scarce resources. One has an interest in preserving them both.

The realist assumes that powerful nations have -- and may expect to continue to enjoy -- the advantage over weaker ones in defining the world order. But the tendency of might to create its own right also benefits the zombies. They are single-minded (if that’s how to put it, since they are dead) and can create more zombies just by biting. This gives them enormous power, and that power is highly renewable. Not all realists are zombies, of course; but all zombies, by default, practice realpolitik.

Challenging the realist emphasis on raw military power, the liberal paradigm would stress the possibility of international cooperation in facing the menace, perhaps through the creation of a World Zombie Organization -- though Drezner also suggests that the effort might be undermined by the emergence of institutions of global civil society, such as Zombies Without Borders or People for the Ethical Treatment of Zombies. By contrast, neoconservatives would be prepared for the United States to “go it alone,” if necessary, by “deploying armed forces in ghoul-infested territory” in order to divert zombies from attacking the homeland. This could be successful in the short term. However, neoconservative anti-zombie policy might tend to militarize US society, leading the authorities to treat living citizens as if they were undead.

Finally, the school of constructivist international relations theory (admittedly less influential than the others in actual policy making circles) would stress the role of narratives and identity-claims in formulating a response to outbreaks of violence by the recently dead. Treating this as an apocalyptic scenario might prove a self-fulfilling prophecy. The complete disintegration of the social order, possibly followed by the consolidation of an authoritarian regime, need not be taken as a given. Instead, constructivists would strive to create “a Kantian ‘pluralistic counter-zombie security community’ in which governments share sovereignty and resources.” This would mitigate the impact of cannibalistic hordes on society; meanwhile, the boundary line between the living and the dead would, over time, be redrawn.

Tongue in cheek? Yes, but with serious intent. In ways that are more entertaining than allegory tends to be, the zombie scenario can express any number of social anxieties -- about terrorism, consumerism, pandemics, and mass culture itself, for example. These problems seem unrelated. But it is perhaps not a total coincidence that the output of zombie films, literature and scholarship began growing rapidly during the 1990s and 2000s.

Thomas Friedman (whose prose often makes me feel that someone is eating my brain) once defined the contemporary world as flat. Zombies started swarming across it in record numbers once the Cold War was over. Ghouls went global. Possessing no memory or long-term plans -- and tending to move in groups, their sheer numbers generating surplus dread -- they are the nightmare side of recent world politics.

Drezner’s assessment of the international implications of an attack by the living dead corresponds fairly closely to Max Brooks’s World War Z: An Oral History of the Zombie War (Three Rivers, 2006) -- a novel that is something like a cross between George Romero and Studs Terkel, and one of the true masterpieces of this genre. Like Theories of International Politics and Zombies, I can't recommend it highly enough. The two books seem destined to share a syllabus together.

“Powerful states would be more likely to withstand an army of flesh-eating ghouls,” Drezner writes. “Weaker and developing countries would be more vulnerable to zombie infestation. Whether due to realist disinterest, waning public support, bureaucratic wrangling, or the fallibility of individual decision-makers, international interventions would likely be ephemeral or imperfect. Complete eradication of the zombie menace would be extremely unlikely. The plague of the undead would join the roster of threats that disproportionately affect the poorest and weakest countries.”

A sobering conclusion. In other words, a zombie apocalypse would be terrible -- but it would not really change things very much.

Scott McLemee
Author's email: 

Antiwar No More?

Among Barack Obama’s distinguishing characteristics in the field of presidential hopefuls, four years ago, was his opposition to the Iraq war, which he had denounced at an antiwar rally in Chicago in October 2002, when invasion was still yet a gleam in the neocon eye. As Obama’s reelection campaign begins this week, his administration continues the military occupations of Afghanistan and Iraq, while making the down payment on a third in Libya.

This is not what people who supported Obama expected -- and public opinion polls suggest that opposition to the wars in Afghanistan and Iraq remains as high as it was during Bush’s second term, or higher. But the streets no longer fill with protesters. This coming weekend there will be antiwar demonstrations in New York (April 9) and San Francisco (April 10). They won’t be on the scale that became almost routine a few years ago, however, when hundreds of thousands of people attended such events. They will be one-tenth the size, more or less.

We can predict that with greater confidence than the weather this weekend. But why? And what would it take to change the situation?

Part of the answer might be found in a paper by Michael T. Heaney and Fabio Rojas called “The Partisan Dynamics of Contention: Demobilization of the Antiwar Movement in the United States, 2007-2009,” appearing in the latest issue of the journal social-science journal Mobilization. (It is available here in PDF.)

Heaney is an assistant professor of organizational studies and political science at the University of Michigan, while Rojas is associate professor of sociology at Indiana University.

Drawing on more than 5,300 surveys the authors conducted with people attending antiwar rallies in recent years, the paper is the latest in a series of studies of the relationship between social movements and political institutions -- in particular, American political parties, major and otherwise.

I first wrote about their work four years ago, as demonstrations against the Iraq war were at their peak. (See also also this column.) My interest is not, as the expression goes, purely academic. Any activist develops certain hunches about the relationship between mass movements, on the one hand, and more established and durable political entities, on the other. Such intuitions tend not to be theorized, but you need them as maps of the terrain. Folks in the Tea Party are not likely ever to read Robert Michels, though I’d guess they’ve had a taste of the iron law of oligarchy by now. Sometimes you have to work these things out for yourself.

Years ago, a friend with long involvement in organizing against the Vietnam war explained how the national election cycle had affected the ebb and flow of the protest movement. “In an odd-numbered year,” he told me, “you’d have masses of people coming out to demonstrations. If it was an even-numbered year, lots of the same people would stay at home because they figured voting for Democrats who criticized the war was enough.” He had been one of those out marching no matter what, and you could hear the frustration in his voice.

My friend lacked the sophisticated statistical tools deployed by Heaney and Rojas (henceforth, H&R), whose understanding of organizational dynamics is also more subtle. But their paper largely corroborates his thumbnail analysis.

Mass movements and political parties are very different animals, at least in the United States. Sociologists and political scientists usually put them in separate cages, and activists and policy wonks would tend to agree. “Party activists may view movements as marginal and unlikely to achieve their goals,” write H&R. “Movement activists may reject parties as too willing to compromise principles and too focused on power as an end in itself.”

But dichotomizing things so sharply means overlooking a third cohort: what H&R call the “movement-partisans” or, a bit more trenchantly, “the party in the streets.” These are people who identify themselves as belonging to an electoral party but consider mass protest to be as valid as the more routine sorts of political action. They might march on Washington, if strongly enough motivated -- but will also make it a point, while there, to visit their Congressional representatives for a quick round of citizen-lobbying.

To movement-partisans, each approach seems a potentially effective way to express their concerns and try to change things. In deciding which one to use at a given moment -- or whether to combine them -- ideological consistency usually counts less an their ad hoc estimate of the respective costs and benefits

According to H&R’s earlier research, movement-partisans are likely to be members of unions, civic organizations, and community groups. This makes them indispensable to building broad support for a cause. (They are also crucial to shoring up what party leaders call “the base” these days.) But the intensity of their involvement varies according to the degree of perceived threat they detect in the political environment.

“When the balance of power between the parties changes,” the social scientists write, movement-partisans will “reassess the benefits and costs of taking action. The rise of an unfriendly party may generate suspicions that the movement will be threatened by a wide range of hostile policies, while the rise of a friendly party may lead to a sense of relief that the threat has ended. Since people tend to work more aggressively to avoid losses than to achieve gains, grassroots mobilization is more likely to flow from the emergence of new threats than from the prospect of beneficial opportunities.”

From surveys conducted during national antiwar actions, the researchers found that people who self-identified as Democrats represented “a major constituency in the antiwar movement during 2007 and 2008,” accounting for 37 to 54 percent of participants. Those who identified as members of third parties represented 7 to 13 percent. (The rest indicated that they were independents, Republicans, or members of more than one party.)

In January 2007, an antiwar protest in Washington, D.C., drew hundreds of thousands of people. In H&R’s terms, a “perceived threat” from the Bush administration still existed among Democratic movement-partisans; so did their sense that it made sense to put pressure on Congress as it shifted from Republican to Democratic control, following the midterm.

But as the presidential campaigns ramped up, the dynamic changed. By late 2008, turnout at demonstrations contracted “by an order of magnitude to roughly the tens of thousands” -- and kept shrinking over the following year. At the same time, the composition began to change. By November 2009, the portion of antiwar protesters identifying as Democrats had fallen to a low of 19 percent, while the involvement of third-party members grew to a peak of 34 percent (almost three times the share just a couple of years earlier).

In some cases, decreasing or ending their participation in antiwar protests was a matter of conscious decision-making by members of the Democratic “party in the street,” as H&R call it. They may have approved of Obama’s handling of the Iraq war or sensed that other issues, such as health care, required more attention. At the same time, movement-partisans of the Republican sort were beginning to mobilize. Even people strongly opposed to the wars often felt this as a disincentive to challenge the administration: they didn't want to risk seeming to join forces with the president's political enemies.

But shifts in attitudes and priorities among individual activists only explain so much. In the final analysis, organization is everything. H&R stress the role of coalitions “in enabling parties and movements to coordinate their actions and share resources.”

The largest and broadest national antiwar coalition, United for Peace and Justice, was also the one most likely to supplement mass demonstrations with messages linked to the electoral arena (“The voters want peace”) and lobbying efforts. Arguably, UFPJ even had influence over groups completely rejecting its approach, since it gave them an incentive to cooperate with each other in organizing alternative antiwar protests.

Following Obama’s election, UFPJ began to disintegrate as Democrats withdrew. (It still exists, but just barely, and now calls itself a "network" rather than a coalition.) The most important effect of its unraveling, according to the paper, is “the fragmentation of the movement into smaller coalitions” -- groupings that tended to act on their own initiative, without the capacity to coordinate work with one another. The number of antiwar demonstrations grew even as the turnout shrank. Another consequence was “the expression of more radical and anti-Obama attitudes by leading organizations.” And this has the predictable effect of narrowing the base of likely supporters.

At this point it seems worth mentioning an insight by another friend whose education in such matters took place in the laboratory of the 1960s. For many years, he said, being engaged in antiwar activism or civil rights work meant going to events where, after a while, you were able to recognize almost everybody. Then one day he attended a demonstration and saw that something had changed. There were some familiar faces, but he had no idea who most of the people were.

“That’s how you know that the cause has actually become a movement," he said. "You look around and see a lot of new faces. It’s no longer just the usual suspects.”

What H&R describe in their paper is, in effect, the film running in reverse. Beyond a certain point, fragmentation becomes self-reinforcing. I wondered about the implications of H&R’s work for what now remains the antiwar movement. Was there anything in their analysis that would suggest the possibility of its revival, on a broader basis, in the immediate future? Could it happen with Obama still in office? Or would it take the “perceived threat” of a Republican president?

I wrote Heaney to ask. His short answer was, simply, no -- the chances of a major revival in the short term are slim. The more nuanced version of his response went somewhat beyond my question, though, and seems of interest:

“As long as voters remain highly polarized along party lines,” he responded by e-mail, “self-identified Democrats are unlikely to protest against Obama's policies, even if they disagree with some of them strongly. A sudden end to the era of partisan polarization seems highly unlikely. So I would say that it is a very good bet that Obama will not confront large left-wing demonstrations. Of course, LBJ faced large left-wing demonstrations, but the party system was not polarized back then in the way that it is today.”

The same dynamics apply to the Tea Party: “Our analysis implies that the Tea Party will have a lower degree of organization and success in 2012 than it did in 2010. Because the Republicans won the House and made gains in the Senate, Tea Party activists feel much less threatened today than they did a year ago. So, while the Tea Party will obviously be around in 2012 -- and it will likely factor into the Republican presidential contest -- our analysis suggests that the Tea Party will not generate the same level of enthusiasm next year as it did last year.”

Well, you take what grounds for optimism you can find.

Heaney might be right about everything. It could be that the antiwar movement will remain in its doldrum until, say, the Gingrich-Palin ticket proves victorious. That’d put some teeth back into the concept of “perceived threat,” anyway.

But it hardly follows that resignation is the best course. I can’t make it to New York on Saturday (let alone San Francisco) but am buying a bus ticket for someone else who wants to go. And while finishing up this column, I got in touch with Ashley Smith, who is a member of the steering committee of the United National Antiwar Committee, who seems to have a pretty sober perspective on where things stand.

“The most important part of these demonstrations,” he told me, “is bringing together old and new forces to rebuild an antiwar movement that has been weak for the last several years. We have made a high priority of including demands that open up the movement to forces to often held at arm's distance from antiwar mobilizations in the past -- Palestinians, Muslims, South Asians and Arabs. We have also had some success in reaching out to labor unions like SEIU 1199 and TWU 100 in New York. It is crucial that we rebuild the antiwar movement now.“

To paraphrase Donald Rumsfeld just slightly, you go into the antiwar struggle with the forces you have, not the ones you want, or wish to have at a later time.

Scott McLemee
Author's email: 

Ears to the Ground

While I spend much of my time on administration, my scholarly field of public opinion shifts so radically, and in such interesting ways, that it still holds my rapt attention. A new generation of opinion scholars are asking the big questions about public discourse and democratic representation, and their having grown up with the Internet gives them the sensibility to understand political communication in a new age. One thing they rarely address, however, is how changes in American public opinion measurement and expression might inform analysis of public opinion on their own campuses. Does the evolving nature of public opinion, and our academic approach to it, have anything to do with the assessment of campus or student opinion? To my mind, much of what public opinion scholars have learned has direct implications for campus citizens and leaders.

Public opinion, as a concept, was born in the ancient Greek democracies, with an acceleration of interest in the heady days of the Enlightenment. While of course citizens always had opinions, recognition that those opinions might matter somehow in affairs of state became vital with the French and American revolutions on the horizon. In the 18th century, particularly in France, an abundance of tools for expressing and assessing public opinion -- political tracts, oratory, theatrical productions, legal cases, interpersonal networks for gossip – were recognized by kings and then by our founding fathers. All of these mechanisms were useful in measuring something, although it was in the eye of the beholder as to what the broader, underlying sentiment might be. History tells us, eventually, who was right about public opinion in a place and time, who was wrong, and where it didn’t matter at all. But in any given moment, political actors and leaders do their best, since their lives and agendas depend on it.

With the rise of positivism, the penny press, and competitive general elections in the American 19th century, straw polls began to appear with great regularity in newspapers and political speeches. The quantification of public opinion was a natural outcome of a growing scientific culture, although early polls were largely unsystematic and infused with partisanship. From the 1830s until the 1930s, newspapers would often publish what I have called "people’s polls," where a quilting bee, union, or fraternity would poll itself and send the results to papers like the Chicago Tribune or The New York Times. Journalists were productive as well, polling people in taverns and train cars as they traveled a vast new nation, often asking for a simple show of hands, “How many for Lincoln and how many for Douglas?” (Interestingly, women were included in published straw polls, even though they could not vote in presidential elections. I surmise that they were included because women were, as they still are, the primary consumers in a household; newspapers need women to read the paper so they see product advertising).

The 20th century advent of polling and survey research is well-studied, and is a rocky but largely linear progression to increasing dominance of quantitative polling. There were some embarrassments to pollsters – the presidential mis-predictions of Landon over Roosevelt in 1936 and then Dewey over Truman in 1948. But the survey industry – so closely tied to market research – became a wildly successful one. So successful, in fact, that journalists and policymakers saw polling as the primary source for "scientific" opinion measurement. Public opinion became synonymous with polls, and journalists in particular bandied around poll results with an authority that made (and still makes) most social scientists cringe.

But things have changed, and abruptly so, over the past few years. While many social scientists are reluctant to give up the survey as a means for understanding public opinion, it is coming to the point where the survey looks like a bizarre dinosaur. We have new, wonderful means of understanding public opinion, far more nuanced ways than a poll could ever capture. People write blogs, tweet, post comments to news sites, produce videos for YouTube, and use an incredibly diverse and complex set of communication tools to get their opinions known. Are they scientific? No, but neither were most polls. Surveys were always reliant on anonymity and question framing. If people told a pollster something in private, were they ever willing to act on that opinion? How much did they care? Was that an opinion of a moment or a closely held belief?

Thousands of researchers and articles explore these topics in the general field of public opinion research, and we will continue to cling to polls to answer some questions. But to believe that they are better or superior to the wealth of complex public opinion expression we now have, whenever we turn on a computer, is to be blind to our communication universe. We may not be able to measure with great accuracy the influence of Daily Kos or Fox News on public opinion or vice versa. But the days of relying on a poll of a thousand random people, and their responses to a short set of multiple choice questions, for anything particularly useful, have slipped away.

What does this profound change in the technological and cultural environments for public opinion expression and measurement mean for campuses? First, we should probably stop leaning on surveys of students, faculty or staff. I have designed many such surveys, on many subjects, from faculty exit interviews to student opinion polls about civility and tolerance. These have ranged from small, specific surveys to large random ones, but in any case, they seem primitive to me at this point. They are less costly than ever, with our tools like Survey Monkey, and low computing costs. But that doesn’t mean they are worthwhile.

They constrain opinion to what you ask, and open-ended queries tend to be difficult to work with. If I take a survey of women faculty about gender discrimination on campus, and the results turn up just a few comments – in the open-ended query sections – with some heart-wrenching stories of harassment or discrimination, what have I found? I have been in this position more times than I can count, where quantitative results reveal no systematic problems, but the qualitative data hint at something very bad.

This doesn’t mean that you can dismiss all surveying; just keep doing the ones where you can actually triangulate with contextual data. For example, while those who have an ax to grind will use end of semester teacher evaluation surveys as a way to trash a professor unfairly, these are still important surveys. We just need to balance them with our other evaluation techniques – observation, review of syllabuses and assignments, learning outcomes, and the like. Of all these techniques, I find student surveys the least instructive, but there are sometimes valuable data.

If not surveys, of students, faculty and staff, then what? How should we navigate the flood of opinion on any issue, and locate “public opinion” or at least the general popular sentiment?

Introduce more sophisticated notions of "data." I wish we could outlaw the bizarre and worthless phrase “data-driven”: it has actually become dangerous, as it values quantitative data (no matter how bad) over other forms of data. And, as any sociology major knows, all data are social constructions in any case, formed by the biases of collectors and questions asked. “Data-driven” is an odd legitimation of all things quantitative, when we all know darned well that some data are better than others. I have seen tremendous damage done with the broadcast of lousy data, just because it was numbers, and some soul bothered to collect it. In fact, bad data are rampant on campuses, because a lot of people are in a hurry.

In our severe budget-cutting of late, the most criminal is the reduction of institutional research offices, where you actually find the people who can discern good/useful campus data from dubious, incomplete data. I hope when the economy improves, these offices can be re-staffed. But in the meantime, beware survey numbers and other super easy quantitative measures of public opinion (e.g., number of hits on a website or visits to the library). If you believe in proper, meaningful data, then figure out how not to lose all the terrific data all around you: alumni letters, faculty/staff meeting conversation, insights from your campus police, depositions in lawsuits, the nature of the departures of key faculty, attendance at events, types of events being held, etc. These data, messy as they are, are often far richer in useful information than typical systematic quantitative data.

Newspaper coverage is not synonymous with public opinion. One of the most difficult psychological challenges for people who love a campus is negativity of their local press. It did not start with the economic downturn, but is worsened by it, as universities look like wealthy enclaves relative to the broader community. You may have faculty departing, staff reductions in critical areas, programs cut, and scary deferred maintenance, but a campus still looks like a happy park to most people. And it is the case that colleges are upbeat places, no matter the economy, because young people lift the spirit of any organization. But even if morale on your campus is high, and people are going about the business of teaching, learning, and research, inevitably there are cynical or angry people who find local journalists/editorialists and vice versa.

It is a way of life in public higher education, and we need to do a better job of separating the reality of campus opinion from the partial view of some media covering us. Bad news, or isolated problems on a large campus, may sell traditional papers – a function of the challenge of journalism more generally, as the profession searches for a new paradigm. Until American journalists figure out how to both improve their business model (which could be decades, should they even survive), and hire full-time higher education reporters, take the measurement of public opinion into your own hands.

The Internet, on the other hand, is likely closer to meaningful public opinion. Even if it is overwhelming in complexity, and less systematic than results of a survey, the opinions found – anonymous and attributed – on the Internet are more interesting and important. They are, in most cases, more organic than anything you could collect with a survey. Your campus is portrayed in wonderful ways – sometimes orchestrated by your PR department, sometimes not – as your students and faculty do amazing things. They discover, create artwork, put on performances, give compelling lectures. But there are some less benign links going viral or at least garnering a lot of attention as well, and monitoring these is a way to learn who is unhappy and if that has any validity. For example, I highly recommend putting your college's name into YouTube every few days to see what is being communicated and why. In the best case, you learn about fabulous people and initiatives, or discover concerns about the community. But whatever is out there, you should probably have a look. It’s an expensive proposition for your vice presidents, deans, and senior faculty, but a fine task for student assistants, who know how to find things better than we do in any case.

Focus Groups. In my field, as opposed to market research, we largely dismiss focus groups. Political scientists make very little use of any tool like this, since the folks in the group are not chosen at random, are paid, and are seen as too small for generalization. On a campus, however, your "population" is far smaller than the U.S. population, so conducting a decent number of focus groups can be very instructive and results can in fact represent huge swaths of your campus. You collect textured, rich data as compared to numbers that are hard to parse (I personally have never heard data "speak"!). Instead of yet more committee work, ask one of your experienced marketing professors to train a few other faculty, trusted staff, and strong students to run focus groups, and have them "on call." This way, when you have an issue, you have a group of excellent facilitators who can be asked to lead some discussions and give the feedback to you. I find that most faculty, staff, and students enjoy this, and will be happy to help, as long as it does not get too arduous. Just train enough people that you can spread the workload around.


Public opinion expression and assessment are among the most glorious and the most challenging aspects of democratic practice, whether in America or anywhere else. The "Arab Spring," for example, an incredible urge toward self-governance and participatory citizenship, has been so moving. But as the inspiration and emotion fade, new democracies are faced with the same challenge we have on any campus: What do people think, how strongly, and why? These measurement issues do not defy social science. Yet if we fail to call upon all of the tools and talented people we have in our midst, we’ll never get a handle on the real opinions of our colleagues or our students.

Susan Herbst
Author's email: 

Susan Herbst is the new president of the University of Connecticut.


Subscribe to RSS - Political science
Back to Top