Political science

Confess!

This past weekend, a comic playing Bill Clinton on Saturday Night Live told the world’s leaders not to pull anything on Hillary when she becomes Secretary of State. It's not even worth trying, he indicated, because she’ll see right through you. But he offered some reassuring advice on how to finesse things, if necessary. “The only words you’re gonna need when Hillary shows up: ‘I ... am ... sorry.’ It don’t work all the time, but it’s a good place to start.”

A friend recounted this skit to me when he saw the galleys of Susan Wise Bauer’s new book The Art of the Public Grovel: Sexual Sin and Public Confession in America (Princeton University Press). Its cover shows the former president in a posture of contrition: hands in front of his face, as if to pray; his eyes both wide and averted. But Bauer’s point is that effective public groveling requires a lot more than just assuming the position, let alone saying “I am sorry.”

There is (so her argument goes) a specific pattern for how a public figure must behave in order to save his hide when caught in a scandal. It is not sufficient to apologize for the pain, or offense to public sensibility, that one has caused. Still less will it do to list the motivating or extenuating circumstances of one’s actions. Full-scale confession is required, which involves recognizing and admitting the grievous nature of one’s deeds, accepting responsibility, and making a plea for forgiveness and asking for support (divine or communal, though preferably both).

The process corresponds to a general pattern that Bauer traces back to the Puritan conversion narratives of the 17th century. Confession started out as a way to deal with Calvinist anxieties over the precarious nature of any given believer’s status in the grand scheme of predestination. Revealing to fellow believers an awareness of the wickedness in one’s own life was, at very least, evidence of a profound change in heart, possibly signaling the work of God’s grace.

Secularized via pop psychology and mass media, public confession now serves a different function. In the 20th century, it became “a ceremonial laying down of power,” writes Bauer, “made so that followers can pick that power up and hand it back. American democratic expectations have woven themselves into the practice of public confession, converting it from a vertical act between God and a sinner into a primarily horizontal act, one intended to re-balance the relationship between leaders and their followers. We both idolize and hate our leaders; we need and resent them; we want to submit, but only once we are reassured that the person to whom we submit is no better than we are. Beyond the demand that leaders publicly confess their sins is our fear that we will be overwhelmed by their power.”

Leaders who follow the pattern may recover from embarrassing revelations about their behavior. Major examples of this that Bauer considers are Jimmy Swaggart (with his hobby of photographing prostitutes in hotel rooms) and Bill Clinton (intern, humidor, etc.) Because they understood and accepted the protocol for a “ceremonial laying down of power” through confession, they were absolved and returnd to their positions of authority.

By contrast, public figures who neglect the proper mode of groveling will suffer a loss of support. Thus Edward Kennedy’s evasive account of what happened at Chappaquiddick cost him a shot at the presidency. The empire of televangelist Jim Bakker collapsed when he claimed that he was entrapped into extramarital canoodling. And Bernard Cardinal Law, the bishop overseeing the Catholic community in Boston, declined to accept personal responsibility for assigning known pedophile priests to positions where they had access to children. Cardinal Law did eventually grovel a bit – more or less along the lines Bauer suggests – but only after first blaming the scandal on the Boston Globe, his own predecessors, and earlier church policy. The pope accepted his resignation six years ago.

It’s one thing to suspect that a set of deep continuities exist between evangelical religion, group psychotherapy, and “performances of self” in an age of mass media. Many of us found ourselves positing this quite often during the late 1990s, usually while yelling at the TV news.

But it’s a much tougher prospect to establish that such continuities really exist – or that they add up to an ethos that is accepted by something called “the American public” (a diverse and argumentative conglomeration, if ever there were one). At the very least, it seems necessary to look at how scandals unfold in nations shaped by a different religious matrix. Bauer doesn’t make such comparisons, unfortunately. And her case studies of American scandals don’t always clinch the argument nearly so well as it may appear.

The discussions of Jim Bakker and Bill Clinton form a center of gravity for the whole book. The chapters on them are of almost equal length. (This may testify less to the historical significance of Jim Bakker’s troubles than to their very considerable entertainment value.) And in keeping with Bauer’s analysis, the men’s responses to embarrassment form a neat contrast in approaches to the demand for confession.

Having been exposed for using church funds to pay blackmail to cover up an affair with a church secretary, Bakker has always presented himself as more sinned against than sinning – the victim of a wicked conspiracy by jealous rivals. In other words, he never performed the sort of confession prescribed by the cultural norms that Bauer identifies. He never handed over his power through suitable groveling, and so his followers punished him.

“Refusing to confess, unable to show his one-ness with his followers, ” she writes, “Bakker remains unable to return to ministry.” Which is inaccurate, actually. He has been televangelizing for the past five years, albeit on a less grandiose scale than was once his wont. Bakker’s inability to reclaim his earlier power may have something to do with his failure to follow the rules for confessing his sins and begging forgiveness. But he still owes the IRS several million dollars, which would be something of a distraction.

Bakker’s claims to have been lured into immorality and disgrace are self-serving, of course. Yet Bauer’s account makes clear that his competitors in the broadcast-holiness business wasted little time in turning on him – the better to shore up their own reputational capital and customer base, perhaps. The critical reader may suspect that Bakker’s eclipse had more to do with economics than with the reverend's failures of rhetorical efficacy.

Former president Clinton, by contrast, is rhetorical efficacy incarnate. Bauer’s chapter on l’affaire Lewinsky attributes his survival to having met the demand for confession.

Of course, he did not exactly make haste to do so. Bauer includes a set of appendices reprinting pertinent statements by the various figures she discusses. The section on Clinton is the longest of any of them. More than a third of the material consists of deceptive statements and lawyerly evasions. But the tireless investigative pornographers of the Starr Commission eventually corned the president and left him with no choice. “In Bill Clinton’s America,” writes Bauer, “the intersection of Protestant practice, therapeutic technique, and talk-show ethos was fully complete. In order to survive, he had to confess.”

He pulled out all the stops – quoting from the Bible on having a “broken spirit,” as well as a Yom Kippur liturgy on the need to turn “from callousness to sensitivity, from pettiness to purpose” (and so forth). It worked. “Against all odds,” writes Bauer, “his confessions managed to convince a significant segment of the American public that he was neither a predator nor an evildoer, and that he was fighting the good fight against evil. Most amazingly, this white, male lawyer, this Rhodes Scholar, who held the highest elected office in the land, persuaded his followers that he was just like the country’s poorest and most oppressed.”

That is one way to understand how things unfolded ten years ago. According to Bauer's schema, Clington underwent a “ceremonial laying down of power,” only to have it handed back with interest. No doubt that description accounts for some people’s experience of the events. But plenty of others found the whole thing to be sordid, cynical, and cheesy as hell – with the confession as less a process that strengthened socials bonds than a moment of relief, when it seemed like the soap opera might end.

So it did, eventually. But there will always be another one, perhaps involving some politician we've never heard of before. That is why The Art of the Public Grovel ought to be kept in stock at Trover’s, the bookshop on Capitol Hill, from now on. While not entirely persuasive in its overall analysis, it might still have non-scholarly applications.

Author/s: 
Scott McLemee
Author's email: 
scott.mclemee@insidehighered.com

Stop the Insani-Tea!

On April 15, tens of thousands of people attended “tea parties” to denounce Obama’s economic policies – dressed up, some of the protesters were, like refugees from a disaster at a Colonial theme park. “No taxation without representation!” they demanded, having evidently hibernated through the recent election cycle. The right-wing publicity machine dutifully ground out its message that a mass movement was being born.

Suppose we grant the claim (however generous, however imaginative) that the tea parties drew 250,000 supporters. Compare that with the turnout, not quite three years ago, for the “Day Without an Immigrant” rallies, which involved somewhere between 1 and 1.5 million workers – many of them undocumented, which meant that their decision to attend involved some risk of losing a job or being deported. By contrast, last week’s anti-Obama protest made no real demands on its participants, and came after weeks of free and constant publicity by a major television network. Teabaggery also enjoyed the support of prominent figures in the conservative establishment. Yet with all this backing, the entire nationwide turnout for the tea parties involved fewer people than attended the immigrant rallies in a single large city.

The events of April 15 may not have marked the death agonies of the Republican Party. But they certainly amounted to a case of profound rhetorical failure: a moment when old modes of persuasion lost their power. The claim to speak for the concerns of “ordinary Americans” choked on its own pseudo-populist bile. The tea bags were less memorable than the cracked pots. It was hard to watch the footage without thinking that the next Timothy McVeigh must be a face in the crowd – and wondering if his victims ought to bring a class-action suit against Fox News.

Only just so much of the failure of the teabagging movement can be attributed to its instigators’ unfamiliarity with contemporary slang. A new book from the University of Chicago Press helps to clarify why alarmist denunciations of higher taxation and (shudder!) “redistribution of the wealth” just won’t cut it.

The publication for Class War? What Americans Really Think About Economic Inequality by Benjamin I. Page and Lawrence R. Jacobs could not be better timed. Page is a professor of political science at Northwestern University, while Jacobs directs the Center for the Study of Politics and Governance at the University of Minnesota. The authors conducted a national public-opinion survey during the summer of 2007 – just before the global economic spasms started – and they also draw on several decades’ worth of polling data in framing their analysis.

The question mark in the title is no accident. Page and Jacobs are not radicals. They insist that there is no class war in the United States. (This, in spite of quoting Warren Buffett’s remark that there actually is one, and that his class has been winning.) They provide evidence that “even Democrats and lower-income workers harbor rather conservative views about free enterprise, the value of material incentives to motivate work, individual self-reliance, and a generalized suspicion of government waste and unresponsiveness.” Their survey found that 58 percent of Democrats and 62 percent of low-income earners agreed that “large differences in pay are probably necessary to get people to work hard.”

But at the same time, they report a widespread concern that the gap between extremes of wealth and poverty is growing and poses a danger. “Although Americans accept the idea that unequal pay motivates hard work,” they find, “a solid majority (59 percent) disagree with the proposition that large differences in income are ‘necessary for America’s prosperity.’”

Not quite three quarters of those polled agreed that “differences in income in America are too large,” and more than thirds reject the idea that “the current distribution of money and wealth is ‘fair.’ ” The proposition that “the money and wealth in this country should be more evenly distributed among a larger percentage of the people” was supported by a large majority of respondents.

While inequality may sound like a Democratic talking point (at least during campaign seasons), the authors note that “solid majorities of Republicans (56 percent) and of high income earners (60 percent) agree that income differences are ‘too large’ in the United States. ... Majorities of Republicans (52 percent) and of the affluent (51 percent) favor more evenly distributing money and wealth.” A footnote indicates that the category of “high income” or “affluent” applied to “the 25.2 percent of our respondents who reported family incomes of $80,000 or more per year.”

While informed sources tell me that sales of small left-wing newspapers are up lately, Page and Jacobs are doubtless correct to describe the default setting of American public opinion as a kind of “conservative egalitarianism.” Citizens “want opportunities for economic success,” they write, “and want individuals to take care of themselves when possible. But they also want genuine opportunity for themselves and others, and a measure of economic security to pursue opportunity and to insure themselves and their neighbors against disasters beyond their control.”

And to make this possible, they are reconciled to taxation. “There is not in fact a groundswell of sentiment for cutting taxes. When asked about tax levels in general, only a small minority favored lowering them; most wanted to keep them about the same. Asked to chose among a range of estate-tax rates on very large ($100 million) estates, only a very small minority of Americans – just 13 percent of them – picked a rate of zero. The average American favors an estate-tax range of about 25 percent. ... Most American say the government should rely a lot on taxes they see as progressive, like corporate income taxes, rather than on regressive measures like payroll taxes. To our surprise, a majority of Americans even say that our government should ‘redistribute wealth by heavy taxes on the rich,’ a sentiment that has grown markedly over the past seventy years.”

And all of this data was gathered, mind you, well before jobs, housing, and retirement savings began to vaporize.

Nothing in Class War? quite answers the question of what political consequences logically follow from the polling data. Perhaps none do, in particular. What people want (or say that they want) is notoriously distinct from what they will actually bestir themselves to do. But it’s worth noting that Page and Jacobs found broad support for increasing the pay of low-income jobs, and drastically reducing the income of those who earn a lot.

“Sales clerks and factory workers should earn $5,000 more a year (about 23 percent more), according to the median responses of those we interviewed,” they write. At the same people, people “want to cut the income of corporate titans by more than half – from the perceived $500,000 to a desired $200,000. Imagine the reaction of ordinary working Americans if they learned that the CEOs of major national corporations actually pulled in $14 million a year.” Yes, imagine. Then something other than tea might start brewing.

Author/s: 
Scott McLemee
Author's email: 
scott.mclemee@insidehighered.com

You Say You Want a Reference Book About Revolution?

Thanks to an edition now available online from the University of Michigan Library, you can easily look up the word "Revolt" in the great Encyclopedia that Diderot and d'Alembert compiled in the 18th century as part of their challenge to the pre-Enlightenment order of things. A revolt is an "uprising of the people against the sovereign" resulting from (here the entry borrows from Fénelon) "the despair of mistreated people" and "the severity, the loftiness of kings."

That certainly counts as fair warning -- and indeed, the Encyclopedia then shifts into wonkish mode, advising any monarch who happened to be reading that he could best control his subjects "by making himself likable to them... by punishing the guilty, and by relieving the unhappy." Plus he should remember to fund education. Won't someone please, please think of the children? While Louis XVI was by no means a total dullard, it seems this advice was wasted on him. (See also "Regicide.")

Scores of other occasions when "the despair of mistreated people" collided with severe, lofty, and unlikable authority are covered in The International Encyclopedia of Revolution and Protest, 1500 to the Present, just published by Wiley-Blackwell. With seven volumes of text, plus one of index, it covers upheavals on every continent except Antarctica, which tends to be pretty quiet. A digital edition is also available.

The work of an international team, the Encyclopedia is edited by Immanuel Ness, a professor of political science at Brooklyn College of the City University of New York. I have been reading around in it (as is my wont) for the past few weeks, when not following the latest tweets of resistance from within Iran, and wanted to ask Ness a few questions about his project. He responded to them by email. A transcript of the exchange follows.

Q: The title of this reference work raises a question. Protests do sometimes lead to revolution, of course, but none that I've ever been to ever has. Although both activities involve departures from (and opposition to) the routines of a given society, revolution and protest seem to be rather distinct processes. Why bring them together like this?

A: Revolution and social transformation are ultimate goals of protests that have arisen from collective grievances when leaders of states and societies are unable or unwilling to come to terms with abject inequality or injustice. Undeniably not all protests lead to revolutionary change and most protesters will not live to see the results of their actions. But when successful, they are the culmination of waves of social grievances against authoritarianism, social and economic inequality and injustices frequently expressed over decades and even centuries.

In the project, we document when protests lead to revolution as well as demonstrations that are manifestations of systemic injustices, even if a revolution did not result immediately thereafter. Thus, the Bolshevik Revolution consolidated the mass peasant and worker movements that peaked in the early 20th century. By contrast, in the Philippines, the powerful peasant protest movements have failed to lead to a transformation of society. While the goal of creating a democratic and equitable society remains unfulfilled there, mass protests persist against injustice.

Despotic systems of rule can stave off resistance, but the history of the last 500 years demonstrates that revolutionary change is an ineluctable process.

Q: OK, but that assumes revolution and protest are means to the ends of justice and equality. I'm not sure the entries in the Encyclopedia all confirm that notion. There is one on fascism, for example: a movement that regarded itself as revolutionary, as the sociologist Michael Mann has emphasized. And come to think it, the "Tea Party" events in the United States earlier this year were protests, of a sort -- but for the most part they were just media stunts. How do you square this messy reality with what sounds like a base-line conception of revolution and protest as the midwives of progress?

A: In developing the work, we debated whether to include fascism and totalitarianism as social movements, and decided they were necessary to maintain a definitive unbiased understanding of the history of protests and revolutions. In many instances, demagogues across the political spectrum have used populist rhetoric and forces to defend violence and repression.

We were cognizant of the manipulative use of revolutionary rhetoric and symbols by repressive leaders to maintain and achieve power. But these entries also examine the popular discontent and resistance to injustice and oppression. For example, throughout Europe, we focused on the proliferation of partisan opposition to Francoism, Nazism, Fascism, and Stalinism. Similarly in the Global South, we documented popular opposition that emerged in response to dominant religious, ethnic, class, and oligarchic rulers that have relied on violence to repress the powerless.

As sociologist James Scott exposed in his work on guerrilla movements, we documented cases of armed resistance that often redounded against the most powerless that are often caught reluctantly in the crosshairs of conflicts. But, in researching modern history, while we may disparage the motivations of some reactionary movements that were cynically manipulated by leaders, the vast majority of social protest was engaged by ordinary people seeking justice, equality, and social inclusion.

Q: Your project is ambitious; it seems to cover the whole world. Is this a matter of some editorial orientation towards the new global or transnational history, or was it simply a matter of the various movements and uprisings seeming to be interconnected and to influence each other (as the cross-references tend to show)? And why does the period it covers start in 1500?

A: In crafting the project, from the outset, we were mindful of utilizing an approach rooted in world history, which seeks a broader examination of human civilization rather than the geographically parochial and theoretically circumscribed western civilization that I consider fairly indifferent to the majority of people who live throughout the world. Geographically, the project is framed from the perspective of world history, which appreciates the dominant processes of empire, migration, capitalism, environmental change, political, and social movements.

Using a world history frame, we found that many political movements are interconnected as arcs of resistance that appear through the processes of imperial expansion and resistance in various spheres of influence. For instance, Latin American resistance to Spanish colonial rule and then the post-colonial era can be viewed through arcs of resistance against European dominance, slavery, racism, and then indigenous struggles for civil and equal rights that appeared through the last 500 years though emerge more decisively in various historical moments. For instance, the numerous essays on indigenous movements reveal that resistance throughout the Americas is reaching a new apogee in the contemporary era.

The decision to begin with protests and revolutions at 1500 recognizes the important historical and social science research that identifies the beginning of the modern era with the dramatic expansion of European imperialism, emergent capitalism, and slavery that significantly emerged and rapidly expanded as major forces throughout the world. The temporal organization owes much to the path-breaking historical work of Fernand Braudel and the Annales School and Immanuel Wallerstein and subsequent World Systems Theorists.

Q: Any revolution is an interpretive minefield. Even nomenclature provokes arguments. (You can't refer to the Khmer Rouge in Cambodia or the Shining Path in Peru without somebody calling you a running dog lackey of the imperialist bourgeoisie for using those terms, since the respective organizations preferred to be called something else.) How did you strive with the need for balance and objectivity -- given that in some cases the very possibility of them is in dispute? Your introductory note for the Encyclopedia says that each article was examined by two members of the editorial board, and that more than half of the submitted pieces were rejected on various grounds. Did that mean you had to leave certain subjects out?

A: Realizing balance and objectivity in each entry was one of the greatest challenges in editing the work. In part this involved seeking to include editors with erudition in their respective fields who had a range of perspectives on the history of protest and revolution. While contributors were enthusiastic about this work, writers with similar perspectives did not necessarily agree with all the interpretations and conclusions. It reminds me of the Italian adage on the divisions on the left: “amici nemici parenti serpenti” (friends can be enemies but families are like a nest of vipers). Of course, the editors engaged in a respectful exchange of views, but people had different interpretations of the events and organizations. The encyclopedia includes arguments with a variance of opinion, but through the referee process, I ensured that the historical facts were correct, even if people reached different conclusions.

The history of the Cold War demonstrates that the US and Soviet Union supported various movements for the purpose of maintaining influence, even if those movements engaged in horrible acts of genocidal violence and brutality. We document each of these cases candidly even if the facts are jarring to one’s political affinities. The US supported the Khmer Rouge in Cambodia even after the party killed some 2 million civilians and was deposed through armed intervention by Soviet-supported communist Vietnam. Even if the narrative histories are disturbing to Maoist supporters of the Khmer Rouge and other groups, it is crucial that we document the horrific unfolding of events.

Still, in the case of Cambodia, while it is easy to blame the Khmer Rouge for all the violence, history demonstrates that for more than 100 years, European and then U.S, colonialists bear responsibility for destroying a culture and society. So, I think that it is crucial to understand the imperialist antecedents that set the stage for militant separatism as is the case of the Khmer and the Shining Path.

Through peer review we selected the most erudite essays submitted on similar topics. Our goal was to have each entry provide an entry point into a historical field of enquiry through providing extensive references. Even in an eight-volume work, our objective was achieving historical significance while remaining comprehensive. We are updating this work next year to include any essays that are worthy.

Q: The situation in Iran has taken a dramatic turn over the past month. Is this a new stage of the revolution that began in 1978-79? A repudiation of it? Something provoked by the CIA? Part of a larger wave of protests stimulated (directly or indirectly) by the global economy? A predictable consequence of so much of the population being young and full of rising expectations?

A: Well, as a rule, we avoided entries on recent events in the last five-to-ten years since the jury is still out and it is impossible at this point to gain more than a general sense of the social forces on the ground. Thus, while some recent events are included, the passage of time is essential to understand the forces at play. As such we excluded some of the “color revolutions” as it is too soon to discern the various groups engaged in the contestation for power. I have noticed that some in the West have already dubbed the Iranian protests as the “Green Velvet Revolution,” almost if it is part of a branding process.

It appears that some sort of democracy is in play today, irrespective of the forces manipulating the protests for their personal or factional benefit. In the Encyclopedia one can learn that the democracy movement in Iran is not a recent phenomenon but endures from the decisive electoral victory of Mohammad Mossadegh in 1953, which represented a repudiation of British interference in Iran, a theme in the unfolding of events today. But the CIA-supported Shah Mohammad Reza Pahlavi’s 1953 military putsch went on to annihilate all democratic opposition. With all democratic forces crushed by the Shah, the Shiite Islamic clerics gained currency in the wake of the Iranian Revolution of 1979, just as Napoleon consolidated power after the French Revolution. No less, the popular will for democracy, equality, and popular control remains a significant force in Iran as in nineteenth century France. I think that while foreign meddling may have occurred, last month’s elections also reveal that the vision of a democratic and egalitarian society remains unvanquished.

Author/s: 
Scott McLemee
Author's email: 
scott.mclemee@insidehighered.com

Neve Gordon's Academic Freedom

It’s not easy to find a country in the Middle East whose universities honor academic freedom as we know it in most Western countries. Syria is a police state, comparable in some ways to North Korea or Myanmar. Iran has substantially become one. Egypt’s security police maintain a chilling presence on campus. The one country that maintains academic freedom is Israel, though of course not in the occupied territories. The comparative climate for intellectual debate in the region is too often ignored or slighted in discussions promoted by the various boycott movements. Simple intellectual honesty and political accuracy requires that every discussion of Israeli academic conduct be framed with a reminder of the regional context. Otherwise, inadequately informed audiences can become victims of demagoguery and an exceptionalist fantasy of Israeli monstrosity be promoted.

But the dynamic of debate in the Israeli academy has suddenly changed, and part of the debate is now being conducted in American venues. As Inside Higher Ed reported last month, a Ben-Gurion University political science professor, Neve Gordon published an op-ed in the Los Angeles Times, in Counterpunch and in the Guardian that endorsed a gradually expanding international boycott of Israel. In her response, also published in the LA Times, Ben-Gurion University’s president, Rivka Carmi ventured not only to castigate Gordon but also to redefine academic freedom in ways contrary to traditions of the American Association of University Professors.

With these very troubling ideas circulating in the United States, a clear need for the AAUP to address the story has arisen. That need is underlined by the fact that several American scholars writing about the Middle East have either lost their jobs or had their tenure cases challenged because of their scholarly or extramural publications. Statements by Carmi and other Israeli administrators thus have the potential to help undermine academic freedom not only in Israel but elsewhere. These are in every sense worldwide debates.

As the Inside Higher Ed story points out, Gordon has been critical of Israeli conduct for some time. His protest columns regularly appear in The Nation here in the United States and in the Guardian in Britain, and he is the author of a 2008 book called Israel’s Occupation, published by the University of California Press. All this work, including the LA Times column, falls within his areas of academic specialization. It ranges from scholarly publication to extramural speech. It is all without question covered by academic freedom. Carmi’s assertion that the LA Times column “oversteps the boundaries of academic freedom — because it has nothing to do with it” is wholly unsupportable.

Gordon’s column, it is worth noting, adopts a somewhat different persona than a number of his other pieces about Israeli policy. It is not, for example, a straightforward protest against Israeli military actions, but rather a confessional staging of his anguished journey toward boycott advocacy: “as I watch my two boys playing in the yard, I am convinced that it is the only way that Israel can be saved from itself.” He has, he is suggesting, had a breakthrough amounting to a recovery of his humanity, something thereby that his opponents implicitly lack. Throughout his 2009 responses to the Gaza invasion he has been moving in that direction, suggesting earlier that he opposed Israel’s military action despite Hamas rockets falling near the home he shares with his children, and arguing that the invasion is distorting the humanity of Israeli children.

I am willing to believe that this tactic is both genuine and a calculated rhetorical strategy, but in either case it has probably contributed to the intensity of the response, since it frames the LA Times piece not as political polemic but as a personal narrative about, as he puts it, “the question that keeps me up at night.” It thus has special power to move ordinary readers, and many of those readers here and abroad have responded passionately. Publishing the column in the United States, rather than Israel, was, to be sure, a deliberate provocation. It moved the argumentative terrain to that of Israel’s major military and political ally, and to the home of many of Israel’s and his own university’s most important donors. The affront was not simply in what he said but where he said it, though it is hardly the first time Israeli scholars of both the Right and the Left have brought these debates to American shores. The response both here and in Israel has been intense. As we saw in the Ward Churchill case, academic freedom does not always fare well in a public firestorm.

The public response called for a principled defense of academic freedom by President Carmi. Instead, she made herself part of the public outcry against Gordon. Worse still, Carmi has sought to narrow academic freedom and undermine the protections it offers, calling Gordon’s column an effort “to advocate a personal opinion, which is really demagoguery cloaked in academic theory.” The notion that a political scientist cannot combine academic arguments with conclusions, theory with advocacy, strikes at the heart of the principle that academics have the right to advise the public and seek an impact on public policy. As Matthew Finkin and Robert Post argue effectively in their 2009 book For the Common Good: Principles of American Academic Freedom, faculty speech in scholarly venues and in the classroom cannot be protected (and cannot fully serve society) if faculty members are not also free to deploy their expertise in the public sphere without fear of government or university reprisal.

Gordon calls for a boycott of the state of Israel, thereby advocating something much more comprehensive than the focused boycott of academic institutions that the AAUP opposes. Some Israeli commentary claims Gordon’s remarks amount to treason, a dangerous and overheated accusation that responsible opinion must reject. Gordon is in fact performing his job as a political scientist and following reasoned moral and professional standards in doing so. Even if he were not a political scientist, he would have the right to say these things, but as a political scientist who writes about Israeli policy he has a disciplinary justification to offer advice and opinion in the public sphere. But academic freedom should protect still more extreme statements than those Gordon has made; it should hold harmless a faculty member who argues that his or her country has no moral or political legitimacy and thus no right to exist.

Extramural statements by faculty are especially vulnerable in times of national crisis. The United States can hardly be said to have protected them during World War I or in the McCarthy period. Many in the Middle East, including many Israelis, consider themselves to be in a permanent state of war. In many area countries Gordon would already be imprisoned or worse. In Israel his right to public speech is being eloquently defended by many both within and without the academy — but not, deplorably, by his own university administration. On several Israeli campuses petitions supporting Gordon have circulated, and a number of scholars have come to his defense. Once again, such robust debate hardly typifies all area countries.

Since Gordon is tenured and cannot be fired, Carmi instead bellowed that he “has forfeited his ability to work effectively within the university setting.” A few days before publishing her LA Times piece, Carmi had already urged Gordon to resign, a view endorsed by Ben-Gurion University’ rector and faculty member Jimmy Weinblatt.

On August 28th, Ilana Curiel reported in Israel News that Carmi and Weinblatt were also exploring options for removing Gordon as department head. There, it should be clear, Ben-Gurion administrators are on more secure ground. In the United States a faculty member serving as an administrator -- including a department chair -- is essentially an at-will employee. He or she can be removed from an administrative post and returned to the faculty if they displease their supervisor. In a public case like this one, of course, Carmi will be contemplating public fallout from a decision to force Gordon out of his chairmanship, so a good deal more than simple line administrative authority is at stake.

Indeed it has been clear from the outset, as Carmi openly acknowledged in an August 27th letter to Ben-Gurion faculty, that donor anger is a major factor in her attacks on Gordon. Inside Higher Ed reported that Amos Drory, Ben-Gurion’s vice president for external affairs, wrote to complaining donors to say “the university is currently exploring the legal options to take disciplinary action.” It is not the first time fund-raising priorities, not principle, have shaped administrative understandings of academic freedom, but that does not blunt the lesson that this represents one of the most severe threats to academic freedom.

Carmi’s own academic freedom, one may note, would have allowed her to reject Gordon’s views while asserting his right to hold them. That is, in effect, what Gordon recommended: “She has to cater to the people that provide the money, so a strong letter of condemnation of my views would have been fine with me. But there’s a difference between saying you disagree wit me, and threatening me.” Instead she mounted an international assault and sought to gut academic freedom in the process. While Gordon has job security, his vulnerability to myriad other forms of internal reprisal is obvious. There are many kinds of research support and institutional recognition that require administrative endorsement. More serious still is the message Carmi has sent to untenured and contingent faculty: exercise your academic freedom at your peril. The chilling effects at Ben-Gurion University have hardened into a deep freeze. There is reason for principled faculty to question the president’s ability to serve in her position.

Author/s: 
Cary Nelson
Author's email: 
info@insidehighered.com

Cary Nelson is national president of the American Association of University Professors.

The King of Pompeii

The term “neoconservative” is now routinely applied to any right-wing policy wonk inside the Beltway or the mass media. This usage reflects no understanding of the movement's history – or, just as often enough, a largely delusional notion of it, based on third-hand guesses about the influence of Leo Strauss and Leon Trotsky. Such rumors tend to be circulated by people who would be hard pressed to name a single book by either of them, let alone to grasp that their ideas were utterly incompatible.

Properly used, the label applies to a rather small cohort of social scientists and journalists who, during the 1950s and ‘60s, became anxious about Communist influence abroad – but equally uneasy at movements for black power, women’s liberation, and (a bit later) gay rights within the United States. “We regarded ourselves originally as dissident liberals,” wrote Irving Kristol, who died last week at the age of 89.

Kristol is often, and rightly, called the godfather of neoconservatism (in something akin to the Marlon Brando sense). An editor at the CIA funded journal Encounter during the 1950s, he was later one of the founders of the journal The Public Interest, and a columnist for The Wall Street Journal.

“We were skeptical of many of Lyndon Johnson’s Great Society initiatives," he wrote in Neoconservatism: The Autobiography of an Idea (Free Press, 1995), "and increasingly disbelieving of the liberal metaphysics, the view of human nature and of social and economic realities, on which those programs were based. Then, after 1965, our dissidence accelerated into a barely disguised hostility. As the ‘counterculture’ engulfed our universities and began to refashion our popular culture, we discovered that traditional ‘bourgeois’ values were what we believed all along, had indeed simply taken for granted.”

Translating this self-perception of themselves as the very guardians of civilization into a politically efficacious movement was not a swift or simple matter. Nor was the Republican Party its obvious or immediate destination. With Kristol as its helmsman, the movement built up a network of magazines, think tanks, and mass-media perches for punditry. These amounted to a counter-counterculture. Thirty years ago, Peter Steinfels’s intelligent and well-researched book The Neoconservatives: The Men Who Are Changing America’s Politics provided a group portrait of the movement on the eve of Ronald Reagan’s election – an ideological cohort with one foot planted in each party.

This wide stance cannot have been comfortable. And in any case, the political realignment of 1980 settled the matter. Reagan was, Kristol wrote in 1995, “the first Republican president since Theodore Roosevelt whose politics were optimistically future-oriented rather than bitterly nostalgic or passively adaptive. The Congressional elections of 1994 ratified this change, just as the person of Newt Gingrich exemplified it. As a consequence, neoconservatism today is an integral part of the new language of conservative politics.”

Indeed it is, for better or worse. But not from the sheer intellectual firepower alone. As that flourish of tribute to “the person of Newt Gingrich” may suggest, the progress of neoconservatism has also involved cultivating the courtier’s grace. There is a knack for knowing just when to apply one’s lips to the fundament of power.

In the early 1990s, it was still possible for Gary Dorrien, now a professor of ethics at Union Theological Seminary and of religion at Columbia University, to write a critical but sympathetic book called The Neoconservative Mind: Politics, Culture, and the War of Ideology (Temple University Press, 1993) that treated it primarily as a movement of ideas, locked in struggle against the prevailing drift of American society. It would be difficult to write about the intervening years in similar terms. Neoconservatism itself became part of that prevailing drift.

Whatever elan its intellectuals once displayed in challenging accepted ideas and trends turned into the kind of second-hand energy available from just going with the flow. This is not good for anyone's critical faculties. A new book by Sam Tanenhaus called The Death of Conservatism, published by Random House, spells out some of what has happened.

Tanenhaus, the editor of The New York Times Book Review, might fairly be called a fellow-traveler of neoconservatism, if not a full-fledged member of its counter-counterculture. His criticism is presented, not in the spirit of polemic, but with the tone of someone grappling with home truths. “During the two terms of George W. Bush,” he writes, “conservative ideas were not merely tested but also pursued with dogmatic fixity, though few conservatives will admit it, just as few seem ready to think honestly about the consequences of a presidency that failed not because it ‘betrayed’ movement ideology but because it often enacted that ideology so rigidly: the aggressive unilateralist foreign policy; the blind faith in a deregulated, Wall Street-centric market; the harshly punitive ‘culture war’ waged against liberal enemies.”

There is a considerable nostalgia to Tanenhaus’s evocation of an earlier period, when argument “about the nature of government and society, and about the role of politics in binding the two” conducted by “a small group of thinkers and writers” whose ideas “then ramified outward to become a broader quarrel that shaped, and at times defined, the political stakes of several generations.”

But now this is all just a memory. “Today’s conservatives resemble the exhumed figures of Pompeii,” he writes, “trapped in postures of frozen flight, clenched in the rigor mortis of a defunct ideology.”

I picture them clutching signs that read “Keep the government out of Medicare,” in Latin.

Author/s: 
Scott McLemee
Author's email: 
scott.mclemee@insidehighered.com

Palintology

Important as it was, the campaign of Barack Obama was not the only history-making element of the 2008 presidential election. With Sarah Palin, we crossed another epochal divide. The boundary between reality television and American politics (already somewhat weakened by the continuous "American Idol" plebiscite) finally collapsed.

Her campaign's basic formula was familiar: members of an ordinary middle-class family turn into instantly recognizable national celebrities while competing for valuable prizes.

But like any contestant at this late stage of an already decadent genre, Palin seemed much less conscious of the stakes of the game (power) than in how it let her broadcast her own sense of herself.

At that level she could not lose – the ballot box notwithstanding. I’m not sure what Sarah Palin’s favorite work of postmodern theory might be (all of them, probably) but she seems to take her lead from Jean Baudrillard’s Seduction. Other political figures use the media as part of what JB calls “production.” That is, they generate signs and images meant to create an effect within politics. For the Baudrillardian “seducer,” by contrast, the power to create fascination is its own reward.

Watching Palin respond to questions about her book Going Rogue (or not respond to them, often enough) is, from this perspective, no laughing matter. She grows ever more comfortable talking about herself. If no more capable of simulating knowledge of public issues, she is getting her story straight, more or less. And this matters. For now she does not have to be accurate, just coherent. She is consolidating her presence, her "brand." Teams of professional ideologists can feed Palin her lines later.

Is this too cynical? I fear it may not be cynical enough. For it assumes that Palin will eventually be integrated into her party’s apparatus and turned into a mouthpiece of old-school Republican electoral politics -- a basic platform of tax cuts for the rich and unregulated handgun ownership for everybody else.

That is not the only possible outcome, however. Someone with Palin’s developing command of the arts of media seduction -- and whose knack on that score is largely a matter of her performative maverickiness -- has the potential to change the rules of the game.

The editors of a new collection of essays called Going Rouge – a punning title that belies its basic seriousness – recognize that in Palin we may have something more than a new celebrity. “No one speaks of McCainism or Doleism,” write Richard Kim and Betsy Reed in their introduction, “but Palinism signals not just a political position but a political style, a whole way of doing politics.”

The volume itself is the product of a whole new way of doing serious nonfiction. It is the first title from OR Books, which has a staff, so far, of two people. One of them is Colin Robinson, who roughly this time last year lost his job as an editor at Simon and Schuster. He tells me that OR now has two offices. One is the coffee shop where he and his partner John Oakes (co-founder of independent publisher Four Walls Eight Windows) work in the morning. The other is the bar they go to at night.

When we talked earlier this year, Robinson described his idea for a new kind of trade publishing. The usual approach is to print an enormous number of copies of a title to get an economy of scale, then give large discounts to chain bookstores – leaving almost no money to promote it. For serious nonfiction, this was a miserable system. Any money for advertising tended to go to publicize, say, The Stephen King Cookbook or suchlike. (Palin's autobiography is an example of a book enjoying just such heavy promotion.)

His plan, Robinson said, would be to publish a few titles that he thought were worthwhile, making them available as e-books and print-on-demand paperbacks -- and then concentrate on advertising them online, among other ways via video. So far you have to buy Going Rouge directly from the publisher (it sold about 4,000 copies before its official publication date on November 16) but it will be available for order from bookstores next month.

Most of the chapters are reprints from magazines such as The New Yorker, The New Republic, and The Nation; a few first appeared on Web sites. The list of contributors is a Who’s Who of left-leaning journalists and commentators. Max Blumenthal, Juan Cole. Naomi Klein, Rick Perlstein, and Katha Pollitt, among others. There are a few critical evaluations of Palin by her fellow Republicans, including one by a conservative columnist who suggests that she makes George W. Bush “sound like Cicero.” The editors also reprint a number of interviews with and public statements by Palin herself – among them, selections from her Twitter and Facebook writings.

A celebration, then, it is not. But Going Rouge does represent an acknowledgment of Palin’s importance, ambiguous though the precise nature of that importance may be. It cannot be reduced to her short-term plans. She remains circumspect about them, for now anyway. But she is busy demonstrating a strong intuitive grasp of how mass media can be used – among other things, to change the subject.

An example is the item Palin posted on Facebook in early August: “The America I know and love is not one in which my parents or my baby with Down Syndrome will have to stand in front of Obama’s ‘death panel’ so his bureaucrats can decide, based on a subjective judgment of their ‘level of productivity in society,’ whether they are worthy of health care. Such a system is downright evil.”

This was fantasy. But it was effective fantasy. To borrow again from Baudrillard, it seduced -- abolishing reality and replacing it with a delirious facsimile.

The editors of Going Rouge give Palin credit for the rhetorical power generated by her words, and perhaps also by her canny use of the social-networking venue: “With remarkable economy of prose, Palin cast health care reform as an assault on the country, put a face on its supposed victims (her baby Trig), coined the expression ‘death panel’ (linking it directly to Obama), raised the specter of euthanasia in the service of a state-run economy, and rallied the troops around a fight against ‘evil.’ In short, she personalized, popularized, and polarized the debate. Never mind that Democratic health care reform bills merely funded optional end-of-life consultations that had heretofore been almost universally acknowledged as a good. (Indeed, Palin herself once championed them in Alaska.)”

Well, consistency is, after all, the hobgoblin of tiny minds. Sarah Palin is playing the political game on a much grander scale -- with rules she may be rewriting as she goes.

With a first printing of 1.5 million copies of her book, I don’t know that the intervention of an upstart press can pose much of a challenge. But OR Books deserves credit for trying. Someone has to speak up for reality from time to time. Otherwise it will just disappear.

Author/s: 
Scott McLemee
Author's email: 
scott.mclemee@insidehighered.com

The Eggheads Scramble

Two journals from opposite ends of the political spectrum have just run discussions of the role of American intellectuals in the age of Obama. It is the sort of coincidence that seems meaningful – or would, at least, if small-circulation magazines played the role they once did in shaping discussions about culture and politics. These days the Zeitgeist prefers to express itself on basic cable.

Tevi Troy’s essay “Bush, Obama, and the Intellectuals” appears in the third issue of National Affairs. When it first showed up on newsstands last year, the graphic design and wonky substance of National Affairs made it look as if it had been cloned from the genetic remains of The Public Interest, the flagship journal of neoconservatism, published between 1965 and 2005. Which it turns out is more or less the case. The editors “strive to walk in the footsteps of our intellectual and institutional predecessor,” they write, calling their predecessor “a journal that for decades enriched our public life with its unparalleled clarity and wisdom.”

Meanwhile the social-democratic journal Dissent began running its symposium “Intellectuals and Their America” in its winter number; part two is in the spring issue. The contributors have included Jackson Lears, Martha Nussbaum, Katha Pollitt Michael Tomaksy, Sam Tanenhaus, and Leon Wieseltier, among others. The common denominator is that these figures do not belong to Dissent’s editorial board. Nor are they, to my recollection, frequent contributors.

Here, too, the mood is elegaic. Dissent’s editors invoke “Our Country and Our Culture,” the symposium that ran across several issues of Partisan Review in 1952 – when that magazine was, they note, “near the apex of its influence.” One detects a wistfulness.

In 2002, Tevi Troy, who is now a visiting fellow at the right-wing Hudson Institute, came out with Intellectuals and the American Presidency: Philosophers, Jesters, and Technicians, published by Rowman and Littlefield. His new article is a serviceable précis of the book, offering a quick assessment of how several presidents have sought to court the intelligentsia.

No easy task, for they are a fickle lot: a milieu "with its own, often low-minded, politics and culture," writes Troy, "and its own complex connections to the popular culture and the rough-and-tumble of American politics.”

Obama came into office enjoying much goodwill -- especially from the sort of citizen who can follow the implications of an allusion to Reinhold Niebuhr. But this is not a blank check. Obama needs to avoid “underestimat[ing] the damage he would suffer if the cultural and academic elites who have backed him so far suddenly turned their knives against him. Precisely because Obama’s presidency rests, in part, on his status as a cultural phenomenon, he would pay a heavy price for losing their support.”

But for practical guidance, the president would need to turn from the article to the appendix to Intellectuals and the American Presidency, where Troy presents a set of maxims on how the White House can handle this constituency.

“Don’t make meetings with intellectuals public,” he warns, “and don’t reveal what was said in any official way.... Do use the president’s meal times liberally as a way to garner support from intellectuals. Even if you don’t back their policies, few people will refuse a free meal at the White House.... Do not, as president, publicly rely on think-tank guidance.... Do let it be known when the president is reading a popular work by a well-known scholar, as long as it is not a Swedish planning text, à la Michael Dukakis.”

Troy served in a number of positions during the presidency of George W. Bush. His article in National Affairs includes the most wonderfully counterintuitive sentence anyone has written in some time: “As an institutional matter, Bush’s outreach to intellectuals could well serve as a model for future presidents... .”

Anybody craving documentation for this arresting claim should recall Troy's maxims: “Do expect that any intellectual in the White House will produce a book describing the experience.” The implications seem clear. I look forward to Troy’s memoirs.

The Dissent symposium does not explicitly address the change of occupants in the White House. But its timing suggests that as a subtext; and so does the editors' reference to “Our Country and Our Culture.”

The joke about Partisan Review in the 1940s was that its offices contained special typewriters with the word “alienation” on one key. So when PR held its legendary symposium in 1952, the editors’ willingness to use the first-person possessive pronoun was a meaningful gesture. It suggested the end of alienation; it signaled that intellectuals were ready to accept a place in the scene before them.

And so things stand again now, perhaps -- after eight years when the unofficial national slogan amounted to “Ignorance is Strength.”

The first part of "Intellectuals and Their America" is now available online, with more responses to follow. While I was initially intrigued by the idea (several of the contributors are writers whose work it is always worth making the time to read) the cumulative effect of reading it has been disappointment and discouragement. For the most salient thing about “Intellectuals and Their America” is how lackluster the whole enterprise seems -- how vigor-free the taking of positions.

Not to deny that certain simulations of polemic are attempted. But they prove tired and rote.

In the second part of the symposium (not yet online) Jean Bethke Elshtain, a professor of social and political ethics at the University of Chicago, revisits a familiar complaint: professors are too inclined to leftist groupthink. “I refer to Harold Rosenberg,” she writes, ”who in 1948 characterized the contemporary academy as ‘the herd of independent minds.’ ”

Except that he didn’t. Rosenberg's barbed phrase was not aimed at academy. He was complaining about his colleagues, those New York Intellectuals of song and legend, who in those days seldom gave university life a second thought. Yet their tendency to assume positions in politics and culture appeared awfully well-synchronized, even so.

It is not a defect of avant garde-inclined thinkers only, as perhaps Elshtain should know. She was part of that herd of liberal intellectuals who offered arguments in favor of the Iraq war in 2003 -- often proving quite strenuous in declaring themselves independent-minded on that score.

In his contribution to the symposium, Michael Eric Dyson, a professor of sociology at Georgetown University, wants to put in a good word for public intellectuality: “Let’s not pretend that quarantining the life of the mind to the academy hasn’t at times made the rest of the culture sick.”

Ignoring the special qualities of this metaphor (it is both inapposite and incoherent) what seems most striking here is the implication that his point will prove controversial, somehow. Again, the contributor’s own example disproves his point. Dyson’s impressive academic career has been built almost entirely around trade-press books and mass-media appearances.

An army of university publicists works to make faculty part of the public conversation. The real issue is the quality of their interventions. Let's not pretend that generating soundbites on the topic of the day qualifies as a contribution to intellectual life, as such.

Once, the publication of this kind of symposium in a journal might clarify what was at stake in arguments among intellectuals. It could leave participants, and readers, with a sense of the state of the nation and its culture.

And indeed, one of the most important things about “Our Country and Culture” was that three contributors to it -- Irving Howe, Norman Mailer, and C. Wright Mills --clearly defined themselves as unhappy with the drift of the discussion. They were sufficiently opposed to its implicit invitation to join the American consensus that, two years later, they joined forces to help start a magazine called Dissent. And then, after a few more years, they dissented so much that they parted ways. (It is a tradition.)

If “Intellectuals and Their America” suggests that all life has gone out of the symposium as ritual, that has little to do with the era itself. The stakes now seem high enough. But the shape of public space itself has changed. Intellectual life is not a herd of independent minds. Rather, it involves any number of herds, some of them more furious than others. And the shepherding role of any given publication is now severely limited. I feel as much nostalgia for old formats as anyone, but this much seems clear: imitating a model from the Truman years seems a poor incentive to intellectual debate in the Obama era.

Author/s: 
Scott McLemee
Author's email: 
scott.mclemee@insidehighered.com

Storytelling

Once upon a time -- long, long ago -- I spent rather a lot of time reading about the theory of narrative. This was not the most self-indulgent way to spend the 1980s, whatever else you can say about it. Arguably the whole enterprise had begun with Aristotle, but it seemed to be reaching some kind of endgame around the time I was paying attention. You got the sense that narratologists would soon be able to map the genome of all storytelling. It was hard to tell whether this would be a good thing or a bad thing, but they sure seemed to be close.

The turning point had been the work of the Russian folklorist Vladimir Propp. In the late 1920s, he had broken down 100 fairy tales into a set of elementary “functions” performed by the characters, which could then be analyzed as occurring in various combinations according to a handful of fixed sequences. The unrelated-seeming stories were just variations on a very few algebraic formulas.

Of course, fairly tales tend to be pretty formulaic to begin with -- but with some tweaking, Propp's approach could be applied to literary texts. By the 1960s, French structuralist critics such as Roland Barthes and Gerard Genette were analyzing the writings of Poe and Proust (not to mention James Bond novels) to extract their narrative DNA. And then came Hayden White’s Metahistory: The Historical Imagination in Nineteenth Century Europe (1973), which showed how narratology might be able to handle nonfiction. White found four basic modes of “emplotment” -- romantic, comic, tragic, and satirical -- in the storytelling done by historians.

It was obviously just a matter of time before some genius came along to synthesize and supersede all of this work in book called Of Narratology, at least half of which would be written in mathematical symbols. The prospect seemed mildly depressing. In the end, I was more interested in consuming narratives (and perhaps even emitting them, from time to time) than in finding the key to all mythologies. Apart from revisiting Peter Brooks's Reading for the Plot: Design and Intention in Narrative (1984) -- the only book on the topic I recall with any pleasure -- narratology is one of those preoccupations long since forgotten.

And so Christian Salmon’s Storytelling: Bewitching the Modern Mind reads like a dispatch from the road not taken. Published in France in 2007 and recently issued in English translation by Verso, it is not a book contribution to the theory of narrative but a report on its practical applications. Which, it turns out, involve tremendous amounts of power and money -- a plot development nobody would have anticipated two or three decades ago.

“From the mid-1990s onward,” writes Salmon, concentration on narrative structure “affected domains as diverse as management, marketing, politics, and the defense of the nation.” To a degree, perhaps, this is obvious. The expression “getting control of the narrative” has long since become part of the lexicon of mass-media knowingness, at least in the United States. And Salmon -- who is a member of the Centre for Research in the Arts and Language in Paris and a columnist for Le Monde -- has one eye trained on the American cultural landscape, seeing it as the epicenter of globalization.

Roughly half of Salmon’s book is devoted to explaining to French readers the history and nuances of such ubiquitous American notions as “spin” and "branding." He uses the expression “narratocracy” to characterize the form of presidential leadership that has emerged since the days of Ronald Reagan. The ability to tell a compelling story is part of governance. (And not only here. Salmon includes French president Sarkozy as practitioner of “power through narrative.”)

Less familiar, perhaps, is the evidence of a major shift toward narrative as a category within marketing and management. Corporations treat storytelling as an integral part of branding; the public is offered not just a commodity but a narrative to consume. He quotes Barbara Stone, a professor of marketing at Rutgers University: “When you have a product that’s just like another product, there are any number of ways to compete. The stupid way is to lower prices. The smart way is to change the value of the product by telling a story about it.” And so you are not just buying a pair of pants, for example, but continuing the legacy of the Beat Generation.

“It is not as though legends and brands have disappeared,” writes Salmon. But now they “talk to us and captivate us by telling us stories that fit in with our expectations and worldviews. When they are used on the Web, they transform us into storytellers. We spread their stories. Good stories are so fascinating that we are encouraged to tell them again.”

Other stories are crafted for internal consumption. Citing management gurus, Salmon shows the emergence of a movement to use storytelling to regulate the internal life of business organizations. This sometimes draws upon the insights of well-known narrative artists of canonical renown, as in books like Shakespeare on Management. (Or Motivational Secrets of the Marquis de Sade, if I can ever sell that idea.) But it also involves monitoring and analyzing the stories that circulate within a business – the lore, the gossip, the tales that a new employee hears to explain how things got the way they are.

An organization’s internal culture is, from this perspective, the totality of the narratives circulating within it. “It is polyphonic,” notes Salmon, “but it is also discontinuous and made up of interwoven fragments, of histories that are talked about and swapped. They can sometimes be contradictory, but the company becomes a storytelling organization whose stories can be listened to, regulated, and, of course, controlled ... by introducing systematized forms of in-house communications and management based upon the telling of anecdotes.”

At the same time, the old tools of structuralist narratology (with its dream of reducing the world’s stock of stories to a few basic patterns) is reinvented as an applied science. One management guru draws on Vladimir Propp’s Morphology of the Folktale in his own work. And there are software packages that “make it possible to break a narrative text down into segments, to label its main elements and arrange its propositions into temporal-causal sequences, to identify scenes, and to draw up trees of causes and decisions.”

One day corporations will be able to harvest all the stories told about them by consumers and employees, then run them through a computer to produce brand-friendly counter-narratives in real time. That sort of thing used to happen in Philip K. Dick's paranoid science-fiction novels, but now it's hard to read him as anything but a social realist.

All of this diligent and relentless narrativizing (whether in business or politics) comes as a response to ever more fluid social relations under high-speed, quick-turnover capitalism.

The old system, in which big factories and well-established institutions were central, has given way to a much more fluid arrangement. Storytelling, then, becomes the glue that holds things together -- to the degree that they do.

The “new organizational paradigm,” writes Salmon, is “a decentralized and nomadic company…that is light, nimble, and furtive, and which acknowledges no law but the story it tells about itself, and no reality other than the fictions it sends out into the world.”

Not long after Storytelling originally appeared in 2007, the world’s economy grew less forgiving of purely fictive endeavors. The postscript to the English-language edition offers Salmon’s reflections on the presidential campaign of 2008, with Barack Obama here figured as a narratocrat-in-chief “hold[ing] out to a disoriented America a mirror in which shattered narrative elements can be put together again.”

This, it seems to me, resembles an image from a fairy tale. The “mirror” is a magical implement restoring to order everything that has been tending towards chaos throughout the rest of the narrative. Storytelling is a smart and interesting book, for the most part, but it suffers from an almost American defect: the desire for a happy ending.

Author/s: 
Scott McLemee
Author's email: 
scott.mclemee@insidehighered.com

Perils of Presidential Parallels

My cyber-savvy son recently e-mailed a message board entry he’d spotted on Google, titled “Stupid UnAmerican Writers Like Robert Schmuhl,” along with a tender, filial sentiment: “Haha!”

The day before, an essay I’d written was posted on one website before quickly finding its way to several others. Some reprinted the entire article, while others offered selected quotations and pointed reactions focusing on the sanity, seriousness and style of the author.

Maybe it was the question AOL’s Politics Daily posed in its headline introducing the initial internet iteration: “Is Sarah Palin the Next Barack Obama?” Maybe it was the consideration of strange-yet-true similarities between the former governor and the current president? Maybe it was the yoking together of polar opposite political figures who provoke don’t-confront-me-with-the-facts convictions among devout supporters.

Whatever the case, the virility of today’s viral blogosphere robustly flexed its inflamed muscles. More quickly than a garrulous pedagogue can finish answering a student’s query, poison darts — composed in high dudgeon How-Dare-You? rage — started to clutter my inbox and to show up on the net. Defenders of both Palin and Obama attacked their keyboards to deride and denigrate any suggestion of parallels. Some responsorial eruptions matched or exceeded the word count of the original article.

For over three decades of teaching and writing about contemporary American politics, I’ve tried (however vainly) to make sense of the forces, patterns and trends animating our civic life. With Palin scheduled to deliver a major speech in Iowa, the first state in the 2012 presidential nominating process, it struck me that she might be following a path somewhat similar to Obama’s in 2008.

  • Both figures emerged with stunning rapidity on the national scene and possess media magnetism.
  • Both used major speeches at their parties’ national conventions as their national political launching pads.
  • Both produced well-publicized and best-selling books to flesh out their life histories and views.
  • Both have somewhat exotic backgrounds by nature of their upbringings in states distant from the continental U.S.
  • Both arrived at national prominence with limited experience in upper-level governmental service.
  • Both positioned themselves as outsiders as they became more widely known, with a willingness to take on their parties’ establishments and Washington’s traditional ways.
  • Both create intense followings that are connected by using social media.

I could continue — and did — taking note early on of “their continent-spanning differences on issues and ideology.” Though that point was repeated near the end, where I mentioned politicians often face now-or-never moments of decision in their careers, gentle readers seemed inclined to disregard or dismiss the basic, non-partisan facts. Palinites were aggrieved to see any comparisons to their hero, and Obamaniacs felt the same way — but from the opposite perspective.

In retrospect, I probably should have expected the charges of “intellectual laziness” and worse the essay evoked. Back in early 2007, even before Obama announced his presidential candidacy, I had composed a similar confection of comparative analysis that the Chicago Tribune in a Sunday feature headlined “Reagan and Obama: Not So Different?”

Well, as Ronald Reagan began so many sentences, that explanatory effort (again with appropriate flashing warning signs that the two figures were “distinctly different”) proved that in the 21st century the words flog and blog have become synonymous. One cyber spitball sticks in my memory like a wad of discarded gum on a shoe: “Good grief! Barack baby wouldn’t make a pimple on Ronald Reagan’s posterior.”

For someone who refuses to choose political sides (never even voting in races I write or talk about) and who just tries to interpret civic affairs fairly, I suppose the reactions to my acts of describing more than coincidental parallels reflect the partisan, polarized toxicity infecting American political culture today. Even if you assiduously avoid taking a stand, others at high volume will do so for you. A discourse of conflict is de rigueur now.

In classes and articles, I come across as a broken record — a retro expression suggesting advancing age — that the phrase “media bias” is considerably more than a knee-jerk epithet to be tossed around as an all-purpose, back-of-the-hand complaint of journalistic prejudice. The two words joined together in unholy cohabitation are rife with complexities that require inquisitive minds and individual judgments.

Bias of some kind is inherent in all human communication, but that doesn’t mean every source of information approaches a story with a preordained perspective or agenda-advancing opinion. Especially in coverage of government and politics, the emphasis in mainstream outlets tends to revolve around institutional criteria (conflict, novelty, consequences and the like) rather than ideological ones, with more concern for accountability than advocating a cause.

Saying this usually sends hidebound conservatives and liberals into paroxysms of disbelief — but that’s really the way it is in traditional newsrooms. Increasingly, though, in this take-no-prisoners political environment, the perception of bias can be self-generating, with an individual bringing personal, preconceived dispositions to whatever’s read, seen and heard.

A work of journalism can be as straight and as balanced as possible, but people in the audience impose their own slant, basing their reactions on firmly planted thoughts and emotions. In this process, neutrality rapidly morphs into partiality — and we’re off to the races of full-throated rejoinders to an ignominious outrage of, say, identifying similarities between two public figures of competing parties.

For an academic more accustomed to point-making than point-scoring, today’s ecosystem of information is both boon and bane. Outlets abound to disseminate arguments and analysis to audiences never before imagined, yet those messages can be misinterpreted by people more fixated on how they already think than on learning something new.

As the digital denunciations of the Palin-Obama disquisition piled up, I lamented to a friend that nobody seemed to be reading the article with any semblance of objectivity. He provided comfort by paraphrasing Oscar Wilde: “The only thing worse than being read is not being read.” So it goes.

The story’s often told that H.L. Mencken’s mother once asked the 20th century’s most incendiary pundit-cum-provocateur: “What are you doing, Harry?”

With alacrity, the sage of Baltimore shot back: “I’m stirring up the animals.”

In today’s political and communications world, it’s possible to stir up the animals of every species, phylum and partisan orientation without even trying. Mencken probably would have reveled in our raucous and interactive age, but some of us might intrude a worry, now and then, about democracy and its discontents.

Author/s: 
Robert Schmuhl
Author's email: 
info@insidehighered.com

Robert Schmuhl is Walter H. Annenberg-Edmund P. Joyce Professor of American Studies and Journalism and director of the John W. Gallivan Program in Journalism, Ethics and Democracy at the University of Notre Dame. His collection of essays, In So Many More Words: Arguments and Adventures, has just been published by University of Notre Dame Press.

The Last Utopia

The German critic Walter Benjamin once gave a set of satirical pointers about how to write fat books -- for example, by making the same point repeatedly, giving numerous examples of the same thing, and writing a long introduction to outline the project, then reminding the reader of the plan as often as possible. Whether or not they are aware of doing so, many academic authors seem to follow his advice closely. Samuel Moyn's The Last Utopia: Human Rights in History, published by Harvard University Press, is a remarkable exception. Its survey of the legacy of ideas later claimed as cornerstones of the politics of human rights is both dense and lucid; its challenging reassessment of recent history is made in a little over two hundred pages. It's almost as if the book were written with the thought that people might want to read it.

After writing a review of The Last Utopia, I interviewed the author by e-mail; a transcript follows. Moyn is a professor of history at Columbia University and the editor of Humanity: An International Journal of Human Rights, Humanitarianism, and Development, published by the University of Pennsylvania Press.

Q: Describing your book as "a critique of the politics of human rights" has occasionally gotten me puzzled looks. After all, what's to criticize about human rights? How do you describe or explain your project in The Last Utopia?

A: As a historical project, The Last Utopia mainly tries to sort out when "international human rights" -- whether as a set of concepts or a collection of movements -- came about. I conclude: pretty recently. They are slightly older as a set of concepts than as a collection of movements, but in both senses they came to prominence in the 1970s, not before.

But then it follows that human rights are just one set of mobilizing notions that humans have had reason to adopt over the years. Looking far back, I try to dispute that human universalism -- treating humanity as what the Universal Declaration of Human Rights calls a single moral family -- never existed before international human rights did. Actually the number of ideologies (notably religious worldviews) based on the moral unity of the species is stupendously high. If so, the moral principles, and even more the practices, associated with human rights turn out to be just one version of a commitment to "humanity."

And in modern times, different universalistic projects have coexisted and competed all along, at least until many people began to assume that human rights were the only kind of universalism there is. One main argument of the book is that this process is visible even within the history of rights talk. Rights -- especially natural rights and the rights of man -- were authority for very different projects and practices than human rights now imply. In the years of American and French Revolution, the appeal to rights justified violent state founding (and national integration). Today, in a postcolonial world, human rights imply not simply a different, supranational agenda, but also wildly different mechanisms of mobilization, from lighting candles, to naming and shaming, to putting checks in the mail.

Ultimately, I conclude, both the affirmation of human rights and criticism of them must begin with the fact that they are new and recent, not timeless or age-old.

Q: You maintain that there is a significant difference between the version of human rights that came to the fore internationally beginning in the late 1970s and earlier notions. You seem to be arguing that campaigns for human rights in recent decades have tended to be antipolitical -- forms of moral renewal, even. But you also show that the relatively small circles taking up the idea of human rights in the 1940s and '50s often involved people of faith who understood it in terms of some kind of religious humanism. So what was different about the later embrace of human rights?

A: It's true I do emphasize the participation of European and trans-Atlantic Christians -- both Catholics and Protestants -- in the early story of international human rights in 1940s and after. But their most frequent associations with human rights were to "Western civilization" and moral community, along with worries about materialism and hedonism. They supported human rights built around freedom of conscience and religious practice, which they saw threatened most fundamentally by the Soviet Union to the east, in an interesting version of orientalism that targeted communist secularism.

Thirty years later, the move to morality was still available within Christian idiom, and Catholics, in particular, were key participants in the origins of human rights movements both behind the Iron Curtain and in the southern cone. Indeed, the Catholic Church amplified its connection of human rights and dignity in the Vatican II era. But all things considered, these affiliations were not the crucial ones for the fortunes of human rights as a galvanizing notion.

Rather, it was reforming leftists -- who had once thrown in their lots with versions of socialism -- who moved to moral humanism in circumstances of foreclosure or exhaustion. They had no space under their regimes to offer political alternatives, or after tiring years of political agitation were looking for something outside and above politics. And indeed these very figures found themselves making alliances -- tactical and coalitional at first -- with forces they would have decried a few years before. Dissidence in the name of "human rights" had replaced a political championship of divisive social alternatives.

Q:You note that earlier discussions of rights posited them as being exercised only within a political community -- while the notion of human rights tended to see them as existing outside of the nation-state, and even as defined against it. But here I want to ask you about someone you mention only in passing. The Universal Declaration of Human Rights that the United Nations issued in 1948 was championed (even to some degree rough-drafted) by H.G. Wells, who very definitely did regard the notion of human rights as something that would be advanced by a definite political authority -- namely, some kind of world state.

Wells wrote about that sort of thing in his science fiction, of course, but also tried to get a global state off the ground. Apart from the Teabaggers worried about One World Government, nobody much thinks about that sort of super-state anymore. Still, wouldn't it suggest that the notion of human rights does imply some kind of sovereignty able to enforce its claims?

A: The Tea Party is not new in this regard. But the perfervid fantasies on the American right of "world government" over the years shouldn't lead one to think that circles supporting that move were ever large, let alone politically influential. You're absolutely correct about Wells, whose globalist dreams went back to long before he began to champion the rights of man as a World War II battle cry, and relate in interesting ways to his fiction. To my knowledge no one so far has offered a synthetic vision of the long campaign for world government, which in some sense still has not gotten off the ground.

In any case, the dream of world government is most revealing about human rights for helping show that the Universal Declaration came to a world of states and indeed empires and perhaps even helped stabilize it. After all, in the beginning the larger United Nations system was more about maintaining that world than overcoming it. This gets to a key theme of the book, in its attempt to demystify the founding of the United Nations and the role of universal rights in its origins.

By the same token, those who have agitated for world government have arguably understood something deep about human rights. As Hannah Arendt argued in the 1940s in her dismissive treatment of the concept, rights presuppose a bounded citizenship space. Her challenge to votaries of human rights is that their declaration is meaningless unless there is a real plan to incorporate "humans" as citizens. And this may have been exactly what the partisans of world government all along, however few in number, have wanted to accomplish.

Q: The term "utopia" has a range of connotations -- some clearly disparaging, others much more honorific. What do you mean by the utopianism of human rights activism? And what's the overtone? Sometimes your characterization sounds a bit dismissive, while at other points it seems as if there's an implication that utopian desire is a necessary thing.

A: Thanks for giving the opportunity to answer this question, because the book evidently gives rise to some confusion on this score. I am a utopian, and -- as I say in the book -- I admire human rights activists for trying to make the world better rather than doing nothing or trying to make it worse.

One of my critics, Gary Bass, has argued that human rights movements follow a "liberalism of fear" which merely tries to stave off terrible evil rather than construct the good life for the world. As the book shows, however, that wasn't true in the 1970s, since several of those who most bitterly scorned prior utopias transmuted their idealism into new forms associated with human rights, rather than dropping idealism altogether. And it certainly isn't true now, when human rights have expanded -- notably in the global south -- far beyond their original antitotalitarianism to embrace a host of causes of improvement, intersecting humanitarianism and development.

If human rights are utopian, however, they are only one version of that commitment. I associate them with utopianism in order to ask the right questions of the movement. How much difference has it made? Were the utopias human rights replaced more or less plausible in light of their successor? Is it time to reorient human rights as an energizing agenda, or replace it?

I'm not sure of the answer. The title The Last Utopia could mean that human rights are the final idealistic cause -- or simply the most recent.

Q: Was your book inspired by any particular sense of frustration, disillusionment, or disappointment? How would you characterize your own political stance?

A: Probably my own trajectory simply reflects the collective learning of many Americans who in the past 10 years evolved away a somewhat naive belief in the transformative implications of human rights for the post-Cold War world order. While a young law student, I actually worked in the White House during the Kosovo bombing campaign, and vividly remember that it was a time when it seemed possible for universal justice to implemented by American power. Then 9/11, and Iraq, happened.

But while many fewer people put stock in the meaning of America's leadership today, the past decade has also seen a profusion of professional scholarship on the history of human rights, and it was my main goal to synthesize and comment on what is known so far, beyond any immediate political agenda. I actually don't get in the book much beyond the crucial turning point of the 1970s, so only a brief epilogue comments on how my analysis provides a way to think about the past couple of decades as contemporary history -- though I hope someday a sequel to the book will go much further!

This sequel would have to fit together three things: hope and then disillusionment about human rights in and through to the unipolar world America has been leading (though perhaps not for even the foreseeable future!), the imaginative and institutional crystallization of human rights as a specifically European language of self-understanding and governance, and the uptake in global humanitarianism of human rights as the language with which both helpers and opponents address fragile postcolonial states and their periodic "crises." Definitely, human rights have transformed our world in recent decades, but in ways that make it no less problematic.

Q: How did you manage to write such a short book on such a big subject?

A: In complete honesty, it helps to have signed a contract whose fine print your editor later enforces! More seriously, I wanted to try to write a book that summarizes what historians have unearthed so far, and shows what they still need to figure out, in short compass. And since I am the sort of historian who spends less time in archives looking for new information than in my armchair reading philosophy and political theory, I thought I could best prioritize different ways of integrating and rethinking existing evidence. It made most sense, in other words, to use the book mainly to make distinctions and pose questions. Definitive answers would have taken a lot more space -- and a different author.

Author/s: 
Scott McLemee
Author's email: 
scott.mclemee@insidehighered.com

Pages

Subscribe to RSS - Political science
Back to Top