During the last few years, my interests as a writing teacher and American Studies scholar have turned to the relationship between rhetoric and democratic practices and, in particular, to how I might use deliberative democracy techniques -- problem-solving strategies based on public consensus building rather than debate, partisanship, and polarization -- for teaching writing and critical thinking. These disciplinary and pedagogical interests came bundled with closely related concerns about how to better involve my students in the life of the university and in the civic affairs of Michigan State University’s neighbor, the local state capitol. I wanted to find ways, in short, for students to develop their public voices. Deeper down, I was also looking to renew my energies as a teacher and ratchet up the relevance of the humanities classroom by trying to connect the usual and venerable fare of the humanities-- principles, ideas, and critical reflection -- to the crucible of lived community problems where ordinary citizens conduct the extraordinary work of democratic citizenship.
Little did I realize that this interest in deliberation as a teaching resource would completely alter my experience of the classroom and profoundly disrupt my role and self-image as a teacher/scholar.
I began with modest experiments connecting the rhetorical and critical thinking requirements of Michigan State's general education writing course to deliberative problem solving techniques. My students, or example, studied the rhetorical processes of deliberation, examined the history of deliberative practice, and tried out deliberative arguments based on local civic and campus issues. We also conducted in-class forums based on the particular methodology of public deliberation and grass-roots problem solving practiced in hundreds of National Issues Forums taking place across the country. National Issues Forums are structured public forums about often-contested social issues that have national impact and local resonance -- for example, immigration reform and alcohol use and abuse. Perspectives on a given topic and a rhetorical framework for deliberation are laid out in issue booklets prepared by Public Agenda and the Charles F. Kettering Foundation. Each book presents three (sometimes four) perspectives on resolving an issue.
These early and partial efforts in my classes gave way to more sustained experiments when my colleague Eric Fretz and I designed a pair of closely related experimental writing courses in the general education sequence that would provide students with opportunities to study techniques of deliberation and to practice both public dialogue and public problem solving throughout the entire semester. These two courses were not team taught in the traditional sense. Eric was scheduled to teach a writing section with a focus on "Race and Ethnicity," and I was assigned a "Public Life in America" class with a special emphasis on education and youth issues. We each designed our own syllabus, although there was a good deal of overlapping of required texts, learning strategies, and writing assignments.
Our classes incorporated three active learning components -- a fairly traditional service experience for our students, a collaboration of both classes on a public forum on youth violence, and student-moderated deliberative study circles in class -- that we designed to link the academic issues of the separate courses, foster a strong learning community between our classes and among our students, and practice democratic skills of deliberation, collaboration, and participation. The experience of moderating a small study circle would give even the most reticent of our students the chance to practice habits of deliberation such as critical listening, asking leading questions, generating and sustaining discussion, staying neutral, and leading a group toward consensus.
Eric and I also tried to weave a deliberative pedagogy into just about every facet of the classes. Students practiced public dialogue and public problem solving at the very beginning of the semester by conducting in-class forums on topics that resonated, sometimes in discordant ways, in the public arena in our state and our university at the time, including the future of affirmative action and the quality of public education. In an effort to find out how far I could push deliberative practices into the life of the classroom, my students even framed and deliberated a class attendance policy.
Next, students gained important insights into public problems related to youth issues through question and answer sessions with invited guests (including a judge and a local police officer) and by working and learning in community settings with a number of community partners, including several Neighborhood Network Centers located in Lansing.
Our students then collaborated in small teams to research, organize, and host the public forum on “Violent Kids: Can We Change the Trend?” Students designed and drafted a discussion guide for forum participants along with worksheets and instructions for moderator assistants. They self-selected into committees that worked on timetables and deadlines for various stages of forum organization, communications, publicity, and background research on such things as children’s television, media violence, and effects of video games. After the forum, one of the work groups assembled and organized all of the forum work from each project team into a comprehensive portfolio. Eric and I drafted and circulated to all of our students an extensive portfolio assessment and evaluation memo that critically addressed the contribution of each work group -- all of which led to a deliberation we had not anticipated.
Our students were generally ruffled by our C+ evaluation of the portfolio, primarily because the grade was assigned to each student and counted for a portion of everyone's final grade. We took advantage of our students' dissatisfaction and invited them to put together a small deliberative forum to take a closer look at the evaluation memo and to present point-by-point arguments in favor of a higher grade. A small student work group agreed to frame the issue and prepare three choices for deliberation. Another work group took responsibility for moderating the joint-class forum, another for “post-forum reflections,” etc.
Here is the discussion guide they prepared:
Choice 1: The NIF Forum collaborative grade of C+ is fair and equitable. Professor Fretz and Professor Cooper’s evaluation memo is thorough, well argued, and reasonable. While some students may nit-pick with details, overall the judgment is sound and the conclusions are justified. All the students in [each class] clearly knew well in advance that the forum work would be evaluated with a common grade. Sure, some students may have worked harder than others. But to insure the integrity and honesty of the forum project as an exercise in democracy and public life, students must be willing to accept the common grade.
Choice 2: Working groups that excelled deserve a better grade than C+. On the other hand, the evaluation memo suggests that other working groups may deserve less than a C+. The working groups should be evaluated on a group-by-group basis. Professor Fretz and Professor Cooper should grade each group according to the arguments made in the separate committee sections of the evaluation memo. This grading procedure is ideal because it takes into consideration both collaborative work and individual effort. It is also more fair. The downside: all the work groups knew from the outset that the portfolio would be graded collaboratively. Is it OK to change that policy after the fact?
Choice 3: The common grade for the NIF forum work should be higher. The evaluation memo grade is simply too low. Granted, the points are well argued. No one claims Professor Fretz and Professor Cooper are being overly unfair. However, the forum was hard work for all students. It took up almost a third of the course work. It was a successful public deliberation. The portfolio, measured by even the toughest standards, was an excellent piece of work. No one disputes these points. Professor Fretz and Professor Cooper need to raise the grade, and the class will accept without question the higher common grade.
We were generally pleased that our students had gained enough understanding, experience, and confidence in democratic deliberation to bring it to bear on a controversy and a complaint that hit closer to home. By registering their objections in a democratic fashion and by seeing their objections taken seriously, our students navigated one of the most critical thresholds of democratic life: "We have a problem; we need to talk about it." Eric and I were convincingly swayed by Choice 3, and we raised the common grade to a B.
Our students’ turnabout confronted us with turnabouts of our own brought on by new roles and practices that deliberation introduced into the classroom.
Eric and I discovered that learning strategies that promote public work through deliberative pedagogy offer teachers rewards and fresh perspectives as well as posing difficult challenges. No longer the "sage on the stage," teachers become facilitators, "guides on the side," and, in many ways, co-learners with students -- and co-workers, too. We no longer directed from the sidelines or articulated abstractions behind a podium. We found ourselves doing work right alongside our students.
As our roles shifted, we had to give up some expectations about what should happen in a college classroom. In the process, we found new ways of thinking about those questions that all of us in higher education ponder: Where does the learning take place? How can I steepen the learning curve? What do I want my students to take away with them? Through practicing democracy in the classroom, we are able to answer these questions in different and more interesting ways than we could have in a more traditional classroom setting. Students learned disciplinary knowledge (in this case, writing rhetorical arguments, thinking critically, connecting written argument to concrete public problem solving) through experience and practice. In addition, they began to experiment with ways of operating and effecting change in the public sphere of the classroom itself.
For our part, we learned that the role of professor is both bigger and smaller than the ones articulated by traditions and expectations of our academic disciplines. Our most challenging and prosaic role, for example, was that of project manager. We helped our students anticipate snags, identify community and university resources, solve problems, develop networking skills, and lay out efficient workflows -- skills we felt were basic to the toolkit of citizenship. We also fetched envelopes and department letterhead, provided campus contacts to facilitate logistics for the forum, arranged for the use of printers, fax machines, office phones and computers.
For me, a striking and lasting consequence of adopting and adapting to a deliberative pedagogy was that I no longer considered myself a "teacher" in the conventional sense in which my colleagues understood, practiced, and peer-reviewed the role. Rather, I became an architect of my students’ learning experiences or maybe a midwife of their practices to become better writers and more-active citizens -- or, perhaps more to the point, I became something like a forum moderator. In a public forum, successful deliberation is often inversely related to the visibility and presence -- indeed, the knowledge and issue expertise -- of the moderator. The same applies to a teacher in a deliberative classroom: You spend a great deal of creative intellectual energy listening to students and learning to get out of their way so that they can take ownership of the subject, in the same way that forum participants must "own" an issue.
That fundamental role shift totally changed my experience of the writing classroom, from mundane matters like the physical arrangement of desks and the venues where learning takes place to epistemological underpinnings, ethical practices and boundaries, not to mention problematic relationships with more traditionally-minded colleagues who felt that I was cutting my students too much slack. In the annual department review, one of my colleagues criticized me, for example, for comments repeated on several narrative evaluations from students that "it was like the students were teaching the class." In the future, obviously, I need to do a better job of articulating a philosophy of deliberative pedagogy so my colleagues can translate statements like that as observations of practice and not criticisms of my teaching style.
The deliberative pedagogy that we employed demands a great deal of preparation and planning, but at the same time requires spontaneity and flexibility -- and a certain degree of uncertainty. Our students’ learning experiences encompassed complex and interlocking community groups, constituencies, organizations, and several offices and units at my university. Grounded in multiple learning partnerships, action research, and real world contexts, learning became a dynamic social process -- emergent, messy, edgy, relational, sometimes inconclusive, occasionally (not often) painful and confused, frequently full of entanglements, and always, I hope, challenging. I found myself constantly pushing the class to a point of agitation, churn, and controlled chaos because that was where the real learning took place -- at that threshold where students became present in, and took ownership of, their own learning experience.
David D. Cooper
David D. Cooper is professor of writing, rhetoric, and American cultures at Michigan State University. He is director of the College of Arts and Letters’ Public Humanities Collaborative and university senior outreach and engagement fellow. Cooper explores deliberative democracy and pedagogy in more detail in a chapter in Deliberation and the Work of Higher Education, forthcoming from the Charles Kettering Foundation.
Among the most striking phenomena associated with Barack Obama’s successful bid for the Democratic nomination has been his ability to attract young people to the political process. Youthful volunteers have staffed his campaign. They have used Internet skills to advance his candidacy and build his organization. They have even been among the thousands of small donors who have contributed to his record-breaking fund-raising efforts. In state after state, their support for Obama during the primaries significantly exceeded his margins among voters from other age groups.
The success of the Obama campaign refutes the oft-repeated notion that young people today are uninterested in national politics and are less ready than older generations of Americans to become responsible stewards of our democratic institutions. This resurgence of youthful activism delivers an important message for our colleges and universities.
The disengagement of young people from our country’s political processes after the 1960s has been well documented. Many studies have shown that during the last three decades of the 20th century, young Americans demonstrated less interest in public affairs than had previous generations, and also were less well informed about political and public policy matters, and less likely to vote.
The withdrawal of young people from active interest in public affairs paralleled reduced attention to citizenship by our colleges and universities. While higher education has long claimed as a core mission preparing students for democratic participation, it is a mission honored in recent years mainly in the rhetoric of college catalogs. Few campuses today provide organized or explicit programming with this focus, either inside or outside the curriculum. Most of our academic institutions address this matter only indirectly, by fostering the intellectual skills and qualities -- critical thinking, habits of reading and information gathering, broad interest in the social world -- that studies have shown relate to heightened levels of political participation.
It was not always this way. In the years after World War II, when patriotic sentiment was strong, academics paid extensive attention to the ways in which the undergraduate curriculum could promote appreciation of the ideas, values and experiences that constituted the shared cultural heritage of the country, a movement symbolized by Harvard’s famous report, "General Education in a Free Society." Many institutions established requirements in American history and Western political and social thought. A new emphasis on international studies reflected the country’s emergence as a global power. Outside the curriculum, there was a heightened focus on the ways in which student government could be a vehicle for teaching undergraduates the ways of democratic decision making.
During the 1960s, however, attention to active citizenship fell victim to the anti-governmental impulses inspired by the war in Vietnam. By the end of that decade, academe was far more concerned with promoting the kind of intellectual independence associated with dissent than with helping students understand the workings of democracy. In curricular terms, not much has changed since the 1960s. Indeed, the emphases of recent years on multiculturalism and world history have rendered special attention to a shared American culture or to American history passé or even objectionable from the perspective of many academics.
It would be unfair to blame academe entirely for the disengagement of young people from our political life. Many factors have been involved, not least the many unappealing qualities of contemporary political practice. But higher education, by abandoning attention to preparation for citizenship, has been an enabler of this pattern. In recent years, however, a growing number of educators have expressed concern about our continuing inattention to this matter. Individuals such as Derek Bok, former president of Harvard, and academic organizations including the Association of American Colleges and Universities have argued that we need to revitalize our traditional concern for citizenship education. Organizations such as Campus Compact and indeed the whole service learning movement are promoting civic engagement among college students, although these efforts are typically focused on community service rather than electoral politics.
The students who have responded so enthusiastically to Senator Obama’s campaign are making it clear that they are ready for renewed attention to our democratic institutions by our colleges and universities. It is inevitable, whatever the final outcome of the election, that the heightened interest in politics shown by young people will translate into a heightened receptivity to programming by colleges and universities focused on these matters. Higher education should seize this opportunity.
Not everyone will welcome renewed efforts by colleges and universities to promote political participation. Some, mainly outside academe, worry that higher education’s tendency toward liberal politics is already turning many college classes into indoctrination sessions; those who harbor such worries will not readily trust our campuses to avoid partisanship. Others, mainly inside academe, worry that an explicit focus on strengthening democracy will quickly devolve into nationalistic boosterism.
But recent work by thoughtful academics, most notably through the Political Engagement Project, sponsored by the Carnegie Foundation for the Advancement of Teaching, have shown that collegiate programs focused on active citizenship can heighten political awareness and foster greater understanding and participation without greatly affecting the political inclinations of participating students or abandoning an appropriately critical perspective on our country’s policies. Boks’s study outlines a number of ways -- through required course work, extra curricular activities, sponsored events and speakers, and presidential leadership -- that colleges and universities can responsibly promote thoughtful political participation.
Individual institutions should craft their own responses to this moment of opportunity. Institutional characteristics such as scale, complexity, mission, location and educational philosophy will suggest the most fruitful approach to citizenship education in particular contexts. The first requirement of progress, therefore, must be engagement of the campus community -- faculty and staff -- in thinking about how citizenship education can most effectively be pursued. But local approaches will need to address some shared objectives.
The first of these is understanding. It is hard to imagine how an institution can claim to prepare its students for active citizenship if they are allowed to graduate with no knowledge of American history or of our political and economic institutions. The widespread absence of requirements in these areas is an embarrassment to higher education. A second challenge is motivation. Campus plans should seek ways to foster an abiding sense of the value and importance of civic engagement. In this area we have much to build on, given the inspiring surge in social activism among many young people. A final challenge is skill. We need to help students develop the capacity to use the vehicles available to citizens to influence the political process effectively. And we need to think about how to use the entire institution -- both the curriculum and the extra-curriculum -- to meet these challenges.
These will not be easy discussions. They will compel us to think about things we have found difficult, such as requiring students to study certain subjects and treating extra curricular life in a systemic way as part of the educational process. But if we can’t find ways to address these issues, we should perhaps abandon the pretense that our mission includes the preparation of citizens. I hope we will not do that. The country needs us to respond differently. It is time for academia to reassert our historic role in preserving and strengthening our democracy by helping our students appreciate what it is about and how it works. The young people turning out in droves to vote in the 2008 primaries are calling us to pay attention to this issue.
Richard M. Freeland
Richard Freeland is the Jane and William Mosakowski Distinguished Professor of Higher Education at Clark University and president emeritus of Northeastern University.
All the horse-race drama of the primary season is now but a memory. And there is plenty of time to kill before the spectacle of the party conventions. (In the fullest Guy Debord-esque sense of “spectacle,” no doubt about it.) So lately we have had a chance to contemplate the less heavily vetted aspects of political discourse – the rattle of whatever tin can may be tied to the tail of each presidential campaign. Is Barack Obama really the Antichrist? Is John McCain really Grandpa Simpson? Discuss.
Lest anyone think this is the silly season, we turn with relief to the August issue of the journal Political Theory, where a major article finally addresses an issue too long ignored by candidates and scholars alike. “If academics’ first responsibility is to tell the truth,” write two political scientists, “then the truth is that after 60 years of modern UFOs, human beings still have no idea what they are, and are not even trying to find out. That should surprise and disturb us, and cast doubt on the structure of rule that requires and sustains it.”
How true! Hold a press conference on immigration here in Washington, DC and the reporters will come. Hold one about the other aliens, and you’re lucky if a couple of smartasses turn up to ask questions about “The X Files.”
I don’t know if they read much political theory over at the National Press Club. Probably not. But if Alexander Wendt (professor of international security at Ohio State University) and Raymond Duvall (chairman of the poli sci department at the University of Minnesota) can’t get a serious hearing for their paper “Sovereignty and the UFO,” then the cause is truly hopeless, and the incessant rectal probing by our reptilian overlords will never end.
The paper came to my attention via blog postings by a couple of political scientists. One took it as an example of the kind of thing you can get away with once you have tenure. The other grappled with the argument itself and found it wanting on its own terms. (The authors of the paper have replied here.)
That argument boils down to a claim that UFO research has never achieved legitimacy because the very possibility of visitation by extraterrestrials poses too many problems for the implicit metaphysics of the nation-state.
Contemporary ideas about national sovereignty are quite thoroughly anthropocentric. That was not always the case. In the age of kings who ruled by divine right, the ultimate sovereign authority was embedded in God Himself. And if you lived in a community where shamans communicate regularly with bears or fish or the spirit of the mountain, then you would tend to think of nature itself as having, in effect, the franchise.
The modern sense of the nation-state rests on the assumption that politics is a strictly human process. Sovereignty – the ultimate authority to make decisions within a territory – is embodied in human agents.
Furthermore, a nation-state tends to develop mechanisms for keeping track of its own population – a series of institutions and bodies of knowledge devoted to monitoring the people who live within its borders, create its wealth, and obey its laws. (Or don’t, as the case may be.) The result is a grid of power and expertise sometimes designated by the rather unwieldy expression “governmentality,” coined by Michel Foucault.
Sovereignty and governmentality are related though not identical concepts. But they converge on one point that Wendt and Duvall consider a kind of blindspot: In the modern nation-state, sovereignty and governmentality are, by default, completely anthropocentric.
Even after countless thousands of UFO reports from all over the world – often by members of the military – the nearly 200 nation-states “have been notably uninterested” in the phenomenon. “One may speak of a ‘UFO taboo,’” write Wendt and Duvall, “a prohibition in the authoritative public sphere on taking UFOs seriously, or ‘thou shalt not try very hard to find out what UFOs are.’”
Their argument is, in important respects, the exact opposite of – well, “The X Files.” It is not a conspiracy theory. “We are not saying the authorities are hiding The Truth about UFOs,” write W&D. Nor are they even suggesting that The Truth would necessarily involve extraterrestrial visitation. “We are saying that [the authorities] cannot ask the question.”
This claim places their inquiry within a field of study described in this column a couple of months ago: the discipline of agnotology. (Just as epistemology considers the genesis and structure of knowledge, agnotology examines the sources and inner logic of ignorance.) Wendt and Duvall’s endnotes neglect the agnotological literature – but they have added to it, even so.
“Our puzzle,” they explain, “is not the familiar question of ufology, ‘What are UFOs?’ but, ‘Why are they dismissed by the authorities?’ Why is human ignorance not only unacknowledged, but so emphatically denied? In short, why a taboo?”
The pattern of avoidance is, the answer, “akin to denial in psychoanalysis: the sovereign represses the UFO out of fear about what it would reveal about itself. There is therefore nothing for the sovereign to do but turn its gaze away from – to ignore, and hence be ignorant of – the UFO, making no decision at all.”
One knows better than to argue with a psychoanalyst, of course. Disagreement only proves that the insight was valid; otherwise, it would not generate so much anxiety. Likewise, the call for UFO research triggers the “extraordinarily resilient” forces of modern sovereignty and its metaphysics: “Those who attempt it will have difficulty funding and publishing their work,” write Wendt and Duvall, “and their reputations will suffer.”
Wendt and Duvall mention a few cases where governments have, in fact, conducted studies of UFO sightings. But clearly they were efforts to dismiss the question. The decades-long Search for Extra-Terrestrial Intelligence (SETI) project does not undermine the point that anthropocentric sovereignty cannot bear contemplating any challenge. Indeed, the authors write, “SETI advocates have been at the forefront of UFO skepticism.”
Theories are always most beautiful when they cannot be falsified. “Sovereignty and the UFO” is a thing of beauty. Anyone who is already a little tired of McCain and Obama should check out Wendt and Duvall. And keep watching the skies....
One minor casualty of the recent conflict in Georgia was the doctrine of peace through McGlobalization -- a belief first elaborated by Thomas Friedman in 1999, and left in ruins on August 8, when Russian troops moved into South Ossetia. “No two countries that both had McDonald’s had fought a war against each other since each got its McDonald’s,” wrote Friedman in The Lexus and the Olive Tree (Farrar, Straus, and Giroux).
Not that the fast-food chain itself had a soothing effect, of course. The argument was that international trade and modernization -- and the processes of liberalization and democratization created in their wakes -- would knit countries together in an international civil society that made war unnecessary. There would still be conflict. But it could be contained -- made rational, and even profitable, like competition between Ronald and his competitors over at Burger King. (Thomas Friedman does not seem like a big reader of Kant, but his thinking here bears some passing resemblance to the philosopher’s “Idea for a Universal History from a Cosmopolitan Perspective,” an essay from 1784.)
McDonald’s opened in Russia in 1990 -- a milestone of perestroika, if ever there were one. And Georgia will celebrate the tenth anniversary of its first Micky D’s early next year, assuming anybody feels up for it. So much for Friedman's theory. Presumably it could be retooled ex post facto (“two countries with Pizza Huts have never had a thermonuclear conflict,” anyone?) but that really seems like cheating.
Ever since a friend pointed out that the golden arches no longer serve as a peace sign, I have been wondering if some alternative idea would better fit the news from Georgia. Is there a grand narrative that subsumes recent events? What generalizations seem possible, even necessary and urgent, now? What, in short, is the Big Idea?
Reading op-ed essays, position papers, and blogs over the past two weeks, one finds a handful of approaches emerging. The following survey is not exhaustive -- and I should make clear that describing these ideas is not the same as endorsing them. Too many facts about what actually happened are still not in; interpretation of anything is, at this point, partly guesswork. (When the fog of war intersects a gulf stream of hot air, you do not necessarily see things more clearly.) Be that as it may, here are some notes on certain arguments being made about what it all means.
The New Cold War: First Version. A flashback to the days of Brezhnev would have been inevitable in any case -- even if this month were not the 40th anniversary of Soviet tanks rolling into what was then Czechoslovakia.
With former KGB man Vladimir Putin as head of state (able to move back and forth between the offices of the president and of the prime minister, as term limits require) and the once-shellshocked economy now growing at a healthy rate thanks to international oil prices, Russia has entered a period of relative stability and prosperity -- if by no means one of liberal democracy. The regime can best be described as authoritarian-populist. There have been years of frustration at seeing former Soviet republics and erstwhile Warsaw Pact allies become members of NATO. Georgia (like Ukraine) has recently been invited to do so as well. So the invasion of South Ossetia represents a forceful reassertion of authority within Russia’s former sphere of influence.
We have reached "the end of the end of the Cold War,” goes this interpretation. Pace Fukuyama, it was a mistake to believe that historical progress would culminate in liberal, democratic, constitutional republicanism. The West needs to recognize the emergence of a neo-Soviet menace, and prepare accordingly.
This perspective was coming together even before the conflict between Russia and Georgia took military form. For some years now, the French philosopher Andre Glucksmann (whose musings on Solzhenitsyn’s The Gulag Archipelago were influential in the mid-1970s) has been protesting the rise of the new Russian authoritarianism, quoting with dismay Putin’s comment that “the greatest geopolitical disaster of the twentieth century is the dissolution of the Soviet Union.”
Vaclav Havel, the playwright and former president of the Czech Republic, has done likewise. In a recent interview, Havel said, “Putin has revealed himself as a new breed of dictator, a highly refined version. This is no longer about communism, or even pure nationalism.... It is a closed system, in which the first person to break the rules of the game is packed off to Siberia."
Why be skeptical of this perspective? Certainly the authoritarianism of the Putin regime itself is not in doubt. But the specter of a new Red Army poised to assert itself on the world stage needs to be taken with a grain of salt. A report prepared by the Congressional Research Service in late July notes that budget cuts have forced “hundreds of thousands of officers out of the ranks” of the Russian military, and reduced troop strength to 1.2. million men (compared to 4.3 million in the Soviet military in 1986).
“Weapons procurement virtually came to a halt in the 1990s,” the report continues, “and is only slowly reviving. Readiness and morale remain low, and draft evasion and desertion are widespread.” Raw nationalist fervor will only make your empire just so evil.
The New Cold War: Take Two. Another version of the old template regards an East/West standoff as inevitable, not because Putinist Russia is so vigorous, but because such a conflict is in the interests of the United States.
We're not talking here about the more familiar sort of argument about the U.S. needing access to oil in the Caucus region. Nor does it hinge on strategic concerns about nuclear cooperation between Russia and Iran. It has less to do with economic interest, or geopolitical advantage, than it does the problem of ideological vision (or lack of it) among ruling elites in the West. A renewal of superpower conflict would help to prop up societies that otherwise seem adrift.
This thesis is argued a British think tank called the Institute of Ideas, which takes much of its inspiration from the work of Frank Furedi, a professor of sociology at the University of Kent. Having started out decades ago as Marxists of a rather exotic vintage, writers associated with the institute have moved on to a robustly contrarian sort of libertarianism. Their perspective is that state and civil society alike in the industrialized world are now prone to waves of fear and a pervasive sense of aimlessness.
“It is difficult,” writes Furedi in a recent essay, “to discover clear patterns in the working of twenty-first-century global affairs....The U.S. in particular (but also other powers) is uncertain of its place in the world. Wars are being fought in faraway places against enemies with no name. In a world where governments find it difficult to put forward a coherent security strategy or to formulate their geo-political interests, a re-run of the Cold War seems like an attractive proposition. Compared to the messy world we live in, the Cold War appears to some to have been a stable and at least comprehensible interlude.”
Hence the great excitement at recent events - so rich are they with promise of a trip backwards in time.
There is something at least slightly plausible in this idea. A quick look at Google shows that people have been announcing “the end of the end of the Cold War” for quite a while now. The earliest usage of that phrase I’ve seen comes from 1991. A kind of nostalgia, however perverse, is probably at work.
But Furedi's larger argument seems another example of an idea so capacious that no counterevidence will ever disprove it. If leaders are concerned about what’s happening in the Caucusus, it is because anxiety has made them long for the old verities. But if they ignored those events -- well, that would imply that the culture has left them incapable of formulating a response. Heads, he wins. Tails, you lose.
The End of ... Something, Anyway. Revitalizing the Cold War paradigm keeps our eyes focused on the rearview mirror. But other commentary on events in Russia and Georgia points out something you might not see that way -- namely, that this stretch of paved road has just run out.
The Duck of Minerva – an academic group blog devoted to political science – has hosted a running discussion of the news from South Ossetia. In a post there, Peter Howard, an assistant professor of international service at American University, noted that the most salient lesson of the invasion was that it exposed the limits of U.S. influence.
“Russia had a relatively free hand to do what it did in Georgia,” he writes, “and there was nothing that the U.S. (or anyone else for that matter) was going to do about it.... In a unipolar world, there is only one sphere of influence -- the whole world is the U.S.’s sphere of influence. Russia’s ability to carve any sphere of influence effectively ends unipolarity, if there ever was such a moment.”
Howard points to a recent article in Foreign Affairs by Richard Haass, the president of the Council on Foreign Relations, about the emergence of “nonpolarity: a world dominated not by one or two or even several states but rather by dozens of actors possessing and exercising various kinds of power.”
This will, it seems, be confusing. Countries won’t classify one another simply as friends or foes: “They will cooperate on some issues and resist on others. There will be a premium on consultation and coalition building and on a diplomacy that encourages cooperation when possible and shields such cooperation from the fallout of inevitable disagreements. The United States will no longer have the luxury of a ‘You're either with us or against us’ foreign policy.” (One suspects the country is going to afford itself that luxury from time to time, even so.)
A recent op-ed in The Financial Times does not explicitly use the term “nonpolarity,” yet takes the concept as a given. Kishore Mahbubani, dean of the public policy school of the National University of Singapore, sees the furor over Georgia as a last gasp of old categories. The rise of Russia is “not even close” to being the most urgent concern facing the west.
“After the collapse of the Soviet Union,” he writes, “western thinkers assumed the west would never need to make geopolitical compromises. It could dictate terms. Now it must recognise reality. The combined western population in North America, the European Union and Australasia is 700m, about 10 per cent of the world’s population. The remaining 90 per cent have gone from being objects of world history to subjects.”
Framing his argument in terms borrowed from Chairman Mao, Mahbubani nonetheless sounds for all the world like an American neoconservative in a particularly thoughtful mood. “The real strategic choice” facing the wealthy 10 percent “is whether its primary challenge comes from the Islamic world or China,” he writes. “If it is the Islamic world, the U.S. should stop intruding into Russia’s geopolitical space and work out a long-term engagement with China. If it is China, the U.S. must win over Russia and the Islamic world and resolve the Israel-Palestine issue. This will enable Islamic governments to work more closely with the west in the battle against al-Qaeda.”
From this perspective, concern with the events in Georgia seems, at best, a distraction. Considering it a development of world importance, then, would be as silly as thinking that the spread of fast-food franchises across the surface of the globe will make everyone peaceful (not to mention fat and happy).
Well, I’m not persuaded that developments in the Caucasus are as trivial as all that. But we’re still a long way from knowing what any of it means. It’s usually best to keep in mind a comment by Zhou Enlai from the early 1970s. Henry Kissinger asked for his thoughts about the significance of the French Revolution. “It is,” Zhou replied, “too soon to say.”
This past weekend, a comic playing Bill Clinton on Saturday Night Live told the world’s leaders not to pull anything on Hillary when she becomes Secretary of State. It's not even worth trying, he indicated, because she’ll see right through you. But he offered some reassuring advice on how to finesse things, if necessary. “The only words you’re gonna need when Hillary shows up: ‘I ... am ... sorry.’ It don’t work all the time, but it’s a good place to start.”
A friend recounted this skit to me when he saw the galleys of Susan Wise Bauer’s new bookThe Art of the Public Grovel: Sexual Sin and Public Confession in America (Princeton University Press). Its cover shows the former president in a posture of contrition: hands in front of his face, as if to pray; his eyes both wide and averted. But Bauer’s point is that effective public groveling requires a lot more than just assuming the position, let alone saying “I am sorry.”
There is (so her argument goes) a specific pattern for how a public figure must behave in order to save his hide when caught in a scandal. It is not sufficient to apologize for the pain, or offense to public sensibility, that one has caused. Still less will it do to list the motivating or extenuating circumstances of one’s actions. Full-scale confession is required, which involves recognizing and admitting the grievous nature of one’s deeds, accepting responsibility, and making a plea for forgiveness and asking for support (divine or communal, though preferably both).
The process corresponds to a general pattern that Bauer traces back to the Puritan conversion narratives of the 17th century. Confession started out as a way to deal with Calvinist anxieties over the precarious nature of any given believer’s status in the grand scheme of predestination. Revealing to fellow believers an awareness of the wickedness in one’s own life was, at very least, evidence of a profound change in heart, possibly signaling the work of God’s grace.
Secularized via pop psychology and mass media, public confession now serves a different function. In the 20th century, it became “a ceremonial laying down of power,” writes Bauer, “made so that followers can pick that power up and hand it back. American democratic expectations have woven themselves into the practice of public confession, converting it from a vertical act between God and a sinner into a primarily horizontal act, one intended to re-balance the relationship between leaders and their followers. We both idolize and hate our leaders; we need and resent them; we want to submit, but only once we are reassured that the person to whom we submit is no better than we are. Beyond the demand that leaders publicly confess their sins is our fear that we will be overwhelmed by their power.”
Leaders who follow the pattern may recover from embarrassing revelations about their behavior. Major examples of this that Bauer considers are Jimmy Swaggart (with his hobby of photographing prostitutes in hotel rooms) and Bill Clinton (intern, humidor, etc.) Because they understood and accepted the protocol for a “ceremonial laying down of power” through confession, they were absolved and returnd to their positions of authority.
By contrast, public figures who neglect the proper mode of groveling will suffer a loss of support. Thus Edward Kennedy’s evasive account of what happened at Chappaquiddick cost him a shot at the presidency. The empire of televangelist Jim Bakker collapsed when he claimed that he was entrapped into extramarital canoodling. And Bernard Cardinal Law, the bishop overseeing the Catholic community in Boston, declined to accept personal responsibility for assigning known pedophile priests to positions where they had access to children. Cardinal Law did eventually grovel a bit – more or less along the lines Bauer suggests – but only after first blaming the scandal on the Boston Globe, his own predecessors, and earlier church policy. The pope accepted his resignation six years ago.
It’s one thing to suspect that a set of deep continuities exist between evangelical religion, group psychotherapy, and “performances of self” in an age of mass media. Many of us found ourselves positing this quite often during the late 1990s, usually while yelling at the TV news.
But it’s a much tougher prospect to establish that such continuities really exist – or that they add up to an ethos that is accepted by something called “the American public” (a diverse and argumentative conglomeration, if ever there were one). At the very least, it seems necessary to look at how scandals unfold in nations shaped by a different religious matrix. Bauer doesn’t make such comparisons, unfortunately. And her case studies of American scandals don’t always clinch the argument nearly so well as it may appear.
The discussions of Jim Bakker and Bill Clinton form a center of gravity for the whole book. The chapters on them are of almost equal length. (This may testify less to the historical significance of Jim Bakker’s troubles than to their very considerable entertainment value.) And in keeping with Bauer’s analysis, the men’s responses to embarrassment form a neat contrast in approaches to the demand for confession.
Having been exposed for using church funds to pay blackmail to cover up an affair with a church secretary, Bakker has always presented himself as more sinned against than sinning – the victim of a wicked conspiracy by jealous rivals. In other words, he never performed the sort of confession prescribed by the cultural norms that Bauer identifies. He never handed over his power through suitable groveling, and so his followers punished him.
“Refusing to confess, unable to show his one-ness with his followers, ” she writes, “Bakker remains unable to return to ministry.” Which is inaccurate, actually. He has been televangelizing for the past five years, albeit on a less grandiose scale than was once his wont. Bakker’s inability to reclaim his earlier power may have something to do with his failure to follow the rules for confessing his sins and begging forgiveness. But he still owes the IRS several million dollars, which would be something of a distraction.
Bakker’s claims to have been lured into immorality and disgrace are self-serving, of course. Yet Bauer’s account makes clear that his competitors in the broadcast-holiness business wasted little time in turning on him – the better to shore up their own reputational capital and customer base, perhaps. The critical reader may suspect that Bakker’s eclipse had more to do with economics than with the reverend's failures of rhetorical efficacy.
Former president Clinton, by contrast, is rhetorical efficacy incarnate. Bauer’s chapter on l’affaire Lewinsky attributes his survival to having met the demand for confession.
Of course, he did not exactly make haste to do so. Bauer includes a set of appendices reprinting pertinent statements by the various figures she discusses. The section on Clinton is the longest of any of them. More than a third of the material consists of deceptive statements and lawyerly evasions. But the tireless investigative pornographers of the Starr Commission eventually corned the president and left him with no choice. “In Bill Clinton’s America,” writes Bauer, “the intersection of Protestant practice, therapeutic technique, and talk-show ethos was fully complete. In order to survive, he had to confess.”
He pulled out all the stops – quoting from the Bible on having a “broken spirit,” as well as a Yom Kippur liturgy on the need to turn “from callousness to sensitivity, from pettiness to purpose” (and so forth). It worked. “Against all odds,” writes Bauer, “his confessions managed to convince a significant segment of the American public that he was neither a predator nor an evildoer, and that he was fighting the good fight against evil. Most amazingly, this white, male lawyer, this Rhodes Scholar, who held the highest elected office in the land, persuaded his followers that he was just like the country’s poorest and most oppressed.”
That is one way to understand how things unfolded ten years ago. According to Bauer's schema, Clington underwent a “ceremonial laying down of power,” only to have it handed back with interest. No doubt that description accounts for some people’s experience of the events. But plenty of others found the whole thing to be sordid, cynical, and cheesy as hell – with the confession as less a process that strengthened socials bonds than a moment of relief, when it seemed like the soap opera might end.
So it did, eventually. But there will always be another one, perhaps involving some politician we've never heard of before. That is why The Art of the Public Grovel ought to be kept in stock at Trover’s, the bookshop on Capitol Hill, from now on. While not entirely persuasive in its overall analysis, it might still have non-scholarly applications.
On April 15, tens of thousands of people attended “tea parties” to denounce Obama’s economic policies – dressed up, some of the protesters were, like refugees from a disaster at a Colonial theme park. “No taxation without representation!” they demanded, having evidently hibernated through the recent election cycle. The right-wing publicity machine dutifully ground out its message that a mass movement was being born.
Suppose we grant the claim (however generous, however imaginative) that the tea parties drew 250,000 supporters. Compare that with the turnout, not quite three years ago, for the “Day Without an Immigrant” rallies, which involved somewhere between 1 and 1.5 million workers – many of them undocumented, which meant that their decision to attend involved some risk of losing a job or being deported. By contrast, last week’s anti-Obama protest made no real demands on its participants, and came after weeks of free and constant publicity by a major television network. Teabaggery also enjoyed the support of prominent figures in the conservative establishment. Yet with all this backing, the entire nationwide turnout for the tea parties involved fewer people than attended the immigrant rallies in a single large city.
The events of April 15 may not have marked the death agonies of the Republican Party. But they certainly amounted to a case of profound rhetorical failure: a moment when old modes of persuasion lost their power. The claim to speak for the concerns of “ordinary Americans” choked on its own pseudo-populist bile. The tea bags were less memorable than the cracked pots. It was hard to watch the footage without thinking that the next Timothy McVeigh must be a face in the crowd – and wondering if his victims ought to bring a class-action suit against Fox News.
Only just so much of the failure of the teabagging movement can be attributed to its instigators’ unfamiliarity with contemporary slang. A new book from the University of Chicago Press helps to clarify why alarmist denunciations of higher taxation and (shudder!) “redistribution of the wealth” just won’t cut it.
The publication for Class War? What Americans Really Think About Economic Inequality by Benjamin I. Page and Lawrence R. Jacobs could not be better timed. Page is a professor of political science at Northwestern University, while Jacobs directs the Center for the Study of Politics and Governance at the University of Minnesota. The authors conducted a national public-opinion survey during the summer of 2007 – just before the global economic spasms started – and they also draw on several decades’ worth of polling data in framing their analysis.
The question mark in the title is no accident. Page and Jacobs are not radicals. They insist that there is no class war in the United States. (This, in spite of quoting Warren Buffett’s remark that there actually is one, and that his class has been winning.) They provide evidence that “even Democrats and lower-income workers harbor rather conservative views about free enterprise, the value of material incentives to motivate work, individual self-reliance, and a generalized suspicion of government waste and unresponsiveness.” Their survey found that 58 percent of Democrats and 62 percent of low-income earners agreed that “large differences in pay are probably necessary to get people to work hard.”
But at the same time, they report a widespread concern that the gap between extremes of wealth and poverty is growing and poses a danger. “Although Americans accept the idea that unequal pay motivates hard work,” they find, “a solid majority (59 percent) disagree with the proposition that large differences in income are ‘necessary for America’s prosperity.’”
Not quite three quarters of those polled agreed that “differences in income in America are too large,” and more than thirds reject the idea that “the current distribution of money and wealth is ‘fair.’ ” The proposition that “the money and wealth in this country should be more evenly distributed among a larger percentage of the people” was supported by a large majority of respondents.
While inequality may sound like a Democratic talking point (at least during campaign seasons), the authors note that “solid majorities of Republicans (56 percent) and of high income earners (60 percent) agree that income differences are ‘too large’ in the United States. ... Majorities of Republicans (52 percent) and of the affluent (51 percent) favor more evenly distributing money and wealth.” A footnote indicates that the category of “high income” or “affluent” applied to “the 25.2 percent of our respondents who reported family incomes of $80,000 or more per year.”
While informed sources tell me that sales of small left-wing newspapers are up lately, Page and Jacobs are doubtless correct to describe the default setting of American public opinion as a kind of “conservative egalitarianism.” Citizens “want opportunities for economic success,” they write, “and want individuals to take care of themselves when possible. But they also want genuine opportunity for themselves and others, and a measure of economic security to pursue opportunity and to insure themselves and their neighbors against disasters beyond their control.”
And to make this possible, they are reconciled to taxation. “There is not in fact a groundswell of sentiment for cutting taxes. When asked about tax levels in general, only a small minority favored lowering them; most wanted to keep them about the same. Asked to chose among a range of estate-tax rates on very large ($100 million) estates, only a very small minority of Americans – just 13 percent of them – picked a rate of zero. The average American favors an estate-tax range of about 25 percent. ... Most American say the government should rely a lot on taxes they see as progressive, like corporate income taxes, rather than on regressive measures like payroll taxes. To our surprise, a majority of Americans even say that our government should ‘redistribute wealth by heavy taxes on the rich,’ a sentiment that has grown markedly over the past seventy years.”
And all of this data was gathered, mind you, well before jobs, housing, and retirement savings began to vaporize.
Nothing in Class War? quite answers the question of what political consequences logically follow from the polling data. Perhaps none do, in particular. What people want (or say that they want) is notoriously distinct from what they will actually bestir themselves to do. But it’s worth noting that Page and Jacobs found broad support for increasing the pay of low-income jobs, and drastically reducing the income of those who earn a lot.
“Sales clerks and factory workers should earn $5,000 more a year (about 23 percent more), according to the median responses of those we interviewed,” they write. At the same people, people “want to cut the income of corporate titans by more than half – from the perceived $500,000 to a desired $200,000. Imagine the reaction of ordinary working Americans if they learned that the CEOs of major national corporations actually pulled in $14 million a year.” Yes, imagine. Then something other than tea might start brewing.
Thanks to an edition now available online from the University of Michigan Library, you can easily look up the word "Revolt" in the great Encyclopedia that Diderot and d'Alembert compiled in the 18th century as part of their challenge to the pre-Enlightenment order of things. A revolt is an "uprising of the people against the sovereign" resulting from (here the entry borrows from Fénelon) "the despair of mistreated people" and "the severity, the loftiness of kings."
That certainly counts as fair warning -- and indeed, the Encyclopedia then shifts into wonkish mode, advising any monarch who happened to be reading that he could best control his subjects "by making himself likable to them... by punishing the guilty, and by relieving the unhappy." Plus he should remember to fund education. Won't someone please, please think of the children? While Louis XVI was by no means a total dullard, it seems this advice was wasted on him. (See also "Regicide.")
Scores of other occasions when "the despair of mistreated people" collided with severe, lofty, and unlikable authority are covered in The International Encyclopedia of Revolution and Protest, 1500 to the Present, just published by Wiley-Blackwell. With seven volumes of text, plus one of index, it covers upheavals on every continent except Antarctica, which tends to be pretty quiet. A digital edition is also available.
The work of an international team, the Encyclopedia is edited by Immanuel Ness, a professor of political science at Brooklyn College of the City University of New York. I have been reading around in it (as is my wont) for the past few weeks, when not following the latest tweets of resistance from within Iran, and wanted to ask Ness a few questions about his project. He responded to them by email. A transcript of the exchange follows.
Q: The title of this reference work raises a question. Protests do sometimes lead to revolution, of course, but none that I've ever been to ever has. Although both activities involve departures from (and opposition to) the routines of a given society, revolution and protest seem to be rather distinct processes. Why bring them together like this?
A: Revolution and social transformation are ultimate goals of protests that have arisen from collective grievances when leaders of states and societies are unable or unwilling to come to terms with abject inequality or injustice. Undeniably not all protests lead to revolutionary change and most protesters will not live to see the results of their actions. But when successful, they are the culmination of waves of social grievances against authoritarianism, social and economic inequality and injustices frequently expressed over decades and even centuries.
In the project, we document when protests lead to revolution as well as demonstrations that are manifestations of systemic injustices, even if a revolution did not result immediately thereafter. Thus, the Bolshevik Revolution consolidated the mass peasant and worker movements that peaked in the early 20th century. By contrast, in the Philippines, the powerful peasant protest movements have failed to lead to a transformation of society. While the goal of creating a democratic and equitable society remains unfulfilled there, mass protests persist against injustice.
Despotic systems of rule can stave off resistance, but the history of the last 500 years demonstrates that revolutionary change is an ineluctable process.
Q: OK, but that assumes revolution and protest are means to the ends of justice and equality. I'm not sure the entries in the Encyclopedia all confirm that notion. There is one on fascism, for example: a movement that regarded itself as revolutionary, as the sociologist Michael Mann has emphasized. And come to think it, the "Tea Party" events in the United States earlier this year were protests, of a sort -- but for the most part they were just media stunts. How do you square this messy reality with what sounds like a base-line conception of revolution and protest as the midwives of progress?
A: In developing the work, we debated whether to include fascism and totalitarianism as social movements, and decided they were necessary to maintain a definitive unbiased understanding of the history of protests and revolutions. In many instances, demagogues across the political spectrum have used populist rhetoric and forces to defend violence and repression.
We were cognizant of the manipulative use of revolutionary rhetoric and symbols by repressive leaders to maintain and achieve power. But these entries also examine the popular discontent and resistance to injustice and oppression. For example, throughout Europe, we focused on the proliferation of partisan opposition to Francoism, Nazism, Fascism, and Stalinism. Similarly in the Global South, we documented popular opposition that emerged in response to dominant religious, ethnic, class, and oligarchic rulers that have relied on violence to repress the powerless.
As sociologist James Scott exposed in his work on guerrilla movements, we documented cases of armed resistance that often redounded against the most powerless that are often caught reluctantly in the crosshairs of conflicts. But, in researching modern history, while we may disparage the motivations of some reactionary movements that were cynically manipulated by leaders, the vast majority of social protest was engaged by ordinary people seeking justice, equality, and social inclusion.
Q: Your project is ambitious; it seems to cover the whole world. Is this a matter of some editorial orientation towards the new global or transnational history, or was it simply a matter of the various movements and uprisings seeming to be interconnected and to influence each other (as the cross-references tend to show)? And why does the period it covers start in 1500?
A: In crafting the project, from the outset, we were mindful of utilizing an approach rooted in world history, which seeks a broader examination of human civilization rather than the geographically parochial and theoretically circumscribed western civilization that I consider fairly indifferent to the majority of people who live throughout the world. Geographically, the project is framed from the perspective of world history, which appreciates the dominant processes of empire, migration, capitalism, environmental change, political, and social movements.
Using a world history frame, we found that many political movements are interconnected as arcs of resistance that appear through the processes of imperial expansion and resistance in various spheres of influence. For instance, Latin American resistance to Spanish colonial rule and then the post-colonial era can be viewed through arcs of resistance against European dominance, slavery, racism, and then indigenous struggles for civil and equal rights that appeared through the last 500 years though emerge more decisively in various historical moments. For instance, the numerous essays on indigenous movements reveal that resistance throughout the Americas is reaching a new apogee in the contemporary era.
The decision to begin with protests and revolutions at 1500 recognizes the important historical and social science research that identifies the beginning of the modern era with the dramatic expansion of European imperialism, emergent capitalism, and slavery that significantly emerged and rapidly expanded as major forces throughout the world. The temporal organization owes much to the path-breaking historical work of Fernand Braudel and the Annales School and Immanuel Wallerstein and subsequent World Systems Theorists.
Q: Any revolution is an interpretive minefield. Even nomenclature provokes arguments. (You can't refer to the Khmer Rouge in Cambodia or the Shining Path in Peru without somebody calling you a running dog lackey of the imperialist bourgeoisie for using those terms, since the respective organizations preferred to be called something else.) How did you strive with the need for balance and objectivity -- given that in some cases the very possibility of them is in dispute? Your introductory note for the Encyclopedia says that each article was examined by two members of the editorial board, and that more than half of the submitted pieces were rejected on various grounds. Did that mean you had to leave certain subjects out?
A: Realizing balance and objectivity in each entry was one of the greatest challenges in editing the work. In part this involved seeking to include editors with erudition in their respective fields who had a range of perspectives on the history of protest and revolution. While contributors were enthusiastic about this work, writers with similar perspectives did not necessarily agree with all the interpretations and conclusions. It reminds me of the Italian adage on the divisions on the left: “amici nemici parenti serpenti” (friends can be enemies but families are like a nest of vipers). Of course, the editors engaged in a respectful exchange of views, but people had different interpretations of the events and organizations. The encyclopedia includes arguments with a variance of opinion, but through the referee process, I ensured that the historical facts were correct, even if people reached different conclusions.
The history of the Cold War demonstrates that the US and Soviet Union supported various movements for the purpose of maintaining influence, even if those movements engaged in horrible acts of genocidal violence and brutality. We document each of these cases candidly even if the facts are jarring to one’s political affinities. The US supported the Khmer Rouge in Cambodia even after the party killed some 2 million civilians and was deposed through armed intervention by Soviet-supported communist Vietnam. Even if the narrative histories are disturbing to Maoist supporters of the Khmer Rouge and other groups, it is crucial that we document the horrific unfolding of events.
Still, in the case of Cambodia, while it is easy to blame the Khmer Rouge for all the violence, history demonstrates that for more than 100 years, European and then U.S, colonialists bear responsibility for destroying a culture and society. So, I think that it is crucial to understand the imperialist antecedents that set the stage for militant separatism as is the case of the Khmer and the Shining Path.
Through peer review we selected the most erudite essays submitted on similar topics. Our goal was to have each entry provide an entry point into a historical field of enquiry through providing extensive references. Even in an eight-volume work, our objective was achieving historical significance while remaining comprehensive. We are updating this work next year to include any essays that are worthy.
Q: The situation in Iran has taken a dramatic turn over the past month. Is this a new stage of the revolution that began in 1978-79? A repudiation of it? Something provoked by the CIA? Part of a larger wave of protests stimulated (directly or indirectly) by the global economy? A predictable consequence of so much of the population being young and full of rising expectations?
A: Well, as a rule, we avoided entries on recent events in the last five-to-ten years since the jury is still out and it is impossible at this point to gain more than a general sense of the social forces on the ground. Thus, while some recent events are included, the passage of time is essential to understand the forces at play. As such we excluded some of the “color revolutions” as it is too soon to discern the various groups engaged in the contestation for power. I have noticed that some in the West have already dubbed the Iranian protests as the “Green Velvet Revolution,” almost if it is part of a branding process.
It appears that some sort of democracy is in play today, irrespective of the forces manipulating the protests for their personal or factional benefit. In the Encyclopedia one can learn that the democracy movement in Iran is not a recent phenomenon but endures from the decisive electoral victory of Mohammad Mossadegh in 1953, which represented a repudiation of British interference in Iran, a theme in the unfolding of events today. But the CIA-supported Shah Mohammad Reza Pahlavi’s 1953 military putsch went on to annihilate all democratic opposition. With all democratic forces crushed by the Shah, the Shiite Islamic clerics gained currency in the wake of the Iranian Revolution of 1979, just as Napoleon consolidated power after the French Revolution. No less, the popular will for democracy, equality, and popular control remains a significant force in Iran as in nineteenth century France. I think that while foreign meddling may have occurred, last month’s elections also reveal that the vision of a democratic and egalitarian society remains unvanquished.
Submitted by Cary Nelson on September 15, 2009 - 3:00am
It’s not easy to find a country in the Middle East whose universities honor academic freedom as we know it in most Western countries. Syria is a police state, comparable in some ways to North Korea or Myanmar. Iran has substantially become one. Egypt’s security police maintain a chilling presence on campus. The one country that maintains academic freedom is Israel, though of course not in the occupied territories. The comparative climate for intellectual debate in the region is too often ignored or slighted in discussions promoted by the various boycott movements. Simple intellectual honesty and political accuracy requires that every discussion of Israeli academic conduct be framed with a reminder of the regional context. Otherwise, inadequately informed audiences can become victims of demagoguery and an exceptionalist fantasy of Israeli monstrosity be promoted.
But the dynamic of debate in the Israeli academy has suddenly changed, and part of the debate is now being conducted in American venues. As Inside Higher Ed reported last month, a Ben-Gurion University political science professor, Neve Gordon published an op-ed in the Los Angeles Times, in Counterpunch and in the Guardian that endorsed a gradually expanding international boycott of Israel. In her response, also published in the LA Times, Ben-Gurion University’s president, Rivka Carmi ventured not only to castigate Gordon but also to redefine academic freedom in ways contrary to traditions of the American Association of University Professors.
With these very troubling ideas circulating in the United States, a clear need for the AAUP to address the story has arisen. That need is underlined by the fact that several American scholars writing about the Middle East have either lost their jobs or had their tenure cases challenged because of their scholarly or extramural publications. Statements by Carmi and other Israeli administrators thus have the potential to help undermine academic freedom not only in Israel but elsewhere. These are in every sense worldwide debates.
As the Inside Higher Ed story points out, Gordon has been critical of Israeli conduct for some time. His protest columns regularly appear in The Nation here in the United States and in the Guardian in Britain, and he is the author of a 2008 book called Israel’s Occupation, published by the University of California Press. All this work, including the LA Times column, falls within his areas of academic specialization. It ranges from scholarly publication to extramural speech. It is all without question covered by academic freedom. Carmi’s assertion that the LA Times column “oversteps the boundaries of academic freedom — because it has nothing to do with it” is wholly unsupportable.
Gordon’s column, it is worth noting, adopts a somewhat different persona than a number of his other pieces about Israeli policy. It is not, for example, a straightforward protest against Israeli military actions, but rather a confessional staging of his anguished journey toward boycott advocacy: “as I watch my two boys playing in the yard, I am convinced that it is the only way that Israel can be saved from itself.” He has, he is suggesting, had a breakthrough amounting to a recovery of his humanity, something thereby that his opponents implicitly lack. Throughout his 2009 responses to the Gaza invasion he has been moving in that direction, suggesting earlier that he opposed Israel’s military action despite Hamas rockets falling near the home he shares with his children, and arguing that the invasion is distorting the humanity of Israeli children.
I am willing to believe that this tactic is both genuine and a calculated rhetorical strategy, but in either case it has probably contributed to the intensity of the response, since it frames the LA Times piece not as political polemic but as a personal narrative about, as he puts it, “the question that keeps me up at night.” It thus has special power to move ordinary readers, and many of those readers here and abroad have responded passionately. Publishing the column in the United States, rather than Israel, was, to be sure, a deliberate provocation. It moved the argumentative terrain to that of Israel’s major military and political ally, and to the home of many of Israel’s and his own university’s most important donors. The affront was not simply in what he said but where he said it, though it is hardly the first time Israeli scholars of both the Right and the Left have brought these debates to American shores. The response both here and in Israel has been intense. As we saw in the Ward Churchill case, academic freedom does not always fare well in a public firestorm.
The public response called for a principled defense of academic freedom by President Carmi. Instead, she made herself part of the public outcry against Gordon. Worse still, Carmi has sought to narrow academic freedom and undermine the protections it offers, calling Gordon’s column an effort “to advocate a personal opinion, which is really demagoguery cloaked in academic theory.” The notion that a political scientist cannot combine academic arguments with conclusions, theory with advocacy, strikes at the heart of the principle that academics have the right to advise the public and seek an impact on public policy. As Matthew Finkin and Robert Post argue effectively in their 2009 book For the Common Good: Principles of American Academic Freedom, faculty speech in scholarly venues and in the classroom cannot be protected (and cannot fully serve society) if faculty members are not also free to deploy their expertise in the public sphere without fear of government or university reprisal.
Gordon calls for a boycott of the state of Israel, thereby advocating something much more comprehensive than the focused boycott of academic institutions that the AAUP opposes. Some Israeli commentary claims Gordon’s remarks amount to treason, a dangerous and overheated accusation that responsible opinion must reject. Gordon is in fact performing his job as a political scientist and following reasoned moral and professional standards in doing so. Even if he were not a political scientist, he would have the right to say these things, but as a political scientist who writes about Israeli policy he has a disciplinary justification to offer advice and opinion in the public sphere. But academic freedom should protect still more extreme statements than those Gordon has made; it should hold harmless a faculty member who argues that his or her country has no moral or political legitimacy and thus no right to exist.
Extramural statements by faculty are especially vulnerable in times of national crisis. The United States can hardly be said to have protected them during World War I or in the McCarthy period. Many in the Middle East, including many Israelis, consider themselves to be in a permanent state of war. In many area countries Gordon would already be imprisoned or worse. In Israel his right to public speech is being eloquently defended by many both within and without the academy — but not, deplorably, by his own university administration. On several Israeli campuses petitions supporting Gordon have circulated, and a number of scholars have come to his defense. Once again, such robust debate hardly typifies all area countries.
Since Gordon is tenured and cannot be fired, Carmi instead bellowed that he “has forfeited his ability to work effectively within the university setting.” A few days before publishing her LA Times piece, Carmi had already urged Gordon to resign, a view endorsed by Ben-Gurion University’ rector and faculty member Jimmy Weinblatt.
On August 28th, Ilana Curiel reported in Israel News that Carmi and Weinblatt were also exploring options for removing Gordon as department head. There, it should be clear, Ben-Gurion administrators are on more secure ground. In the United States a faculty member serving as an administrator -- including a department chair -- is essentially an at-will employee. He or she can be removed from an administrative post and returned to the faculty if they displease their supervisor. In a public case like this one, of course, Carmi will be contemplating public fallout from a decision to force Gordon out of his chairmanship, so a good deal more than simple line administrative authority is at stake.
Indeed it has been clear from the outset, as Carmi openly acknowledged in an August 27th letter to Ben-Gurion faculty, that donor anger is a major factor in her attacks on Gordon. Inside Higher Ed reported that Amos Drory, Ben-Gurion’s vice president for external affairs, wrote to complaining donors to say “the university is currently exploring the legal options to take disciplinary action.” It is not the first time fund-raising priorities, not principle, have shaped administrative understandings of academic freedom, but that does not blunt the lesson that this represents one of the most severe threats to academic freedom.
Carmi’s own academic freedom, one may note, would have allowed her to reject Gordon’s views while asserting his right to hold them. That is, in effect, what Gordon recommended: “She has to cater to the people that provide the money, so a strong letter of condemnation of my views would have been fine with me. But there’s a difference between saying you disagree wit me, and threatening me.” Instead she mounted an international assault and sought to gut academic freedom in the process. While Gordon has job security, his vulnerability to myriad other forms of internal reprisal is obvious. There are many kinds of research support and institutional recognition that require administrative endorsement. More serious still is the message Carmi has sent to untenured and contingent faculty: exercise your academic freedom at your peril. The chilling effects at Ben-Gurion University have hardened into a deep freeze. There is reason for principled faculty to question the president’s ability to serve in her position.
Cary Nelson is national president of the American Association of University Professors.
The term “neoconservative” is now routinely applied to any right-wing policy wonk inside the Beltway or the mass media. This usage reflects no understanding of the movement's history – or, just as often enough, a largely delusional notion of it, based on third-hand guesses about the influence of Leo Strauss and Leon Trotsky. Such rumors tend to be circulated by people who would be hard pressed to name a single book by either of them, let alone to grasp that their ideas were utterly incompatible.
Properly used, the label applies to a rather small cohort of social scientists and journalists who, during the 1950s and ‘60s, became anxious about Communist influence abroad – but equally uneasy at movements for black power, women’s liberation, and (a bit later) gay rights within the United States. “We regarded ourselves originally as dissident liberals,” wrote Irving Kristol, who died last week at the age of 89.
Kristol is often, and rightly, called the godfather of neoconservatism (in something akin to the Marlon Brando sense). An editor at the CIA funded journal Encounter during the 1950s, he was later one of the founders of the journal The Public Interest, and a columnist for The Wall Street Journal.
“We were skeptical of many of Lyndon Johnson’s Great Society initiatives," he wrote in Neoconservatism: The Autobiography of an Idea (Free Press, 1995), "and increasingly disbelieving of the liberal metaphysics, the view of human nature and of social and economic realities, on which those programs were based. Then, after 1965, our dissidence accelerated into a barely disguised hostility. As the ‘counterculture’ engulfed our universities and began to refashion our popular culture, we discovered that traditional ‘bourgeois’ values were what we believed all along, had indeed simply taken for granted.”
Translating this self-perception of themselves as the very guardians of civilization into a politically efficacious movement was not a swift or simple matter. Nor was the Republican Party its obvious or immediate destination. With Kristol as its helmsman, the movement built up a network of magazines, think tanks, and mass-media perches for punditry. These amounted to a counter-counterculture. Thirty years ago, Peter Steinfels’s intelligent and well-researched book The Neoconservatives: The Men Who Are Changing America’s Politics provided a group portrait of the movement on the eve of Ronald Reagan’s election – an ideological cohort with one foot planted in each party.
This wide stance cannot have been comfortable. And in any case, the political realignment of 1980 settled the matter. Reagan was, Kristol wrote in 1995, “the first Republican president since Theodore Roosevelt whose politics were optimistically future-oriented rather than bitterly nostalgic or passively adaptive. The Congressional elections of 1994 ratified this change, just as the person of Newt Gingrich exemplified it. As a consequence, neoconservatism today is an integral part of the new language of conservative politics.”
Indeed it is, for better or worse. But not from the sheer intellectual firepower alone. As that flourish of tribute to “the person of Newt Gingrich” may suggest, the progress of neoconservatism has also involved cultivating the courtier’s grace. There is a knack for knowing just when to apply one’s lips to the fundament of power.
In the early 1990s, it was still possible for Gary Dorrien, now a professor of ethics at Union Theological Seminary and of religion at Columbia University, to write a critical but sympathetic book called The Neoconservative Mind: Politics, Culture, and the War of Ideology (Temple University Press, 1993) that treated it primarily as a movement of ideas, locked in struggle against the prevailing drift of American society. It would be difficult to write about the intervening years in similar terms. Neoconservatism itself became part of that prevailing drift.
Whatever elan its intellectuals once displayed in challenging accepted ideas and trends turned into the kind of second-hand energy available from just going with the flow. This is not good for anyone's critical faculties. A new book by Sam Tanenhaus called The Death of Conservatism, published by Random House, spells out some of what has happened.
Tanenhaus, the editor of The New York Times Book Review, might fairly be called a fellow-traveler of neoconservatism, if not a full-fledged member of its counter-counterculture. His criticism is presented, not in the spirit of polemic, but with the tone of someone grappling with home truths. “During the two terms of George W. Bush,” he writes, “conservative ideas were not merely tested but also pursued with dogmatic fixity, though few conservatives will admit it, just as few seem ready to think honestly about the consequences of a presidency that failed not because it ‘betrayed’ movement ideology but because it often enacted that ideology so rigidly: the aggressive unilateralist foreign policy; the blind faith in a deregulated, Wall Street-centric market; the harshly punitive ‘culture war’ waged against liberal enemies.”
There is a considerable nostalgia to Tanenhaus’s evocation of an earlier period, when argument “about the nature of government and society, and about the role of politics in binding the two” conducted by “a small group of thinkers and writers” whose ideas “then ramified outward to become a broader quarrel that shaped, and at times defined, the political stakes of several generations.”
But now this is all just a memory. “Today’s conservatives resemble the exhumed figures of Pompeii,” he writes, “trapped in postures of frozen flight, clenched in the rigor mortis of a defunct ideology.”
I picture them clutching signs that read “Keep the government out of Medicare,” in Latin.
Important as it was, the campaign of Barack Obama was not the only history-making element of the 2008 presidential election. With Sarah Palin, we crossed another epochal divide. The boundary between reality television and American politics (already somewhat weakened by the continuous "American Idol" plebiscite) finally collapsed.
Her campaign's basic formula was familiar: members of an ordinary middle-class family turn into instantly recognizable national celebrities while competing for valuable prizes.
But like any contestant at this late stage of an already decadent genre, Palin seemed much less conscious of the stakes of the game (power) than in how it let her broadcast her own sense of herself.
At that level she could not lose – the ballot box notwithstanding. I’m not sure what Sarah Palin’s favorite work of postmodern theory might be (all of them, probably) but she seems to take her lead from Jean Baudrillard’s Seduction. Other political figures use the media as part of what JB calls “production.” That is, they generate signs and images meant to create an effect within politics. For the Baudrillardian “seducer,” by contrast, the power to create fascination is its own reward.
Watching Palin respond to questions about her book Going Rogue (or not respond to them, often enough) is, from this perspective, no laughing matter. She grows ever more comfortable talking about herself. If no more capable of simulating knowledge of public issues, she is getting her story straight, more or less. And this matters. For now she does not have to be accurate, just coherent. She is consolidating her presence, her "brand." Teams of professional ideologists can feed Palin her lines later.
Is this too cynical? I fear it may not be cynical enough. For it assumes that Palin will eventually be integrated into her party’s apparatus and turned into a mouthpiece of old-school Republican electoral politics -- a basic platform of tax cuts for the rich and unregulated handgun ownership for everybody else.
That is not the only possible outcome, however. Someone with Palin’s developing command of the arts of media seduction -- and whose knack on that score is largely a matter of her performative maverickiness -- has the potential to change the rules of the game.
The editors of a new collection of essays called Going Rouge – a punning title that belies its basic seriousness – recognize that in Palin we may have something more than a new celebrity. “No one speaks of McCainism or Doleism,” write Richard Kim and Betsy Reed in their introduction, “but Palinism signals not just a political position but a political style, a whole way of doing politics.”
The volume itself is the product of a whole new way of doing serious nonfiction. It is the first title from OR Books, which has a staff, so far, of two people. One of them is Colin Robinson, who roughly this time last year lost his job as an editor at Simon and Schuster. He tells me that OR now has two offices. One is the coffee shop where he and his partner John Oakes (co-founder of independent publisher Four Walls Eight Windows) work in the morning. The other is the bar they go to at night.
When we talked earlier this year, Robinson described his idea for a new kind of trade publishing. The usual approach is to print an enormous number of copies of a title to get an economy of scale, then give large discounts to chain bookstores – leaving almost no money to promote it. For serious nonfiction, this was a miserable system. Any money for advertising tended to go to publicize, say, The Stephen King Cookbook or suchlike. (Palin's autobiography is an example of a book enjoying just such heavy promotion.)
His plan, Robinson said, would be to publish a few titles that he thought were worthwhile, making them available as e-books and print-on-demand paperbacks -- and then concentrate on advertising them online, among other ways via video. So far you have to buyGoing Rouge directly from the publisher (it sold about 4,000 copies before its official publication date on November 16) but it will be available for order from bookstores next month.
Most of the chapters are reprints from magazines such as The New Yorker, The New Republic, and The Nation; a few first appeared on Web sites. The list of contributors is a Who’s Who of left-leaning journalists and commentators. Max Blumenthal, Juan Cole. Naomi Klein, Rick Perlstein, and Katha Pollitt, among others. There are a few critical evaluations of Palin by her fellow Republicans, including one by a conservative columnist who suggests that she makes George W. Bush “sound like Cicero.” The editors also reprint a number of interviews with and public statements by Palin herself – among them, selections from her Twitter and Facebook writings.
A celebration, then, it is not. But Going Rouge does represent an acknowledgment of Palin’s importance, ambiguous though the precise nature of that importance may be. It cannot be reduced to her short-term plans. She remains circumspect about them, for now anyway. But she is busy demonstrating a strong intuitive grasp of how mass media can be used – among other things, to change the subject.
An example is the item Palin posted on Facebook in early August: “The America I know and love is not one in which my parents or my baby with Down Syndrome will have to stand in front of Obama’s ‘death panel’ so his bureaucrats can decide, based on a subjective judgment of their ‘level of productivity in society,’ whether they are worthy of health care. Such a system is downright evil.”
This was fantasy. But it was effective fantasy. To borrow again from Baudrillard, it seduced -- abolishing reality and replacing it with a delirious facsimile.
The editors of Going Rouge give Palin credit for the rhetorical power generated by her words, and perhaps also by her canny use of the social-networking venue: “With remarkable economy of prose, Palin cast health care reform as an assault on the country, put a face on its supposed victims (her baby Trig), coined the expression ‘death panel’ (linking it directly to Obama), raised the specter of euthanasia in the service of a state-run economy, and rallied the troops around a fight against ‘evil.’ In short, she personalized, popularized, and polarized the debate. Never mind that Democratic health care reform bills merely funded optional end-of-life consultations that had heretofore been almost universally acknowledged as a good. (Indeed, Palin herself once championed them in Alaska.)”
Well, consistency is, after all, the hobgoblin of tiny minds. Sarah Palin is playing the political game on a much grander scale -- with rules she may be rewriting as she goes.
With a first printing of 1.5 million copies of her book, I don’t know that the intervention of an upstart press can pose much of a challenge. But OR Books deserves credit for trying. Someone has to speak up for reality from time to time. Otherwise it will just disappear.