A variety of scholars have weighed in on the current debate about American political civility, noting brutal fights on the floor of Congress in the 19th century, nasty mud-slinging of U.S. presidential campaigns throughout history, and other less than impressive aspects of our cultural past. And of course, they are correct that incivility is nothing new. What makes incivility seem omnipresent is the communication environment of our day: the pressure on our 24/7 journalists to fill airtime, new venues for citizens to state their opinions -- thoughtful or lunatic -- online, and a culture that encourages unabashed self-expression.
Who thought we would see the day when CNN news anchors would read incoming “Tweets” from viewers to us in serial fashion, opening an international information channel to faceless, opinionated people with no qualification for broadcasting except time on their hands?
It was difficult not to be appalled by the excesses of campaign rally crowds during the 2008 presidential election, the displays at some health care town hall meetings this past summer, and Congressman Joe Wilson’s outburst ("You lie!"). Students of American political history put these events in context, easily, because incivility is manifest in a variety of ways during different eras. But that scholarly response seems a very unsatisfying reaction to the ill-mannered eruptions, name-calling, and sheer meanness that we find on television and our favorite internet sites, now on a regular basis. The incivility is still worrisome, even if historically predictable, and we look for a way to cope with it.
The scholarly literature on trends in civility is mixed in its conclusions, with some arguing for either a bumpy or near-linear increase of incivility in both the United States and Western Europe, others arguing that we are actually more polite now than ever in public, and still others – like myself – who posit that civility and incivility are both timeless strategic rhetorical weapons. Some people are better at using these tools than others, to achieve their goals, but a macro-historical argument about collective civility is probably a bit of a stretch and difficult to demonstrate empirically, to say the least.
The “incivility as strategy” approach fits our current circumstances, particularly the health care reform debate, fairly well. The political right now draws on Saul Alinsky’s mid-century tactics on behalf of the poor in Chicago for instruction on town meeting behavior, and the political left tries to come up with brutally effective broadcast advertisements, guided by the Republican “Harry and Louise” spots that undermined the Clintons in the 1990s. Civility and incivility are weapons, as are facts, logic, demonstrating, teaching, striking, and all the other means of persuasion one finds in the arsenal of public expression.
But perhaps the essential issue is that incivility is just more interesting than is measured, calm discussion. Incivility is intriguing, almost always. It can be downright exciting, as when blows are exchanged at a town meeting, and replayed like a train wreck on YouTube by millions of viewers. And who is not fascinated by citizens (apparently on the same side of the issues) marching with pictures of the president portrayed as both Lenin and Hitler? It is bizarre, and also hard to take our eyes off of.
As President Obama put it on a recent broadcast of 60 Minutes: "I will also say that in the era of 24-hour cable news cycles that the loudest, shrillest voices get the most attention. And so, one of the things I'm trying to figure out is, how can we make sure that civility is interesting. And, you know, hopefully, I will be a good model for the fact that, you know, you don't have to yell and holler to make your point, and to be passionate about your position."
Obama might, over the longer term, fight incivility in part by maintaining his own preternatural calm throughout incessant appearances on television. But my sense is that exciting nasty discourse needs to be matched by something that gets the blood boiling just as well, or incivility will indeed triumph in any given situation.
Soaring rhetoric from President Reagan in the past, Obama today, and others with their talents in the future may be passionate, but as rhetoric soars, it does not always argue. Great oratory gets steamed up when it expresses hopes and beliefs (e.g., Americans cannot always support other citizens financially, or health care is an inalienable “right”), not when it argues for, say, the "public option" or insurance cooperatives. So, the trick is to find mechanisms for public policy discussion that are exciting, passionate, creative, and thoughtful all at the same time.
From the ancient philosophers onward, a variety of academics across disciplines have tackled the questions of rhetoric, persuasion, political debate, and civility, and as a result, we can offer a tremendous amount of theorizing and empirical research on these topics. But that complex material simply will not penetrate or guide contemporary American public discourse any time soon. And pointing to our campuses as models -- underscoring the ways we debate and argue with respect for each other every day (or nearly every day), in classrooms, faculty meetings, symposiums, and beyond – doesn’t go very far either. It’s hard to explain unless you have lived it: Imploring political leaders or fellow citizens to look to universities as exemplars of "cultures of argument" will not work because it is too experiential in nature.
However, colleges and universities do offer more practical ideas and tools to American lawmakers, journalists, and interest group leaders, that are far more helpful and productive. There is the wonderful work by Gerald Graff and others on teaching argument and conflict, demanding that our students know how to make an argument in class, in papers, and as they go about their lives. As the years pass, these scholars have made a difference, and my bet is that their impact will be even greater as a younger generation of faculty learn how to incorporate argument into their teaching, no matter the discipline or class size.
But even more accessible than these pedagogical paradigms and tools is formal debate itself, from policy debate modeled by national championship college and university teams, to Lincoln-Douglas-style debate, and a variety of other formats that have emerged across nations. While I was only a high school debater myself, and I'm now far outside both the high school and collegiate debate “circuits," it is clear to me that if we can train our students – not only our student leaders and teams – in debate, and make it a stronger presence on campuses, we might build a more constructive public discourse with generational change. Anyone can debate – learn to make an argument, marshal evidence, rebut – with some instruction and practice. And these skills, once gained, can be translated into the sorts of forums our students will eventually find themselves in: workplace meetings, the PTA, community organizations, and in some cases, city halls and legislatures. We do not need to train a generation of lawyers, but we do need to train a generation of students who can simulate what attorneys and great debaters do as a matter of course.
There are many people, organizations and institutions that teach debate either for the classroom or for regional or national competitions, in the United States, abroad, and online (see here and here). But the basic elements are the same across formats: Argument, evidence, forced reciprocity and dialogue, equal time, and mandatory listening. These are precisely the elements missing from much of the contemporary debate about health care reform, and I predict they will be absent as well from the worrisome debates coming next, immigration policy reform in particular. These aspects of communication are the very building blocks for civility, and at this point at least, we have a deficit of them.
Those of us who study political communication used to hope – and perhaps many scholars still do – that the best American journalists would educate the public on the quickly-evolving policy issues before us, leading reasoned debate through newspapers and television programs. Some journalists give it an honest try, when they hold jobs that allow it. And we can locate a few lone heroes among the Sunday morning talking heads, if we wade through all the worthless talk of presidential popularity polls, embarrassing gaffes, and who is spinning whom. But with the financial struggles and disappearance of so many news organizations, it is difficult for any journalist – no matter how talented – to get our attention.
They compete, for better or worse, with bloggers and Twitterers, and wise information “gate-keepers” are leaving us with every passing year. It may be up to academic leaders to take on unexpected and much greater responsibility in shaping citizens, not just in our conventional ways of teaching liberal arts or specialized disciplinary knowledge. Of course we shape citizens already, but we must also figure out how to train our students for the rough and tumble they will find after they leave our contemplative campuses. It’s a jungle out there in the world of American political discourse, and our students will need to give it all some logical structure, and simultaneously invent new forms of civility for their generation.
Many colleges and universities teach public speaking at present, and some have made introductory courses mandatory in core curriculums or as part of major requirements in fields like Communications. Why not, similarly, consider formal debate training, as a mandatory – or at least greatly encouraged – aspect of a college curriculum? To my mind, it should at least be a consideration of all educators watching our national political debate in the fall of 2009. We can shut off CNN in disgust and sit in awe of some truly horrendous town meetings. But we can help things somewhat, by teaching our students both how to argue and why it is exciting to do so. College and university faculty can enhance the long-term health of political communication by focusing on the development of argumentation, in whatever form fits their courses, disciplines, institutions, and community.
Along these lines, Model United Nations is another excellent tool for teaching students how to argue respectfully and take positions they would not normally take. These programs demand more of students in a course than debate might, but as with teaching debate (in person or online), there is extensive support for instructors available for free on the Web. As with debate, the general structure of Model U.N. can be altered to fit a particular curricular goal or theme. For example, in teaching the Middle East conflicts and issues, the National Council on U.S.-Arab Relations supports a network called "Model Arab League" at both the high school and college levels. And of course, more ambitious faculty can try to fashion entirely new stakeholder-based deliberation programs, using the general rules of more established activities like Model U.N.
Our students will not – no matter how compelling and well-trained – be able to demand that their local school board follow the tight structure and rules of policy debate or of congress (on a good day). That is an absurdity. But they will have an ideal-typical model for what logical, evidence-based debate should look like, and will inevitably bring some elements of it with them to whatever table at which they find themselves. I have found in so many groups and organizations that people are generally starved for rules about how to conduct their discussions – a rationalized (in Weber’s sense) approach that might bring fairness, civility, and progress. The point is that we need to give students exemplars, somehow, so they can lead others toward structures for talking, listening, and constructive exchange, based on mutual respect and decency. And they might even bring civility to the internet, developing new ways to harness free communication in the service of democratic talk.
The truth is that while Americans pioneered a kind of democracy, we have never been particularly good at debate -- not during Alexis de Tocqueville’s era, and not today. We certainly don’t seem to have the patience for it. There have been some intriguing presidential campaign exchanges here and there, memorable moments in congressional hearings, and of course many moving orators in mainstream politics and outside of it. But we will never see the sort of civil, thoughtful, inventive debate that enables good public policy making until we inspire the young adults in our midst how to pursue it themselves.
Susan Herbst is chief academic officer for the University System of Georgia and professor of public policy at Georgia Institute of Technology.
Each spring I have the privilege of hosting at my home a group of students who have been honored with what we call the Presidential Leadership Award. These are graduating seniors who have demonstrated extraordinary leadership in their academic, co-curricular, and service work at the college. Often these are students I have come to know pretty well, so I was struck last April when not one but two of them asked me exactly the same question: What is it that you do, anyway?
The question was asked not as a challenge but out of a genuine sense of curiosity. They knew that, being the president, I must do something, and that given the size of my office and my residence in a college-owned house, it must be something reasonably important. They knew that if they were in my general vicinity they were likely to get their picture taken; that I served as a kind of collegiate maitre d’, welcoming everyone from new students to visiting dignitaries to campus; and that my name appeared in the Mac Weekly more often than most, particularly on the opinions page. But whereas they could define pretty precisely the jobs of their professors and their coaches and their residence hall directors, they could not define mine.
There is a myth about the evolution of the American college presidency that runs more or less like this: "Back in the day" college and university presidents were figures of towering intellect who spent comparatively little time worrying about such mundane and vaguely unsavory things as fund raising and balancing budgets but instead provided visionary leadership for their institutions and, even more broadly, spoke with effect to the great issues of the day. Like many myths, this one has embedded within it at least some small element of truth. There have been in fact a handful of college presidents who have functioned as visible public intellectuals, and as the business of running a college has become more complex, the need for presidents to attend to matters financial has grown accordingly.
If the past year has taught us anything, it is that not only college presidents, but business people and politicians and individuals of every stripe should pay very careful attention to the advice offered to Dickens’ David Copperfield by the irrepressible Mr. Micawber: "Annual income twenty pounds, annual expenditure nineteen and six, result happiness. Annual income twenty pounds, annual expenditure twenty pounds and six, result misery." It is a president’s job to avoid institutional misery.
But anyone who believes that this responsibility is new, or that college presidents used to be free of such concerns, is deeply mistaken. Here is one president lamenting the financial pressures of the job: "What I was sent here for is an inscrutable mystery. I am too diffident to wrestle with men about money or with financial problems so vast….If [a college president] can read and write, so much the better, but he must be able to raise money." The voice is that of James Wallace, Macalester’s fifth president, writing in 1895.
The reality is that college presidents have always had to be concerned with what someone has termed both the business of education, or the work of preparing students to be successful in their personal, professional, and civic lives, and the education business, or the work of ensuring that the institution can pay its bills. Bill Bowen, the former president of Princeton University, recalls being told by a Nobel Prize winning physicist on his faculty that "excellence can’t be bought … but it has to be paid for."
The question of the extent to which a college president should function as a public intellectual is more interesting and the answer, in my view, more nuanced. Few would argue with the assertion that within the college community the president should provide intellectual, ethical, and even temperamental leadership.
The faculty is responsible for shaping the curriculum and carrying out the core educational work of the college; the president can aid that work by articulating, clearly and repeatedly, the context within which it takes place and the ends to which it is directed.
Further, a college president should be expected to model those attributes that are to a learning community most essential, including clarity of language and thought, civility, scholarly curiosity and rigor, openness to views that are different from one’s own, and an unwavering commitment to ethical behavior: in other words, everything that we have not seen manifested at the recent town hall meetings on health care reform. Being human, college presidents will sometimes fail to meet these exalted standards, but every day and in every setting they should try. This is important because fairly or not, members of the community will extrapolate from the actions of the president a sense of what is valued and accepted by the college.
For instance, if the president attempts to demonstrate regularly that she or he is the smartest person in the room — a habit that most of us acquire quickly in graduate school — others will assume that this is the appropriate goal to chase in an educational setting, whereas for me a more appropriate goal is for each of us to behave as if we are the person in the room with the most to learn. It's amazing how much better that works if one’s goal is actually to learn something.
Things get trickier when the question becomes the following: what role should a college president play in relation to the many political and social questions that extend far beyond the borders of the campus and in many cases divide our communities and our culture? This is, I confess, perhaps the single most difficult dilemma with which I wrestle in my position. As those who know me well will confirm, I am by nature a person with strong opinions and a preference for expressing them directly: after all, I grew up in New York City, which is not a place known for its delicacy and decorum. At my family’s dinner table, if you weren’t shouting, someone would ask if you were feeling OK. I am also enormously frustrated by the absence of thoughtful public discourse in this country and believe that those who are educated and who embrace rather than mock the life of the mind have a responsibility to raise the level of that discourse.
And yet — fairly or unfairly, reasonably or not, the views expressed by the president are typically seen as the views of the college that she or he represents. My personal desire to express publicly my opinions on controversial issues often comes into direct conflict with my professional responsibility to preserve academic freedom and an atmosphere of openness to all reasonable perspectives that are civilly stated. And in the end that professional responsibility must take precedence. Again I turn to Bill Bowen, who wrote that "the university should be the home of the critic, welcoming and respectful of every point of view; it cannot serve this critically important function if it becomes the critic itself, coming down on one side or another of controversial issues…. It is the freedom of the individual to think and speak out that is of paramount importance, and safeguarding this freedom requires that the institution itself avoid becoming politicized."
There is no truth about Macalester in which I believe more deeply and, simultaneously, to which it is more challenging for me to adhere. But my conviction is that in agreeing to become a college president, a willingness to be measured and restrained in one’s public statements — to accept one’s status as a walking, talking logo — is part of the deal. There is no principle that has generated more debate on campus, whether about boycotting various corporations whose policies are controversial or taking a stand on the war in Iraq or actively supporting a reduction in the legal drinking age. It is to wrestle with such difficult matters that college communities exist, and it is through such discussion that we approach closer to some kind of wisdom.
Now, this does not mean that I believe that I should say nothing about anything, though I’m sure there are those who think I do a pretty darn good job of saying nothing about everything. It means that I believe that I need to pick my spots with great care. In general, when I speak to issues of public significance, I try to focus on those that I take to be so central to the educational mission of Macalester as to require the college to make a decision about its policies and practices. Admittedly the line here is very fuzzy, and what one person considers central to our educational mission, the next might consider irrelevant. But life is composed of such ambiguities.
My point might be made more clearly through the use of a few examples. It seems to me inappropriate for me in my role as president to endorse a particular party or candidate in the race for the governor of Minnesota. I have opinions — boy do I have opinions — but to express them very openly runs the risk of suggesting that Macalester is taking an official, institutional position and even of jeopardizing our status as a tax-exempt organization. I consider it my civic duty to vote and my right as an individual to contribute from time to time to the campaigns of particular candidates, but I am typically reluctant to make public endorsements. Similarly I do not believe that I should be staking out through my public remarks Macalester’s position on health care reform or cap and trade or military intervention in Afghanistan. These are however precisely the issues that all of you should be studying, arguing about, and taking action on through your lives as students, scholars, and global citizens. My job is to ensure that Macalester provides the environment within which you can do these things, rather than to delineate in each instance the proper "Macalester" stance.
On the other hand, I have spoken out both individually and on behalf of Macalester on issues including the importance of diversity to higher education and the necessity for all of us to practice and model environmental responsibility. For me, these issues are inseparable from and directly relevant to our work as a college and therefore ones that I can and should address. Some might contend that the latter topic is one that falls outside the standards I have defined; my response is that the reality of climate change has passed beyond the point of reasonable debate and has become an essential component of responsible citizenship, whose encouragement, at least at Macalester, lies at the core of our mission.
So we have taken such public actions as signing an amicus brief in the University of Michigan affirmative action case and becoming early signers of the College and University Presidents’ Climate Commitment. I would be prepared to contend that not to take stands on issues of this kind — stands whose particular form will rightly vary from institution to institution -- would actually impair our ability to carry out our educational work and therefore that they are issues to which I should speak, both individually and as a representative of Macalester.
Of course there are also issues such as genocide, the spread of poverty and disease, and the violation of basic human rights against which institutions such as ours can take emphatic stands, though even in these instances the articulation of a proper response can become problematic and is often better consigned to the open realm of public discourse than to the more restricted realm of a presidential decree.
Again, is the line between issues on which colleges and universities should take a position and those that they should leave open to communal debate perfectly clear? Absolutely not. Is it important for anyone in my position to recognize that such a line exists, to decide on which side of it any particular issue falls, and to be scrupulously careful in making the distinction? To that question my answer is yes.
Brian Rosenberg is president of Macalester College. This essay is adapted from this year's opening convocation address.
Last year, college students were the most fervent supporters of Obama’s bid for the presidency. Now, the U.S. Senate has taken up what Obama says is the defining legislation of his term: health care reform. Oddly, the voice of college students is nowhere to be found in the national debate -- most likely because the activist set does not realize how much is at stake for them personally.
It might seem that college students have little to worry about. Most full-time students in fact have health insurance right now. Two-thirds are covered through their parents’ insurance plans and another 7 percent are covered through a university plan, according to the Government Accountability Office.
But one thing is guaranteed: College students with the good fortune to have insurance right now will lose their current coverage soon after graduation. For those who are insured through their parents’ plans, they will be dropped after they leave school. And for students on a university plan, they will soon learn that the loyalty of their alma mater has limits: It does not extend to a lifetime of affordable health care.
What is a student to do? The current answer, unfortunately, is to get a job. And not just any job: a stable, full-time job with an employer that will offer them health insurance. That, in fact, is the bizarre reality of health care in the United States. We currently live in a system that presumes “employer-sponsored insurance,” in which you must have a steady paycheck before you can get affordable health care.
As college students surely know, however, the prospect of steady full-time work is looking worse than ever. The unemployment rate for young adults is up from 10 percent last year to a whopping 15 percent this year. For recent grads who have the good fortune to land a job, they will be more likely than older workers to work for small companies. But small employers are also the least likely to offer health insurance, and more small companies have dropped health insurance for their workers every year since 2000.
The alternative is to buy insurance individually rather than to bother with an employer. For recent grads in particular, it’s a pity that the cost of these plans is rising faster than wages. As workers just starting their careers, college students will most likely have the lowest earnings of their lifetimes. Short of a steady job or enough money and know-how to navigate the private insurance market, the Class of 2010 will get insurance under the current system only if they are poor or disabled. Only then would they get scooped up by a government safety net program: Medicaid. But it’s not clear that any college students aspire to that fate.
This scenario does not even take into account the existential question that college seniors may be pondering right now: whether they even want to follow the straight-and-narrow path from college to traditional career. Entrepreneurs, activists, travelers, farmers, parents, artists -- be warned: All of those opportunities would require verve, intelligence -- and the willingness to sacrifice good health if need be. It is little wonder that people in their 20s are more likely to be uninsured than any other age group in the U.S. today.
Right now, the U.S. Senate is debating a bill that could help change this situation for college students. But many senators are not yet convinced that Americans really want health care reform. Do college students?
It is a good time for students to think through their answers. For one thing, Obama is calling for a vote on the Senate bill before Christmas. No doubt, health care bills are complicated and boring -- not exactly end-of-term pleasure reading. But students might start with a blog by the director of the White House budget office, Peter Orszag.
Heading into winter break, students also have the chance to think through the health care debate on a more personal level. They can find out when their current coverage is going to end. For those on a parent’s plan, it may come as a shock to find that they will lose coverage on Commencement Day.
Over the holidays, college students can also chat up their grandparents and other older relatives. Polls consistently show that people over the age of 65 are the most resistant to health care overhaul -- in large part because they want to protect their Medicare coverage.
College students do have a major stake in the outcome of the health care debate. So whether on campuses or on their own, students would be wise to think through the issues -- not for Obama’s sake this time, but for their own.
Laura Stark is an assistant professor of sociology and science in society at Wesleyan University; she co-wrote this essay with several Wesleyan juniors and seniors: Suzanna Hirsch, Samantha Hodges, Gianna Palmer and Kim Segall.
When considering the political scene of the moment, it is difficult not to see how historical allegory plays an important role in the public spectacle known as the Tea Party movement. From the name itself, an acronym (Taxed Enough Already) that fuses current concerns to a patriotic historical moment, to the oral and written references by some of its members to Stalin and Hitler, the Tea Party appears to be steeped (sorry) in history. However, one has only to listen to a minute of ranting to know that what we really are talking about is either a deliberate misuse or a sad misunderstanding of history.
Misuse implies two things: first, that the Partiers themselves know that they are attempting to mislead, and second, that the rest of us share an understanding of what accurate history looks like. Would that this were true. Unfortunately, there is little indication that the new revolutionaries possess more than a rudimentary knowledge of American or world history, and there is even less reason to think that the wider public is any different. Such ignorance allows terms like communism, socialism, and fascism to be used interchangeably by riled-up protesters while much of the public, and, not incidentally, the media, nods with a fuzzy understanding of the negative connotations those words are supposed to convey (of course some on the left are just as guilty of too-liberally applying the “fascist” label to any policy of which they do not approve). It also allows the Tea Partiers to believe that their situation – being taxed with representation – somehow warrants use of "Don’t Tread On Me" flags and links their dissatisfaction with a popularly elected president to that of colonists chafing under monarchical rule.
While the specifics of the moment (particularly, it seems, the fact of the Obama presidency) account for some of the radical resentment, the intensity of feeling among the opposition these days seems built upon a total lack of historical perspective. Would someone who really understood the horrors of Stalin’s purges still believe that President Obama sought to emulate the Soviet leader? Or, a drier example, could you speak of a sudden government "takeover" of health care, replete with death panels, if you knew of the long and gradual approach to building the modern American welfare state? The problem, of course, is that many Americans have at best a shaky hold on the relevant historical facts and are therefore credulous when presented with distortions and fabrications. Even after college graduation, too many students lack understanding of key historical developments. And that’s just college students – let’s not forget the majority of Americans who last studied history in their high school years, perhaps in a state like Texas, where Thomas Jefferson was just erased from the past because he is now considered too radical and the word "capitalism" has been replaced by "free enterprise" to help smooth out its rough edges.
It is important to realize that ignorance about history allows falsehoods and distortions to be presented as facts, but it is also significant that Tea Partiers look to history to legitimize their endeavors. In other words, history is still seen as authoritative; the problem is that the authority is being abused. Such abuse can succeed only when the public’s collective historical memory has been allowed to atrophy.
In addition to a vague (at best) recollection of the pertinent facts, Tea Partier warnings of cataclysm are taken seriously because the skill of thinking historically has not been emphasized in high school and college curriculums. Teaching students to understand that things change over time because of particular actions taken or not taken and that context matters, also referred to as "critical thinking," gives them some perspective and helps them to take the long view that can illuminate the emptiness of sky-is-falling scare tactics. The politics of our moment, focused solely on what's happening this minute and what it means for the next election (no matter how far off), cry out for a skeptical appreciation by an electorate that unfortunately does not know how to think historically.
In recent years, conservative groups like the Intercollegiate Studies Institute and the American Council of Trustees and Alumni have been the loudest critics of the low status of history in colleges in the United States. They are especially upset with the lack of American history requirements at elite universities. But this should not be solely a conservative issue, nor can it be one that professional historians ignore. As the Tea Party movement is demonstrating, there are direct political consequences if the public is unable to perceive when history is used to mislead and confuse people.
Unfortunately, as budgets are being slashed at colleges and universities nationwide, history is seen by many as impractical and unimportant. Courses that focus on “career-building” and “real-world skills” are prioritized while history departments are unable to replace retiring faculty. One reason for this is that the case for history has not been made effectively. As ACTA has reported, none of the top 50 universities requires its students to take U.S. history – and 10 require no history course at all. Some students may take a history course that fulfills a broader core requirement, but many do not. And too often these core courses are deficient in teaching historical practice. Historians, whether just entering the field or preparing to retire, have an obligation as people with special knowledge of history's significance to make the case for a greater commitment to the discipline – to students, campus administrators, legislators, and the public. Indeed, anyone concerned about education who does not want to see our contemporary political discourse sink lower should be actively interested in promoting history education.
This is an uphill battle. There is no easy-to-measure market value for teaching history, no space race to gin up patriotic sentiment, no simplistic explanation to combat the perception that studying the subject offers no reward. Yet as the Tea Party "movement" has made apparent, history continues to float in the air of our political discourse, its authority ripe for sucking into every imaginable debate. There will always be divergent interpretations of the past and disagreements about what facts to emphasize, and individual schools and teachers will construct their courses as they see fit. But most of all, we must redouble our efforts to foster historical thinking. Teaching students how historians find and use evidence to construct their arguments develops the critical skills necessary for sorting through the various and often outlandish claims available 24 hours a day on cable TV and the Internet. As long as people reference past events while staking out their positions in the present – and that is unlikely to change – a functioning democracy demands a citizenry capable of spotting historical fantasy and hyperbolic misapplication of historical precedent.
Erik Christiansen and Jeremy Sullivan
Erik Christiansen teaches history at the University of Rhode Island and at Roger Williams University. Jeremy Sullivan is a Ph.D. candidate in history at the University of Maryland at College Park.
Last week, leaders from higher education gathered at the White House for a conference on Advancing Interfaith Service on College Campuses. Senior administration officials from the Department of Education, the Corporation for National and Community Service and two White House offices – of Faith-based and Neighborhood Partnerships, and of Social Innovation – addressed the crowd of university presidents, professors, chaplains and students.
That the White House would hold a conference on interfaith cooperation is no mystery; President Obama made the topic a theme of his presidency from the very beginning. But why a gathering that focuses on campuses? I think there are four reasons for this:
College campuses set the educational and civic agenda for the nation. By gathering higher education leaders, administration officials are signaling that they hope campuses make learning about religious diversity a mark of what it means to be an educated person. And just as campuses helped make volunteerism and multiculturalism a high priority on our nation’s civic agenda, staff in the Obama administration are hopeful that higher education can do the same for interfaith cooperation.
College campuses are social laboratories that can illustrate what success looks like. While there may be frigid relations between some religious groups in politics and the public square, a college campus has both the mission and the resources (chaplains, diversity offices, religion departments, resident advisers) to proactively cultivate positive relations between Muslims and Jews, Christians and Buddhists, Hindus and Humanists. They can demonstrate cooperation rather than conflict.
Campuses have the resources and mission to advance a knowledge paradigm – an orientation and body of knowledge that appreciates and possibility engages religious diversity. From Samuel Huntington’s clash of civilizations theory to stories of religious conflict on the evening news to the recent spate of bestsellers by ‘the new atheists,’ we are increasingly subject to a knowledge paradigm about religions being the source for violence, bigotry and ignorance in the world. While this paradigm should certainly be acknowledged, another one can be advanced: that diverse religions share positive values like mercy and compassion that can be acted on across lines of faith for the common good.
Campuses train the next generation of leaders. Students who have a positive experience of the “religious other” on campus take that worldview into the broader society. Students who develop an appreciative knowledge of the world’s religions on campus educate their neighbors. Students who learn the skills to bring people from different faith backgrounds together to build understanding and cooperation together on the quad apply those skills with their religiously diverse coworkers.
President Obama has shown the way in each of the above categories, and college campuses are uniquely positioned to follow his lead.
In his inaugural address, Obama lifted up America’s religious diversity and connected it to America’s promise: “Our patchwork heritage is a strength not a weakness, we are a nation of Christians and Muslims, Jews and Hindus, and nonbelievers...." The message: Educated citizens should know of our nation’s religious diversity, and it is a civic virtue to engage this diversity positively.
College presidents in America could sound a similar note in speeches to the incoming freshman class.
In the advisory council for the Faith-based Office, the president created his own laboratory that models what positive relations between religiously diverse citizens. I had the honor of serving on the inaugural council (a new group of 25 is expected to be appointed soon). There were Orthodox and Reform Jews, Catholic and Protestant clergy, Sunni and Shia Muslims, Hindu civic leaders and Evangelical movement-builders. And that’s not all -- we were Republicans and Democrats, gay and straight, Mexican and Indian and white and African American. And we had to agree on a final report that went to the president.
College campuses could have an interfaith council that works on common projects.
In Cairo, the president advanced a new “knowledge paradigm” with respect to religious diversity. Eschewing the tired clash of civilizations theory, which falsely claims that religions have opposing values that put them in conflict, Obama highlighted the positive interactions between the West and Islam throughout the course of history, the many contributions Muslim Americans make to their nation, and the dimensions of Islam he admired such as the advancement of learning and innovation.
College campuses can have academic courses that do the same.
As a young adult, Obama was a community organizer working under a Jewish mentor, bringing together Catholic, Protestant and Muslim groups to start job training centers and tutoring programs on the South Side of Chicago. In this way, he acquired the competencies of leadership in a religiously diverse world. The president has signaled that he believes this is a valuable experience for today’s young adults, making interfaith cooperation through service a line in his Cairo address and a theme of the Summer of Service program.
College campuses, with the high value they place on service, leadership development and the positive engagement of diversity, are perfectly prepared to launch robust interfaith service initiatives.
Interfaith initiatives have been growing on campuses for several decades. The White House invited the vanguard of the movement to Washington, D.C., last week with a clear message: this administration appreciates what you have been doing, and we think you can do more. A movement goes from niche to norm when a vanguard recognizes its moment. For the movement of interfaith cooperation, this is the moment.
Eboo Patel is the founder and executive director of Interfaith Youth Core, an organization that works with college campuses on religious diversity issues.
Depending on your politics, Virginia Attorney General Ken Cuccinelli II’s “fraud investigation” involving the climate-change research of the former University of Virginia assistant professor Michael Mann is either a witch hunt or a long-overdue assault on the Ivory Tower.
But Mr. Cuccinelli’s demand last month for a decade’s worth of e-mails and scientific work papers from Professor Mann’s former employer, UVa, should give comfort to none of us. A fundamental principle is at stake, often described in shorthand as academic freedom. More to the point, it’s the understanding that government will not without extraordinarily compelling reasons intrude on the process of scientific discovery. It’s a principle on which liberals and conservatives alike can agree.
The ill-advised investigation in Charlottesville transgresses a long-honored boundary, with implications that extend far beyond the Albemarle County courthouse where the university has filed a petition to block the subpoena. That is why I, along with other higher education leaders, scientists and scholars (including even some of Professor Mann’s scientific detractors), support the university’s legal battle.
The stakes are high. Academic freedom protects scholars of every stripe from government repression or retaliation, especially when they take on controversial topics and espouse unpopular theories. Throughout history, nations that protect academic freedom have strong institutions of higher education. Where academic freedom is weak, governmental power goes unchecked.
The matter concerns not just the academy but all of us as citizens. We know that a thriving, independent, intellectually diverse higher education sector is best able to produce the scientific discoveries and advances in knowledge that make society better. The process that leads to innovation involves dialogue. Scholars debate hypotheses, examine data, and scrutinize each other’s ideas.
At their best, the exchanges are blunt and unstinting: thus theories are criticized, refuted, honed, and improved. The free marketplace of ideas in which this exchange takes place is the best engine known to mankind for producing innovation while weeding out discredited hypotheses. Society has a strong interest in ensuring that scholars can engage in dialogue without the chilling threat of government intrusion.
History shows that when governments interfere, science is stifled, and society suffers. For his theory that the sun was but one of an infinite number of stars, the 17th century astronomer Giordano Bruno was burned at the stake -- setting back astronomical discovery by perhaps centuries. The Soviet government persecuted the plant geneticist Nikolai Vavilov for his contention that principles of genetics, not Marxist ideology, should inform agricultural policy -- while the Russian people starved. Here in the United States, McCarthy-era persecution chilled scholarship to the detriment of all.
Mr. Cuccinelli argues that he is trying to protect Virginia taxpayers from fraud. No doubt inquiry is appropriate in cases where there is real evidence of financial wrongdoing. But Professor Mann has been cleared of wrongdoing by numerous scientific and governmental bodies that have investigated Climategate. And the exceedingly broad “civil investigative demand” served on UVa sweeps in scientific papers and scholarly exchanges between colleagues -- exactly the kinds of exchanges that prosecutors should be chary to disturb.
What’s more, Mr. Cuccinelli, who is separately suing the EPA to block regulation of carbon emissions, has a legal stake in the climate change debate: he seeks to make the scientific validity of research like Professor Mann’s an issue in the EPA case. The attorney general’s positions, and the subpoenas themselves, have led many to question whether his investigation of Professor Mann is really about financial misfeasance, or whether it is about the politics of climate change.
Next month, an Albemarle County court is scheduled to hear the UVa subpoena case. If the attorney general’s request is granted, the chilling effect on important academic research will be felt at Thomas Jefferson’s university, throughout Virginia, and beyond. That prospect should give all of us pause, no matter whether our politics are blue, red or green.
Molly Corbett Broad
Molly Corbett Broad is president of the American Council on Education, the major coordinating body for more than 1,600 college and university presidents and more than 200 related associations, nationwide.
Even as Barack Obama became the presumptive Democratic presidential nominee last Tuesday, his continuing failure to win white working-class voters clouds his prospects for November. The inability to connect with noncollege educated whites also undercuts his claim to being a truly transformative candidate -- a Robert F. Kennedy figure -- who could significantly change the direction of the country. In the fall campaign, however, Obama's suggestion that he may be ready to change the focus of affirmative action policies in higher education -- away from race to economic class -- could prove pivotal in his efforts to reach working-class whites, and revive the great hopes of Bobby Kennedy's candidacy.
Affirmative action is a highly charged issue, which most politicians stay away from. But nothing could carry more potent symbolic value with Reagan Democrats than for Obama to end the Democratic Party's 40 years of support for racial preferences and to argue, instead, for preferences -- in college admissions and elsewhere -- based on economic status. Obama needs to do something dramatic. Right now, while people inside and outside the Obama campaign are making the RFK comparison, working-class whites aren't buying it. The results in Tuesday's Indiana primary are particularly poignant. Obama won handily among black Hoosiers, but lost the non-college educated white vote to Hillary Clinton by 66-34 percent. Forty years earlier, by contrast, Kennedy astonished observers by forging a coalition of blacks and working class whites, the likes of which we have rarely seen since then.
On May 6, 1968, the day before the Indiana primary, Kennedy participated in an iconic motorcade through industrial Lake County, with black mayor Richard Hatcher sitting on one side of Kennedy and boxer Tony Zale, the native son hero of Gary's Slavic steelworkers on the other. On primary election day, running against Eugene McCarthy and a stand in for Hubert Humphrey, Kennedy swept the black vote but also white working-class wards which four years earlier had supported Alabama Governor George Wallace's presidential bid. Author Robert Coles told Kennedy, "There is something going on here that has to do with real class politics."
Of course, Obama's skin color may have made it more difficult for him to attract these voters than it had been for Kennedy. But in some ways RFK had it harder: The May 1968 primary came on the heels of widespread urban rioting spawned by Martin Luther King Jr.'s assassination in April. Blue collar whites and blacks were at each others throats, and Kennedy was the one national politician most closely associated with black America.
In Obama's campaign to win over working-class whites, pundits have pointed to two key obstacles: his 20 year association with the angry and race-obsessed Rev. Jeremiah Wright, and Obama's condescending comments about the bitterness of small-town white working-class voters. Some working-class whites appear to believe that Obama is not on their side - worried that he may favor black interests over theirs, and at the same time that he looks down his nose on people like them. The image may be unfair, the result of a single comment he made, played up by his political opponents, but the notion could stick nonetheless.
Obama is right to talk about shared concerns of all working people, such as better health care and schools. But to catch the attention of working-class whites, he needs to do something striking, which further distances himself from the Rev. Wrights of the world, who view life through the lens of race, and also signals to working-class whites that he understands that they deserve a helping hand too. Switching the basis of affirmative action policies from race to class would do just that.
Thus far, Obama has hinted that he's ready for the shift. While Obama has in the past been a strong supporter of race-based affirmative action, in his debate in Philadelphia with Hillary Clinton, he said in response to a question that his own privileged daughters do not deserve affirmative action preferences, and that working-class students of all colors do. He needs to make this explicit, to spell out the new policy, and explain why he is shifting away from his traditional reliance on race-based policies.
Supporting a shift to class-based affirmative action would be the logical policy manifestation of his well received speech on race in Philadelphia back in March. In the address, Obama made clear that this nation needs some form of affirmative action to address the legacy of discrimination in America. He noted that legalized discrimination in FHA loans, for example, prevented blacks from borrowing to purchase homes, leaving older blacks with little accumulated wealth to pass down to today's generations. And he observed that many African Americans continue to attend to attended inferior segregated schools, to live in neighborhoods with concentrated poverty, and to grow up in single parent households, all of which are connected to some degree to discrimination.
On the other hand, Obama acknowledged many of the arguments made by opponents of affirmative action, who say that while such policies might have once made sense, it is now time to move on. Obama faulted Rev. Wright for failing to recognize that significant racial progress has been made, and he urged the country to "move beyond our old racial wounds." Then, amazingly for a Democratic politician, he observed: "Most working- and middle-class white Americans don't feel they have been particularly privileged by their race.... As far as they're concerned, no one handed them anything."
Resentment builds, Obama said, "when they hear that an African American is getting an advantage in landing a good job or a spot in a good college because of an injustice that they themselves never committed." These resentments, he said are not "misguided or even racist," but rather are "grounded in legitimate concerns."
Class-based affirmative action reconciles both points of view. It avoids the explicit use of race that working-class whites resent, moving us beyond the "racial stalemate" Obama described. But a carefully conceived economic affirmative action program would also try to capture the full legacy of discrimination of which Obama spoke. It would be colorblind but not blind to history. Discrimination has economic manifestations, and college admissions officers could give a leg up to smart students who overcome various obstacles which disproportionately affect African Americans: growing up in a low-income household, one headed by a single-parent, a family lacking in accumulated wealth, and residing in neighborhoods with concentrated of poverty, and attending low quality schools. Under such a program, low-income and working-class kids of all races would benefit -- people like the young Barack Obama or John Edwards -- but not students like Barack Obama's own children.
Moving to class-based preferences would at once remove a terrible source of division and instead reinforce the common interests of working-class voters. And it would do more than just help Obama get elected. Reviving the old RFK coalition would give Obama a mandate to enact the type of far reaching change than hasn't been fully entertained since Kennedy's death.
No one has ever won a Nobel Peace Prize for education. Click here and look for yourself. I can’t be alone in finding this embarrassing for all of us in education. Here in the nation with the self-proclaimed “finest higher education system in the world,” why hasn’t the Big Ten won the Nobel Peace Prize? Or the Ivy League? Or even the Little Three. The opposite of peace would be war and conflict. Isn’t war the ultimate failure to solve a problem by other means? Isn’t our job in education to teach people to solve problems of all sizes?
I wonder because today, again, begins the dark time of year for the world-changing, approval-seeking wannabes with whom I cast my lot. Today’s the day the MacArthur Fellows are announced. As of press time, I have to conclude that no phone call or photo request so far means I missed again. I’ve been here before. I am resilient. What keeps me going in this season is the knowledge that the Nobels are on the way in October, starting Monday.
These past 12 months have been perplexing. Since President Obama won his Nobel Peace Prize last year, though, I haven’t made any progress on my annual post-MacArthur questions: What would a Nobel Peace Prize for education look like? What would a Norman Borlaug-ian accomplishment by an educator be? I don’t have an answer, and the Nobels will pass by again. Someone out there must have an idea for education.
As a sometime English teacher, I like analogies and metaphors. Norman Borlaug, the 1970 Prize winner, put more grains of wheat on shorter stalks. That was a big step to reducing hunger and starvation in Mexico and Pakistan. Yes, Borlaug’s Green Revolution, which no one disputes fed millions, has critics. So does whole milk. Before my own critics howl, I do not mean that stuffing more students into smaller classrooms is the Nobel idea to consider. Borlaug’s idea, though, is the kind of global game-changer that perhaps we educators and columnists need to ponder.
I can’t see that any winners are better necessarily better thinkers than educators. In 2007, droning Al Gore won for just describing a problem – global warming. Winners from medicine, though, keep thinking beyond their laboratories and hospitals. The 1985 winner was International Physicians for the Prevention of Nuclear War. In 1999, the winner was Medicins sans Frontieres, "in recognition of the organization's pioneering humanitarian work on several continents." The 2005 winners were Mohamed ElBaradei and the International Atomic Energy Agency "for their efforts to prevent nuclear energy from being used for military purposes and to ensure that nuclear energy for peaceful purposes is used in the safest possible way." Why not a similar focused effort for education?
In 1947, the Quakers won. In 1917, the International Red Cross. What’s to stop a U.S. college or university from doing the same? Or at least a teacher with an education peace plan.
A pox on us all is the recurring Nobel Peace Prize theme of nuclear weapons. As if invention of the weapons in the first place weren’t already evidence that we teachers have room for improvement. In 1995, the winner was the Pugwash Conferences on Science and World Affairs -- "for their efforts to diminish the part played by nuclear arms in international politics and, in the longer run, to eliminate such arms." I keep thinking that we educators ought to have an answer by now, how to solve problems without even the threat of nuclear war.
My Nobel Peace Prize education contender is Nicholas Negroponte, for the scope and success of his project One Laptop Per Child. The mission statement has a peaceful ring: “To create educational opportunities for the world's poorest children by providing each child with a rugged, low-cost, low-power, connected laptop with content and software designed for collaborative, joyful, self-empowered learning.”
I’m not hoping. The world declared Negroponte crazy for saying he could build a laptop for $100. That same world has now discredited him for having missed by 100% and coming in with a price of $200. I’ve tried one. They are great. I’ll do what I can to move Negroponte out of the No Good Deed Goes Unpunished Hall of Fame. One Laptop Per Child, in my book (or hard drive) anyway, beats Al Gore any day. Negroponte is working on real solutions.
Thinking about all this last year in my office at Bunker Hill Community College, I looked up one day to find that 1992 Nobel Peace Prize winner Rigoberta Menchu Tum was arriving on campus that afternoon. After her talk, I asked what she made of the absence of any Nobel Peace Prizes for education. Through a translator, here’s what she said: “Everything in this world depends on education. All the people who have graduated, with formal educations, they are the ones who are the leaders. But with all the people who are harming the world, we need to take another look at how education focuses on the positive social mission. Part of the learning takes place in the classroom, but I’d move part of the learning out into the street, resolving conflicts, resolving conflicts, solving problems. If we have leaders who think only about war, well?”
I imagine the Nobel committee is wrapping up for this year. What can education have on the table for the 2011 Nobel Peace Prize? Suggestions welcome below.
Last week Inside Higher Ed published a column by Scott McLemee entitled “Rude Democracy,” which discussed Jon Stewart’s Rally to Restore Sanity and apparent trends indicating a lack of political engagement among young people. McLemee’s argument was both intelligent and important, but I believe there’s another side to the story of Stewart’s rally, political civility, and turnout among college students and young voters in the 2010 midterm election.
Unsurprisingly, Republicans were very successful in the midterm election, gaining control of the House of Representatives and cutting into the Democrats’ majority in the Senate. While the politically active on campuses across the country will surely devote much discussion to the results of the election and their implications over the proceeding days and weeks, it’s less likely they’ll discuss the execrable turnout among 18- to 29-year-olds.
Early exit polling done by CBS News indicates that young people made up roughly 9 percent of all voters in the midterm election. In 2008, young people made up 18 percent of the electorate. Why?
Political scientists and campaign consultants offer several theories. Americans are more likely to vote when enthusiasm abounds for the candidates they support and young people tend to support Democrats. Young people historically don’t turn out for midterms. Barack Obama’s candidacy in 2008 was uniquely galvanizing for young voters. The agenda of Congress and the president has not adequately addressed energy policy -- a very important issue to college students – and media coverage of the health care reform bill (which did quietly include benefits for young people) focused mostly on the concerns of older voters. Thus, young people seem to have concluded that voting isn’t worth their time right now.
Herbst studied the views of young people and found that “72 percent of students agreed that it was very important for them to always feel comfortable in class.” Herbst argues that “Contrary to the image of college being a place to ‘find oneself’ and learn from others, a number of students saw the campus as just the opposite – a place where already formed citizens clash, stay with like-minded others, or avoid politics altogether.”
Based on Herbst’s study, McLemee, writing prior to the sanity rally, argued that, while Stewart’s rally was likely to draw lots of young people and provide them with a fun weekend, “the anti-ideological spirit of the event is a dead end. The attitude that it's better to stay cool and amused than to risk making arguments or expressing too much ardor -- this is not civility. It’s timidity.” Clearly, McLemee believes that the unwillingness of young people to engage in political debate – argument – is not a political virtue, but rather a democratically harmful form of indifference.
Before accepting McLemee’s assertion, though, I think several important questions need to be answered. Why do the students in Herbst's survey feel that it isn't possible to persuade others? Could it be that such a belief is the product of an uncivil political culture? If students had political role models who successfully persuaded others in civil and respectful ways, would they be more inclined to view the political arena – and the classroom – as a space in which the clash of ideas can occur and yield positive results?
Personally, I can think of two positive things Stewart does; first, he encourages young people to refuse to subscribe to the currently pervasive ultra-partisan view of politics that fosters incivility and acts as a barrier to progress; and second, and more basically, he brings some level of political awareness through humor to people who might otherwise be totally apathetic and ignorant. Stewart’s influence, in my view, doesn’t breed timidity (as McLemee asserts), but rather increased youth engagement of the type that rejects a toxic political culture.
It also seems possible that the “Obama Effect” I mentioned earlier, holding that young voters turned out in 2008 because of their admiration of the current president, is at play. I'm worried that young people, perhaps naively, viewed Candidate Obama as a post-partisan role model and that President Obama’s lack of success thus far may further discourage engagement among young people who believed he had the ability to catalyze change without acting like every other “scumbag politician” they've come to dislike.
Moving forward, two things are clear. First, the perspective of young people has the power to change the nature of partisanship; if we, as a generation, continue to subscribe to the ideals of the Rally to Restore Sanity, we have the potential to improve the tone of politics.
Second, however, the burden most certainly falls on us; politicians are not going to pander to a portion of the electorate they don’t believe will turn out to vote, so if we want to transform Stewart’s rally from a sunny Saturday on the Mall into a new political reality, we’ve got to make our voices heard.
Among Barack Obama’s distinguishing characteristics in the field of presidential hopefuls, four years ago, was his opposition to the Iraq war, which he had denounced at an antiwar rally in Chicago in October 2002, when invasion was still yet a gleam in the neocon eye. As Obama’s reelection campaign begins this week, his administration continues the military occupations of Afghanistan and Iraq, while making the down payment on a third in Libya.
This is not what people who supported Obama expected -- and public opinion polls suggest that opposition to the wars in Afghanistan and Iraq remains as high as it was during Bush’s second term, or higher. But the streets no longer fill with protesters. This coming weekend there will be antiwar demonstrations in New York (April 9) and San Francisco (April 10). They won’t be on the scale that became almost routine a few years ago, however, when hundreds of thousands of people attended such events. They will be one-tenth the size, more or less.
We can predict that with greater confidence than the weather this weekend. But why? And what would it take to change the situation?
Part of the answer might be found in a paper by Michael T. Heaney and Fabio Rojas called “The Partisan Dynamics of Contention: Demobilization of the Antiwar Movement in the United States, 2007-2009,” appearing in the latest issue of the journal social-science journal Mobilization. (It is available here in PDF.)
Heaney is an assistant professor of organizational studies and political science at the University of Michigan, while Rojas is associate professor of sociology at Indiana University.
Drawing on more than 5,300 surveys the authors conducted with people attending antiwar rallies in recent years, the paper is the latest in a series of studies of the relationship between social movements and political institutions -- in particular, American political parties, major and otherwise.
I first wrote about their work four years ago, as demonstrations against the Iraq war were at their peak. (See also also this column.) My interest is not, as the expression goes, purely academic. Any activist develops certain hunches about the relationship between mass movements, on the one hand, and more established and durable political entities, on the other. Such intuitions tend not to be theorized, but you need them as maps of the terrain. Folks in the Tea Party are not likely ever to read Robert Michels, though I’d guess they’ve had a taste of the iron law of oligarchy by now. Sometimes you have to work these things out for yourself.
Years ago, a friend with long involvement in organizing against the Vietnam war explained how the national election cycle had affected the ebb and flow of the protest movement. “In an odd-numbered year,” he told me, “you’d have masses of people coming out to demonstrations. If it was an even-numbered year, lots of the same people would stay at home because they figured voting for Democrats who criticized the war was enough.” He had been one of those out marching no matter what, and you could hear the frustration in his voice.
My friend lacked the sophisticated statistical tools deployed by Heaney and Rojas (henceforth, H&R), whose understanding of organizational dynamics is also more subtle. But their paper largely corroborates his thumbnail analysis.
Mass movements and political parties are very different animals, at least in the United States. Sociologists and political scientists usually put them in separate cages, and activists and policy wonks would tend to agree. “Party activists may view movements as marginal and unlikely to achieve their goals,” write H&R. “Movement activists may reject parties as too willing to compromise principles and too focused on power as an end in itself.”
But dichotomizing things so sharply means overlooking a third cohort: what H&R call the “movement-partisans” or, a bit more trenchantly, “the party in the streets.” These are people who identify themselves as belonging to an electoral party but consider mass protest to be as valid as the more routine sorts of political action. They might march on Washington, if strongly enough motivated -- but will also make it a point, while there, to visit their Congressional representatives for a quick round of citizen-lobbying.
To movement-partisans, each approach seems a potentially effective way to express their concerns and try to change things. In deciding which one to use at a given moment -- or whether to combine them -- ideological consistency usually counts less an their ad hoc estimate of the respective costs and benefits
According to H&R’s earlier research, movement-partisans are likely to be members of unions, civic organizations, and community groups. This makes them indispensable to building broad support for a cause. (They are also crucial to shoring up what party leaders call “the base” these days.) But the intensity of their involvement varies according to the degree of perceived threat they detect in the political environment.
“When the balance of power between the parties changes,” the social scientists write, movement-partisans will “reassess the benefits and costs of taking action. The rise of an unfriendly party may generate suspicions that the movement will be threatened by a wide range of hostile policies, while the rise of a friendly party may lead to a sense of relief that the threat has ended. Since people tend to work more aggressively to avoid losses than to achieve gains, grassroots mobilization is more likely to flow from the emergence of new threats than from the prospect of beneficial opportunities.”
From surveys conducted during national antiwar actions, the researchers found that people who self-identified as Democrats represented “a major constituency in the antiwar movement during 2007 and 2008,” accounting for 37 to 54 percent of participants. Those who identified as members of third parties represented 7 to 13 percent. (The rest indicated that they were independents, Republicans, or members of more than one party.)
In January 2007, an antiwar protest in Washington, D.C., drew hundreds of thousands of people. In H&R’s terms, a “perceived threat” from the Bush administration still existed among Democratic movement-partisans; so did their sense that it made sense to put pressure on Congress as it shifted from Republican to Democratic control, following the midterm.
But as the presidential campaigns ramped up, the dynamic changed. By late 2008, turnout at demonstrations contracted “by an order of magnitude to roughly the tens of thousands” -- and kept shrinking over the following year. At the same time, the composition began to change. By November 2009, the portion of antiwar protesters identifying as Democrats had fallen to a low of 19 percent, while the involvement of third-party members grew to a peak of 34 percent (almost three times the share just a couple of years earlier).
In some cases, decreasing or ending their participation in antiwar protests was a matter of conscious decision-making by members of the Democratic “party in the street,” as H&R call it. They may have approved of Obama’s handling of the Iraq war or sensed that other issues, such as health care, required more attention. At the same time, movement-partisans of the Republican sort were beginning to mobilize. Even people strongly opposed to the wars often felt this as a disincentive to challenge the administration: they didn't want to risk seeming to join forces with the president's political enemies.
But shifts in attitudes and priorities among individual activists only explain so much. In the final analysis, organization is everything. H&R stress the role of coalitions “in enabling parties and movements to coordinate their actions and share resources.”
The largest and broadest national antiwar coalition, United for Peace and Justice, was also the one most likely to supplement mass demonstrations with messages linked to the electoral arena (“The voters want peace”) and lobbying efforts. Arguably, UFPJ even had influence over groups completely rejecting its approach, since it gave them an incentive to cooperate with each other in organizing alternative antiwar protests.
Following Obama’s election, UFPJ began to disintegrate as Democrats withdrew. (It still exists, but just barely, and now calls itself a "network" rather than a coalition.) The most important effect of its unraveling, according to the paper, is “the fragmentation of the movement into smaller coalitions” -- groupings that tended to act on their own initiative, without the capacity to coordinate work with one another. The number of antiwar demonstrations grew even as the turnout shrank. Another consequence was “the expression of more radical and anti-Obama attitudes by leading organizations.” And this has the predictable effect of narrowing the base of likely supporters.
At this point it seems worth mentioning an insight by another friend whose education in such matters took place in the laboratory of the 1960s. For many years, he said, being engaged in antiwar activism or civil rights work meant going to events where, after a while, you were able to recognize almost everybody. Then one day he attended a demonstration and saw that something had changed. There were some familiar faces, but he had no idea who most of the people were.
“That’s how you know that the cause has actually become a movement," he said. "You look around and see a lot of new faces. It’s no longer just the usual suspects.”
What H&R describe in their paper is, in effect, the film running in reverse. Beyond a certain point, fragmentation becomes self-reinforcing. I wondered about the implications of H&R’s work for what now remains the antiwar movement. Was there anything in their analysis that would suggest the possibility of its revival, on a broader basis, in the immediate future? Could it happen with Obama still in office? Or would it take the “perceived threat” of a Republican president?
I wrote Heaney to ask. His short answer was, simply, no -- the chances of a major revival in the short term are slim. The more nuanced version of his response went somewhat beyond my question, though, and seems of interest:
“As long as voters remain highly polarized along party lines,” he responded by e-mail, “self-identified Democrats are unlikely to protest against Obama's policies, even if they disagree with some of them strongly. A sudden end to the era of partisan polarization seems highly unlikely. So I would say that it is a very good bet that Obama will not confront large left-wing demonstrations. Of course, LBJ faced large left-wing demonstrations, but the party system was not polarized back then in the way that it is today.”
The same dynamics apply to the Tea Party: “Our analysis implies that the Tea Party will have a lower degree of organization and success in 2012 than it did in 2010. Because the Republicans won the House and made gains in the Senate, Tea Party activists feel much less threatened today than they did a year ago. So, while the Tea Party will obviously be around in 2012 -- and it will likely factor into the Republican presidential contest -- our analysis suggests that the Tea Party will not generate the same level of enthusiasm next year as it did last year.”
Well, you take what grounds for optimism you can find.
Heaney might be right about everything. It could be that the antiwar movement will remain in its doldrum until, say, the Gingrich-Palin ticket proves victorious. That’d put some teeth back into the concept of “perceived threat,” anyway.
But it hardly follows that resignation is the best course. I can’t make it to New York on Saturday (let alone San Francisco) but am buying a bus ticket for someone else who wants to go. And while finishing up this column, I got in touch with Ashley Smith, who is a member of the steering committee of the United National Antiwar Committee, who seems to have a pretty sober perspective on where things stand.
“The most important part of these demonstrations,” he told me, “is bringing together old and new forces to rebuild an antiwar movement that has been weak for the last several years. We have made a high priority of including demands that open up the movement to forces to often held at arm's distance from antiwar mobilizations in the past -- Palestinians, Muslims, South Asians and Arabs. We have also had some success in reaching out to labor unions like SEIU 1199 and TWU 100 in New York. It is crucial that we rebuild the antiwar movement now.“
To paraphrase Donald Rumsfeld just slightly, you go into the antiwar struggle with the forces you have, not the ones you want, or wish to have at a later time.