In November, Pew Research Center released a report discussing the level of belief in American exceptionalism in the United States. It gauged this by asking whether interviewees accepted the statement "Our people are not perfect, but our culture is superior to others." I have been interested in the history of theories of American exceptionalism for more than twenty years, and gave it a look. Formulating the idea that way struck me as obnoxious and fairly absurd. But then the Pew people are specialists in public opinion research -- and feelings of superiority (or rather, anxieties over it) do seem to be what is at stake as the expression American exceptionalism is used in U.S. politics lately. (It's worth noting that it's the authors of the report who make a connection between superiority and exceptionalism. The interviewers didn't explicitly ask about the latter.)
Republican candidates keep proclaiming their faith in American exceptionalism, or smiting Obama for his failure to believe in it. Not long ago somebody published a letter to the editor claiming that Obama hates American exceptionalism, which would seem to imply that he must believe in it, since hating something you don’t believe in sounds difficult and a real waste of time. But it’s probably best not to expect too much logical consistency at this point in the electoral season.
Obama himself is at least somewhat culpable for the whole situation. The furor all started in 2009 when, in response to a question, he said: “I believe in American exceptionalism, just as I suspect that the Brits believe in British exceptionalism and the Greeks believe in Greek exceptionalism.” That, too, is a misreading of the term, equating it with something like national-self esteem. But of a healthy sort -- well shy of narcissistic grandiosity, with plenty to go around. That's probably what got him into trouble.
Anyway, the Pew study yielded some interesting results. Pew's researchers have been asking whether people agreed with the sentiment "Our people are not perfect, but our culture is superior to others" for at least 10 years now. In 2002, 60 percent of the Americans polled said they did. The figure fell to 55 percent in 2007. Last year, just 49 percent of respondents agreed, with nearly as many (46 percent) saying they disagreed. “Belief in cultural superiority has declined among Americans across age, gender and education groups,” the Pew report said.
The same question was posed in surveys conducted in Britain, France, Germany, and Spain. The level of agreement was higher in the U.S, than elsewhere (Germany and Spain were fairly close) but the variations are less interesting than what held constant: “In the four Western European countries and in the U.S., those who did not graduate from college are more likely than those who did to agree that their culture is superior, even if their people are not perfect.”
Make of that what you will. For my part, the really odd thing about all the recent endorsements of American exceptionalism is that the very expression came into the world as the name for a Communist heresy.
The image of America as a city upon a hill -- uniquely favored by the Almighty and a light unto the heathens -- is older than the United States itself, of course. And it’s true that visitors to the country, including Alexis de Tocqueville, have long declared it “exceptional,” in one way or another, and not always for the better. Charles Dickens thought we were exceptionally prone to printing his books without permission, let alone paying him royalties. But the term "American exceptionalism" is more recent, and it took the Comintern to launch the Republican candidates' preferred way of recommending themselves these days.
Circa 1927-28, a group of American Communist Party leaders began arguing that, yes, the U.S. economy would undoubtedly succumb to the contradictions of capitalism, sooner or later, but it still had plenty of life in it yet, so the comrades abroad should keep that in mind, at least for a while. Their perspective was in accord with the ideas of the Bolshevik theorist Nicholai Bukharin concerning the world economic situation, and he was the one, after all, in charge of the Communist International. So all was copacetic, at least until the summer of 1928, when Stalin quit taking Bukharin’s phone calls.
Before long, the American leaders were called on the carpet by the authorities in Moscow, and found themselves denounced by Stalin himself for an ideological deviation: "American exceptionalism.” Stalin also told them, "When you get back to America, nobody will stay with you except your wives." That turned out to be a slight exaggeration, but they were promptly expelled from the party when they got back home -- taking around a thousand fellow American exceptionalists with them.
As it happened, all of this was just a few months before the stock market crash on Black Tuesday, which made the whole debate seem rather moot. But a catchphrase was born. Stalin’s speeches blasting American exceptionalism were printed as a pamphlet in an enormous edition. The pro-American exceptionalism Communists went off to start their own group, which had a strange and complex history that deserves better scholarship than it has received. But that seems like enough esoterica for now.
David Levering Lewis puts the neologism into a wider context with his essay “Exceptionalism's Exceptions: The Changing American Narrative,” in the new issue of the American Academy of Arts and Sciences’ journal Daedelus. Levering, now a professor of history at New York University, received one Pulitzer Prize each for the two volumes of his biography of W.E.B. Du Bois.
“[I]ts Soviet originators defined American exceptionalism as the colossal historical fallacy that imagined itself exempt from the iron laws of economic determinism,” Levering writes, “whereas most American academics and public intellectuals … avidly embraced a phrase they regarded as an inspired encapsulation of 160 years of impeccable national history.” One of the handful of figures to give the idea a careful, skeptical examination, Levering says, was Du Bois. In his masterpiece Black Reconstruction (1935), he wrote that “two theories of the future of America clashed and blended just after the Civil War.” One was “abolition-democracy based on freedom, intelligence, and power for all men,” and the other was “a new industrial philosophy” with “a vision not of work but of wealth; not of planned accomplishment, but of power.”
American exceptionalism was, in effect, the happy belief that these tendencies reinforced each other. That was not a credible idea for an African-American who received his Ph.D. from Harvard one year before the Supreme Court ruling in Plessy v. Ferguson that endorsed “separate but equal” treatment of the races. For Du Bois, writes Levering, “the cant of exceptionalism survived mainly to keep the Moloch of laissez-faire on life support even as its vital signs failed in the wake of the Great Crash of 1929.”
The doctrine of exceptionalism proved hardier than Du Bois imagined, as the years following World War II showed. Levering mentions that Henry Luce “had already given the world its peacetime marching orders in ‘The American Century,’ a signature 1941 editorial in Life.” Eight other contributors, most of them historians, join Andrew J. Bacevich in assessing that line of march in The Short American Century: A Postmortem (Harvard University Press), a collection of essays spinning off from a lecture series Bacevich organized at Boston University in 2009-2010.
“By the time the seventieth anniversary of Luce’s famous essay rolled around in 2011,” the editor writes, “the gap between what he had summoned Americans to do back in 1941 and what they were actually willing or able to do had become unbridgeable.” Unfortunately the editorial is not reprinted, and it loses something in paraphrase -- a bracing tone of stern moral uplift, perhaps, inherited from his parents, who had been missionaries in China. Here’s a sample:
“[W]hereas their nation became in the 20th Century the most powerful and the most vital nation in the world, nevertheless Americans were unable [after World War One] to accommodate themselves spiritually and practically to that fact. Hence they have failed to play their part as a world power -- a failure which has had disastrous consequences for themselves and for all mankind. And the cure is this: to accept wholeheartedly our duty and our opportunity as the most powerful and vital nation in the world and in consequence to exert upon the world the full impact of our influence, for such purposes as we see fit and by such means as we see fit.”
And plenty more where that came from. “When first unveiled,” Bacevich notes, “Luce’s concept of an American Century amounted to little more than the venting of an overwrought publishing tycoon.” By the end of the war, that had changed: “Claims that in 1941 sounded grandiose became after 1945 unexceptionable.” The American Century brought “plentiful jobs, proliferating leisure activities, cheap energy readily available from domestic sources, and a cornucopia of consumer goods, almost all of them bearing the label ‘Made in the U.S.A.’ ” And all of it while, in Luce’s words, “exert[ing] upon the world the full impact of our influence, for such purposes as we see fit and by such means as we see fit.”
Well, and how did that turn out? The contributors are not of one mind. “As international regimes go, much of the American Century, despite the chronic tensions and occasional blunders of the Cold War (and especially the tragedy of Vietnam) was on the whole a laudably successful affair,” writes David M. Kennedy. For Emily S. Rosenberg, “the period of maximum U.S. power and influence” was “a precursor to a global Consumer Century” that “proved highly adaptive to local cultural variation,” so that equating globalization with Americanization is a misnomer.
In counterpoint, Walter LaFeber rebukes Luce’s vision all along the line. He writes that the American Century “never existed except as an illusion, but an illusion to which Americans, in their repeated willingness to ignore history, fell prey.”
T. J Jackson Lears writes in praise of a “pragmatic realism” informed by the pluralism of William James and Randolph Bourne, and says it “requires a sense of proportionality between means and ends, as well as a careful consideration of consequences – above all, the certain, bloody consequences of war.” But his essay does not exactly portray the American Century as a triumph of pragmatic realism. (C. Wright Mills’s description of the nuclear war strategists’ “crackpot realism” seems a little more apropos.)
Bacevich’s essay concluding the book brings us up the moment by stressing how interconnected the American Century and American exceptionalism have become. “To liken the United States to any other country (Israel possibly excepted) is to defile a central tenet of the American civil religion. In national politics, it is simply impermissible.” Luce’s vision “encapsulate[es] an era about which some (although by no means all) Americans might wax nostalgic, a time, real or imagined, of common purpose, common values, and shared sacrifice.”
Such yearning is understandable, but nostalgia is bad for you: it makes the past seem simpler than it was. And the world has probably had as much exceptionalism as it can stand. As the American psychologist Harry Stack Sullivan put it, we are all much more simply human than anything else. And it seems like there must be a better use of a political figure's time than assuring people that they are all above average.
A set of three books landed on my desk last week: the opening salvos in a new series from Verso called Counterblasts. A notice across from each one’s title page announces the intention “to revive a tradition inaugurated by Puritan and Leveller pamphleteers in the 17th century when, in the words of one of their number, Gerard Winstanley, the old world was ‘running up like parchment in the fire.’” Given that Winstanley’s group, the Diggers, was the original Occupy movement, Verso’s timing is excellent -- though any revival of pamphleteering at this late date almost certainly demands a format suitable for rapid dissemination on portable devices. And at extremely low (and probably no) cost.
At least with Counterblasts you get a well-designed artifact for your money. Each volume singles out one of the “politicians, media barons, and their ideological hirelings” serving as “apologists of Capital and Empire,” as the series description calls them, in suitably Puritan-Jacobin tones. The cover is stark black. A photo of the book’s polemical target looms against the backdrop. The aesthetic here resembles "The Charlie Rose Show" (talking heads afloat in the depths of infinite space) although considerably less flattering to the guests. It seems appropriate, then, that the first two Counterblasts are directed at figures who have been prominent in the world of TV punditry.
One is the New York Times foreign affairs columnist Thomas Friedman, pictured scowling in concentration, like a bulldog who just swallowed a Styrofoam packing peanut and is now thinking that it might have been a bad idea. The other is Bernard Henri Levy, who, when not playing a philosopher on French television, serves as a celebrity thinker-in-residence at the Huffington Post. As always, he looks marvelous.
The third figure is Michael Ignatieff, whose picture will be familiar to the Canadian public but ring only the faintest of bells elsewhere. He spent the 1990s as one of England’s most prominent public intellectuals, preparing BBC documentaries and writing books on human rights, civil wars, and humanitarian intervention. He was also the authorized biographer of Isaiah Berlin, whose essays on the history of social and political thought defined a sort of Anglo-American liberal orthodoxy in recent decades.
In 2000, Ignatieff became the first director of the Carr Center for Human Rights Policy at Harvard University. A few months later, George W. Bush took office. Each man had barely settled in their new offices before Ignatieff published the first of several efforts to clarify the ethico-political justification for preemptive war against Iraq, given the menace of Saddam Hussein’s weapons of mass destruction.
With mission accomplished yet no WMDs in sight, Ignatieff turned his mind to arguing for other reasons why the invasion of Iraq had been a good idea. His book The Lesser Evil: Political Ethics in an Age of Terror (Princeton University Press, 2004) argued, among other things, that torture must be condemned as morally wrong, but hey, what can you do? Desperate times mean desperate measures, and desperate measures require thoughtful casuistry.
The Lesser Evil appeared at just about the time the pictures from Abu Ghraib did. Nobody in those snapshots was agonizing over nuances of right and wrong, and it didn’t look like the US soldiers were extracting information about ticking time bombs either. They were just having an awful lot of fun. The images would have created an uproar, of course, even if they had worn expressions of pain and doubt. But the way they looked out at the viewer, as if expecting you to give them the high-five, threw Ignatieff’s work in a new context. However much his thinking might be rooted in the precepts of Sir Isaiah, its on-the-ground consequences were degrading for everyone involved.
In 2007, Ignatieff returned to the pages of The New York Times Magazine (where his most widely discussed articles in favor of the war had appeared a few years earlier) to say that he had been wrong ... or misled ... or too much the airy academic ... or not quite so right as he could have been. He admitted that some people argued from the start that the war was a bad idea, but that didn't mean they were proven correct , since they had been right for the wrong reasons. He, at least, had been wrong for the right reasons and clearly must not be expected to learn anything from them.
It was a strange essay, and it left the impression of a mind at the end of its tether, dangling in the wind. But Derrick O’Keefe’s Counterblast volume Michael Ignatieff: The Lesser Evil suggests that his mea culpa was more coherent -- or at least more consistent with the rest of his career -- than it might look.
When Ignatieff returned to Canada in 2005 after almost three decades abroad, it seemed like he was stepping away from the work that had defined him as a public figure. After all, he had made some major interventions in the debates over liberal internationalism, or philanthropic militarism, that unfolded across a distinct period beginning with the first war of Yugoslavia’s disintegration (mid-1991) and ending, more or less, with the second battle of Fallujah (late 2004). He even had the confidence and authority needed to risk defining his position in terms as brutal as any that an opponent might attribute to him: “Imperialism used to be the white man’s burden. This gave it a bad reputation. But imperialism doesn’t stop being necessary just because it become politically incorrect.”
So wrote Ignatieff in 2003, full of beans. Declarations of imperial mission were not much wanted by 2005, when headhunters from the Liberal Party of Canada lured him away from Harvard. You could not fault him for wanting to reinvent himself. But here was more to it than that.
From the blinkered U.S.-centric perspective, Ignatieff’s departure did not look like forward motion, but the Liberal Party has long been at the very center of Canadian politics (flanked by the Conservatives on the right and the New Democrats to the left, and the dominant force among them). Ignatieff’s return to his homeland was the first step in a serious bid for power. And his mea culpa in the Times was part of it, since the U.S. occupations of Afghanistan and Iraq never enjoyed much support in Canada.
Besides distancing himself from policies he had once supported -- taking responsibility for them, but not too much responsibility -- Ignatieff also used the essay for another purpose. He explained that leaving the ivory tower behind had rendered him a tough-minded man of the world. In the future he would assume his positions, and choose his words, more carefully. In the meantime, he was making as many references to hockey as circumstances would permit.
Michael Ignatieff: The Lesser Evil? makes the case that this newfound discovery of measured responses and realpolitik is just an act, because Ignatieff has been practicing them all along. O'Keefe points out that the Times essay defines politicians as "actors who have to feign indignation and other emotions they do not feel,” while academics “merely play with words and pursue digressions with ideas for their own sake because they are detached from the real-world consequences.” Here, unwarranted generalization yields self-accusation: Ignatieff himself seems to have been one of the very few academics to champion "regime change" in ways "detached from the real-world consequences." O’Keefe wonders if Ignatieff ever had a moral compass to lose. “It’s not that one can never genuinely change one’s mind,” he writes, “it’s just that there is no trace at all of the humility or regret that would normally accompany such an about face.” It's the portrait of a man saying what the powerful want to hear, as the means to gain power for himself.
While largely persuasive, O'Keefe's indictment is a little too unrelenting. He can barely credit Ignatieff with anything, even with any literary gifts: his books are the work of a “solipsistic cosmopolitan.” But even as a non-admirer of Isaiah Berlin, I’d say Ignatieff’s biography is decent. One of his novels was a finalist for the Booker prize in the early 1990s. And Ignatieff has been called “Canada’s Obama,” which refers in part to their shared facility with a pen, rare among politicians. But the series is called Counterblasts, after all, and sometimes polemic involves taking no prisoners.
Ignatieff became the leader of the Liberal Party in 2009. Last May, he oversaw what O’Keefe calls “the most catastrophic electoral defeat in the history of the Liberal Party of Canada,” whereupon Ignatieff resigned. That underscores the other reservation I had about the book, which is that both the man and the era he helped shape are now part of history, rather than current events. The next two volumes in the series will address Christopher Hitchens and Tony Judt. The thought of them counter-counterblasting in reply is appealing, but a daydream now that they're gone. The old world, as Winstanley said, is "running up like a parchment in the fire." The series editors should go find some active menaces to take down.
The National Endowment for the Humanities on Monday named Wendell Berry -- a poet, novelist and essayist with a focus on conservation issues -- as the 2012 Jefferson Lecturer. Berry will deliver the lecture on April 23 in Washington. The lecture, "It All Turns on Affection," will discuss human interaction with nature, as depicted in history, philosophy, and literature.