The Soiling of Old Glory

It was three months before the Bicentennial and a group of high school students in Boston were saying the Pledge of Allegiance. One of them held a large American flag. But this was not the commonplace ritual of citizenship that it might sound. The teenagers, all of them white, were just as swept up as their parents in the protests over court-ordered desegregation of the Boston public schools; and much of the rhetoric swirling around the anti-busing movement appealed to the old patriotic tropes of resistance to tyranny, defense of the rights of the citizen, and so on. The kids who milled around in front of City Hall in Boston were enjoying their chance to share in the Spirit of ‘76 while also skipping class on a Monday morning.

The people in the anti-busing movement were not, they often insisted, racists. It so happened that a young African-American lawyer named Ted Landsmark had a meeting at City Hall that morning to discuss minority hiring in construction jobs. He turned the corner and walked into a scene that would be recorded for posterity by Stanley Forman, a photojournalist for The Boston Herald American. In a picture that won Forman his second Pulitzer Prize in as many years, we see Landsmark in the right-hand half of the image. Dressed in a three-piece suit, he is the only black person in the crowd. But what dominates the scene is the teenager who had been holding the American flag a little earlier, during the Pledge, and now wields it as a weapon, seeming to drive it like a lance into Landsmark’s body. Forman later titled his photograph “The Soiling of Old Glory.”

Frozen in mid-action, the image is brutal. But what makes it especially so is the expression on the kid’s face – a look of pure hatred and rage, his teeth showing, his upper lip curled in what seems to be (according to some research in affect theory) the universal physical manifestation of disgust. Other people in the crowd look on with what seems to be interest or even pleasure. It appears that nobody is ready to help Landsmark. A man standing just behind the lawyer seems to be holding him, so that kid with the flagstaff can get a clear shot.

But according to Louis P. Masur in The Soiling of Old Glory: The Story of a Photograph that Shocked America, just published by Bloomsbury Press, the picture is a misleading in that regard. Masur -- a professor of American institutions and values at Trinity College, in Connecticut -- analyzed the other images the photographer shot that day and finds a different story unfolding.

“The man who, in a a previous image, is in motion racing to the scene has arrived and is grabbing Landsmark,” writes Masur. “It would appear that he has joined the fray to get in his punches and, worse yet, is pinioning Landsmark’s arms so that the flag bearer has a clear line of attack. In fact, the person holding Landsmark is Jim Kelly, one of the adult organizers of the protest, and he has raced in not to bind Landsmark but to save him from further violence.... He dashed in to try to break up the fight. In another photograph taken a moment later, he can be seen holding his arms out wide trying to keep the protesters back as Landsmark stumbles to safety.”

Knowing this, writes Masur, “changes our understanding of the photograph.” To a degree, perhaps, yes. But no degree of recontextualizing can gainsay the interpretation of the scene offered by the victim of the assault. “I couldn’t put my Yale degree in front of me to protect myself,” Landsmark told a newspaper reporter a few days after the attack. “The thing that is most troubling is that it happened not because I was somebody but because I was anybody....I was just a nigger they were trying to kill.”

The image is iconic. It does not simply reproduce an event; it crystallizes something out of life itself.

“The camera freezes time,” as Masur writes, “giving us always a moment, a fraction of a narrative that stretches before and after the isolated instant.” The Soiling of Old Glory reconstructs some of that narrative – drawing for the most part on published sources, especially the account of the Boston busing crisis given by the great American journalist Anthony Lukas in his book Common Ground: A Turbulent Decade in the Lives of Three American Families (1985).

Protesters often insisted that they hated busing, not African-Americans -- but as someone put it at the time, nobody went out to beat up a school bus. Masur treats the photograph itself as a turning point in the crisis. “However strenuously the anti-busing movement emphasized issues other than race,” he writes, “the photograph shattered the protesters’ claim that racism did not animate their cause and that they were patriotic Americans fighting for their liberties. The photograph had seared itself into the collective memory of the city and installed itself in the imagination of both blacks and whites.”

Masur understands the photograph as leading, “at first, to turmoil and self-scruitny, and later, to progress and healing.” Such a perspective is bound to be appealing to many people, especially given the desire now to imagine a “post-racial” America. In any case, the photograph itself certainly sticks in one’s memory – and not just as a document of a particular conflict. An analyst must try to account for something of its power; and this task is not any easier given the relative neglect by scholars of photojournalism itself as a topic for critical study.

Unfortunately, in the course of meditating upon the image, Masur sometimes exhibits rather serious failures of what is sometimes been called “hermeneutic tact.” This is a feel for the limits of interpretation: a sense that ingenuity might, beyond a certain point, involve a kind of violation of what the process will bear. Hermeneutic tact, like the social kind, is impossible to codify. But you know when someone has violated it, because you wince.

At one point, the author claims that the mob member wielding the flag like a spear “suggests the stabbing of Christ in the side by the Roman solider Longinus, who afterward was converted to Christianity and was canonized.” No, it doesn’t – not even if you squint really, really hard. Here, free association leads to something akin to a context salad. Likewise with some musings about the title Forman gave his photograph: “The verb soiling means defiling or staining,” writes Masur. “But with the root soil, it also suggests planting. Flags are thrust into the ground as statements of control, whether by explorers in the New World or by American astronauts on the moon. In an extreme act of desecration and possession, the protestor, it seems, is trying to implant the flag into the black man and claim ownership.”

We now have a precise technical term for this sort of thing, thanks to the efforts of Harry Frankfurt. A good editor would have found a way to remove such passages, or at least buried them in the quiet graveyard of the book’s apparatus. They are distracting yet take nothing away from the lasting power of the image itself. Looking at it, I felt the urge to reread Frederick Douglass’s “The Meaning of July 4th for the Negro” – a speech from 1852 that Masur, oddly enough, never cites, though it is hard to think of a more pointed and fitting account of how certain beloved symbols may serve as instruments of oppression. More was happening within the frame of Stanley Forman's action shot than any single analysis can quite exhaust.

Scott McLemee
Author's email: 

All for Nought?

We are coming to the end of a decade with no name. As the 1990s wound down, there was a little head scratching over what to call the approaching span of 10 years, but no answer seemed obvious, and no consensus ever formed. A few possibilities were ventured -- for example, “the Noughties” -- but come the turn of the new millennium, they proved about as appealing and marketable as a Y2K survival kit. Before you knew it, we were deep into the present decade without any commonly accepted way of referring to it.

The lack seems more conspicuous now. We are in the penultimate year of the '00s (however you want to say it) and much of the urgent rhetoric about “change” in the presidential campaigns implies a demand for some quick way to refer to the era that will soon, presumably, be left behind.

It has become second nature to periodize history by decades, as if each possessed certain qualities, amounting almost to a distinct personality. To some degree this tendency was already emerging as part of the public conversation in the 19th century (with the 1840s in particular leaving a strong impression as an era of hard-hitting social criticism and quasi-proto-hippie experimentation) but it really caught on after the First World War. In The True and Only Heaven: Progress and Its Critics (Norton, 1991), the late Christopher Lasch wrote: “This way of thinking about the past had the effect of reducing history to fluctuations in public taste, to a progression of cultural fashions in which the daring advances achieved by one generation become the accepted norms of the next, only to be discarded in turn by a new set of styles.”

Thus we have come to speak of the Sixties as an era of rebellion and experimentation, much of it communal. And you could read books by 1840s guys like Marx, Kierkegaard, and Feuerbach in cheap paperbacks, if you weren’t too stoned. In the Seventies, the experimentation continued, but in a more privatized way (“finding myself”) and with a throbbing disco soundtrack. By the Eighties, greed was good; so were MTV, pastel, and mobile phones as big as your head. In the Nineties, it seemed for a little while as if maybe greed were bad. But then all the teenage Internet millionaires started ruling over the end of history from their laptops while listening to indie rock and wearing vintage clothes from the Sixties, Seventies, and Eighties.

Of course there are alternative points of emphasis for each decade. This kind of encapsulated history is not exactly nuance-friendly. But it’s no accident that bits of popular culture and lifestyle have become the default signifiers summing up each period of the recent past. “The concept of the decade,” wrote Lasch, “may have commended itself, as the basic unit of historical time, for the same reason the annual model change commended itself to Detroit: it was guaranteed not to last. Every ten years it had to be traded in for a new model, and this rapid turnover gave employment to scholars and journalists specializing in the detection and analysis of modern trends.”

Well, we do what we can. But it seems as if the effort has failed miserably over the past few years. The detectives and analysts have gone AWOL. There is no brand name for the decade itself, nor a set of clichés to clinch its inner essence.

While discussing the matter recently with Aaron Swartz, a programmer now working on the Open Library initiative, I found myself at least half agreeing with his impression of the situation. “This decade seems Zeitgeist-free,” he said. “It’s as if the Nineties never ended and we’re just continuing it without adding anything new.”

But then I remembered that my friend is all of 21 years old, meaning that roughly half his life so far was spent in the 1990s. Which could, in spite of Aaron’s brilliance, somewhat limit the ability to generalize. In that regard, being middle aged offers some small advantage. My own archive of memory includes at least a few pre-literate recollections of the 1960s (the assassinations of 1968 interrupted "Captain Kangaroo") and my impression is that each new decade since then has gone through a phase of feeling like the continuation of (even a hangover from) the one that went before.

It usually takes the Zeitgeist a while to find a new t-shirt it prefers. On the other hand, I also suspect that Aaron is on to something -- because it sure seems like the Zeitgeist is having an unusually hard time settling down, this time around.

In the next column, I’ll sketch out a few ideas about what might, with hindsight, turn out to have been the distinguishing characteristics of the present decade. Perhaps the fact that we still don’t have a name for it is not an oversight, or a bit of bad semantic luck. According the Lasch, decade-speak is a way to understand the past as a story of progress involving the rise and fall of cultural styles and niches. If so, then it may be we might have turned a corner somewhere along the way. The relationship between progress, nostalgia, and the cultural market may have changed in ways making it harder to come up with a plausible general characterization of the way we live now.

More on that in a week. Meanwhile: What do you call the current period? Does it have its own distinct “structure of feeling”? When we fit it into our thumbnail histories of the recent past, how are we likely to understand the spirit of the age?

Scott McLemee
Author's email: 

The 2.0 Decade?

Last week Intellectual Affairs discussed the effects of an irresistible force on an immovable object. The force in question is our habit of referring to each recent decade as if it had a distinct quality or even personality: the '50s as an era of straightlaced conformity, for example, or the '70s as (in Tom Wolfe’s phrase) “the Me Decade.” This tendency has dubious effects. It flattens the complexity of historical reality into clichés. It manifests a sort of condescension to yesteryear, even. But decade-speak is a basic element in ordinary conversation -- and the habit is so well-established as to seem, again, irresistible.

If so, the past several years have been a period that won’t budge, if only because we lack a convenient and conventional way to refer to it. Expressions such as “the Aughties” or “the Noughties” are silly and unappealing. I ended the last column by asking what readers were calling this decade. Many of the responses involved expressions of disgust with the era itself, but a couple of people did propose terms for what to call it. One suggestion was, simply, “the Two Thousands,” which seems practical enough. Another was “the 2.0 Decade” -- an expression both witty and apropos, though unlikely to catch on.

Perhaps some expression may yet emerge within everyday speech -- but we’re winding down “the Zeroes” (my own last-ditch suggestion) without a way of referring to the decade itself. For now, it is an empty space in public conversation, a blind spot in popular memory. That may not be an accident. The past several years have not exactly been lacking in novelty or distinctiveness. But the tempo and the texture of the changes have made it difficult to take everything in -- to generalize about experience, let alone sum it up in a few compact stereotypes.

Rather than give an extensive chronology or checklist of defining moments from the present decade -- whatever we end up calling it -- I’d like to use the rest of this column to note a few aspects of what it has felt like to live through this period. The effort will reflect a certain America-centrism. But then, decade-speak is usually embedded in, and a reflection upon, a particular national culture. (German or Italian musings on “the Seventies,” for example, place more emphasis on the figure of the urban guerrilla than the disco dancer.)

These notes are meant neither as memoir nor as political editorial, though I doubt a completely dispassionate view of the period is quite possible. “Your mileage may vary,” as a typical expression of the decade goes. Or “went,” rather. For let’s imagine that the era is over, and that time has come to describe what things felt like, back then....

The decade as unit of meaning does not normally correspond to the exact chronological span marked by a change of digits. We sometimes think of the Eighties as starting with the election of Margaret Thatcher in 1979, for example, or Ronald Reagan’s inauguration in 1981. Conversely, that period can be said to end with the tearing down of the Berlin Wall in 1989, or as late as the final gasp of the Soviet state in 1991. The contours, like the meanings, tend to be established ex post facto; and they are seldom beyond argument.

In this case, there was a strong tendency to think of the decade as beginning on the morning of September 11, 2001 -- which meant that it started amid terror, disbelief, and profound uncertainty about what would happen next. Within a few weeks of the attacks, there would be a series of deaths from anthrax-laced envelopes sent through the U.S. mail. (Among the puzzles of the entire period is just why and how the public managed to lose interest in the latter attacks, even though no official finding was ever made about the source of the anthrax.)

Over the next two years or so, there would be a constantly fluctuating level of official “terror alerts.” Free-floating anxiety about the possibility of some new terrorist assault would become a more or less normal part of everyday life. Even after early claims by the administration of a connection between Saddam Hussein and the 9/11 terrorists were disproven, a large part of the public continued to believe that one must have existed. Elected officials and the mass media tended not to challenge the administration until several months into the Iraq War. The range of tolerated dissent shrank considerably for at least a few years.

Simultaneously, however, an expanding and wildly heterogeneous new zone of communication and exchange was emerging online -- and establishing itself so firmly that it would soon be difficult to recall what previous regimes of mass-media production and consumption had been like. The relationship between the transmitters of news, information, and analysis (one the one hand) and the audience for them (on the other) tended to be ever less one-way.

It proved much easier to wax utopian or dystopian over the effects of this change than to keep up with its pace, or the range of its consequences.
At the same time, screens and recording devices were -- ever more literally -- everywhere. Devices permitting almost continuous contact with the new media kept getting smaller, cheaper, and more powerful. They permeated very nearly the entire domain of public and private space alike. Blaise Pascal’s definition of the universe began to seem like an apt description of cosmos being created by the new media: “an infinite sphere, the center of which is everywhere, the circumference nowhere.”

Quoting this passage, Jorge Luis Borges once noted that Pascal’s manuscript shows he did not originally describe the sphere as infinite. He wrote “frightful” instead, then scratched it out. Looking back on that unnamed (and seemingly unnameable) decade now, it seems like the right word. Whatever meaning it may yet prove to have had, it was, much of the time, frightful.

Scott McLemee
Author's email: 

From Plymouth Rock to Plato's Retreat

Last week, Intellectual Affairs gave the recent cable TV miniseries “Sex: The Revolution” a nod of recognition, however qualified, for its possible educational value. The idea that sex has a history is not, as such, self-evident. The series covers the changes in attitudes and norms between roughly 1950 and 1990 through interviews and archival footage. Most of this flies past at a breakneck speed, alas. The past becomes a hostage of the audience’s presumably diminished attention span.

Then again, why be ungrateful? Watching the series, I kept thinking of a friend who teaches history at Sisyphus University, a not-very-distinguished institution in the American heartland. For every student in his classroom who seems promising, there are dozens who barely qualify as sentient. (It sounds like Professor X, whose article “In the Basement of the Ivory Tower” appears in the latest issue of The Atlantic, teaches in the English department there.) Anything, absolutely anything, that might help stimulate curiosity about the past would be a godsend for the history faculty at Sisyphus U.

With that consideration in mind, you tend to watch “Sex: The Revolution” with a certain indulgence -- as entertainment with benefits, so to speak. Unfortunately, the makers stopped short. They neglected to interview scholars who might have provided more insight than a viewer might glean from soundbites by demi-celebrities. And so we end up with a version of history not too different from the one presented by Philip Larkin in the poem “Annus Mirabilis” --

Sexual intercourse began

In nineteen sixty-three

(Which was rather late for me) -

Between the end of the Chatterley ban

And the Beatles' first LP.

-- except without the irony. A belief that people in the old days must have been repressed is taken for granted. Was this a good thing or not? Phyllis Schlafly and reasonable people may disagree; but the idea itself is common coin of public discourse.

But suppose a television network made a different sort of program -- one incorporating parts of what one might learn from reading the scholarship on the history of sex. What sense of the past might then emerge?

We might as well start with the Puritans. Everybody knows how up-tight they were -- hostile to sex, scared of it, prone to thinking of it as one of the Devil’s wiles. The very word “Puritan” now suggests an inability to regard pleasure as a good thing.

A case in point being Michael Wigglesworth -- early Harvard graduate, Puritan cleric, and author of the first American best-seller, The Day of Doom (1642), an exciting poem about the apocalypse. Reverend Wigglesworth found the laughter of children to be unbearable. He said it made him think of the agonies of the damned in hell.You can just imagine how he would respond to the sound of moaning. Somehow it is not altogether surprising to learn that the Rev’s journal contains encrypted entries mentioning the “filthy lust” he felt while tutoring male students.

In short, a typical Puritan -- right? Well, not according to Edmund Morgan, the prominent early-Americanist, whose many contributions to scholarship over the years included cracking the Wigglesworth code. (He is now professor emeritus of history at Yale.)

Far from being typical, Wigglesworth, it seems, was pretty high-strung even by the standards of the day. In a classic paper called “The Puritans and Sex,” published in 1942, Morgan assessed the evidence about how ordinary believers regarded the libido in early New England. He found that, clichés notwithstanding, the Puritans tended to be rather matter-of-fact about it.

Sermons and casual references in letters and diaries reveal that the Puritans took sexual pleasure for granted and even celebrated it -- so long, at least, as it was enjoyed within holy wedlock. Of course, the early colonies attracted many people of both sexes who were either too young to marry or in such tight economic circumstances that it was not practical. This naturally meant a fair bit of random carrying on, even in those un-Craigslist-ed days. All such activity was displeasing unto the Lord, not to mention His earthly enforcers; but the court records show none of the squeamishness about that one might expect, given the Puritans’ reputation. Transgressions were punished, but the hungers of the flesh were taken for granted.

And Puritan enthusiasm for pleasures of the marriage bed was not quite so phallocentric as you might suppose. As a more recent study notes, New Englanders believed that both partners had to reach orgasm in order for conception to occur. Many Puritan women must have had their doubts on that score. Still, the currency of that particular bit of misinformation would tend to undermine the assumption that everybody was a walking bundle of dammed-up desire -- finding satisfaction only vicariously, through witch trials and the like.

Our imagined revisionist documentary would be full of such surprises. Recent scholarship suggests that American mores were pretty wild long before Alfred Kinsey quantified things in his famous reports.

Richard Godbeer’s Sexual Revolution in Early America (Johns Hopkins University Press, 2002) shows that abstinence education was not exactly the norm in the colonial period. Illegitimate births were commonplace; so was the arrival of children six or seven months after the wedding day. For that matter, cohabitation without benefit of clergy was the norm in some places. And while there were statutes on the books against sodomy -- understood as nonprocreative sexual activity in general -- it’s clear that many early Americans preferred to mind their own business.

Enforcing prohibitions on “unnatural acts” between members of the same sex was a remarkably low priority. “For the entire colonial period,” noted historians in a brief filed a few years ago when Lawrence v. Texas went to the U.S. Supreme Court, “we have reports of only two cases involving two women engaged in acts with one another.... The trial of Nicholas Sension, a married man living in Westhersfield, Connecticut, in 1677, revealed that he had been widely known for soliciting sexual contacts with the town’s men and youth for almost forty years but remained widely liked. Likewise, a Baptist minister in New London, Connecticut, was temporarily suspended from the pulpit in 1757 because of his repeatedly soliciting sex with men, but the congregation voted to restore him to the ministry after he publicly repented.”

History really comes alive, given details like that -- and we’ve barely reached the Continental Congress. The point is not that the country was engaged in one big orgy from Plymouth Rock onwards. But common attitudes and public policies were a lot more ambivalent and contradictory in the past than we’re usually prone to imagine.

There was certainly repression. In four or five cases from the colonial era, sodomy was punished by death. But in a society where things tend to be fluid -- where relocation is an option, and where money talks -- there will always be a significant share of the populace that lives and acts by its own lights, and places where the old rules don't much matter. And so every attempt to enforce inhibition is apt to seem like little, too late (especially to those making the effort).

You catch some of that frantic sense of moral breakdown in the literature of anti-Mormonism cited by Sarah Barringer Gordon in her study The Mormon Question: Polygamy and Constitutional Conflict in Nineteenth-Century America, published by the University of North Carolina Press in 2002. Novels about polygamous life in Utah were full of dark fascination with the lascivious excess being practiced in the name of freedom of religion – combined with fear that the very social and political order of the United States was being undermined. It was all very worrying, but also titillating. (Funny how often those qualities go together.)

The makers of “Sex: The Revolution” enjoyed the advantage of telling stories from recent history, which meant an abundance of film and video footage to document the past. Telling a revisionist story of American sexual history would suffer by visual comparison, tending either toward History Channel-style historical reenactments or Ken Burns-ish readings of documents over sepia-toned imagery.

But now, thanks to the efforts of phonographic archivists, we can at least listen to one part of the sexual discourse of long ago. A set of wax recordings from the 1890s -- released last year on a CD called “Actionable Offenses” -- preserves the kind of lewd entertainment enjoyed by some of the less respectable Americans of the Victorian era. And by “lewd,” I do not mean “somewhat racy.” The storytelling in dialect tends to be far coarser than anything that can be paraphrased in a family publication such as Inside Higher Ed. A performance called “Learning a City Gal How to Milk” is by no means the most obscene.

Anthony Comstock -- whose life’s work it was to preserve virtue by suppressing vice -- made every effort to wipe out such filth. It’s a small miracle that these recordings survived. The fact that they did gives us a hint at just how much of a challenge Comstock and associates must have faced.

When a popular program such as “Sex: The Revolution” recalls the past, it is usually an account of the struggle to free desire from inhibition. Or you can tell the same tale in a conservative vein: the good old days of restraint, followed by a decline into contemporary decadence.

Both versions are sentimental; both condescend to the past.

In the documentary I’d like to see, the forces of repression would be neither villains nor heroes. They would be hapless, helpless, confused -- and sinking fast in quicksand, pretty much from the start. It would be an eye-opening film. Not to mention commercially viable. After all, there would be a lot of sex in it.

Scott McLemee
Author's email: 

Plenty to Go Around

Epistemology, as everyone around these parts is surely aware, is the study of the problems associated with knowledge – what it is, from whence it comes, and how it is you know that you know what you know (or think you do).

It gets recursive mighty fast. And questions about the relationship between epistemology and ethics are potentially even more so. Most of us just accept the wisdom of Emil Faber, the legendary founder of Faber College in bucolic Pennsylvania, who proclaimed, “Knowledge is good.” (At least that's what it says on the plaque in front of the campus library, as I recall, though it's been many years since my last viewing of "Animal House.")

But what about ignorance? Arguably there is more of it in the world than knowledge. Who studies it, though? Shouldn't epistemology have its equal but opposite counterpart?

A new book from Stanford University Press called Agnotology: The Making and Unmaking of Ignorance proposes that such a field of study is necessary – that we need rigorous and careful thinking about the structure and function and typology of cluelessness. The editors, Robert N. Proctor and Londa Schiebinger, are both professors of history of science at Stanford University. Their volume is a collection of papers by various scholars, rather than a systematic treatment of its (perhaps inexhaustible) subject. But the field of agnotology seems to cohere around a simple, if challenging, point: Ignorance, like knowledge, is both socially produced and socially productive.

This goes against the grain of more familiar ways of thinking. The most commonplace way of understanding ignorance, after all, is to define it as a deficit – knowledge with a minus sign in front of it.

A rather more sophisticated approach (which got Socrates in trouble) treats heightening the awareness of one’s own ignorance as the beginning of wisdom. And the emergence of modern scientific research, a few centuries back, treated ignorance as a kind of raw material: fuel for the engines of inquiry. As with any fuel, the prospect of a shortage seems catastrophic. “New ignorance must forever be rustled up to feed the insatiable appetite for science,” writes Proctor about the common trope of ignorance as resource. “The world’s stock of ignorance is not being depleted, however, since (by wondrous fortune and hydra-like) two new questions arise for every one answered....The nightmare would be if we were somehow to run out of ignorance, idling the engines of knowledge production.”

Each of these familiar perspectives on ignorance -- treating it as deficit, as Socratic proving ground, or as spur for scientific inquiry -- frames it as something outside the processes of knowledge-production and formal education. If those processes are carried on successfully enough, then ignorance will decline.

The agnotologists know better (if I can put it that way).

Ignorance is not simply a veil between the knower and the unknown. It is an active – indeed vigorous – force in the world. Ignorance is strength; ignorance is bliss. There is big money in knowing how to change the subject – by claiming the need for “more research” into whether tobacco contains carcinogens, for example, or whether the powerful jaws of dinosaurs once helped Adam and Eve to crack open coconuts.

Having a memory so spotty that is a small miracle one can recall one’s own name is a wonderfully convenient thing, at least for Bush administration officials facing Congressional hearings. The Internet complicates the relationship between information and ignorance ceaselessly, and in ever newer ways. Poverty fosters ignorance. But affluence, it seems, does it no real harm.

This is, then, a field with much potential for growth. Most of the dozen papers in Agnotology are inquries into how particular bodies of ignorance have emerged and reproduced themselves over time. Nobody quotes the remark by Upton Sinclair that Al Gore made famous: “It is difficult to get a man to understand something when his salary depends upon him not understanding it.” Still, that line certainly applies to how blindspots have taken shape in the discourse over climate change, public health, and the history of racial oppression. (In a speech, Ronald Reagan once attributed the greatness of the United States to the fact that “it has never known slavery.”)

Any sufficiently rigorous line of agnotological inquiry must, however, recognize that there is more to ignorance than political manipulation or economic malfeasance. It also serves to foster a wide range of social and cognitive goods.

The paper “Social Theories of Ignorance” by Michael J. Smithson, a professor of psychology at the Australian National University, spells out some of the benefits. A zone of carefully cultivated ignorance is involved in privacy and politeness, for example. It is also intrinsic to specialization. “The stereotypical explanation for specialization,” writes Smithson, “is that it arises when there is too much for any one person to learn anything.” But another way of looking at it is to regard specialization as a means whereby “the risk of being ignorant about crucial matters is spread by diversifying ignorance.”

Smithson also cites the research of A.R. Luria (a figure something like the Soviet era’s equivalent to Oliver Sacks) who studied an individual with the peculiar ability to absorb and retain every bit of information he had encountered in his lifetime. Such a person would have no advantage over the garden variety ignoramus. On the contrary,“higher cognitive functions such as abstraction or even mere classification would be extremely difficult,” writes Smithson. “Information acquired decades ago would be as vividly recalled as information acquired seconds ago, so older memories would interfere with more recent usually more relevant recollections.”

So a certain penumbra of haziness has its uses. Perhaps someone should contact the trustees of Faber College. The sign in front of the campus library could be changed to read “Diversifying Ignorance is Good.”

Scott McLemee
Author's email: 

It's All About the Oil

A friend recently noted that this week’s column would probably run at just about the time the Chinese government was using the Olympic torch to burn down a Tibetan village. Perhaps, he said, this might be a good occasion to check out the latest edition of The Ancient Olympic Games by Judith Swadding – first published by the British Museum in 1980 and now being reissued by the University of Texas Press.

The earlier version contained a succinct overview of how the Olympics (originally held every four years between 776 BC and 395 AD) were revived at the close of the 19th century. The new edition has been expanded to include an account of the past century or so – during which time the games often served as a venue for propaganda, a medium through which great powers conducted their hostilities. All this, of course, in spite of official rhetoric about how the spirit of sportsmanship transcends ideology.

The update is necessary, I suppose, but in some ways anticlimatic – even a distraction. Let modern times take care of themselves; the author’s heart really belongs to the ancient world. Swaddling is a curator at the British Museum, and conducts most of the book as an amiable and instructive tour of what has survived of the world of the original Olympic competitions. The text is heavily illustrated with photographs of the surviving architecture at Olympia and artwork portraying the games themselves.

The most intriguing image, at least to me, was a photograph of an artifact known as a strigil. This is a device that is often mentioned in accounts of the period, but is hard to picture. The strigil was an “oil scraper,” used to peel away the layer of grime that built up on an athlete’s skin in the course of events such as the pankatrion, which is not found in the modern Olympics – a kind of no-holds-barred wrestling match that sounds absolutely brutal, and that doubtless left many who fought in it crippled for life.

The strigil, it turns out, looks something like a windshield de-icer with a little bottle of olive oil conveniently attached by a chain. Having oil rubbed into the skin before competition was supposed to prevent sunburn and otherwise be good for the athlete’s health. Any excess oil was supposed to be strigil’d off before the competition began. But a wrestler sometimes “forgot” to do this quite as thoroughly as he should. This gave him a definite advantage by making it harder for the opponent to get a grip.

More flagrant forms of cheating must have been a serious problem. Hefty fines for it were given out – that is, if the malefactor were lucky. If he wasn’t, justice was dealt out by tough characters armed with whips. Any racer who started before the signal was given should probably have just keep on running. There were also cases of competitions being “fixed.”

Swaddling writes that “instances of bribery were relatively rare.” She quotes an ancient author asking who would be such a lowlife as to try to corrupt a sacred event. (Apart from being a sporting event, the Olympics were also major religious gatherings, with scores of oxen being sacrificed for the occasion.) But you have to wonder if piety really kept everyone in line.

The author does not mention the statues of Zeus in a heavily trafficked area of Olympia, portraying the god in a menacing aspect. Inscriptions at the base of each statue warned people not to attempt to bribe the judges. If you did, Zeus would presumably hurl one of the thunderbolts he was carrying in his fist. This suggests that the temptation to offer the judges a little something was fairly common. Why go to all the trouble if everyone was already reverent and restrained?

Then again, it is easy to imagine why the athletes themselves would want to cheat. Winning immortal glory was one incentive; but so was avoiding immortal shame. The author quotes one Olympic sports commentator whose put-downs still work after two thousand years: “Charmos, a long distance runner, finished seventh in a field of six. A friend ran alongside him shouting, ‘Keep going Charmos!’ and although fully dressed, beat him. And if he had had five friends, he would have finished twelfth.”

Nor was Charmos the only victim of ancient stand-up comedy. Although Swaddling doesn’t cite it, there was the case of a boxer whose “admirers” wanted to erect a monument to his humanitarianism. Why? Because he never hurt anybody.

Greek doctors occasionally expressed irritation when athletes set themselves up as medical advisers and, Swaddling notes, “even attempted to write books on the subject.” You can just picture them performing live infomercials in the agora. Such grumbling aside, it seems there was a close connection between the Olympics and progress in ancient medical science. The latter “virtually came to a standstill when the major games ceased in the late fourth century AD.”

The close connection between the two fields was expressed in mythology: “Asklepios, the Greek god of medicine, learned his skills from the centaur Cheiron, who was credited with the introduction of competitive gymnastics and of music from the doubles pipes to accompany exercise.” (Someone should mention this to the people who run Jazzercise.)

Married women were not allowed onto the grounds of the Olympic festivities, though they managed to sneak in from time to time. Was there some dubious medical theory to rationalize this? In any case, the exclusion did not apply to all women. Both virgins and prostitutes were permitted to attend the games.

That sounds like something out of a Freudian case study. Swaddling simply notes the matter without trying to interpret it. I have no theories, but will offer a bit of related speculation. One of these days an archaeologist is going to discover an inscription that reads: “What happens at the Olympics, stays at the Olympics.”

Scott McLemee
Author's email: 

The End of the End of the End of History?

One minor casualty of the recent conflict in Georgia was the doctrine of peace through McGlobalization -- a belief first elaborated by Thomas Friedman in 1999, and left in ruins on August 8, when Russian troops moved into South Ossetia. “No two countries that both had McDonald’s had fought a war against each other since each got its McDonald’s,” wrote Friedman in The Lexus and the Olive Tree (Farrar, Straus, and Giroux).

Not that the fast-food chain itself had a soothing effect, of course. The argument was that international trade and modernization -- and the processes of liberalization and democratization created in their wakes -- would knit countries together in an international civil society that made war unnecessary. There would still be conflict. But it could be contained -- made rational, and even profitable, like competition between Ronald and his competitors over at Burger King. (Thomas Friedman does not seem like a big reader of Kant, but his thinking here bears some passing resemblance to the philosopher’s “Idea for a Universal History from a Cosmopolitan Perspective,” an essay from 1784.)

McDonald’s opened in Russia in 1990 -- a milestone of perestroika, if ever there were one. And Georgia will celebrate the tenth anniversary of its first Micky D’s early next year, assuming anybody feels up for it. So much for Friedman's theory. Presumably it could be retooled ex post facto (“two countries with Pizza Huts have never had a thermonuclear conflict,” anyone?) but that really seems like cheating.

Ever since a friend pointed out that the golden arches no longer serve as a peace sign, I have been wondering if some alternative idea would better fit the news from Georgia. Is there a grand narrative that subsumes recent events? What generalizations seem possible, even necessary and urgent, now? What, in short, is the Big Idea?

Reading op-ed essays, position papers, and blogs over the past two weeks, one finds a handful of approaches emerging. The following survey is not exhaustive -- and I should make clear that describing these ideas is not the same as endorsing them. Too many facts about what actually happened are still not in; interpretation of anything is, at this point, partly guesswork. (When the fog of war intersects a gulf stream of hot air, you do not necessarily see things more clearly.) Be that as it may, here are some notes on certain arguments being made about what it all means.

The New Cold War: First Version. A flashback to the days of Brezhnev would have been inevitable in any case -- even if this month were not the 40th anniversary of Soviet tanks rolling into what was then Czechoslovakia.

With former KGB man Vladimir Putin as head of state (able to move back and forth between the offices of the president and of the prime minister, as term limits require) and the once-shellshocked economy now growing at a healthy rate thanks to international oil prices, Russia has entered a period of relative stability and prosperity -- if by no means one of liberal democracy. The regime can best be described as authoritarian-populist. There have been years of frustration at seeing former Soviet republics and erstwhile Warsaw Pact allies become members of NATO. Georgia (like Ukraine) has recently been invited to do so as well. So the invasion of South Ossetia represents a forceful reassertion of authority within Russia’s former sphere of influence.

We have reached "the end of the end of the Cold War,” goes this interpretation. Pace Fukuyama, it was a mistake to believe that historical progress would culminate in liberal, democratic, constitutional republicanism. The West needs to recognize the emergence of a neo-Soviet menace, and prepare accordingly.

This perspective was coming together even before the conflict between Russia and Georgia took military form. For some years now, the French philosopher Andre Glucksmann (whose musings on Solzhenitsyn’s The Gulag Archipelago were influential in the mid-1970s) has been protesting the rise of the new Russian authoritarianism, quoting with dismay Putin’s comment that “the greatest geopolitical disaster of the twentieth century is the dissolution of the Soviet Union.”

Vaclav Havel, the playwright and former president of the Czech Republic, has done likewise. In a recent interview, Havel said, “Putin has revealed himself as a new breed of dictator, a highly refined version. This is no longer about communism, or even pure nationalism.... It is a closed system, in which the first person to break the rules of the game is packed off to Siberia."

Why be skeptical of this perspective? Certainly the authoritarianism of the Putin regime itself is not in doubt. But the specter of a new Red Army poised to assert itself on the world stage needs to be taken with a grain of salt. A report prepared by the Congressional Research Service in late July notes that budget cuts have forced “hundreds of thousands of officers out of the ranks” of the Russian military, and reduced troop strength to 1.2. million men (compared to 4.3 million in the Soviet military in 1986).

“Weapons procurement virtually came to a halt in the 1990s,” the report continues, “and is only slowly reviving. Readiness and morale remain low, and draft evasion and desertion are widespread.” Raw nationalist fervor will only make your empire just so evil.

The New Cold War: Take Two. Another version of the old template regards an East/West standoff as inevitable, not because Putinist Russia is so vigorous, but because such a conflict is in the interests of the United States.

We're not talking here about the more familiar sort of argument about the U.S. needing access to oil in the Caucus region. Nor does it hinge on strategic concerns about nuclear cooperation between Russia and Iran. It has less to do with economic interest, or geopolitical advantage, than it does the problem of ideological vision (or lack of it) among ruling elites in the West. A renewal of superpower conflict would help to prop up societies that otherwise seem adrift.

This thesis is argued a British think tank called the Institute of Ideas, which takes much of its inspiration from the work of Frank Furedi, a professor of sociology at the University of Kent. Having started out decades ago as Marxists of a rather exotic vintage, writers associated with the institute have moved on to a robustly contrarian sort of libertarianism. Their perspective is that state and civil society alike in the industrialized world are now prone to waves of fear and a pervasive sense of aimlessness.

“It is difficult,” writes Furedi in a recent essay, “to discover clear patterns in the working of twenty-first-century global affairs....The U.S. in particular (but also other powers) is uncertain of its place in the world. Wars are being fought in faraway places against enemies with no name. In a world where governments find it difficult to put forward a coherent security strategy or to formulate their geo-political interests, a re-run of the Cold War seems like an attractive proposition. Compared to the messy world we live in, the Cold War appears to some to have been a stable and at least comprehensible interlude.”

Hence the great excitement at recent events - so rich are they with promise of a trip backwards in time.

There is something at least slightly plausible in this idea. A quick look at Google shows that people have been announcing “the end of the end of the Cold War” for quite a while now. The earliest usage of that phrase I’ve seen comes from 1991. A kind of nostalgia, however perverse, is probably at work.

But Furedi's larger argument seems another example of an idea so capacious that no counterevidence will ever disprove it. If leaders are concerned about what’s happening in the Caucusus, it is because anxiety has made them long for the old verities. But if they ignored those events -- well, that would imply that the culture has left them incapable of formulating a response. Heads, he wins. Tails, you lose.

The End of ... Something, Anyway. Revitalizing the Cold War paradigm keeps our eyes focused on the rearview mirror. But other commentary on events in Russia and Georgia points out something you might not see that way -- namely, that this stretch of paved road has just run out.

The Duck of Minerva – an academic group blog devoted to political science – has hosted a running discussion of the news from South Ossetia. In a post there, Peter Howard, an assistant professor of international service at American University, noted that the most salient lesson of the invasion was that it exposed the limits of U.S. influence.

“Russia had a relatively free hand to do what it did in Georgia,” he writes, “and there was nothing that the U.S. (or anyone else for that matter) was going to do about it.... In a unipolar world, there is only one sphere of influence -- the whole world is the U.S.’s sphere of influence. Russia’s ability to carve any sphere of influence effectively ends unipolarity, if there ever was such a moment.”

Howard points to a recent article in Foreign Affairs by Richard Haass, the president of the Council on Foreign Relations, about the emergence of “nonpolarity: a world dominated not by one or two or even several states but rather by dozens of actors possessing and exercising various kinds of power.”

This will, it seems, be confusing. Countries won’t classify one another simply as friends or foes: “They will cooperate on some issues and resist on others. There will be a premium on consultation and coalition building and on a diplomacy that encourages cooperation when possible and shields such cooperation from the fallout of inevitable disagreements. The United States will no longer have the luxury of a ‘You're either with us or against us’ foreign policy.” (One suspects the country is going to afford itself that luxury from time to time, even so.)

A recent op-ed in The Financial Times does not explicitly use the term “nonpolarity,” yet takes the concept as a given. Kishore Mahbubani, dean of the public policy school of the National University of Singapore, sees the furor over Georgia as a last gasp of old categories. The rise of Russia is “not even close” to being the most urgent concern facing the west.

“After the collapse of the Soviet Union,” he writes, “western thinkers assumed the west would never need to make geopolitical compromises. It could dictate terms. Now it must recognise reality. The combined western population in North America, the European Union and Australasia is 700m, about 10 per cent of the world’s population. The remaining 90 per cent have gone from being objects of world history to subjects.”

Framing his argument in terms borrowed from Chairman Mao, Mahbubani nonetheless sounds for all the world like an American neoconservative in a particularly thoughtful mood. “The real strategic choice” facing the wealthy 10 percent “is whether its primary challenge comes from the Islamic world or China,” he writes. “If it is the Islamic world, the U.S. should stop intruding into Russia’s geopolitical space and work out a long-term engagement with China. If it is China, the U.S. must win over Russia and the Islamic world and resolve the Israel-Palestine issue. This will enable Islamic governments to work more closely with the west in the battle against al-Qaeda.”

From this perspective, concern with the events in Georgia seems, at best, a distraction. Considering it a development of world importance, then, would be as silly as thinking that the spread of fast-food franchises across the surface of the globe will make everyone peaceful (not to mention fat and happy).

Well, I’m not persuaded that developments in the Caucasus are as trivial as all that. But we’re still a long way from knowing what any of it means. It’s usually best to keep in mind a comment by Zhou Enlai from the early 1970s. Henry Kissinger asked for his thoughts about the significance of the French Revolution. “It is,” Zhou replied, “too soon to say.”

Scott McLemee
Author's email: 

Turning a Page

Ideas have seldom been the currency of American politics. (Most of the time, currency is the currency of American politics.) But this seems like a moment in history when new thinking is a matter of some urgency.

Over the past few days, I've been conducting an utterly unscientific survey of academics, editors, and public intellectuals to find out how -- if given a chance -- they might try to influence the incoming occupant of the White House. The question was posed by e-mail as follows:

"Imagine you are invited to a sit-down with the president-elect and given the chance to suggest some recommended reading between now and the inauguration.Since we're trying to keep this fantasy of empowerment at least slightly plausible, I'd ask you to limit yourself to one book. (He will be busy.) Something not yet available in English is fine; we will assume a crack team of translators is standing by. Journal articles, historical documents, and dissertations also acceptable.

"What would you propose? Why? Is there a special urgency to recommending it to the attention of the next Chief Executive at this very moment? Remember, this is a chance to shape the course of history. Use your awesome power wisely...."

I tried to cast a wide net for potential respondents -- wider than my own political sympathies, in any case. Not all who were invited chose to participate. But everyone who did respond is included here. The suggestions were far-ranging, and the president-elect would no doubt benefit from time spent reading any of the nominated titles. (To make tracking things down easier on his staff, I have added the occasional clarifying note in brackets.)

In reality, of course, it's a long shot that the new president will take any of this advice. But the exercise is serious, even so -- for it is matter of opening a wider discussion of what books and ideas should be brought to bear on public life at this pivotal instant. An election is a political process; but so, sometimes, is thinking.

Eric Rauchway is a professor of history at the University of California at Davis and author of The Great Depression and the New Deal: A Very Short Introduction, recently published by Oxford University Press.

If they were asking me I'd suppose they were familiar with my own modest works, so I'd try to point out a perhaps neglected or forgotten classic.

Suppose it's John McCain, who has often expressed admiration for Theodore Roosevelt. I'd humbly suggest President-elect McCain revisit the chapters in George Mowry's classic Era of Theodore Roosevelt dealing with Roosevelt's first full term of office (1905-1909), when he worked hard with Congress to craft landmark legislation regulating business, affording protection to consumers, and providing for workers' compensation.

Suppose, conversely, it's Barack Obama, who would be the first northern Democrat elected since the party sloughed off the South in the Civil Rights era (i.e., since John Kennedy) and who would, like the greatest northern Democrat and perhaps the greatest president of all, Franklin Roosevelt, take office in a time of profound crisis. I would humbly remind him of Isaiah Berlin's classic essay on Roosevelt, in which he describes how much could be accomplished by a deft politician, sensitive even to minute ebbs and flows in political opinion, who while not lacking vision or integrity nevertheless understand—as Berlin wrote—"what to do and when to do it."

[The essay on Roosevelt can be found in the Berlin omnibus collection The Proper Study of Mankind: An Anthology of Essays, published ten years ago by Farrar, Straus and Giroux. Or here, while the link lasts.-SM]

Elvin Lim is an assistant professor of government at Wesleyan University and author of The Anti-Intellectual Presidency: The Decline of Presidential Rhetoric from George Washington to George W. Bush, published by Oxford University Press and discussed recently in this column.

The president-elect should read Preparing to be President: The Memos of Richard E. Neustadt (AEI Press, 2000), edited by Charles O. Jones. Richard Neustadt was a scholar-practitioner who advised Presidents Truman, Kennedy, Johnson, and Clinton, and, until his passing in 2003, also the dean of presidential studies. Most of the memos in this volume were written for president-elect John Kennedy, when the country was, as it is now, ready for change.

At the end of every election, "everywhere there is a sense of a page turning ... and with it, irresistibly, there comes the sense, 'they' couldn't, wouldn't, didn't, but 'we' will," Neustadt wrote years ago, reminding presidents-elect that it is difficult but imperative that they put the brake on a campaign while also starting the engine of a new administration. Campaigning and governing are two different things.

Buoyed by their recent victory, first-term presidents have often over-reached and under-performed, quickly turning hope into despair. If there is one common thread to Neustadt's memos, it is the reminder that there is no time for hubris or celebration. The entire superstructure of the executive branch - the political appointees who direct the permanent civil service - is about to lopped off, and the first and most critical task of the president-elect is to surround himself with competent men and women he can work with and learn from.

In less than three months, the president-elect will no longer have the luxury of merely making promises on the campaign trail. Now he must get to work.

Jenny Attiyah is host and producer of Thoughtcast, an interview program devoted to writers and academics, and available via podcast.

We don't have to agree with everything we read in this country. Reading is not unpatriotic. So may I suggest that the future commander-in-chief actually read the speeches by Osama bin Laden? At a minimum, he can read between the lines. As Sun Tzu said, "know thine enemy". But we know so little about bin Laden. We don't even know where he lives. Supposedly, he "hates our freedoms" – but he would argue that what he hates is the freedom we take with our power.

After these videos were released, it usually took some effort to dig out a transcription. In the end, I had to go to Al Jazeera for a translation. What I remember most clearly is grainy video of the guy, holding his index finger aloft, but with the volume silenced, so our talking TV heads could impart their wisdom in peace. Let's hope the next president is willing to turn off the mute button on our enemy. Ignorance is no longer an excuse.

[Verso Press made this much easier three years ago with the collection Messages to the World: The Statements of Osama Bin Laden, which provides as much OBL as anyone should have to read.-SM]

Daniel Drezner is a professor of international relations at Tufts University. He also blogs.

I'd probably advise the president to read the uber-source for international relations, Thucydides' History of the Peloponnesian War. Too many people only read portions like the Melian Dialogue, which leads to a badly distorted view of world politics (the dialogue represents the high-water mark of Athenian power -- it all goes downhill after that). The entire text demonstrates the complex and tragic features of international politics, the folly of populism, the occasional necessity of forceful action, the temptations and dangers of empire, and, most importantly, the ways in which external wars can transform domestic politics in unhealthy ways.

Chris Matthew Sciabarra is a visiting scholar at New York University and a founding co-editor of The Journal of Ayn Rand Studies.

Given my own views of the corporatist state-generated roots of the financial crisis, I'd probably recommend The Theory of Money and Credit by Ludwig von Mises, so that he could get a quick education on how the credit policies of a central bank set the boom-bust cycle into motion. Perhaps this might shake the new President into a truly new course for US political economy.

Irving Louis Horowitz is professor emeritus of sociology and political science at Rutgers University and editorial director of Transaction Publications.

While I seriously and categorically doubt that any one book will shape the course of history, and even less, do I feel touched by a sense of "awesome power" much less preternatural wisdom, I will recommend a book that the next president of the United States would, or better should, avail himself of: On Thermonuclear War by Herman Kahn. Released first by Princeton University Press in the dark days of the Cold War in 1960, and reissued by Transaction Publishers in 2007, this is the painful reminder that peace in our time is heavily dependent of the technology of war in our time. The howls of dismissal that greeted this book upon first blush have been replaced by a sober appreciation that the global threat to our Earth are very much a man made product.

Kahn's book can serve as a guide in the stages of diplomatic failure and its consequential turn to military activities at maximum levels. Kahn does not presume pure rationality as a deterrent to war, and in light of the nuclear devices in the hands of dangerous nations states such as Iran and North Korea, where notions of life and death may give way to Gotterdamerung and the preference of destruction and self-immolation, such presumed rational behavior discourse may prove dangerous and even delusionary.The unenviable task of the next president will be to avoid taking the world to the proverbial brink - and making sure others do not dare take the fatal step to do likewise. Oddly, for all of its dire scenarios, Kahn's classic is a curiously optimistic reading, rooted in realistic policy options. It deserves to be on the shelf of the next head of the American nation.

Dick Howard is a professor of history at the State University of New York at Stony Brook and editor of the Columbia University Press series Columbia Studies in Political Thought/Political History.

I'd have him read Polanyi's The Great Transformation. Why? It's short, clearly argued, and makes a simple but fundamental point: capitalism is not the natural way that people relate to one another (including in their "economic" relations). It is the result of several political decisions that create the framework within which it can emerge. The next president will have to recognize that he too will make political decisions with economic consequences (and should not deceive himself into thinking that his decisions are simply a reaction to economic "necessities").

To be noted as well: Polanyi, a former banker in Austria, was writing in the wake of the Great Depression, whose causes he was trying to understand. It was the inability of "economics" to understand what had happened to the world economy that led Polanyi to his pathbreaking and brilliant study.

A hubristic final note: I of course recommend this only because my own study of the history of political thought from the Greeks to the American and French revolutions, titled The Primacy of Politics, will not yet be on the market.

[ Primacy will be published by Columbia University Press in late '09.-SM]

James Marcus is the book-review editor for The Columbia Journalism Review and has translated several books from Italian.

It's not often that the POTUS asks me what to read next, and at first I thought I should rise to the occasion with something suitably canonical. I considered Democracy in America, The Federalist Papers, maybe even The Education of Henry Adams (although I'd allow the leader of the free world to skip the virgin-and-dynamo stuff at the end). Then I decided that it made more sense to submit a narrow-gauge production: a book that grappled with public issues through the prism of personal experience, not unlike Barack Obama's Dreams from My Father or John McCain's Faith of My Fathers. If, like the two titles I just mentioned, it included a dash of Oedipal ambivalence, so much the better.

What I came up with was Tobias Wolff's In Pharaoh's Army: Memories of the Lost War. As the next president ponders the best way to extract the United States from its Iraqi quagmire, a memoir of Vietnam seems like a useful reality check. The author, a self-confessed screw-up, spent part of his enlistment in the Mekong Delta, advising a Vietnamese artillery battalion. There are very few heroics in his book, and no argumentation about the wisdom of being there in the first place. What we do get is the endless confusion of fighting a popular insurgency. And an insistence that even the survivors of such a conflict are permanently marked: "It's the close call you have to keep escaping from, the unending doubt that you have a right to your own life. It's the corruption suffered by everyone who lives on, that henceforth they must wonder at the reason, and probe its justice."

Over the next four years, the president will almost certainly order U.S. troops into battle. In its modest, personal, anti-rhetorical manner, this book reminds us of the price to be paid.

Claire Potter is a professor of history and American studies at Wesleyan University, and is also known as Tenured Radical. She contributes to the history blog Cliopatria.

My contribution to President Obama's reading list is Nancy Cott's Public Vows: A History of Marriage and the Nation (2000). While the history of marriage has been augmented considerably since this book came to include important volumes on the history of interracial marriage, the demand for gay marriage, and the fraught relationship between Christianity and marriage, all other scholars have relied, more or less, on Cott's argument that marriage is first and foremost a contract with the state.

It's not primarily a contract with another person – although it is that; it is not a contract with your local community – regardless of their approval and disapproval; and it is in no way a contract with any religious hierarchy – although it can be critical to the terms of inclusion in a religious community.

Marriage, President Obama, is about citizenship. You, along with nearly everyone who hedges his bets on gay marriage, reiterates that the most important fact about marriage is that it is between "one man and one woman." But that's not true. In the United States, as Cott shows, marriage has been primarily about the qualifications of a man "to be a participating member of a state."

While over time political authorities in the United States have allowed marriage to "bear the impress of the Christian religion," if marriage is a public institution at all, its function is to mirror the political community and to be the arm of the state that functions to "shape the gender order." In other words, Mr. President, the history of marriage is a political history, not a religious one; and it is a history of inclusion or exclusion from political power.

George Scialabba is the author of What Are Intellectuals Good For?, a collection of essays forthcoming from Pressed Wafer in March 2009. He was profiled in this column two years ago .

Dear Citizen Obama (I'm afraid the overly deferential "Mr. President" encourages the aggrandizement of the Executive Branch):

More than thirty years ago, your predecessor Jimmy Carter described America's tax system as "a national disgrace." Since then, it's gotten much, much worse. It is now so complex and irrational that only two groups of Americans understand it: tax lawyers and readers of David Cay Johnston, Pulitzer-Prize-winning New York Times reporter and author of Perfectly Legal: The Covert Campaign to Rig Our Tax System to Benefit the Super-Rich -- and Cheat Everybody Else. The abuses and evasions detailed in Perfectly Legal (and its companion volume, Free Lunch: How the Wealthiest Americans Enrich Themselves at Government Expense – and Stick You with the Bill) may raise your blood pressure dramatically. You should read them, but only under a doctor's supervision.

Continued tax avoidance at current staggering levels by the wealthy is your mortal enemy. Unless the tax code is drastically reformed -- and effectively enforced -- there will simply not be enough money to accomplish your goals. It will take courage, persistence, and all your celebrated rhetorical skills to vanquish this dragon in your path. But unless you do, your hopes will be thwarted and your administration will be no more than a ripple on the surface of American history.

Good luck and Godspeed.

James Mustich is editor ofThe Barnes & Noble Review.

Since I have more than once in the past few months mourned the unkind timing of Norman Mailer's death this year -- What might the author of one of our finest war novels have made of the trials of Senator McCain on the campaign trail? How would the instigatory commentator on so much of our nation's cultural, political, and existential foment make sense of the long and disciplined loneliness of Senator Obama? And, last but by no means least, how would an imagination precocious and peculiar enough to have set a novel called Why Are We in Vietnam? in Alaska have illuminated the passage of Sarah Palin through the national psyche? -- I'd recommend to the new chief executive Mailer's piece on the 1960 Democratic convention, "Superman Comes to the Supermarket."

Coming out of the exhaustions of electoral combat, I might even give him a pass and ask him only to read the first paragraph -- forgive me, Norman -- if he promised to spend some time thinking about it:

"For once let us try to think about a political convention without losing ourselves in housing projects of fact and issue. Politics has its virtues, all too many of them -- it would not rank with baseball as a topic of conversation if it did not satisfy a great many things -- but one can suspect that its secret appeal is close to nicotine. Smoking cigarettes insulates one from one's life, one does not feel as much, often happily so, and politics quarantines one from history; most of the people who nourish themselves in the political life are in the game not to make history but to be diverted from the history that is being made."

Jodi Dean is a professor of political science at Hobart and William Smith Colleges and author of Democracy and Other Neoliberal Fantasies, forthcoming from Duke University Press.

I would recommend that President Obama read Our American King by David Lozell Martin.First, Obama is already familiar with Marxist, feminist, structuralist, and post-colonial theory from his days as a student at Harvard. So there is already some coverage here. Second, Obama has lots of advisors providing lots of advice on policy matters. Anything added here would end up just another item in the mix. Third, the new President faces so many enormous challenges that it is highly unlikely he'll have much time to devote to pondering a complex text, no matter how important.So I recommend a novel published last year, bedside reading that will provide the new President with food for thought. It captures, I think, the fears of many of us for the future of democracy in a time of extreme inequality, the sense that our country is leaning heavily on the wrong side of a precipice.

Our American King depicts what remains of the United States after a great economic calamity: the top .1 percent of Americans have appropriated all the wealth and goods for themselves and left the rest of the country to fend for itself. As the super-rich live in heavily defended enclaves, the suburbs and cities descend into violence, starvation, and death. Social order collapses. The President and Vice President that oversaw the calamity, that presided over the great transfer of wealth from the many to the few, are hung upside and backwards on the White House gates. The central drama of the novel involves the man who comes to power next. He is set up as a king, a uniter, the great hope of the people. Through him, they begin to work together, to imagine again the possibility of collective responsibility. The new king's authority draws from the people's fear and desperate longing for hope, a fear and a longing that, as Martin makes clear, may not always lead to the best outcomes.

My hope is that President Obama will read this book and recognize that people's longing for a leader, the One, is powerful but precisely because of that power should be redirected toward common projects, toward faith in each other and belief in equality, toward a renewed conviction that the conditions of the least well off--not the best--tell us who we are.

Richard Byrne is the editor ofUMBC Magazine. His play Burn Your Bookes premiered last year in Prague. He blogs at Balkans via Bohemia.

As a playwright, I want the next president to read a play. Plays are perfect fodder for the chief executive-to-be: they are short, can be digested in one sitting, and offer the advantage of distilling larger currents of thought into character, dialogue and action. And such an opportunity should not be wasted on agit-prop (Bertolt Brecht, Clifford Odets) or classics that should already have been imbibed by the civilized soul. (So let’s shelve Henry V and Major Barbara for now.) The play should talk to the president about the human cost of tough times, the dignities and foibles of ordinary citizens, and the dire alternatives to forceful and human courses of action.

For such times, the German playwright Odon von Horvath is just the ticket. Before his tragic death on the cusp of World War II, Horvath offered a window on the brutalities of economic collapse and the roots of fascism in desperation and human folly. But which Horvath to select? Tales from the Vienna Woods is Horvath’s masterpiece, but I’d worry that its deep subtleties and epic canvas of pre-war Austria would confound a reader pressed for time. So I’d opt instead for Horvath’s tiny jewel of human desolation: Faith, Hope and Charity.

In a mere 52 pages, the play follows Elisabeth, an ordinary young woman down on her luck, as she is hounded to death by close encounters with unfeeling bureaucracy and casual cruelty. It is a succinct and powerful play with a simple lesson: if our political institutions are not suffused with the moral values of the play’s title, they can be perverted into engines of personal annihilation. It is a message the new president should consider as sweeping changes in government and its powers are proposed and enacted.

Scott McLemee
Author's email: 

The Forgotten Virtue of Gratitude

It was a typical 1970s weekday evening. The sky was growing dark and I, an elementary school student, was sitting at the kitchen table of a modest North Jersey cape cod putting the finishing touches on the day’s homework. The back door opened -- a telltale sign that my father was home from work. As he did every day, Dad stopped in the laundry room to take off his muddied work boots. As usual, he was tired. He could have been covered with any number of substances, from dirt to paint to dried spackle. His hands were rough and gnarled. I kissed him hello, he went to the bathroom to “wash up,” and my family sat down to eat dinner.

I always knew how hard my father worked each day in his job as a general contractor. When I got older I spent summers working with him. I learned the virtues of this kind of working class life, but I also experienced the drudgery that came with laying concrete footings or loading a dumpster with refuse. I worked enough with my father to know that I did not want to do this for the rest of my life. Though he never told me so, I am sure that Dad probably didn't want that for me, either.

I eventually became only the second person in my extended family to receive a college degree. I went on to earn a Ph.D. (a “post-hole digger” to my relatives) in history and settled into an academic life. As I enter my post-tenure years, I am grateful for what I learned from my upbringing and for the academic vocation I now pursue. My gratitude inevitably stems from my life story. The lives that my parents and brothers (one is a general contract and the other is a plumber) lead are daily reminders of my roots.

It is not easy being a college professor from a working-class family. Over the years I have had to explain the geographic mobility that comes with an academic life. I have had to invent creative ways to make my research understandable to aunts and uncles. My parents read my scholarly articles, but rarely finish them. My father is amazed that some semesters I go into the office only three days a week. As I write this I am coming off of my first sabbatical from teaching. My family never quite fathomed what I possibly did with so much time off. (My father made sense of it all by offering to help me remodel my home office, for which I am thankful!) “You have the life,” my brother tells me. How can I disagree with him?

Gratitude is a virtue that is hard to find in the modern academy, even at Thanksgiving time. In my field of American history, Thanksgiving provides an opportunity to set the record straight, usually in op-ed pieces, about what really happened in autumn 1621. (I know because I have done it myself!). Granted, as public intellectuals we do have a responsibility to debunk the popular myths that often pass for history, but I wonder why we can’t also use the holiday, as contrived and invented and nostalgic and misunderstood as it is, to stop and be grateful for the academic lives we get to lead.

Thanksgiving is as good a time as any to do this. We get a Thursday off from work to take a few moments to reflect on our lives. And since so many academics despise the shopping orgy known as “Black Friday,” the day following Thanksgiving presents a wonderful opportunity to not only reject consumer self-gratification, but practice a virtue that requires us to forget ourselves.

I am not sure why we are such an unthankful bunch. When we stop and think about it we enjoy a very good life. I can reference the usual perks of the job -- summer vacation, the freedom to make one’s own schedule, a relatively small amount of teaching (even those with the dreaded 4-4 load are in the classroom less than the normal high school teacher). Though we complain about students, we often fail to remember that our teaching, when we do it well, makes a contribution to society that usually extends far beyond the dozens of people who have read our recent monograph. And speaking of scholarship, academics get paid to spend a good portion of their time devoted to the world of ideas. No gnarled hands here.

Inside Higher Ed recently reported that seventy-eight percent of all American professors express “overall job satisfaction.” Yet we remain cranky. As Immanuel Kant put it, “ingratitude is the essence of vileness.” I cannot tell you how many times I have wandered into a colleague’s office to whine about all the work my college expects of me.

Most college and university professors live in a constant state of discontentment, looking for the fast track to a better job and making excuses as to why they have not landed one yet. Academia can be a cutthroat and shallow place to spend one’s life. We are too often judged by what is written on our conference name badges. We say things about people behind their backs that we would never say to their faces. We become masters of self-promotion. To exhibit gratefulness in this kind of a world is countercultural.

The practice of gratitude may not change our professional guilds, but it will certainly relieve us of our narcissism long enough to realize that all of us are dependent people. Our scholarship rests upon the work of those scholars that we hope to expand upon or dismantle. Our careers are made by the generosity of article and book referees, grant reviewers, search committees, and tenure committees. We can all name teachers and mentors who took the time to encourage us, offer advice, and write us letters. Gratitude may even do wonders for our mental health. Studies have shown that grateful people are usually less stressed, anxious, and depressed.

This Thanksgiving take some time to express gratitude. In a recent study the Harvard University sociologist Neil Gross concluded that more college and university professors believe in God than most academics ever realized. If this is true, then for some of us gratitude might come in the form of a prayer. For others it may be a handwritten note of appreciation to a senior scholar whom we normally contact only when we need a letter of recommendation. Or, as the semester closes, it might be a kind word to a student whose academic performance and earnest pursuit of the subject at hand has enriched our classroom or our intellectual life. Or perhaps a word of thanks to the secretary or assistant who makes our academic life a whole lot easier.

As the German theologian and Christian martyr Dietrich Bonhoeffer explained, “gratitude changes the pangs of memory into a tranquil joy.”

John Fea
Author's email: 

John Fea teaches American history at Messiah College, in Grantham, Pa. He is the author of The Way of Improvement Leads Home: Philip Vickers Fithian and the Rural Enlightenment in America (University of Pennsylvania Press, 2008).


This past weekend, a comic playing Bill Clinton on Saturday Night Live told the world’s leaders not to pull anything on Hillary when she becomes Secretary of State. It's not even worth trying, he indicated, because she’ll see right through you. But he offered some reassuring advice on how to finesse things, if necessary. “The only words you’re gonna need when Hillary shows up: ‘I ... am ... sorry.’ It don’t work all the time, but it’s a good place to start.”

A friend recounted this skit to me when he saw the galleys of Susan Wise Bauer’s new book The Art of the Public Grovel: Sexual Sin and Public Confession in America (Princeton University Press). Its cover shows the former president in a posture of contrition: hands in front of his face, as if to pray; his eyes both wide and averted. But Bauer’s point is that effective public groveling requires a lot more than just assuming the position, let alone saying “I am sorry.”

There is (so her argument goes) a specific pattern for how a public figure must behave in order to save his hide when caught in a scandal. It is not sufficient to apologize for the pain, or offense to public sensibility, that one has caused. Still less will it do to list the motivating or extenuating circumstances of one’s actions. Full-scale confession is required, which involves recognizing and admitting the grievous nature of one’s deeds, accepting responsibility, and making a plea for forgiveness and asking for support (divine or communal, though preferably both).

The process corresponds to a general pattern that Bauer traces back to the Puritan conversion narratives of the 17th century. Confession started out as a way to deal with Calvinist anxieties over the precarious nature of any given believer’s status in the grand scheme of predestination. Revealing to fellow believers an awareness of the wickedness in one’s own life was, at very least, evidence of a profound change in heart, possibly signaling the work of God’s grace.

Secularized via pop psychology and mass media, public confession now serves a different function. In the 20th century, it became “a ceremonial laying down of power,” writes Bauer, “made so that followers can pick that power up and hand it back. American democratic expectations have woven themselves into the practice of public confession, converting it from a vertical act between God and a sinner into a primarily horizontal act, one intended to re-balance the relationship between leaders and their followers. We both idolize and hate our leaders; we need and resent them; we want to submit, but only once we are reassured that the person to whom we submit is no better than we are. Beyond the demand that leaders publicly confess their sins is our fear that we will be overwhelmed by their power.”

Leaders who follow the pattern may recover from embarrassing revelations about their behavior. Major examples of this that Bauer considers are Jimmy Swaggart (with his hobby of photographing prostitutes in hotel rooms) and Bill Clinton (intern, humidor, etc.) Because they understood and accepted the protocol for a “ceremonial laying down of power” through confession, they were absolved and returnd to their positions of authority.

By contrast, public figures who neglect the proper mode of groveling will suffer a loss of support. Thus Edward Kennedy’s evasive account of what happened at Chappaquiddick cost him a shot at the presidency. The empire of televangelist Jim Bakker collapsed when he claimed that he was entrapped into extramarital canoodling. And Bernard Cardinal Law, the bishop overseeing the Catholic community in Boston, declined to accept personal responsibility for assigning known pedophile priests to positions where they had access to children. Cardinal Law did eventually grovel a bit – more or less along the lines Bauer suggests – but only after first blaming the scandal on the Boston Globe, his own predecessors, and earlier church policy. The pope accepted his resignation six years ago.

It’s one thing to suspect that a set of deep continuities exist between evangelical religion, group psychotherapy, and “performances of self” in an age of mass media. Many of us found ourselves positing this quite often during the late 1990s, usually while yelling at the TV news.

But it’s a much tougher prospect to establish that such continuities really exist – or that they add up to an ethos that is accepted by something called “the American public” (a diverse and argumentative conglomeration, if ever there were one). At the very least, it seems necessary to look at how scandals unfold in nations shaped by a different religious matrix. Bauer doesn’t make such comparisons, unfortunately. And her case studies of American scandals don’t always clinch the argument nearly so well as it may appear.

The discussions of Jim Bakker and Bill Clinton form a center of gravity for the whole book. The chapters on them are of almost equal length. (This may testify less to the historical significance of Jim Bakker’s troubles than to their very considerable entertainment value.) And in keeping with Bauer’s analysis, the men’s responses to embarrassment form a neat contrast in approaches to the demand for confession.

Having been exposed for using church funds to pay blackmail to cover up an affair with a church secretary, Bakker has always presented himself as more sinned against than sinning – the victim of a wicked conspiracy by jealous rivals. In other words, he never performed the sort of confession prescribed by the cultural norms that Bauer identifies. He never handed over his power through suitable groveling, and so his followers punished him.

“Refusing to confess, unable to show his one-ness with his followers, ” she writes, “Bakker remains unable to return to ministry.” Which is inaccurate, actually. He has been televangelizing for the past five years, albeit on a less grandiose scale than was once his wont. Bakker’s inability to reclaim his earlier power may have something to do with his failure to follow the rules for confessing his sins and begging forgiveness. But he still owes the IRS several million dollars, which would be something of a distraction.

Bakker’s claims to have been lured into immorality and disgrace are self-serving, of course. Yet Bauer’s account makes clear that his competitors in the broadcast-holiness business wasted little time in turning on him – the better to shore up their own reputational capital and customer base, perhaps. The critical reader may suspect that Bakker’s eclipse had more to do with economics than with the reverend's failures of rhetorical efficacy.

Former president Clinton, by contrast, is rhetorical efficacy incarnate. Bauer’s chapter on l’affaire Lewinsky attributes his survival to having met the demand for confession.

Of course, he did not exactly make haste to do so. Bauer includes a set of appendices reprinting pertinent statements by the various figures she discusses. The section on Clinton is the longest of any of them. More than a third of the material consists of deceptive statements and lawyerly evasions. But the tireless investigative pornographers of the Starr Commission eventually corned the president and left him with no choice. “In Bill Clinton’s America,” writes Bauer, “the intersection of Protestant practice, therapeutic technique, and talk-show ethos was fully complete. In order to survive, he had to confess.”

He pulled out all the stops – quoting from the Bible on having a “broken spirit,” as well as a Yom Kippur liturgy on the need to turn “from callousness to sensitivity, from pettiness to purpose” (and so forth). It worked. “Against all odds,” writes Bauer, “his confessions managed to convince a significant segment of the American public that he was neither a predator nor an evildoer, and that he was fighting the good fight against evil. Most amazingly, this white, male lawyer, this Rhodes Scholar, who held the highest elected office in the land, persuaded his followers that he was just like the country’s poorest and most oppressed.”

That is one way to understand how things unfolded ten years ago. According to Bauer's schema, Clington underwent a “ceremonial laying down of power,” only to have it handed back with interest. No doubt that description accounts for some people’s experience of the events. But plenty of others found the whole thing to be sordid, cynical, and cheesy as hell – with the confession as less a process that strengthened socials bonds than a moment of relief, when it seemed like the soap opera might end.

So it did, eventually. But there will always be another one, perhaps involving some politician we've never heard of before. That is why The Art of the Public Grovel ought to be kept in stock at Trover’s, the bookshop on Capitol Hill, from now on. While not entirely persuasive in its overall analysis, it might still have non-scholarly applications.

Scott McLemee
Author's email: 


Subscribe to RSS - History
Back to Top