History

Disciplinary Associations Should Start Treating Job Seekers With Respect

As has become the annual tradition, the American Historical Association is out with its report lauding the health of the academic job market in history. The report, culled exclusively from job listings in Perspectives (an AHA publication) and Ph.D. completion statistics reported by history departments, shows that there are more available positions than there are historians produced. Other disciplines issue similar reports. While the AHA report may be viewed favorably by some such as scholars in Asian history, the most underpopulated field for historians, for others it reflects a general lack of concern from the association for the untenured and the graduate student. And the problems discussed here apply to many other disciplines as well.

As a national organization and the most powerful entity in the historical job market, the AHA has done surprisingly little to help the newest members of their profession. On the whole, historians pride themselves on their concern for social justice. In 2005, for example, the Organization of American Historians uprooted its annual conference and moved it to another city in a show of solidarity with hotel workers. When it comes to the plight of the discipline’s own working class, the unemployed job seeker, this compassion and concern is absent. In its place is an annual report from the AHA talking about how good it is for some. For others, there isn’t much the AHA can do. I find this lack of action, especially when compared to what is normally shown for the less fortunate, disheartening.

While the AHA can do nothing to overcome the dearth of tenure-track positions (which is a reality that deans, trustees, and legislators control), the association has a great deal of control over two things: job market statistics and the interview process. These areas, which some might say are of secondary concern, have made the job search a very inhospitable place. For one, the association could conduct a statistically sound study of the job market based on an actual survey of departments and job seekers. Drawing attention to the total number of jobs and the number of Ph.D.’s produced in the past year overlooks the fact that visiting faculty and independent scholars are also on the market. A more thorough census would provide better information to AHA members and possibly even a snapshot of many other employment concerns, including how the positions stack up in terms of pay, tenure-track status, and other key factors.

More importantly, the organization could do a number of things to reform the poorly designed hiring process that leaves applicants floating in a limbo of uncertainty throughout much of November and December. The lack of communication between search committees and job seekers is so common that it is now taken for granted along with death and taxes. Job applicants no longer expect any professional courtesy. While this results in a good bit of anxiety for anyone on the market, it can also lead to undue financial hardships that could easily be avoided. As a former editor of the H-Grad listserv and one currently searching for a tenure-track position, I can safely say that these concerns are pressing on the mind of most applicants.

The key to these job market reforms is the AHA itself. As the group most vested in the hiring process, it has done little to actively rectify some of the more egregious concerns with the job market. I have compiled a short list of changes that could be adopted with one vote at a business meeting. And most of these changes would benefit not just the AHA, but other disciplinary associations, especially in the humanities, where good, tenure-track jobs are not widely available. While this will absolutely not correct the disparity between job seekers and open positions, it will go a long way towards making the process a more fair and equitable institution.

1. Take a more accurate census of the job-seeking population annually. There is a glut of history Ph.D.'s. Everyone knows this. Yet for the past three years, the AHA has been trumpeting the idea that the job market is improving based solely on data that have no correlation with the actual situation. The AHA, like other associations, bases its data on job applicants solely on the number of new Ph.D.’s, ignoring the fact that so many of the past few years’ new doctorates remain either unemployed or in temporary positions, off the tenure track and with low pay and benefits. By only counting the new Ph.D.’s, the figures for job-seekers are significantly lower than they should be. The research being produced by the AHA needs to be more accurate so as to guide job applicants and graduate students as to their chances of finding a position. Since candidates who utilize the AHA Job Register at the annual meeting have to be registered as a meeting attendee, the AHA should include a census form with the conference registration. Questions such as "Are you a job seeker?," "What is your area of specialty?," "Have you ever had a tenure-track position?" and “How many years have you been on the job market?” would give a more accurate picture of just how truly dire the job market is. A follow-up survey every April would round out the study and enable applicants to assess their position in the market for the following fall. Job seekers could then make career choices based on tangible facts, rather than hearsay and propaganda.

2. Make the Job Register service a privilege that has to be earned. The AHA has a good deal of influence on the job market but has yet to utilize it in any significant way. Since most tenure-track positions are advertised in the AHA Perspectives and interviews are conducted at the AHA annual meeting, the AHA should mandate certain conditions that must be met before interviewing and advertising space is sold. If those conditions are not met, the AHA should deny departments the right to use their facilities and their ad space, thus adding substantial cost to the interviewing institutions. University HR departments and academic deans, often cited as the reason search committees are unable to communicate with applicants, would either allow the departments to comply with these provisions and foot the bill for a more expensive interview process. Lack of communication and the posting of identical positions without a hire for three or more years are two of the problems that stand out at the moment, but the usage could be expanded in future years to address new situations as the AHA sees fit.

3. Require that search committees inform applicants of their interview status via e-mail 30 days before the annual meeting. Graduate students, visiting lecturers, and independent scholars are, on the whole, not independently wealthy. Traveling across the country to stay at an upscale hotel in a major city just after the holiday season is a lot to ask, especially if a candidate has no interviews. Applicants, though, are at the mercy of the search committees, some of whom notify interviewees a week or less before the annual meeting. Applicants are forced to either keep their rooms and plane tickets past the cancellation date in the hopes that their phone will ring or pay higher airfares and higher hotel rates for last-minute bookings. Letting candidates know their interview status a month in advance would alleviate that situation and prevent the least paid of the profession from shouldering the heaviest and most burdensome travel costs. The AHA should set guidelines that search committees must let all job applicants know whether they will have an interview at the AHA 30 days before the annual meeting or face some sort of Job Register sanction up to suspension from its benefits for a set period of time as determined by the AHA.

4. Establish a general listserv for search committees and job seekers. Search committees are notorious for their lack of communications. Job seekers have pooled their resources into a number of academic career wikis, but these can be misused and are dependent on the truthfulness of the poster. The AHA can alleviate this uncertainty by creating a listserv and mandating that those who use the Job Register would agree to notify the AHA by e-mail at important phases of the job search process. Which steps those are would be open for negotiation, but everyone, committees and candidates alike, would know what those benchmarks are ahead of time. The AHA, and this is the critical step, would aggregate these notifications and send them out via a daily listserv to all job applicants who choose to subscribe. Under this system, for example, all who applied for the position in Pre-Modern China at Boise Valley State could know that the search committee has made AHA invitations, has made invitations for on-campus interviews, or that Dr. Damon Berryhill had accepted the position. Job applicants, who usually have no idea how the searches are progressing, would be more informed when fielding other offers and would no longer need to contact each institution directly for updates. Participation would also be in the hiring institution’s best interests, as it would reduce the need to communicate one on one with job candidates (a very time consuming task for search committee members) but still create a much more open system of communication for job seekers.

It is frustrating to me when scholars who have spent years examining the forces of reform and progress will take no action to better the lives of their fellow historians. Individuals who have studied the great reformers and crusaders of the past will simply throw up their hands and exclaim “its just the market!,” when confronted with horror stories of graduate students and visiting faculty on the hunt for tenure-track jobs. These people do nothing as if, by some sort of divine incantation, the injustices of the hiring process are set in stone and beyond human control. This is the attitude that needs to change the most.

It is worth mentioning that graduate students make up the largest constituency group of the AHA membership. As the H-Grad listserv and the academic careers wiki continue to gain popularity, it will not be much longer before job seekers figure out how to organize themselves and make their voices heard one way or another. One anonymous poster on the job wiki for American historians has already suggested that all job seekers flood this year’s business meeting and vote no on every provision until the AHA takes up job market reform. The leadership of the AHA should adopt these reforms, or at the very least make a reasonable effort to study them, in order to make the job market a more tolerable place for the profession’s newest members and to take the first steps toward a more equitable and open hiring process.

Author/s: 
Michael Bowen
Author's email: 
info@insidehighered.com

Michael Bowen is assistant director of the Bob Graham Center for Public Service and a visiting lecturer in the history department at the University of Florida.

Blue Skies Ahead?

Hurricane Katrina is a disaster so recent that it is, in a sense, still underway. It isn’t just that large parts of New Orleans are in ruins after two and a half years, or that many of its residents are gone for good. According to one survey, something like a third of the population now living in the city intends to leave . The Journal of American History has just published a special issue, “Through the Eye of Katrina: The Past as Prologue?” that presents 20 papers covering New Orlean’s unique place in the American social and cultural landscape. It is also a reminder of just how much of city’s historical ambiance is now well and truly in the past, lost for good.

The issue is available online, in an edition that includes both audio slideshows and a set of maps and timelines. Rather than try to describe everything available in “Through the Eye of Katrina,” I’d like to take a short look here at two papers from it in particular. They frame the events of August 2005 as part of an unfinished story.

The final paper, “What Does American History Tell Us About Katrina and Vice Versa?” is by Lawrence N. Powell – a professor of history at Tulane University and one of the issue’s editors. That seems an odd place for it to appear. The paper feels very much like an introduction, and it can be recommended as a good place for the reader to start.

My impression is that Powell would like very much to believe that something like Arthur Schlesinger, Jr.’s argument in The Cycles of American History (1986) is true. The idea is that our political life and national temper go through more or less regular swings of the pendulum, whether between liberalism and conservatism, or between public-mindedness and an emphasis on private interests.

This idea has been around for at least 100 years. The first widely circulated version was put presented by Henry Adams’s somewhat eccentric younger brother Brooks in the 1890s. Another cyclical theory was put forward by Arthur Schlesinger Sr., who predicted from it that a long period of conservative dominance would begin in the United States around 1978. Given that he died one year after Barry Goldwater’s run for president in 1964, this seems unusually farsighted. The model advanced in the mid-1980s by Junior (who passed away in 2007) held that U.S. history tends to go through cycles lasting 30 years each – with 15 years of expansive public-mindedness being followed by 15 of conservative retrenchment.

Powell doesn’t endorse any particular timetable in his paper, which is probably for the best. (Historians should avoid poaching on the professional expertise of astrologers.) But he does suggest that Katrina might yet prove to be a “detonating event” in setting off whatever seismic changes in public life are now underway.

Assuming, that is, that any such shifts really are underway. “American politics have been in a conservative phase for so long,” writes Powell, “thirty-five years and counting, that is hard to imagine an abrupt shift in the national Zeitgeist happening anytime soon.” He notes that the spirit of privatized effort “has been hard to miss in the Katrina recovery” – with the preferred mode of public assistance taking the form of tax credits for developers and “multimillion-dollar debris-removal contracts that were sole-sourced to politically connected corporations.”

But however locked into trickle-down thinking the current recovery efforts have been, the experience of Katrina has raised concerns about the state of the country’s infrastructure and emergency-preparedness. That may, in turn, require national action at the level of policy and budget – no matter what the talk-radio Zeitgeist says about it.

"We have probably gone too far down the road of states’ rights federalism,” writes Powell, “for some sort of New Deal–type agency such as the Tennessee Valley Authority to arise any time soon. The need for such a regional entity, with the capacity to coordinate relief and recovery activity across multiple levels of government and between public and private actors, has never been more apparent than now. All the same, the pendulum does seem to be swinging back toward the public sphere. When conservative southern Republican politicians start intervening in the insurance market or call for regulating outsourcing (as the governor of Florida has recently done), it is a sign the country is overdue for a conversation about the role of government.”

Geographers make an important distinction between “site and “situation,” as Ari Kelman points out in his paper “Boundary Issues: Clarifying New Orleans’s Murky Edges.” And in the history of the city, it is been a difference that’s made a difference. “Site” refers to the concrete geographical specifics of where an urban area is located. By contrast, “situation” is a matter of human attitude and need – what advantages it offers, relative to other places.

Site and situation are related, of course. But they can be at odds – and in New Orleans, are they ever. Kelman notes its “near-perfect situation” as a city built “on the east bank of the continent’s greatest river, near its outlet, in an era before technologies began circumventing the vagaries of geography.....Just downstream from the city lay the Gulf of Mexico, which provided access to the Atlantic world of trade.” But the site is near-perfect in its potential for disaster: a city built on sediment, with the riverfront being its highest point of elevation, with most of it below sea level and having no natural drainage.

Kelman, who is an associate professor of history at the University of California at Davis, describes three centuries of attempts by the city to compensate for its “site” problems through engineering – a series of efforts that have had complicated implications for its economy and social structure. It’s a valuable narrative that helps to place the events of August 2005 in a long-term perspective.

But when Kelman reaches the point of addressing the post-Katrina future, something puzzling happens. In the final lines of the paper, he writes: “It now appears New Orleans will try again to engineer itself out of harm’s way, once more attempting to improve its levees and drainage system. The various committees seem captivated by the notion that it is possible to separate the city from its surroundings, a myth that will not die, no matter how many of New Orleans’s residents do.”

This left me scratching my head. Just what alternative is there to improving the levees and drainage system? Is the implication that site has so trumped situation that repopulating New Orleans is a bad idea? While Powell’s article expresses ambivalence about prospects for the future, I found Kelman’s remarks inscrutable – so decided to ask him for clarification.

It turns out that confusion is a reasonable response, because Kelman himself intended his final remarks to be ambiguous. He has been studying the city’s history and its reconstruction efforts and has written about Katrina for The Nation and The Christian Science Monitor. He isn't against either hydraulic engineering or repopulating the city, as such. But at this point, no he sees no cut-and-dried policy option recommends itself as a real solution to its current problems.

"There’s no good solution to what ails New Orleans," he told me by email. "That was true before Katrina; it’s true now. Higher levees and better drainage conjure the illusion of safety without offering any real guarantees that people will be safe, particularly if the next storm comes soon, if the money runs out (a sure thing), or if global warming isn’t a hoax. The illusion of safety, then, makes it easy for people to live in harm’s way without reckoning with the danger. But, on the other hand, making people leave their communities heaps social insult atop environmental injury – not that I see the social and the environmental as being easily separable.”

So what is to be done?

"I have no idea," he writes. "Better leadership – a coordinated effort between federal, state, and local authorities, with real input from the grassroots, like Common Ground and others – would have been nice. But Katrina had the bad manners to come calling during the Bush years.... So we suffered through our first libertarian catastrophe. And we seem not to have learned much from it. That’s pretty grim, I know.... But we need to do more for the city. Maybe we will after this coming November."

Well, OK. Keep hope alive, etc. But in the final analysis it seems as if both Powell and Kelman are counting on the idea that a swing of the pendulum must be coming – and that when it does, things will start to change. Whole campaigns are being run on the principle that revitalized optimism can fundamentally transform the American situation.

Perhaps it can. But if Katrina taught us nothing else, it proved that situation isn’t everything. Anyone who wants lasting change needs to think about engineering some improvements to the site. That won't come easily, and it takes a lot of optimism to think that will happen at all.

Author/s: 
Scott McLemee
Author's email: 
scott.mclemee@insidehighered.com

The Whole World Was Watching

"Chicago 10," which opened in theaters a few days ago, is one of the most exciting movies ever made about any aspect of the 1960s. It is also among the most frustrating; for it turns out that excitement is only just so much of a virtue in a documentary film. Sensation minus context is desensitizing. And in a number of ways "Chicago 10" marks an almost complete triumph of visual intensity over historical memory.

Yet it is also the product of some impressive digging in the video archives. Much of the footage in it was shot by television crews roaming the streets during the Democratic National Convention in August 1968. Thousands of people had gathered in Chicago to protest the Vietnam war, by any means necessary; and thousands of cops marched in formation to prevent them from doing so, also by any means necessary. Guns were not actually fired in the process, which is something of a miracle. In one harrowing scene, a group of demonstrators taunts the police, challenging them to shoot.

You see Walter Cronkite on the evening news (a sober and avuncular anchorman, like nobody now in broadcasting: the living voice of the middle of the American road) comparing Mayor Daley’s city to a police state. There is a shot of a placard in the streets of Chicago saying “Welcome to Prague.” For this was one of those moments in history when the whole world was not just watching but, it seemed, performing from a common script. At the same time as Chicago was turning into an armed camp, Soviet troops were busy putting down Czechoslovakia’s experiment in reforming its own regime.

The confrontation in Chicago between the protesters and the forces of law and order is chronicled, day by day; and the tensions build up to a frenzy when the police go on a rampage. One unnamed and seemingly apolitical Chicagoan describes sitting in a bar, minding his own business, when the cops stormed in, making everybody leave under the threat of being pounded senseless. Anyone with long hair did not get a choice in the matter.

The street scenes are intercut with animated sequences based on transcripts of the government’s prosecution of those it accused of organizing the demonstrations. This is not the first time the court case has been put on screen (both the BBC and HBO have done so in previous decades) nor will it be the last, since a film called "The Trial of the Chicago 7" is in production this year. That title is something of a misnomer, since the Black Panther Party chairman Bobby Seale was the eighth defendant until his case with severed from that of his alleged co-conspirators. In "Chicago 10," the figure is bumped up to include the two lawyers who represented Abbie Hoffman, Bobby Seale, et al. -- since they, like the defendants, were cited for contempt of court.

As hyperkinetic eye candy, “Chicago 10" is as good as it gets. A soundtrack with Eminem rapping his fantasy of an anti-Bush insurrection is, I suppose, one way to make history come alive. Likewise with turning the courtroom antics of the revolutionaries look like something out of a video game. But the mash-up between archival footage and music-video aesthetics has the effect of stripping the events out of any sort of historical context. In making their work as up-to-date as possible in style and overtone, the movie makers seem never to have asked whether they might also be doing a disservice to the past.

All the fast cuts and visual tricks here might be justified by reference to the presumed demands of Today’s Youth, with their supersaturated yet shrinking attention spans. But if kids born in the 1990s really are the intended audience, why give that quick shot of the sign reading “Welcome to Prague” without any explanation for it? (My apologies, of course, if it turns out that Today’s Youth are completely up to speed on postwar Eastern European history.)

How is it that the film never mentions that student protests in France a few months earlier led to a general strike that almost brought down the government there? The “May events” in Paris were still on many people’s minds as the summer wound down; they help to make sense of what might otherwise look like plain craziness, at times, in the streets of Chicago. But no hint of the outside world ever breaks into any frame of the film. It revisits the past in an almost isolationist, if not solipsistic way -- quite as Tom Brokaw did in a recent TV program that treated the year 1968 as if it had unfolded almost entirely within the United States. (The convulsions in Paris, Prague, and Peking that year were dispatched in about two minutes.)

In the case of "Chicago 10," the perspective is shallow as well as narrow. Events are not simply yanked out of the past and detached from their contemporary global significance.They are shown without concern for long-term causes or effects. Incidents and images are presented without any reference at all to a larger narrative in which they might have some meaning. No effort is made to discuss the effects of the Chicago protests and the conspiracy trial in American politics. And that really takes some doing.

When we talk about the “culture war” now, the expression is usually just a very tired metaphor. But what happened outside the Democratic convention was an early battle in it, and a very literal one.

The turmoil gave many people a sense that the whole country was hurtling towards a much greater showdown. That prospect has dimmed for the protesters who marched in the streets, then, but it never really did for the “silent majority,” as the winner of the presidential campaign later that year put it.

In his bookChicago ‘68 -- first published 20 years ago by the University of Chicago Press, which is now reissuing it -- David Farber, now a professor of history at Temple University, quotes a position paper that Richard Nixon wrote as a candidate: “The first right of every American, to be free from domestic violence, has become the forgotten civil right of the American people.”

Obviously Nixon did not mean freedom from having your head massaged by a policeman’s billy club. The Republican candidate’s complaint was that the government was abdicating its responsibility to protect the individual’s right to be left in peace. “Instead,” writes Farber in his paraphrase of Nixon’s argument, “that state has pledged itself to a policy of inclusion, a policy that insists that the state has the right to intrude in local affairs and order private citizens to accept the rights of other citizens -- the blacks, the Latinos, the poor, the protestors -- to intrude on their privacy. Such a policy, Nixon is implying, naturally leads to a situation in which certain citizens would intrude violently into other people’s lives, marching and sitting in an taking over streets and even burning and destroying private property.”

The upheaval in Chicago consolidated that feeling. But it also added something else -- an element still lingering in the mix of resentments that fuels so much of American political culture. “Chicago 10" exists because there were so many TV cameras in the streets. And it’s clear that the police gave members of the press extra special attention -- beating them with the gusto they would otherwise have reserved for, say, student radicals carrying the Vietcong flag. Sympathy for the police and contempt for the news media were, for the “silent majority,” two sides of a common rage.

“Both stem from a mistrust of disembodied authority,” writes Farber. “Both feelings come from a suspicion that some outside, elite power has taken control of what should be commonsensical and local.”

A better movie would have found some way to portray that suspicion, and to address how much of it is still in the air, four decades later -- a legacy that has outlasted any dream in the streets that year.

Author/s: 
Scott McLemee
Author's email: 
scott.mclemee@insidehighered.com

The Soiling of Old Glory

It was three months before the Bicentennial and a group of high school students in Boston were saying the Pledge of Allegiance. One of them held a large American flag. But this was not the commonplace ritual of citizenship that it might sound. The teenagers, all of them white, were just as swept up as their parents in the protests over court-ordered desegregation of the Boston public schools; and much of the rhetoric swirling around the anti-busing movement appealed to the old patriotic tropes of resistance to tyranny, defense of the rights of the citizen, and so on. The kids who milled around in front of City Hall in Boston were enjoying their chance to share in the Spirit of ‘76 while also skipping class on a Monday morning.

The people in the anti-busing movement were not, they often insisted, racists. It so happened that a young African-American lawyer named Ted Landsmark had a meeting at City Hall that morning to discuss minority hiring in construction jobs. He turned the corner and walked into a scene that would be recorded for posterity by Stanley Forman, a photojournalist for The Boston Herald American. In a picture that won Forman his second Pulitzer Prize in as many years, we see Landsmark in the right-hand half of the image. Dressed in a three-piece suit, he is the only black person in the crowd. But what dominates the scene is the teenager who had been holding the American flag a little earlier, during the Pledge, and now wields it as a weapon, seeming to drive it like a lance into Landsmark’s body. Forman later titled his photograph “The Soiling of Old Glory.”

Frozen in mid-action, the image is brutal. But what makes it especially so is the expression on the kid’s face – a look of pure hatred and rage, his teeth showing, his upper lip curled in what seems to be (according to some research in affect theory) the universal physical manifestation of disgust. Other people in the crowd look on with what seems to be interest or even pleasure. It appears that nobody is ready to help Landsmark. A man standing just behind the lawyer seems to be holding him, so that kid with the flagstaff can get a clear shot.

But according to Louis P. Masur in The Soiling of Old Glory: The Story of a Photograph that Shocked America, just published by Bloomsbury Press, the picture is a misleading in that regard. Masur -- a professor of American institutions and values at Trinity College, in Connecticut -- analyzed the other images the photographer shot that day and finds a different story unfolding.

“The man who, in a a previous image, is in motion racing to the scene has arrived and is grabbing Landsmark,” writes Masur. “It would appear that he has joined the fray to get in his punches and, worse yet, is pinioning Landsmark’s arms so that the flag bearer has a clear line of attack. In fact, the person holding Landsmark is Jim Kelly, one of the adult organizers of the protest, and he has raced in not to bind Landsmark but to save him from further violence.... He dashed in to try to break up the fight. In another photograph taken a moment later, he can be seen holding his arms out wide trying to keep the protesters back as Landsmark stumbles to safety.”

Knowing this, writes Masur, “changes our understanding of the photograph.” To a degree, perhaps, yes. But no degree of recontextualizing can gainsay the interpretation of the scene offered by the victim of the assault. “I couldn’t put my Yale degree in front of me to protect myself,” Landsmark told a newspaper reporter a few days after the attack. “The thing that is most troubling is that it happened not because I was somebody but because I was anybody....I was just a nigger they were trying to kill.”

The image is iconic. It does not simply reproduce an event; it crystallizes something out of life itself.

“The camera freezes time,” as Masur writes, “giving us always a moment, a fraction of a narrative that stretches before and after the isolated instant.” The Soiling of Old Glory reconstructs some of that narrative – drawing for the most part on published sources, especially the account of the Boston busing crisis given by the great American journalist Anthony Lukas in his book Common Ground: A Turbulent Decade in the Lives of Three American Families (1985).

Protesters often insisted that they hated busing, not African-Americans -- but as someone put it at the time, nobody went out to beat up a school bus. Masur treats the photograph itself as a turning point in the crisis. “However strenuously the anti-busing movement emphasized issues other than race,” he writes, “the photograph shattered the protesters’ claim that racism did not animate their cause and that they were patriotic Americans fighting for their liberties. The photograph had seared itself into the collective memory of the city and installed itself in the imagination of both blacks and whites.”

Masur understands the photograph as leading, “at first, to turmoil and self-scruitny, and later, to progress and healing.” Such a perspective is bound to be appealing to many people, especially given the desire now to imagine a “post-racial” America. In any case, the photograph itself certainly sticks in one’s memory – and not just as a document of a particular conflict. An analyst must try to account for something of its power; and this task is not any easier given the relative neglect by scholars of photojournalism itself as a topic for critical study.

Unfortunately, in the course of meditating upon the image, Masur sometimes exhibits rather serious failures of what is sometimes been called “hermeneutic tact.” This is a feel for the limits of interpretation: a sense that ingenuity might, beyond a certain point, involve a kind of violation of what the process will bear. Hermeneutic tact, like the social kind, is impossible to codify. But you know when someone has violated it, because you wince.

At one point, the author claims that the mob member wielding the flag like a spear “suggests the stabbing of Christ in the side by the Roman solider Longinus, who afterward was converted to Christianity and was canonized.” No, it doesn’t – not even if you squint really, really hard. Here, free association leads to something akin to a context salad. Likewise with some musings about the title Forman gave his photograph: “The verb soiling means defiling or staining,” writes Masur. “But with the root soil, it also suggests planting. Flags are thrust into the ground as statements of control, whether by explorers in the New World or by American astronauts on the moon. In an extreme act of desecration and possession, the protestor, it seems, is trying to implant the flag into the black man and claim ownership.”

We now have a precise technical term for this sort of thing, thanks to the efforts of Harry Frankfurt. A good editor would have found a way to remove such passages, or at least buried them in the quiet graveyard of the book’s apparatus. They are distracting yet take nothing away from the lasting power of the image itself. Looking at it, I felt the urge to reread Frederick Douglass’s “The Meaning of July 4th for the Negro” – a speech from 1852 that Masur, oddly enough, never cites, though it is hard to think of a more pointed and fitting account of how certain beloved symbols may serve as instruments of oppression. More was happening within the frame of Stanley Forman's action shot than any single analysis can quite exhaust.

Author/s: 
Scott McLemee
Author's email: 
scott.mclemee@insidehighered.com

All for Nought?

We are coming to the end of a decade with no name. As the 1990s wound down, there was a little head scratching over what to call the approaching span of 10 years, but no answer seemed obvious, and no consensus ever formed. A few possibilities were ventured -- for example, “the Noughties” -- but come the turn of the new millennium, they proved about as appealing and marketable as a Y2K survival kit. Before you knew it, we were deep into the present decade without any commonly accepted way of referring to it.

The lack seems more conspicuous now. We are in the penultimate year of the '00s (however you want to say it) and much of the urgent rhetoric about “change” in the presidential campaigns implies a demand for some quick way to refer to the era that will soon, presumably, be left behind.

It has become second nature to periodize history by decades, as if each possessed certain qualities, amounting almost to a distinct personality. To some degree this tendency was already emerging as part of the public conversation in the 19th century (with the 1840s in particular leaving a strong impression as an era of hard-hitting social criticism and quasi-proto-hippie experimentation) but it really caught on after the First World War. In The True and Only Heaven: Progress and Its Critics (Norton, 1991), the late Christopher Lasch wrote: “This way of thinking about the past had the effect of reducing history to fluctuations in public taste, to a progression of cultural fashions in which the daring advances achieved by one generation become the accepted norms of the next, only to be discarded in turn by a new set of styles.”

Thus we have come to speak of the Sixties as an era of rebellion and experimentation, much of it communal. And you could read books by 1840s guys like Marx, Kierkegaard, and Feuerbach in cheap paperbacks, if you weren’t too stoned. In the Seventies, the experimentation continued, but in a more privatized way (“finding myself”) and with a throbbing disco soundtrack. By the Eighties, greed was good; so were MTV, pastel, and mobile phones as big as your head. In the Nineties, it seemed for a little while as if maybe greed were bad. But then all the teenage Internet millionaires started ruling over the end of history from their laptops while listening to indie rock and wearing vintage clothes from the Sixties, Seventies, and Eighties.

Of course there are alternative points of emphasis for each decade. This kind of encapsulated history is not exactly nuance-friendly. But it’s no accident that bits of popular culture and lifestyle have become the default signifiers summing up each period of the recent past. “The concept of the decade,” wrote Lasch, “may have commended itself, as the basic unit of historical time, for the same reason the annual model change commended itself to Detroit: it was guaranteed not to last. Every ten years it had to be traded in for a new model, and this rapid turnover gave employment to scholars and journalists specializing in the detection and analysis of modern trends.”

Well, we do what we can. But it seems as if the effort has failed miserably over the past few years. The detectives and analysts have gone AWOL. There is no brand name for the decade itself, nor a set of clichés to clinch its inner essence.

While discussing the matter recently with Aaron Swartz, a programmer now working on the Open Library initiative, I found myself at least half agreeing with his impression of the situation. “This decade seems Zeitgeist-free,” he said. “It’s as if the Nineties never ended and we’re just continuing it without adding anything new.”

But then I remembered that my friend is all of 21 years old, meaning that roughly half his life so far was spent in the 1990s. Which could, in spite of Aaron’s brilliance, somewhat limit the ability to generalize. In that regard, being middle aged offers some small advantage. My own archive of memory includes at least a few pre-literate recollections of the 1960s (the assassinations of 1968 interrupted "Captain Kangaroo") and my impression is that each new decade since then has gone through a phase of feeling like the continuation of (even a hangover from) the one that went before.

It usually takes the Zeitgeist a while to find a new t-shirt it prefers. On the other hand, I also suspect that Aaron is on to something -- because it sure seems like the Zeitgeist is having an unusually hard time settling down, this time around.

In the next column, I’ll sketch out a few ideas about what might, with hindsight, turn out to have been the distinguishing characteristics of the present decade. Perhaps the fact that we still don’t have a name for it is not an oversight, or a bit of bad semantic luck. According the Lasch, decade-speak is a way to understand the past as a story of progress involving the rise and fall of cultural styles and niches. If so, then it may be we might have turned a corner somewhere along the way. The relationship between progress, nostalgia, and the cultural market may have changed in ways making it harder to come up with a plausible general characterization of the way we live now.

More on that in a week. Meanwhile: What do you call the current period? Does it have its own distinct “structure of feeling”? When we fit it into our thumbnail histories of the recent past, how are we likely to understand the spirit of the age?

Author/s: 
Scott McLemee
Author's email: 
scott.mclemee@insidehighered.com

The 2.0 Decade?

Last week Intellectual Affairs discussed the effects of an irresistible force on an immovable object. The force in question is our habit of referring to each recent decade as if it had a distinct quality or even personality: the '50s as an era of straightlaced conformity, for example, or the '70s as (in Tom Wolfe’s phrase) “the Me Decade.” This tendency has dubious effects. It flattens the complexity of historical reality into clichés. It manifests a sort of condescension to yesteryear, even. But decade-speak is a basic element in ordinary conversation -- and the habit is so well-established as to seem, again, irresistible.

If so, the past several years have been a period that won’t budge, if only because we lack a convenient and conventional way to refer to it. Expressions such as “the Aughties” or “the Noughties” are silly and unappealing. I ended the last column by asking what readers were calling this decade. Many of the responses involved expressions of disgust with the era itself, but a couple of people did propose terms for what to call it. One suggestion was, simply, “the Two Thousands,” which seems practical enough. Another was “the 2.0 Decade” -- an expression both witty and apropos, though unlikely to catch on.

Perhaps some expression may yet emerge within everyday speech -- but we’re winding down “the Zeroes” (my own last-ditch suggestion) without a way of referring to the decade itself. For now, it is an empty space in public conversation, a blind spot in popular memory. That may not be an accident. The past several years have not exactly been lacking in novelty or distinctiveness. But the tempo and the texture of the changes have made it difficult to take everything in -- to generalize about experience, let alone sum it up in a few compact stereotypes.

Rather than give an extensive chronology or checklist of defining moments from the present decade -- whatever we end up calling it -- I’d like to use the rest of this column to note a few aspects of what it has felt like to live through this period. The effort will reflect a certain America-centrism. But then, decade-speak is usually embedded in, and a reflection upon, a particular national culture. (German or Italian musings on “the Seventies,” for example, place more emphasis on the figure of the urban guerrilla than the disco dancer.)

These notes are meant neither as memoir nor as political editorial, though I doubt a completely dispassionate view of the period is quite possible. “Your mileage may vary,” as a typical expression of the decade goes. Or “went,” rather. For let’s imagine that the era is over, and that time has come to describe what things felt like, back then....

The decade as unit of meaning does not normally correspond to the exact chronological span marked by a change of digits. We sometimes think of the Eighties as starting with the election of Margaret Thatcher in 1979, for example, or Ronald Reagan’s inauguration in 1981. Conversely, that period can be said to end with the tearing down of the Berlin Wall in 1989, or as late as the final gasp of the Soviet state in 1991. The contours, like the meanings, tend to be established ex post facto; and they are seldom beyond argument.

In this case, there was a strong tendency to think of the decade as beginning on the morning of September 11, 2001 -- which meant that it started amid terror, disbelief, and profound uncertainty about what would happen next. Within a few weeks of the attacks, there would be a series of deaths from anthrax-laced envelopes sent through the U.S. mail. (Among the puzzles of the entire period is just why and how the public managed to lose interest in the latter attacks, even though no official finding was ever made about the source of the anthrax.)

Over the next two years or so, there would be a constantly fluctuating level of official “terror alerts.” Free-floating anxiety about the possibility of some new terrorist assault would become a more or less normal part of everyday life. Even after early claims by the administration of a connection between Saddam Hussein and the 9/11 terrorists were disproven, a large part of the public continued to believe that one must have existed. Elected officials and the mass media tended not to challenge the administration until several months into the Iraq War. The range of tolerated dissent shrank considerably for at least a few years.

Simultaneously, however, an expanding and wildly heterogeneous new zone of communication and exchange was emerging online -- and establishing itself so firmly that it would soon be difficult to recall what previous regimes of mass-media production and consumption had been like. The relationship between the transmitters of news, information, and analysis (one the one hand) and the audience for them (on the other) tended to be ever less one-way.

It proved much easier to wax utopian or dystopian over the effects of this change than to keep up with its pace, or the range of its consequences.
At the same time, screens and recording devices were -- ever more literally -- everywhere. Devices permitting almost continuous contact with the new media kept getting smaller, cheaper, and more powerful. They permeated very nearly the entire domain of public and private space alike. Blaise Pascal’s definition of the universe began to seem like an apt description of cosmos being created by the new media: “an infinite sphere, the center of which is everywhere, the circumference nowhere.”

Quoting this passage, Jorge Luis Borges once noted that Pascal’s manuscript shows he did not originally describe the sphere as infinite. He wrote “frightful” instead, then scratched it out. Looking back on that unnamed (and seemingly unnameable) decade now, it seems like the right word. Whatever meaning it may yet prove to have had, it was, much of the time, frightful.

Author/s: 
Scott McLemee
Author's email: 
scott.mclemee@insidehighered.com

From Plymouth Rock to Plato's Retreat

Last week, Intellectual Affairs gave the recent cable TV miniseries “Sex: The Revolution” a nod of recognition, however qualified, for its possible educational value. The idea that sex has a history is not, as such, self-evident. The series covers the changes in attitudes and norms between roughly 1950 and 1990 through interviews and archival footage. Most of this flies past at a breakneck speed, alas. The past becomes a hostage of the audience’s presumably diminished attention span.

Then again, why be ungrateful? Watching the series, I kept thinking of a friend who teaches history at Sisyphus University, a not-very-distinguished institution in the American heartland. For every student in his classroom who seems promising, there are dozens who barely qualify as sentient. (It sounds like Professor X, whose article “In the Basement of the Ivory Tower” appears in the latest issue of The Atlantic, teaches in the English department there.) Anything, absolutely anything, that might help stimulate curiosity about the past would be a godsend for the history faculty at Sisyphus U.

With that consideration in mind, you tend to watch “Sex: The Revolution” with a certain indulgence -- as entertainment with benefits, so to speak. Unfortunately, the makers stopped short. They neglected to interview scholars who might have provided more insight than a viewer might glean from soundbites by demi-celebrities. And so we end up with a version of history not too different from the one presented by Philip Larkin in the poem “Annus Mirabilis” --

Sexual intercourse began

In nineteen sixty-three

(Which was rather late for me) -

Between the end of the Chatterley ban

And the Beatles' first LP.

-- except without the irony. A belief that people in the old days must have been repressed is taken for granted. Was this a good thing or not? Phyllis Schlafly and reasonable people may disagree; but the idea itself is common coin of public discourse.

But suppose a television network made a different sort of program -- one incorporating parts of what one might learn from reading the scholarship on the history of sex. What sense of the past might then emerge?

We might as well start with the Puritans. Everybody knows how up-tight they were -- hostile to sex, scared of it, prone to thinking of it as one of the Devil’s wiles. The very word “Puritan” now suggests an inability to regard pleasure as a good thing.

A case in point being Michael Wigglesworth -- early Harvard graduate, Puritan cleric, and author of the first American best-seller, The Day of Doom (1642), an exciting poem about the apocalypse. Reverend Wigglesworth found the laughter of children to be unbearable. He said it made him think of the agonies of the damned in hell.You can just imagine how he would respond to the sound of moaning. Somehow it is not altogether surprising to learn that the Rev’s journal contains encrypted entries mentioning the “filthy lust” he felt while tutoring male students.

In short, a typical Puritan -- right? Well, not according to Edmund Morgan, the prominent early-Americanist, whose many contributions to scholarship over the years included cracking the Wigglesworth code. (He is now professor emeritus of history at Yale.)

Far from being typical, Wigglesworth, it seems, was pretty high-strung even by the standards of the day. In a classic paper called “The Puritans and Sex,” published in 1942, Morgan assessed the evidence about how ordinary believers regarded the libido in early New England. He found that, clichés notwithstanding, the Puritans tended to be rather matter-of-fact about it.

Sermons and casual references in letters and diaries reveal that the Puritans took sexual pleasure for granted and even celebrated it -- so long, at least, as it was enjoyed within holy wedlock. Of course, the early colonies attracted many people of both sexes who were either too young to marry or in such tight economic circumstances that it was not practical. This naturally meant a fair bit of random carrying on, even in those un-Craigslist-ed days. All such activity was displeasing unto the Lord, not to mention His earthly enforcers; but the court records show none of the squeamishness about that one might expect, given the Puritans’ reputation. Transgressions were punished, but the hungers of the flesh were taken for granted.

And Puritan enthusiasm for pleasures of the marriage bed was not quite so phallocentric as you might suppose. As a more recent study notes, New Englanders believed that both partners had to reach orgasm in order for conception to occur. Many Puritan women must have had their doubts on that score. Still, the currency of that particular bit of misinformation would tend to undermine the assumption that everybody was a walking bundle of dammed-up desire -- finding satisfaction only vicariously, through witch trials and the like.

Our imagined revisionist documentary would be full of such surprises. Recent scholarship suggests that American mores were pretty wild long before Alfred Kinsey quantified things in his famous reports.

Richard Godbeer’s Sexual Revolution in Early America (Johns Hopkins University Press, 2002) shows that abstinence education was not exactly the norm in the colonial period. Illegitimate births were commonplace; so was the arrival of children six or seven months after the wedding day. For that matter, cohabitation without benefit of clergy was the norm in some places. And while there were statutes on the books against sodomy -- understood as nonprocreative sexual activity in general -- it’s clear that many early Americans preferred to mind their own business.

Enforcing prohibitions on “unnatural acts” between members of the same sex was a remarkably low priority. “For the entire colonial period,” noted historians in a brief filed a few years ago when Lawrence v. Texas went to the U.S. Supreme Court, “we have reports of only two cases involving two women engaged in acts with one another.... The trial of Nicholas Sension, a married man living in Westhersfield, Connecticut, in 1677, revealed that he had been widely known for soliciting sexual contacts with the town’s men and youth for almost forty years but remained widely liked. Likewise, a Baptist minister in New London, Connecticut, was temporarily suspended from the pulpit in 1757 because of his repeatedly soliciting sex with men, but the congregation voted to restore him to the ministry after he publicly repented.”

History really comes alive, given details like that -- and we’ve barely reached the Continental Congress. The point is not that the country was engaged in one big orgy from Plymouth Rock onwards. But common attitudes and public policies were a lot more ambivalent and contradictory in the past than we’re usually prone to imagine.

There was certainly repression. In four or five cases from the colonial era, sodomy was punished by death. But in a society where things tend to be fluid -- where relocation is an option, and where money talks -- there will always be a significant share of the populace that lives and acts by its own lights, and places where the old rules don't much matter. And so every attempt to enforce inhibition is apt to seem like little, too late (especially to those making the effort).

You catch some of that frantic sense of moral breakdown in the literature of anti-Mormonism cited by Sarah Barringer Gordon in her study The Mormon Question: Polygamy and Constitutional Conflict in Nineteenth-Century America, published by the University of North Carolina Press in 2002. Novels about polygamous life in Utah were full of dark fascination with the lascivious excess being practiced in the name of freedom of religion – combined with fear that the very social and political order of the United States was being undermined. It was all very worrying, but also titillating. (Funny how often those qualities go together.)

The makers of “Sex: The Revolution” enjoyed the advantage of telling stories from recent history, which meant an abundance of film and video footage to document the past. Telling a revisionist story of American sexual history would suffer by visual comparison, tending either toward History Channel-style historical reenactments or Ken Burns-ish readings of documents over sepia-toned imagery.

But now, thanks to the efforts of phonographic archivists, we can at least listen to one part of the sexual discourse of long ago. A set of wax recordings from the 1890s -- released last year on a CD called “Actionable Offenses” -- preserves the kind of lewd entertainment enjoyed by some of the less respectable Americans of the Victorian era. And by “lewd,” I do not mean “somewhat racy.” The storytelling in dialect tends to be far coarser than anything that can be paraphrased in a family publication such as Inside Higher Ed. A performance called “Learning a City Gal How to Milk” is by no means the most obscene.

Anthony Comstock -- whose life’s work it was to preserve virtue by suppressing vice -- made every effort to wipe out such filth. It’s a small miracle that these recordings survived. The fact that they did gives us a hint at just how much of a challenge Comstock and associates must have faced.

When a popular program such as “Sex: The Revolution” recalls the past, it is usually an account of the struggle to free desire from inhibition. Or you can tell the same tale in a conservative vein: the good old days of restraint, followed by a decline into contemporary decadence.

Both versions are sentimental; both condescend to the past.

In the documentary I’d like to see, the forces of repression would be neither villains nor heroes. They would be hapless, helpless, confused -- and sinking fast in quicksand, pretty much from the start. It would be an eye-opening film. Not to mention commercially viable. After all, there would be a lot of sex in it.

Author/s: 
Scott McLemee
Author's email: 
scott.mclemee@insidehighered.com

Plenty to Go Around

Epistemology, as everyone around these parts is surely aware, is the study of the problems associated with knowledge – what it is, from whence it comes, and how it is you know that you know what you know (or think you do).

It gets recursive mighty fast. And questions about the relationship between epistemology and ethics are potentially even more so. Most of us just accept the wisdom of Emil Faber, the legendary founder of Faber College in bucolic Pennsylvania, who proclaimed, “Knowledge is good.” (At least that's what it says on the plaque in front of the campus library, as I recall, though it's been many years since my last viewing of "Animal House.")

But what about ignorance? Arguably there is more of it in the world than knowledge. Who studies it, though? Shouldn't epistemology have its equal but opposite counterpart?

A new book from Stanford University Press called Agnotology: The Making and Unmaking of Ignorance proposes that such a field of study is necessary – that we need rigorous and careful thinking about the structure and function and typology of cluelessness. The editors, Robert N. Proctor and Londa Schiebinger, are both professors of history of science at Stanford University. Their volume is a collection of papers by various scholars, rather than a systematic treatment of its (perhaps inexhaustible) subject. But the field of agnotology seems to cohere around a simple, if challenging, point: Ignorance, like knowledge, is both socially produced and socially productive.

This goes against the grain of more familiar ways of thinking. The most commonplace way of understanding ignorance, after all, is to define it as a deficit – knowledge with a minus sign in front of it.

A rather more sophisticated approach (which got Socrates in trouble) treats heightening the awareness of one’s own ignorance as the beginning of wisdom. And the emergence of modern scientific research, a few centuries back, treated ignorance as a kind of raw material: fuel for the engines of inquiry. As with any fuel, the prospect of a shortage seems catastrophic. “New ignorance must forever be rustled up to feed the insatiable appetite for science,” writes Proctor about the common trope of ignorance as resource. “The world’s stock of ignorance is not being depleted, however, since (by wondrous fortune and hydra-like) two new questions arise for every one answered....The nightmare would be if we were somehow to run out of ignorance, idling the engines of knowledge production.”

Each of these familiar perspectives on ignorance -- treating it as deficit, as Socratic proving ground, or as spur for scientific inquiry -- frames it as something outside the processes of knowledge-production and formal education. If those processes are carried on successfully enough, then ignorance will decline.

The agnotologists know better (if I can put it that way).

Ignorance is not simply a veil between the knower and the unknown. It is an active – indeed vigorous – force in the world. Ignorance is strength; ignorance is bliss. There is big money in knowing how to change the subject – by claiming the need for “more research” into whether tobacco contains carcinogens, for example, or whether the powerful jaws of dinosaurs once helped Adam and Eve to crack open coconuts.

Having a memory so spotty that is a small miracle one can recall one’s own name is a wonderfully convenient thing, at least for Bush administration officials facing Congressional hearings. The Internet complicates the relationship between information and ignorance ceaselessly, and in ever newer ways. Poverty fosters ignorance. But affluence, it seems, does it no real harm.

This is, then, a field with much potential for growth. Most of the dozen papers in Agnotology are inquries into how particular bodies of ignorance have emerged and reproduced themselves over time. Nobody quotes the remark by Upton Sinclair that Al Gore made famous: “It is difficult to get a man to understand something when his salary depends upon him not understanding it.” Still, that line certainly applies to how blindspots have taken shape in the discourse over climate change, public health, and the history of racial oppression. (In a speech, Ronald Reagan once attributed the greatness of the United States to the fact that “it has never known slavery.”)

Any sufficiently rigorous line of agnotological inquiry must, however, recognize that there is more to ignorance than political manipulation or economic malfeasance. It also serves to foster a wide range of social and cognitive goods.

The paper “Social Theories of Ignorance” by Michael J. Smithson, a professor of psychology at the Australian National University, spells out some of the benefits. A zone of carefully cultivated ignorance is involved in privacy and politeness, for example. It is also intrinsic to specialization. “The stereotypical explanation for specialization,” writes Smithson, “is that it arises when there is too much for any one person to learn anything.” But another way of looking at it is to regard specialization as a means whereby “the risk of being ignorant about crucial matters is spread by diversifying ignorance.”

Smithson also cites the research of A.R. Luria (a figure something like the Soviet era’s equivalent to Oliver Sacks) who studied an individual with the peculiar ability to absorb and retain every bit of information he had encountered in his lifetime. Such a person would have no advantage over the garden variety ignoramus. On the contrary,“higher cognitive functions such as abstraction or even mere classification would be extremely difficult,” writes Smithson. “Information acquired decades ago would be as vividly recalled as information acquired seconds ago, so older memories would interfere with more recent usually more relevant recollections.”

So a certain penumbra of haziness has its uses. Perhaps someone should contact the trustees of Faber College. The sign in front of the campus library could be changed to read “Diversifying Ignorance is Good.”

Author/s: 
Scott McLemee
Author's email: 
scott.mclemee@insidehighered.com

It's All About the Oil

A friend recently noted that this week’s column would probably run at just about the time the Chinese government was using the Olympic torch to burn down a Tibetan village. Perhaps, he said, this might be a good occasion to check out the latest edition of The Ancient Olympic Games by Judith Swadding – first published by the British Museum in 1980 and now being reissued by the University of Texas Press.

The earlier version contained a succinct overview of how the Olympics (originally held every four years between 776 BC and 395 AD) were revived at the close of the 19th century. The new edition has been expanded to include an account of the past century or so – during which time the games often served as a venue for propaganda, a medium through which great powers conducted their hostilities. All this, of course, in spite of official rhetoric about how the spirit of sportsmanship transcends ideology.

The update is necessary, I suppose, but in some ways anticlimatic – even a distraction. Let modern times take care of themselves; the author’s heart really belongs to the ancient world. Swaddling is a curator at the British Museum, and conducts most of the book as an amiable and instructive tour of what has survived of the world of the original Olympic competitions. The text is heavily illustrated with photographs of the surviving architecture at Olympia and artwork portraying the games themselves.

The most intriguing image, at least to me, was a photograph of an artifact known as a strigil. This is a device that is often mentioned in accounts of the period, but is hard to picture. The strigil was an “oil scraper,” used to peel away the layer of grime that built up on an athlete’s skin in the course of events such as the pankatrion, which is not found in the modern Olympics – a kind of no-holds-barred wrestling match that sounds absolutely brutal, and that doubtless left many who fought in it crippled for life.

The strigil, it turns out, looks something like a windshield de-icer with a little bottle of olive oil conveniently attached by a chain. Having oil rubbed into the skin before competition was supposed to prevent sunburn and otherwise be good for the athlete’s health. Any excess oil was supposed to be strigil’d off before the competition began. But a wrestler sometimes “forgot” to do this quite as thoroughly as he should. This gave him a definite advantage by making it harder for the opponent to get a grip.

More flagrant forms of cheating must have been a serious problem. Hefty fines for it were given out – that is, if the malefactor were lucky. If he wasn’t, justice was dealt out by tough characters armed with whips. Any racer who started before the signal was given should probably have just keep on running. There were also cases of competitions being “fixed.”

Swaddling writes that “instances of bribery were relatively rare.” She quotes an ancient author asking who would be such a lowlife as to try to corrupt a sacred event. (Apart from being a sporting event, the Olympics were also major religious gatherings, with scores of oxen being sacrificed for the occasion.) But you have to wonder if piety really kept everyone in line.

The author does not mention the statues of Zeus in a heavily trafficked area of Olympia, portraying the god in a menacing aspect. Inscriptions at the base of each statue warned people not to attempt to bribe the judges. If you did, Zeus would presumably hurl one of the thunderbolts he was carrying in his fist. This suggests that the temptation to offer the judges a little something was fairly common. Why go to all the trouble if everyone was already reverent and restrained?

Then again, it is easy to imagine why the athletes themselves would want to cheat. Winning immortal glory was one incentive; but so was avoiding immortal shame. The author quotes one Olympic sports commentator whose put-downs still work after two thousand years: “Charmos, a long distance runner, finished seventh in a field of six. A friend ran alongside him shouting, ‘Keep going Charmos!’ and although fully dressed, beat him. And if he had had five friends, he would have finished twelfth.”

Nor was Charmos the only victim of ancient stand-up comedy. Although Swaddling doesn’t cite it, there was the case of a boxer whose “admirers” wanted to erect a monument to his humanitarianism. Why? Because he never hurt anybody.

Greek doctors occasionally expressed irritation when athletes set themselves up as medical advisers and, Swaddling notes, “even attempted to write books on the subject.” You can just picture them performing live infomercials in the agora. Such grumbling aside, it seems there was a close connection between the Olympics and progress in ancient medical science. The latter “virtually came to a standstill when the major games ceased in the late fourth century AD.”

The close connection between the two fields was expressed in mythology: “Asklepios, the Greek god of medicine, learned his skills from the centaur Cheiron, who was credited with the introduction of competitive gymnastics and of music from the doubles pipes to accompany exercise.” (Someone should mention this to the people who run Jazzercise.)

Married women were not allowed onto the grounds of the Olympic festivities, though they managed to sneak in from time to time. Was there some dubious medical theory to rationalize this? In any case, the exclusion did not apply to all women. Both virgins and prostitutes were permitted to attend the games.

That sounds like something out of a Freudian case study. Swaddling simply notes the matter without trying to interpret it. I have no theories, but will offer a bit of related speculation. One of these days an archaeologist is going to discover an inscription that reads: “What happens at the Olympics, stays at the Olympics.”

Author/s: 
Scott McLemee
Author's email: 
scott.mclemee@insidehighered.com

The End of the End of the End of History?

One minor casualty of the recent conflict in Georgia was the doctrine of peace through McGlobalization -- a belief first elaborated by Thomas Friedman in 1999, and left in ruins on August 8, when Russian troops moved into South Ossetia. “No two countries that both had McDonald’s had fought a war against each other since each got its McDonald’s,” wrote Friedman in The Lexus and the Olive Tree (Farrar, Straus, and Giroux).

Not that the fast-food chain itself had a soothing effect, of course. The argument was that international trade and modernization -- and the processes of liberalization and democratization created in their wakes -- would knit countries together in an international civil society that made war unnecessary. There would still be conflict. But it could be contained -- made rational, and even profitable, like competition between Ronald and his competitors over at Burger King. (Thomas Friedman does not seem like a big reader of Kant, but his thinking here bears some passing resemblance to the philosopher’s “Idea for a Universal History from a Cosmopolitan Perspective,” an essay from 1784.)

McDonald’s opened in Russia in 1990 -- a milestone of perestroika, if ever there were one. And Georgia will celebrate the tenth anniversary of its first Micky D’s early next year, assuming anybody feels up for it. So much for Friedman's theory. Presumably it could be retooled ex post facto (“two countries with Pizza Huts have never had a thermonuclear conflict,” anyone?) but that really seems like cheating.

Ever since a friend pointed out that the golden arches no longer serve as a peace sign, I have been wondering if some alternative idea would better fit the news from Georgia. Is there a grand narrative that subsumes recent events? What generalizations seem possible, even necessary and urgent, now? What, in short, is the Big Idea?

Reading op-ed essays, position papers, and blogs over the past two weeks, one finds a handful of approaches emerging. The following survey is not exhaustive -- and I should make clear that describing these ideas is not the same as endorsing them. Too many facts about what actually happened are still not in; interpretation of anything is, at this point, partly guesswork. (When the fog of war intersects a gulf stream of hot air, you do not necessarily see things more clearly.) Be that as it may, here are some notes on certain arguments being made about what it all means.

The New Cold War: First Version. A flashback to the days of Brezhnev would have been inevitable in any case -- even if this month were not the 40th anniversary of Soviet tanks rolling into what was then Czechoslovakia.

With former KGB man Vladimir Putin as head of state (able to move back and forth between the offices of the president and of the prime minister, as term limits require) and the once-shellshocked economy now growing at a healthy rate thanks to international oil prices, Russia has entered a period of relative stability and prosperity -- if by no means one of liberal democracy. The regime can best be described as authoritarian-populist. There have been years of frustration at seeing former Soviet republics and erstwhile Warsaw Pact allies become members of NATO. Georgia (like Ukraine) has recently been invited to do so as well. So the invasion of South Ossetia represents a forceful reassertion of authority within Russia’s former sphere of influence.

We have reached "the end of the end of the Cold War,” goes this interpretation. Pace Fukuyama, it was a mistake to believe that historical progress would culminate in liberal, democratic, constitutional republicanism. The West needs to recognize the emergence of a neo-Soviet menace, and prepare accordingly.

This perspective was coming together even before the conflict between Russia and Georgia took military form. For some years now, the French philosopher Andre Glucksmann (whose musings on Solzhenitsyn’s The Gulag Archipelago were influential in the mid-1970s) has been protesting the rise of the new Russian authoritarianism, quoting with dismay Putin’s comment that “the greatest geopolitical disaster of the twentieth century is the dissolution of the Soviet Union.”

Vaclav Havel, the playwright and former president of the Czech Republic, has done likewise. In a recent interview, Havel said, “Putin has revealed himself as a new breed of dictator, a highly refined version. This is no longer about communism, or even pure nationalism.... It is a closed system, in which the first person to break the rules of the game is packed off to Siberia."

Why be skeptical of this perspective? Certainly the authoritarianism of the Putin regime itself is not in doubt. But the specter of a new Red Army poised to assert itself on the world stage needs to be taken with a grain of salt. A report prepared by the Congressional Research Service in late July notes that budget cuts have forced “hundreds of thousands of officers out of the ranks” of the Russian military, and reduced troop strength to 1.2. million men (compared to 4.3 million in the Soviet military in 1986).

“Weapons procurement virtually came to a halt in the 1990s,” the report continues, “and is only slowly reviving. Readiness and morale remain low, and draft evasion and desertion are widespread.” Raw nationalist fervor will only make your empire just so evil.

The New Cold War: Take Two. Another version of the old template regards an East/West standoff as inevitable, not because Putinist Russia is so vigorous, but because such a conflict is in the interests of the United States.

We're not talking here about the more familiar sort of argument about the U.S. needing access to oil in the Caucus region. Nor does it hinge on strategic concerns about nuclear cooperation between Russia and Iran. It has less to do with economic interest, or geopolitical advantage, than it does the problem of ideological vision (or lack of it) among ruling elites in the West. A renewal of superpower conflict would help to prop up societies that otherwise seem adrift.

This thesis is argued a British think tank called the Institute of Ideas, which takes much of its inspiration from the work of Frank Furedi, a professor of sociology at the University of Kent. Having started out decades ago as Marxists of a rather exotic vintage, writers associated with the institute have moved on to a robustly contrarian sort of libertarianism. Their perspective is that state and civil society alike in the industrialized world are now prone to waves of fear and a pervasive sense of aimlessness.

“It is difficult,” writes Furedi in a recent essay, “to discover clear patterns in the working of twenty-first-century global affairs....The U.S. in particular (but also other powers) is uncertain of its place in the world. Wars are being fought in faraway places against enemies with no name. In a world where governments find it difficult to put forward a coherent security strategy or to formulate their geo-political interests, a re-run of the Cold War seems like an attractive proposition. Compared to the messy world we live in, the Cold War appears to some to have been a stable and at least comprehensible interlude.”

Hence the great excitement at recent events - so rich are they with promise of a trip backwards in time.

There is something at least slightly plausible in this idea. A quick look at Google shows that people have been announcing “the end of the end of the Cold War” for quite a while now. The earliest usage of that phrase I’ve seen comes from 1991. A kind of nostalgia, however perverse, is probably at work.

But Furedi's larger argument seems another example of an idea so capacious that no counterevidence will ever disprove it. If leaders are concerned about what’s happening in the Caucusus, it is because anxiety has made them long for the old verities. But if they ignored those events -- well, that would imply that the culture has left them incapable of formulating a response. Heads, he wins. Tails, you lose.

The End of ... Something, Anyway. Revitalizing the Cold War paradigm keeps our eyes focused on the rearview mirror. But other commentary on events in Russia and Georgia points out something you might not see that way -- namely, that this stretch of paved road has just run out.

The Duck of Minerva – an academic group blog devoted to political science – has hosted a running discussion of the news from South Ossetia. In a post there, Peter Howard, an assistant professor of international service at American University, noted that the most salient lesson of the invasion was that it exposed the limits of U.S. influence.

“Russia had a relatively free hand to do what it did in Georgia,” he writes, “and there was nothing that the U.S. (or anyone else for that matter) was going to do about it.... In a unipolar world, there is only one sphere of influence -- the whole world is the U.S.’s sphere of influence. Russia’s ability to carve any sphere of influence effectively ends unipolarity, if there ever was such a moment.”

Howard points to a recent article in Foreign Affairs by Richard Haass, the president of the Council on Foreign Relations, about the emergence of “nonpolarity: a world dominated not by one or two or even several states but rather by dozens of actors possessing and exercising various kinds of power.”

This will, it seems, be confusing. Countries won’t classify one another simply as friends or foes: “They will cooperate on some issues and resist on others. There will be a premium on consultation and coalition building and on a diplomacy that encourages cooperation when possible and shields such cooperation from the fallout of inevitable disagreements. The United States will no longer have the luxury of a ‘You're either with us or against us’ foreign policy.” (One suspects the country is going to afford itself that luxury from time to time, even so.)

A recent op-ed in The Financial Times does not explicitly use the term “nonpolarity,” yet takes the concept as a given. Kishore Mahbubani, dean of the public policy school of the National University of Singapore, sees the furor over Georgia as a last gasp of old categories. The rise of Russia is “not even close” to being the most urgent concern facing the west.

“After the collapse of the Soviet Union,” he writes, “western thinkers assumed the west would never need to make geopolitical compromises. It could dictate terms. Now it must recognise reality. The combined western population in North America, the European Union and Australasia is 700m, about 10 per cent of the world’s population. The remaining 90 per cent have gone from being objects of world history to subjects.”

Framing his argument in terms borrowed from Chairman Mao, Mahbubani nonetheless sounds for all the world like an American neoconservative in a particularly thoughtful mood. “The real strategic choice” facing the wealthy 10 percent “is whether its primary challenge comes from the Islamic world or China,” he writes. “If it is the Islamic world, the U.S. should stop intruding into Russia’s geopolitical space and work out a long-term engagement with China. If it is China, the U.S. must win over Russia and the Islamic world and resolve the Israel-Palestine issue. This will enable Islamic governments to work more closely with the west in the battle against al-Qaeda.”

From this perspective, concern with the events in Georgia seems, at best, a distraction. Considering it a development of world importance, then, would be as silly as thinking that the spread of fast-food franchises across the surface of the globe will make everyone peaceful (not to mention fat and happy).

Well, I’m not persuaded that developments in the Caucasus are as trivial as all that. But we’re still a long way from knowing what any of it means. It’s usually best to keep in mind a comment by Zhou Enlai from the early 1970s. Henry Kissinger asked for his thoughts about the significance of the French Revolution. “It is,” Zhou replied, “too soon to say.”

Author/s: 
Scott McLemee
Author's email: 
scott.mclemee@insidehighered.com

Pages

Subscribe to RSS - History
Back to Top