History

Essay on Death of Eugene Genovese

Intellectual Affairs

An ancient and corny joke of the American left tells of a comrade who was surprised to learn that the German radical theorist Kautsky’s first name was Karl and not, in fact, “Renegade.” He’d seen Lenin’s polemical booklet The Proletarian Revolution and the Renegade Kautsky but only just gotten around to reading it.

Eavesdropping on some young Marxist academics via Facebook in the week following the historian Eugene Genovese’s death on September 26, I’ve come to suspect that there is a pamphlet out there somewhere about the Renegade Genovese. Lots of people have made the trek from the left to the right over the past couple of centuries, of course, but no major American intellectual of as much substance has, in recent memory, apart from Genovese. People may throw out a couple of names to challenge this statement, but the operative term here is “substance.” Genovese published landmark studies like Roll, Jordan, Roll: The World the Slaves Made (1974) and – with the late Elizabeth Fox-Genovese, his wife -- Fruits of Merchant Capital: Slavery and Bourgeois Property in the Rise and Expansion of Capitalism, not score-settling memoirs and suchlike.

As for the term “renegade,” well… The author of the most influential body of Marxist historiography in the United States from the past half-century turned into one more curmudgeon denouncing “the race, class, gender swindle.” And at a meeting of the Conservative Political Action Committee, no less. The scholar who did path-breaking work on the political culture of the antebellum South -- developing a Gramscian analysis of how slaves and masters understood one another, at a time when Gramsci himself was little more than an intriguing rumor within the American left – ended up referring to the events of 1861-65 as “the War of Southern Independence.”

Harsher words might apply, but “renegade” will do.

He is listed as “Genovese, Gene” in the index to the great British historian’s Eric Hobsbawm’s autobiography Interesting Times: A Twentieth-Century Life (2002). Actually, now I have to change that to “the late, great British historian” Hobsbawm, rather: he died on October 1.

The two of them belonged to an extremely small and now virtually extinct species: the cohort of left-wing intellectuals who pledged their allegiance to the Soviet Union and other so-called “socialist” countries, right up to that system’s very end. How they managed to exhibit such critical intelligence in their scholarship and so little in their politics is an enigma defying rational explanation. But they did: Hobsbawm remained a dues-paying member of the Communist Party of Great Britain until it closed up shop in 1991.

The case of Genovese is a little more complicated. He was expelled from the American CP in 1950, at the age of 20, but remained close to its politics long after that. In the mid-1960s, as a professor of history at Rutgers University, he declared his enthusiasm for a Vietcong victory. It angered Richard Nixon at the time, and I recall it being mentioned with horror by conservatives well into the 1980s. What really took the cake was that he’d become the president of the Organization of American Historians in 1978-79. Joseph McCarthy and J. Edgar Hoover had to be spinning in their graves.

When such a sinner repents, the angels do a dance. With Eric Hobsbawm, they didn’t have much occasion to celebrate. Though he wrote off the Russian Revolution and all that followed in its wake as more or less regrettable when not utterly disastrous, he didn’t treat the movement he’d supported as a God that failed. He could accept the mixture of noble spirits and outright thugs, of democratic impulses and dictatorial consequences, that made up the history he'd played a small part in; he exhibited no need to make either excuses or accusations.

Genovese followed a different course, as shown in  in the landmark statement of his change in political outlook, an article called  “The Question” that appeared in the social-democratic journal Dissent in 1994. The title referred to the challenge of one disillusioned communist to another: “What did you know and when did you know it?" Genovese never got around to answering that question about himself, oddly enough. But he was anything but reluctant  He was much less reluctant about accusing more or less everybody who’d ever identified as a leftist or a progressive of systematically avoiding criticism of the Soviets. He kept saying that “we” had condoned this or that atrocity, or were complicit with one bloodbath or another, but in his hands “we” was a very strange pronoun, for some reason meaning chiefly meaning “you.”

What made it all even odder was that Genovese mentioned, almost in passing, that he’d clung to his support for Communism “to the bitter end.” If decades of fellow-traveling showed a failure of political judgment, “The Question” was no sign of improvement. His ferocious condemnation seemed to indicate that everyone from really aggressive vegans to Pol Pot belonged to one big network of knowing and premeditated evil. You hear that on talk radio all the time, but never from a winner of the Bancroft Prize for American history. Or almost never.

Recognizing that Genovese’s “open letter to the left [was] intended to provoke,” Dissent’s editors “circulated it to people likely to be provoked” and published their responses, and Genovese’s reply, in later issues. The whole exchange is available in PDF here.

Unfortunately it did not occur to the editors to solicit a response from either Phyllis or Julius Jacobson, the founders of New Politics, a small journal of the anti-Stalinist left, which has somehow managed to stay afloat since their deaths in recent years. (Full disclosure: I’m on its editorial board.) They read “The Question” as soon as it came out. If my memory can be trusted, one or the other of them (possibly both: they finished each other’s sentences) called it “blockheaded.” Coming as it did from septuagenarian Trotskyists, “blockheaded” was a temperate remark.

But Julius, at least, had more to say. He’d served as campus organizer for the Young Socialist League at Brooklyn College in the late 1940s and early ‘50s, when Genovese was there. They crossed paths – how could they not? – and Julius remembered him as a worthy opponent. Genovese could defend the twists and turns in Stalin’s policies with far more skill than most CP members and supporters, whose grasp of their movement’s history and doctrine boiled down to the sentiment that the Soviet Union was, gosh, just swell.  

Julius was not prone to losing debates, but it’s clear that these ideological boxing matches went into overtime. Picturing the young Genovese in battle, I find the expression “more Stalinist than Stalin” comes to mind. But that’s only part of it. He was also -- what’s much rarer, and virtually paradoxical -- an independent Stalinist. He brought intelligent cynicism, rather than muddled faith, to making his arguments. An article by the American historian Christopher Phelps demonstrates that Genovese “knew full well and openly acknowledged the undemocratic nature and barbaric atrocities of the Communist states” but refused to “condemn their crimes unequivocally in his writings” and denounced anyone who did. “It serves no purpose,” Genovese wrote, “to pretend that `innocent' -- personally inoffensive and politically neutral -- people should be spared” from revolutionary violence. (Phelps was a graduate student when he published the commentary in 1994. Today he teaches in the American and Canadian Studies program at the University of Nottingham.)

Genovese wasn’t a political hack; his opinions had the veneer of serious thought, thanks in no small part to the fact that he also became an extremely cogent analyst of the history of American slavery.  When he no longer had a tyranny to support, he “discovered” how complicit others had been, and began warning the world about the incipient totalitarianism of multiculturalism. His studies of the intellectual life of the slaveholding class began to show ever more evident sympathy for them – a point discussed some years ago in “Right Church, Wrong Pew: Eugene Genovese & Southern Conservatism,” an article by Alex Lichtenstein, an associate professor of history at Indiana University, which I highly recommend. Genovese’s scholarship has been influential for generations, and it will survive, but anyone in search of political wisdom or a moral compass should probably look elsewhere.

 

Editorial Tags: 

Essay on James Harvey Robinson centennial and 'Doing Recent History'

This year is the centenary of James Harvey Robinson’s book The New History: Essays Illustrating the Modern Historical Outlook, which made a case for teaching and writing about the past as something other than the record of illustrious men gaining power and then doing things with it.

“Our bias for political history,” he wrote, “led us to include a great many trifling details of dynasties and military history which merely confound the reader and take up precious space that should be devoted to certain great issues hitherto neglected.” The new breed of historians, such as the ones Robinson was training at Columbia University, would explore the social and cultural dimensions of earlier eras -- “the ways in which people have thought and acted in the past, their tastes and their achievements in many fields” – as well as what he called “the intricate question of the role of the State in the past.”

One hundred years and several paradigm shifts later, this “new history” is normal history; it’s not obvious why Robinson’s effort was so provocative at the time. You can see how it might have upset turf-protecting experts concerned with, say, whether or not Charles the Bald was actually bald. But it also promised to make connections between contemporary issues and knowledge of the past -- or threatened to make those connections, to put it another way.

Hold that thought for now, though. Jumping from 1912 to the present, let me point out a new collection of papers from the University of Georgia Press called Doing Recent History, edited by Claire Bond Potter and Renee C. Romano. (Potter is professor of history at the New School, Romano an associate professor of history at Oberlin College.)

There’s something puzzlingly James Harvey Robinson-ish about it, even though none of the contributors give the old man a nod. It must be a total coincidence that the editors are publishing the collection just now, amidst all the centennial non-festivities. And some of Robinson’s complaints about his colleagues would sound bizarre in today’s circumstances – especially his frustration at their blinkered sense of what should count as topics and source materials for historical research. “They exhibit but little appreciation of the vast resources upon which they might draw,” he wrote, “and unconsciously follow for the most part, an established routine in their selection of facts.”

As if in reply, the editors of Doing Recent History write: “We have the opportunity to blaze trails that have not been marked in historical literature. We have access to sources that simply do not exist for earlier periods: in addition to living witnesses, we have unruly evidence such as video games and television programming (which has expanded exponentially since the emergence of cable), as well as blogs, wikis, websites, and other virtual spaces.”

No doubt cranky talk-show hosts and unemployed Charles the Bald scholars will take umbrage at Jerry Saucier’s paper “Playing the Past: The Video Game Simulation as Recent American History” – and for what it’s worth, I’m not entirely persuaded that Saucier’s topic pertains to historiography, rather than ethnography. But that could change at some point. In “Do Historians Watch Enough TV? Broadcast News as a Primary Source,” David Greenberg makes the forceful argument that political historians tend to focus on written material to document their work: a real anachronism given TV’s decisive role in public life for most of the period since World War II. He gives the example of a sweeping history of the Civil Rights movement that seemed to draw on every imaginable source of documentation -- but not the network TV news programs that brought the struggle into the nation's living room. (The historian did mention a couple of prime-time specials, but with no details or reason to suppose he'd watched them.) Likewise, it’s entirely possible that historians of early 21st-century warfare will need to know something about video games, which have had their part in recruiting and training troops.

Besides the carefully organized, searchable databases available in libraries, historians have to come to terms with the oceans of digital text created over the past quarter-century or so -- tucked away on countless servers for now, but posing difficult questions about archiving and citation. The contributors take these issues up, along with related problems about intellectual property and the ethical responsibility of the historian when using documents published in semi-private venues online, or deposited in research collections too understaffed to catch possible violations of confidentiality.

In “Opening Archives on the Recent Past: Reconciling the Ethics of Access and the Ethics of Privacy, “ Laura Clark Brown and Nancy Kaiser discuss a number of cases of sensitive information about private citizens appearing in material acquired by the Southern Historical Collection of the University of North Carolina at Chapel Hill. For example, there's the author whose papers include torrid correspondence with a (married) novelist who wouldn't want his name showing up in the finding aid. Brown and Kaiser also raise another matter for concern: “With the full-text search capabilities of Google Books and electronic journals, scholarly works no longer have practical obscurity, and individuals could easily find their names and private information cited in a monograph with even a very small press run.”

The standard criticism of James Harvey Robinson’s work among subsequent generations of professional historians is that his “new history” indulges in “presentism” – the sin of interpreting the past according to concerns or values of the historian’s own day. In Robinson’s case, he seems to have been a strong believer in the virtues of scientific progress, in its continuing fight against archaic forms of thought and social organization. With that in mind, it’s easier to understand his insistence that social, cultural, and intellectual history were at least as important as the political and diplomatic sort (and really, more so). Students and the general public were better off learning about “the lucid intervals during which the greater part of human progress has taken place,” rather than memorizing the dates of wars and coronations.

None of the contributors to Doing Recent History are nearly that programmatic. Their main concern is with the challenge of studying events and social changes from the past few decades using the ever more numerous and voluminous sources becoming available. Robinson’s “new history” tried to make the past interesting and relevant to the present. The “recent history” people want to generate the insights and critical skills that become possible when you learn to look at the recent past as something much less familiar, and more puzzling, than it might otherwise appear. I'm struck less by the contrast than the continuity.

Robinson would have loved it. In fact, he even anticipated their whole project. “In its normal state,” he wrote one hundred years ago, “the mind selects automatically, from the almost infinite mass of memories, just those things in our past which make us feel at home in the present. It works so easily and efficiently that we are unconscious of what it is doing for us and of how dependent we are upon it.”

Our memory — personal and cultural alike – “supplies so promptly and so precisely what we need from the past in order to make the present intelligible that we are beguiled into the mistaken notion that the present is self-explanatory and quite able to take care of itself, and that the past is largely dead and irrelevant, except when we have to make a conscious effort to recall some elusive fact.” That passage would have make a good epigraph for Doing Recent History, but it’s too late now.

Editorial Tags: 

Appeals court rejects researchers' bid to protect oral history confidentiality

Smart Title: 

U.S. appeals court bars researchers’ bid to quash subpoena seeking oral history records at Boston College.

Irish historian considers significance of fight over papers at Boston College

Irish historians have watched the legal case relating to the witness statements from participants in the conflict in Northern Ireland held by Boston College with great interest and with no little trepidation.

Regardless of the ultimate outcome of the case, there are real fears that the controversy has already jeopardized the collection and preservation of historical material relating to the conflict in Northern Ireland.

One friend, who was instrumental in helping University College Dublin Archives to secure a significant collection of private papers that includes material relating to the Northern Ireland peace process, remarked recently that it would have been more difficult to convince the donor to preserve his papers and donate them to an archive if the controversy at Boston College had previously been known.

The great difficulty here is that any comprehensive history of the Northern Ireland conflict will be very dependent on statements from the men and women who were directly engaged in the events: republicans, loyalist paramilitaries, police, British army personnel, politicians, public servants, and the ordinary people whose lives were shaped by the conflict. The nature of the conflict in Northern Ireland was such that no existing archive can expect to stand as sufficient sources for the writing of plausible history; the words of the people who lived through (and participated in) the conflict need to be preserved to allow for the creation of a more meaningful historical record.

The Boston College interviews are one of several series of interviews that currently exist, or are now being collected. Oral history is especially important if we are to tell the story of everyday life during these years, and the motivations and reflections of men and women who did not hold positions of leadership.

Irish historians are very conscious of the importance of such testimonies, because a comparable archive exists relating to the 1916 Rising and the Irish war of independence. In the 1940s and early 1950s the Bureau of Military History – funded by the Irish government – collected statements from men and women who participated in these events. Some of those men and women engaged in violence or other acts about which they might not have been willing to speak publicly. The statements were finally released in 2004, 50 years after they were collected, when all the witnesses had died.

Although this delay has been criticized, it shows a respect for the witnesses and indeed for all who were affected by the events narrated in these testimonies. These statements, and the anticipated release shortly of thousands of Military Pension Files, containing further firsthand statements from those involved in the War of Independence, provide a permanent and valuable record of a critical period in the emergence of contemporary Ireland.

These firsthand accounts have transformed the understanding of these years, bringing it to life in a manner that more formal records cannot do.

The oral statements of participants in the conflict in Northern Ireland offer a similar potential to provide a rounded account of these years.  This will only happen, however, if those making statements can trust the record-taker, and trust the place where these records are deposited.  

This trust requires firm assurances that the statements will not be released prematurely, or divulged other than under the terms agreed.  The witness statements should be collected with the intent of creating a long-term historical record; while there may be an understandable eagerness to gain access to them, in order to be first with the story – they are best left undisturbed for a significant period of time.  Essentially, they should be collected and protected for posterity – not for the present.

University College Dublin (UCD), in common with other research universities, has a clear code of ethics that applies to all material that relates to identifiable individuals; securing their consent to any use that permits them to be identified is a key requirement.

In addition researchers and archivists must observe the requirements of the Data Protection Act, which precludes the revealing of personal information – relating to matters such as health, family circumstances or financial records, and these regulations are strictly enforced. Many of the private collections deposited in UCD Archives can only be accessed with the permission of the donor.

While testimonies relating to paramilitary activities are obviously of a particularly sensitive nature, there are recognized laws and procedures in place that protect the witness, the archive, the archivist and the researcher – provided that they are observed.

The issue may become more complex when records are transferred from one country to another, if the legal framework relating to data protection and disclosure is different, but again, a robust protocol and clearly-determined governance – agreed before any records are compiled – should reduce these risks.

Oral histories are extremely valuable sources for posterity, and they are becoming of still greater importance in an age when communication increasingly takes the form of telephone conversations, e-mails, texts, tweets and other means; these are obviously less easily preserved than letters or written memorandums.  

Ultimately, there will be lessons to be learned from the specifics of the Boston College case. The overarching ambition must remain unchanged: to ensure that a trusted record of the past can be compiled and preserved for posterity.

Mary E. Daly is professor of modern Irish history at University College Dublin.

Review of Nancy K. Bristow, "American Pandemic: The Lost Worlds of the 1918 Influenza Epidemic"

Intellectual Affairs

It was a classic instance of blaming the messenger: Spanish newspapers carried the earliest reports of a new illness that spread across the globe in the final months of World War I, and so it be came known as “Spanish influenza,” although its real point of origin will never be known. It was virulent and highly communicable. A paper appearing in the Centers for Disease Control and Prevention journal Emerging Infectious Diseases a few years ago estimated that 500 million people, almost a third of the world’s population, were stricken with it. By the end of its reign of terror in the final months of 1920, there were 50 million fatalities -- more than three times as many as died from the war itself. These figures may be on the low side.

In her two long essays on illness, Susan Sontag grappled with the strong and longstanding tendency to treat certain diseases as meaningful: the vehicle for metaphors of social or cultural disturbance. “Feelings about evil are projected onto a disease,” she wrote. “And the disease (so enriched with meanings) is projected onto the world." Just so, one would imagine, with a pandemic. Something in a plague always hints at apocalypse.

But the striking thing about Spanish influenza is how little meaning stuck to it. Plenty of sermons must have figured the Spanish flu as one of the Four Horsemen, at the time, but the whole experience was quickly erased from collective memory, at least in the United States. In 1976, the historian Alfred W. Crosby published a monograph called Epidemic and Peace: 1918 that Cambridge University Press later issued as America’s Forgotten Pandemic (1989). Apart from being snappier, the new title underscored the almost total disappearance from anything but the specialist’s sense of history. One person in four in the U.S. suffered from an attack of Spanish flu, and it killed some 675,000 of them. The catastrophe seems never to have interested Hollywood, though, and the only work of fiction by an author who lived through the outbreak, so far as I know, is Katherine Anne Porter’s novella “Pale Horse, Pale Rider.” (Biblical imagery seems just about unavoidable.) (Note: This article was updated from an earlier version to correct the name of the author of Epidemic and Peace: 1918.)

The title of Nancy K. Bristow’s American Pandemic: The Lost Worlds of the 1918 Influenza Epidemic (Oxford University Press) is an echo of Cross’s America’s Forgotten Pandemic. I don’t want to read too much into the one-word difference, but it does seem that the influenza crisis of almost a century ago has been working its way back into public awareness in recent years. Several more books on the subject have appeared since Gina Kolata’s best-seller Flu: The Story of the Great Influenza Pandemic of 1918 and the Search for the Virus That Caused It came out in 1999. The Public Broadcasting Service has done its part with an excellent documentary as well as an episode of Downton Abbey in which pestilence hits a country house in England during a dinner party.

So “forgotten” is no longer quite the right word for the nightmare. But it remains almost impossible to imagine the ferocity of the pandemic, much less its scale. The contemporary accounts that Bristow draws on retain their horror. Doctors wrote of patients changing from “an apparently well condition to almost prostration within one or two hours,” with raging fevers and severe pain in even the milder cases – and the worst involving a “bloody exudate” coughed up from the “peculiar and intense congestion of the lungs with [a] hemorrhage,” so that it was “simply a struggle for air until they suffocate.”

Morgues were overrun. In poor households, several delirious family members might be crowded into the same bed along with someone who had died. Those who made it to the hospital could lie unattended for days at a time. The authorities were issuing “don’t worry, it’s just another flu”-type pronouncements well into the catastrophic phase of the epidemic. Quarantines and bans on public gatherings were easier to proclaim than to enforce. Having absorbed the relatively new idea that disease was spread by germs, people donned surgical masks to protect themselves – to no avail, since influenza was a virus. The epidemic went through three waves of contagion in as many years, and it wore down whatever patience or civic-mindedness people had when the disaster hit.

A pandemic, by definition, puts everyone at risk. But access to medical help – inadequate as it proved – was far less egalitarian. (As is still the case, of course.) Much of the historical scholarship on disease in recent decades has stressed how the interaction between medical professionals and their clientele tends to reinforce the social hierarchies already in place. Bristow’s work follows this well-established course, combining it with a familiar emphasis on the changes in medicine’s public role in the wake of Progressive Era reforms.

She writes about how poor, immigrant, or Native American sufferers were assumed guilty “of dishonesty and laziness, and of attempting to take advantage of others’ generosity” until proven otherwise, while the African-American population was forced “to continue relying on their own too limited community resources as they sought to provide sufficient care for their sick neighbors.” And while the U.S. Public Health Service had been created in 1912, its capacity to respond to the influenza crisis was limited, given how poorly the disease was understood. Even gathering reliable statistics on it proved almost impossible while the virus was on its rampage.

The most interesting chapter of America’s Pandemic considers how doctors and nurses responded to the crisis. Although they often worked side-by-side together, their experiences were a marked contrast.

“Ignorant of the disease’s etiology, uncertain of the best methods of treatment, and unable to ease the suffering of their patients,” Bristow writes, “physicians often expressed a sense of helplessness as individuals and humility as members of a profession.” (You know something is catastrophic when it reduces doctors to humility.)

Belonging to an almost entirely male profession, they “gauged their work against the masculine standards of skill and expertise” – and the inevitable military metaphor of going to battle against the disease became that much more intense given the example of actual soldiers fighting and dying in the trenches. But the influenza virus was stronger. “Like a hideous monster,” one physician wrote, “he went his way, and none could hinder.” Doctors’ letters and diaries from the period reflect a sense of bewilderment and failure.

For a while the authority of the profession itself was undermined. Patent medicines and related quackery proved no more effective in treating or preventing the disease than anything the doctors could offer. But they weren’t any less effective, either.

The nurses could not have responded more differently. Caring for patients was “a terrific test” and “high privilege,” they wrote, “a most horrible  and yet most beautiful experience.” As with doctors, many lost their lives while tending to the sick. But one nurses’ report said that the work was “one of the most immediately satisfactory experiences of our lives” for those who survived it, “and this is true even though we were borne down with the knowledge that, do all we might, the pressing, tragic need for nursing was much greater than could possibly be met.”

And this, too, was a matter of gauging their skill by socially sanctioned gender norms. “Women working as nurses aspired to what they viewed as the uniquely feminine qualities of domesticity, compassion, and selflessness,” writes Bristow. “To measure up to these standards nurses needed only to care for their patients, not cure them, and this they proved able to do.”

A few hours after choosing American Pandemic for this week’s column, I attended a public event at which every third person seemed to be coughing, with a congested wheeze usually audible. Synchronicity is not always your friend. For the past several days I have been reading about influenza, and writing about it, while suffering from one of its milder variants. The experience is not to be recommended.

Two quick points might be worth making before the medication kicks in. Bristow’s final assessment is that the horror and devastation of the pandemic could not be reconciled with the preferred national narrative of progress and redemption, “with its upbeat tenor and its focus on a bright future.” At most, its memory was preserved as part of local history, or through family stories.

The argument is plausible, to a degree, but it overlooks the element of trauma involved – not just the suffering and death during that period, but the feeling of being rendered helpless by an event that’s come out of nowhere.

And what sense does it make to think of the events of 1918-20 as “America’s pandemic,” forgotten or otherwise? Deaths from influenza in the United States during that period represent something like 1.4 percent of the world’s fatalities from the disease. How was it remembered -- or erased from public memory -- elsewhere? Diseases don’t respect borders, and it’s hard to see why the historiography of disease should, either.
 

Historians start effort to define what graduates should be able to do

Smart Title: 

In first effort of its kind in the U.S., a discipline works to define what graduates of its programs should be able to do -- from associate degree through the Ph.D.

Review of Liette Gidlow, "Obama, Clinton, Palin"

Intellectual Affairs

It's hard to think of with three living figures in American politics who generate more passion than the ones named in the title of Obama, Clinton, Palin: Making History in Election 2008, a collection of essays edited by Liette Gidlow and published by the University of Illinois Press. The word “passion” here subsumes both ardor and loathing. I doubt it is intentional, but the photographs on the book’s cover are arrayed such that they seem almost attached to one another, like Siamese triplets perhaps, or some beast with multiple heads in one of the more psychedelic passages of Biblical prophecy. If the 2012 campaign doesn’t give you nightmares, that image still might.

Gidlow, the editor, is an associate professor of history at Wayne State University, and the 11 other contributors are all historians as well. Their essays frame the 2008 campaign as a late episode in the country’s uneven progress toward incorporating anybody other than white men into elected government. Every so often we hear that the United States entered the “post-feminist” or “post-racial” era at some unspecified point in the (presumably) recent past. But reality has a way of asserting itself, and the next thing you know there are people demonstrating against the president with signs that show him as a cannibal with a bone through his nose, or a politician responds to a female heckler by hinting that she should perform a sexual service for him.

Debates about race and gender came up often during the 2008 primaries and the election season that followed, so the book’s emphasis is hardly misplaced. Much discussed at the time was each campaign’s groundbreaking status in the history of presidential contests -- with Palin being the first woman to run on the Republican ticket, while Obama was the first African-American, and Clinton the first woman, to have a serious shot at the Democratic nomination.

That is true, but it is blinkered. If the essays in Obama, Clinton, Palin could be reduced to a single theme, it might be that the history-making campaigns of 2008 were also products of history, or echoes of it, as well. The most interesting chapter in that regard is Tera W. Hunter’s “The Forgotten Legacy of Shirley Chisholm,” which recalls the African-American Congresswoman’s presidential bid in 1972.

Chisholm had no hope of winning, and knew it, but paved the way for Hillary Clinton and Barack Obama. She was, Hunter says, “antiracist, antisexist, pro-choice, pro-labor, antiwar, fiercely independent, and, above all, principled.” But the point of invoking her memory is hardly to treat the Clinton or Obama campaigns as rightful heirs. In Hunter’s reading, the Democratic primaries of 2008 were a travesty of Chisholm’s effort.

“Hillary Clinton never spoke openly, critically, or engagingly about the status of women in our society, the problems of gender discrimination, and what we should do about it,” writes Hunter. As the competition heated up, Clinton “became more forthright in claiming to be the victim of gender discrimination,” while Obama “continued to be reluctant to dwell on issues related to race and racism.” By contrast, Chisholm “challenged the racist attitudes and practices in the women’s movement,” Hunter writes, “as much as she challenged sexism among African-Americans and the broader society.”

A cynic might reply that she could afford to do that precisely because she was not trying to get elected. To win, you pander, and when somebody complains, you try to figure out how to pander to them, too. But running a winning campaign involves neutralizing reservations as much as enlisting allegiance. On that score Obama “faced the challenge of calming white fears,” Hunter writes, “of reassuring the populace that he was not an ‘angry black man’ seeking racial retribution.” (The point is also made in “Barack Obama and the Politics of Anger” by Tiffany Ruby Patterson, who recalls how the candidate navigated the controversy over Rev. Jeremiah Wright’s fire-next-time sermons.)

Susan M. Hartmann’s “Hillary Clinton’s Candidacy in Historical and Global Context” offers one of the book’s analyses of how gender stereotypes and media sexism created obstacles for the candidate – even as she “benefited not only from her husband’s name and popularity, but also from the access he afforded” to sundry political resources. “By contrast,” Hartmann says, “Republican vice presidential candidate Sarah Palin escaped much of the gender hostility that Clinton faced,” largely because of “her strong right-wing credentials, importantly including opposition to much of the feminist agenda.”

Indirectly challenging that claim is Catherine E. Rymph’s “Political Feminism and the Problem of Sarah Palin.” Rymph makes the case for regarding Palin as the legatee of a strain of G.O.P. feminism going back to the 1940s, when “Republicans made up a greater number of women serving in Congress” than did Democrats. Their party platform endorsed the Equal Rights Amendment in 1940 – four years before the Democrats did. (See also the abundant scholarship on the role of women activists on the right, discussed by Kim Phillips-Fein shows in “Conservatism: A State of the Field,” her thorough survey of recent historiography.)

Clinton and Palin were both “presented in sexualized ways” during their campaigns, Rymph points out: “Clinton was a castrating bitch, while Palin was a ‘MILF’ (presumably more flattering, but equally degrading).” Opponents were relentless in mocking Palin’s hair, clothes, days as a beauty-pageant competitor and the like, which Rymph cites as evidence that “Americans of all stripes can tolerate and even embrace sexism when it is used as a weapon against women with whom they disagree or whom they see as representing the wrong picture of womanhood.”

Thanks to Google, several contributors are able to document the racist and misogynistic rage churning throughout the primaries and campaigns. This is the third or fourth academic press publication I’ve read in the past few months to quote extensively from blog posts, comments sections, Facebook dialogs, and so forth. The effect is sometimes informative or illuminating, but usually it isn’t.  You get used to seeing chunks of semiliterate ranting online, but it’s still mildly disconcerting to find it in cold type, properly cited, with scholarly apparatus.

The poisonous material quoted in “Michelle Obama, the Media Circus, and America’s Racial Obsession” by Mitch Katchum makes it perhaps the most horrifying chapter in the book. It is undoubtedly necessary to the job of showing the double dose of stereotyping (“angry black woman,” “Jezebel,” “baby mama”) that emerged in 2008 campaign, and set the tone for much that’s followed. But in the future, historians might do well to focus on computerized content analysis of digital chatter, rather than exhibiting samples, because it does not take that long to reach the threshold marked ad nauseum.

I’ve discussed a few papers in Obama, Clinton, Palin, not attempted a comprehensive review. But one general impression bears mentioning, and a look through the index confirms it: there are no entries for Afghanistan, Iraq, Lehmann Brothers, terrorism, torture, or the Troubled Asset Relief Program, nor any other major issue at stake in 2008. All are mentioned at some point in the book. But the index-maker can't be faulted, because they are always quite peripheral to the project.

What we get, then, is political history at a considerable remove from questions of governance. It’s certainly possible to argue that combat over race or gender in a presidential campaign may serve as a proxy for debates over social or economic policy. But that argument has to be made. Otherwise it seems as if the only issue in an election is whether the most powerful elected offices in the country should or should not be more demographically representative.

In any case, the 2012 presidential race has been pretty uneventful in the politics-of-difference department -- so far, anyway. I contacted Liette Gidlow, the editor of the book, to ask what she made of the contrast with four years ago.

“I do think that the 2008 campaigns expanded leadership opportunities for African-Americans, women, and others in a lasting way,” she responded by e-mail. “The political contests so far this year would seem to suggest otherwise; though Michele Bachmann and Herman Cain had their moments, ultimately their campaigns failed to win broad support among Republicans. But for the past 40 years, it has been the Democratic party, not the Republican party, that has been the driving force behind diversity in political representation, and with the primary contests limited to the Republicans this year, we shouldn't be surprised to see that the field has been dominated by white men. Which doesn't mean that in future presidential contests the Democrats will offer a slate that ‘looks like America’ or that the Republicans won't. But every time a candidate who departs from our usual expectations succeeds, it expands our ability to imagine, and ultimately to accept, different kinds of people as leaders.”

That seems fair enough, all in all. But it leaves open the question of what difference it makes, if any, after that. It certainly felt like something was changing on election night in 2008, but four years later, I often wonder what it was.

Essay on how a university responded to criticism of one of its heroes

Sadly, almost any topic in our modern society becomes politicized, forcing us into a corner where we must choose to be for or against. Opinion comes first, interpretation comes later, if at all. Simplicity is the order of the day. Dealing with complexity is inconvenient.

So it is with the irresistible urge to judge historical figures such as Thomas Jefferson, Robert E. Lee and even George Washington — deciding whether we are pro or con, and then injecting them into our contemporary partisan conflicts. It overwhelms any inclination to embrace the study of the past for something other than debating points. We know, or should know, that our own history is complicated. We would appreciate it if those judging us in the future would respect that. We owe the same courtesy to those who came before us.

When Joseph Ellis wrote American Sphinx, his masterful study of Thomas Jefferson, he ran headlong into precisely this problem. As he ventured into his research, his youthful fascination with Jefferson gave way to a mature appreciation for the man with all his contradictions, faults and strengths, failures and accomplishments. History rarely presents us with simple morality tales. And the fortunes of Jefferson in the contemporary age look so much like the dreaded approval ratings in volatile public opinion polls. One day, he is everyone’s hero. The next, he is a hypocrite and a devious politician.

It was a puzzle for Ellis. As commentaries on Jefferson devolved into point-counterpoint volleys, it seemed "impossible to steer an honorable course between idolatry and evisceration," he wrote. The historian concluded wisely that "affection and criticism toward Jefferson are not mutually exclusive positions," and that "all mature appraisals of mythical figures are destined to leave their most ardent admirers somewhat disappointed." Influential individuals who live in turbulent times do things that call attention to the strength of their character, and they do other things that point to their human qualities, which is to say their imperfections. "Anyone who confines his research to one side of the moral equation," wrote Ellis, "is destined to miss a significant part of the story."

There are occasions when I feel these tensions in a direct and personal way.

I serve as president of a college named after two influential and consequential figures. One of them is George Washington, who made our college the beneficiary of his only significant gift to higher education: $20,000 in James River Canal stock. He wanted to support an institution located in an area of the country he considered the "Western frontier."

The other is Robert E. Lee. After the Civil War, he became president of what was then Washington College. He and several members of his family are buried in Lee Chapel, an iconic building on campus where we hold many of our formal ceremonies. My wife, my son and I live in Lee House — the house on campus built for him and his family that has served as the home for all the university's presidents. The dining room is where he died. The building in the driveway was a stall for Traveller, Lee’s horse; it remains preserved as it was back in Lee’s day, and by custom its doors remain forever open. It is the second-most-visited tourist spot in a small town with many historical sites.

We commemorate both men during our annual Founders’ Day Convocation on campus, held on Lee’s birthday each year. Our convocation speaker this year was Ron Chernow, author of the Pulitzer-winning Washington: A Life.

I also am a graduate of the institution I now serve as president, and I proudly and forcefully call upon the traditions of the university in the service of preparing students for lives of integrity and responsibility. My own stance toward Lee is one of respect, especially for what Michael Sandel, the Harvard political theorist, refers to as "the quality of character" of his deliberation when he confronted impossible choices.

But it is not idolatry. Neither is it evisceration. It is instead an honest attempt to understand the man and his times, which included slavery, secession and civil war. I take this stance not for purposes of reaching a final judgment on whether he was destined for heaven or hell — which would be the height of arrogance, as if I, and not Providence, could make such a call — but to appreciate the complexity of history and those who live it. Like Ellis with Jefferson, I have come to the conclusion that affection for and criticism of Lee are not mutually exclusive.

There are times, though, when that is easier said than done.

What should the university do when a Washington Post columnist condemns Lee as a traitor who chose the wrong side when it came to the great moral question of his time? How should we respond when a PBS documentary, which otherwise portrayed the man with all the respect history requires, got it wrong when it came to the chapter in his life that profoundly affected the university? He did not, as the documentary claimed, live out his final years "in hiding" at a small college in the mountains of Virginia. Rather, he fulfilled a pledge. "I have a self-imposed task which I must accomplish," he wrote. "I have led the young men of the South in battle; I have seen many of them die in the field; I shall devote my remaining energies to training young men to do their duty in life." Forsaking far more lucrative offers, he came to a nearly bankrupt college to prepare young men from the North and the South for a dramatically different world in the wake of the Civil War.

Beyond our campus, the story of Lee the general overshadows the story of Lee the educator; understandably so. But those years of curriculum reform and lessons in integrity are inseparable from the man’s biography, and they add a deeper appreciation, especially to our understanding of Lee’s refined sense of duty. Many students and alumni of this university cannot recognize the man when the profile casts aside the effect he had on an educational institution that had a direct effect on our own lives.

For that reason and many others, criticism of Lee, even the unintended oversight, triggers the reflex to rush to the defense. As one thoughtful, dedicated alumnus wrote to me in the wake of the PBS documentary and the Washington Post column, neither of which mentioned the university, "we" have been attacked, and the institution “has done nothing to respond.”

But the university is not synonymous with the man. It is an institution of values, to be sure. And to illustrate its values, it often invokes the stories of individuals who have made it what it is. But it is first and foremost an institution of learning and study, of critical reflection on all matters. Its primary mission is to seek truth. It cannot do so if it closes its own history to examination.

Lee was a dignified, humble man. His sense of duty and honor would cause him to cringe if he ever became the subject of idolatry or the embodiment of myth. Blindly, superficially and reflexively rushing to his defense is no less an affront to history than blindly, superficially and reflexively attacking him. What he needs, what he deserves, and what his record can withstand is the honest appraisal of those who have not made up their minds, who can appreciate the man with all his complexities and contradictions. History is indeed not kind enough to present us with simple morality tales.

More to the point, a university serves its students best by not imposing an orthodox point of view about the past and certainly not the future. Higher education, no less than other institutions, is a victim of our politicized society. The things we do — the courses we teach, the values we espouse, the faculty we hire — should not be subjected to ideological litmus tests.

John Henry Cardinal Newman, the 19th-century British educator, remains a powerful influence on how we think about a college education. His words remind us to keep our bearings. The faculty should “learn to respect, to consult, to aid each other.” In so doing, they create a culture of learning. "A habit of mind is formed which lasts through life, of which the attributes are freedom, equitableness, calmness, moderation, and wisdom." Newman concludes, "This is the main purpose of a university in its treatment of its students."

Ellis has it right when it comes to the role of history. Newman has it right when it comes to the role of the university. But it remains a challenge in our highly divisive times. If any institution should resist the harsh, polarized and emotional discourse of today’s society, and if any institution should model the virtues of calmness, moderation and wisdom, it is a university, especially one named in honor of two individuals who personified precisely those virtues.

Kenneth P. Ruscio is a 1976 graduate of Washington and Lee University. He took office as the university's 26th president in 2006.

Study says colleges pay a price for humanities support

Section: 
Smart Title: 

Study suggests that private colleges with many such programs may pay a price in tuition and research revenues.

Review of Teofilo Ruiz, "The Terror of History"

Intellectual Affairs

You can’t judge a book by its cover, as the old saw goes, but every so often the cover art may stun you into long contemplation. Or horror, in the case of Teofilo R. Ruiz’s The Terror of History: On the Uncertainties of Life in Western Civilization (Princeton University Press), which greets the prospective reader by way of Goya’s “Saturn Devouring His Son.”

The ghastliness of the painting never came through when I’d come across it before, in much smaller reproductions, often in black and white. It was inspired by a myth that Freud could have made up, if antiquity had not already obliged. Saturn, having castrated his father (long story), decided to eat his own offspring, which seems like a reasonable precaution given the circumstances. In a familiar version of the story, he swallows the children whole. Tricked into drinking a purgative, Saturn vomits them up and they go on to better things.

Not so in Goya’s rendition. In it, Saturn grasps a headless corpse, pulling away chunks of flesh, bite by bite, with the red insides of the body visible. He is bearded, and he stares at the viewer with a deranged expression while tearing off an arm with his teeth.

The image is unsettling because its grotesquery cuts through gentler versions of the myth to expose a layer of savagery otherwise concealed. Goya’s "Saturn" is part Theodore Kazinski, part Jeffrey Daumer. And, by implication, vice versa: The killers are his avatars. Malevolent craziness seems primordial.

Putting the book down after staring at its cover again, I turn by habit to Google News. It is not a comfort.

In The Terror of History, Ruiz, professor of history, Spanish, and Portuguese at the University of California at Los Angeles, describes and reflects on how people have tried to escape the unending pageant of catastrophe, violence, suffering, and random disorder making up the human condition. Referring to them as “the uncertainties of life in Western civilization” is odd, since things are not appreciably less messed-up elsewhere. Gautama Buddha had pertinent things to say on the matter, some of which Ruiz discusses. (Plus the subtitle inevitably calls to mind the comment Gandhi is supposed to have made when asked for his opinion of Western civilization: “I think it would be a good idea.”)

But most of Ruiz’s cultural references come from European and American sources, and the image from Goya serves to anchor his meditations in the Greco-Roman past. “It is a pictorial representation  of one of Ancient Greece’s most telling myths,” he writes. “The god Chronos (time) devours his children out of fear that, as fate has predicted, one of them will overthrow him. And so does Chronos devour all of us.”

It is a very old interpretation of the myth, albeit one resting on an etymological mistake. The Greek counterpart of Saturn was named Kronos, who was not the god of time Chronos, although they probably got each other’s mail a lot. To judge by the work of Victorian mythologist Thomas Bullfinch, this has been going on for a while. Bullfinch also suggests that the mix-up may account for the confusing way Saturn’s reign is depicted in ancient sources. On the one hand, it was supposed to be the Golden Age. On the other hand, there was the constant patriarchic cannibalism. It is hard to reconcile them.

The contradiction runs through Ruiz's book. “We live, as it were, always on the edge of the abyss,” he writes, “and when we think we are happy and at peace, as individuals and as communities, awful things may be waiting just around the corner.” But it is difficult, and maybe impossible, to reconcile ourselves to this; and while hope for a Golden Age is hard to come by, it is our nature “to cling to life, to hope against hope” and create meaning. Drawing on the great Dutch medievalist Johan Huizinga’s work, Ruiz organizes his musings around three grand strategies for finding happiness, or at least mitigating total dread: “through belief (in a whole variety of orthodox and heterodox forms), [through] the life of the senses, and/or through culture and the pursuit of the beautiful.”

Under each of these headings, he arrays quotations from and reflections on a kaleidoscopic array of ancient and modern authors and phenomena: Sophocles, Proust, utopian communes, witch-burning crazes, The Decameron, an insurrection in Brazil in the 1890s, the Marquis de Sade, and The Epic of Gilgamesh, to give a representative sampling. Plus there are memoiristic bits. He mentions teaching “a class on world history from the Big Bang to around 400 C.E.” The book seems more ambitious still.

Insofar as an argument emerges, it is that each strategy to escape “the terror of history” has a powerful appeal, but all of them have a tendency to go off the rails, creating more misery, whether individual or social. For every mystic realizing the oneness of being, you get twenty fanatics who treat homicide as a sacrament. Romantic love is sublime, but it has no warranty. People experiment with utopian communes in spite of their track record and not because of it.

For that matter, authorship is no bower of bliss, either:

“As I sit at my computer, churning out one book or article after another, a suspicion gnaws at my mind. Almost like an alarm clock unpleasantly ringing in the morning’s early hours, it tells me that, as serious as I am about the reconstruction of the past, both the projects themselves and my seriousness are forms of escape, of erasing meaninglessness. It is all a bit delusional. Does my work really amount to anything? Does it really matter? Does it fulfill any meaningful purpose? Early in my career, it meant tenure, promotion, recognition, but now what?”

It bears mentioning here that Saturn also presides over melancholy, often named, along with pride, “the disease of scholars.”

As for the experience of reading The Terror of History, I will report less melancholy than dismay. For a short book displaying enormous erudition, it is awfully repetitive. It stops every so often to tell you what it is about, and every point is restated with some frequency. “I am, of course, not saying anything new here,” Ruiz comments at a couple of points. This invites the distracting question of why it’s being said at all.

In spite of my best efforts to see all of this as deliberate -- even thematic (history repeats itself but we forget what it said the first time, etc.) – the preponderance of evidence suggests otherwise. It appear not to have been edited very much. If it had been, “Santillana’s famous dictum that those who do not know history are condemned to repeat it” would have been repaired to give George Santayana the credit. The reference to “Russell Jacobi’s forthcoming book on fraternal violence” would not have made me laugh from imagining an 18th century German philosopher wandering the UCLA campus.

Ruiz twice calls Afro-Cuban music “enervating.” Either he regards the word is a synonym for “energizing” (it means precisely the opposite) or else conga drums fill him with ennui.

He refers to “Nietsche’s elegant dichotomy between the Dionysian (a form of Carnivalesque intoxication) and the Apollonian,” Ruiz equates the latter term with “rational individualism.” Actually the philosopher applies the word to “the beautiful appearance of the inner fantasy world,” which is not someplace where a lot of rational choice takes place.

A good editor would have caught all of these problems (the list is not exhaustive) while gently helping the author through as many revisions as necessary to subdue the redundancies and unknot the thickets. Consider the following:

“The conundrum here is whether to ignore history – though history most certainly does not ignore us, and it is often unforgiving of our neglect – may not be after all far less demanding of our time and strength and lead to far less grief and more pleasure.”

It is possible to render such a sentence into something coherent on first reading. (I have seen this done.) But books are being cranked out by even the most prestigious of university presses without the red pencil ever touching a manuscript, or whatever the current equivalent might be. A gifted editor adds value to the final product. A capable copy editor does as well. Their numbers are thinning; it seems a matter of time before they disappear from the face of the earth. If Saturn isn’t crunching their bones, then Mammon, god of budget decisions, undoubtedly is.

Pages

Subscribe to RSS - History
Back to Top