History

Irish historian considers significance of fight over papers at Boston College

Irish historians have watched the legal case relating to the witness statements from participants in the conflict in Northern Ireland held by Boston College with great interest and with no little trepidation.

Regardless of the ultimate outcome of the case, there are real fears that the controversy has already jeopardized the collection and preservation of historical material relating to the conflict in Northern Ireland.

One friend, who was instrumental in helping University College Dublin Archives to secure a significant collection of private papers that includes material relating to the Northern Ireland peace process, remarked recently that it would have been more difficult to convince the donor to preserve his papers and donate them to an archive if the controversy at Boston College had previously been known.

The great difficulty here is that any comprehensive history of the Northern Ireland conflict will be very dependent on statements from the men and women who were directly engaged in the events: republicans, loyalist paramilitaries, police, British army personnel, politicians, public servants, and the ordinary people whose lives were shaped by the conflict. The nature of the conflict in Northern Ireland was such that no existing archive can expect to stand as sufficient sources for the writing of plausible history; the words of the people who lived through (and participated in) the conflict need to be preserved to allow for the creation of a more meaningful historical record.

The Boston College interviews are one of several series of interviews that currently exist, or are now being collected. Oral history is especially important if we are to tell the story of everyday life during these years, and the motivations and reflections of men and women who did not hold positions of leadership.

Irish historians are very conscious of the importance of such testimonies, because a comparable archive exists relating to the 1916 Rising and the Irish war of independence. In the 1940s and early 1950s the Bureau of Military History – funded by the Irish government – collected statements from men and women who participated in these events. Some of those men and women engaged in violence or other acts about which they might not have been willing to speak publicly. The statements were finally released in 2004, 50 years after they were collected, when all the witnesses had died.

Although this delay has been criticized, it shows a respect for the witnesses and indeed for all who were affected by the events narrated in these testimonies. These statements, and the anticipated release shortly of thousands of Military Pension Files, containing further firsthand statements from those involved in the War of Independence, provide a permanent and valuable record of a critical period in the emergence of contemporary Ireland.

These firsthand accounts have transformed the understanding of these years, bringing it to life in a manner that more formal records cannot do.

The oral statements of participants in the conflict in Northern Ireland offer a similar potential to provide a rounded account of these years.  This will only happen, however, if those making statements can trust the record-taker, and trust the place where these records are deposited.  

This trust requires firm assurances that the statements will not be released prematurely, or divulged other than under the terms agreed.  The witness statements should be collected with the intent of creating a long-term historical record; while there may be an understandable eagerness to gain access to them, in order to be first with the story – they are best left undisturbed for a significant period of time.  Essentially, they should be collected and protected for posterity – not for the present.

University College Dublin (UCD), in common with other research universities, has a clear code of ethics that applies to all material that relates to identifiable individuals; securing their consent to any use that permits them to be identified is a key requirement.

In addition researchers and archivists must observe the requirements of the Data Protection Act, which precludes the revealing of personal information – relating to matters such as health, family circumstances or financial records, and these regulations are strictly enforced. Many of the private collections deposited in UCD Archives can only be accessed with the permission of the donor.

While testimonies relating to paramilitary activities are obviously of a particularly sensitive nature, there are recognized laws and procedures in place that protect the witness, the archive, the archivist and the researcher – provided that they are observed.

The issue may become more complex when records are transferred from one country to another, if the legal framework relating to data protection and disclosure is different, but again, a robust protocol and clearly-determined governance – agreed before any records are compiled – should reduce these risks.

Oral histories are extremely valuable sources for posterity, and they are becoming of still greater importance in an age when communication increasingly takes the form of telephone conversations, e-mails, texts, tweets and other means; these are obviously less easily preserved than letters or written memorandums.  

Ultimately, there will be lessons to be learned from the specifics of the Boston College case. The overarching ambition must remain unchanged: to ensure that a trusted record of the past can be compiled and preserved for posterity.

Mary E. Daly is professor of modern Irish history at University College Dublin.

Review of Nancy K. Bristow, "American Pandemic: The Lost Worlds of the 1918 Influenza Epidemic"

Intellectual Affairs

It was a classic instance of blaming the messenger: Spanish newspapers carried the earliest reports of a new illness that spread across the globe in the final months of World War I, and so it be came known as “Spanish influenza,” although its real point of origin will never be known. It was virulent and highly communicable. A paper appearing in the Centers for Disease Control and Prevention journal Emerging Infectious Diseases a few years ago estimated that 500 million people, almost a third of the world’s population, were stricken with it. By the end of its reign of terror in the final months of 1920, there were 50 million fatalities -- more than three times as many as died from the war itself. These figures may be on the low side.

In her two long essays on illness, Susan Sontag grappled with the strong and longstanding tendency to treat certain diseases as meaningful: the vehicle for metaphors of social or cultural disturbance. “Feelings about evil are projected onto a disease,” she wrote. “And the disease (so enriched with meanings) is projected onto the world." Just so, one would imagine, with a pandemic. Something in a plague always hints at apocalypse.

But the striking thing about Spanish influenza is how little meaning stuck to it. Plenty of sermons must have figured the Spanish flu as one of the Four Horsemen, at the time, but the whole experience was quickly erased from collective memory, at least in the United States. In 1976, the historian Alfred W. Crosby published a monograph called Epidemic and Peace: 1918 that Cambridge University Press later issued as America’s Forgotten Pandemic (1989). Apart from being snappier, the new title underscored the almost total disappearance from anything but the specialist’s sense of history. One person in four in the U.S. suffered from an attack of Spanish flu, and it killed some 675,000 of them. The catastrophe seems never to have interested Hollywood, though, and the only work of fiction by an author who lived through the outbreak, so far as I know, is Katherine Anne Porter’s novella “Pale Horse, Pale Rider.” (Biblical imagery seems just about unavoidable.) (Note: This article was updated from an earlier version to correct the name of the author of Epidemic and Peace: 1918.)

The title of Nancy K. Bristow’s American Pandemic: The Lost Worlds of the 1918 Influenza Epidemic (Oxford University Press) is an echo of Cross’s America’s Forgotten Pandemic. I don’t want to read too much into the one-word difference, but it does seem that the influenza crisis of almost a century ago has been working its way back into public awareness in recent years. Several more books on the subject have appeared since Gina Kolata’s best-seller Flu: The Story of the Great Influenza Pandemic of 1918 and the Search for the Virus That Caused It came out in 1999. The Public Broadcasting Service has done its part with an excellent documentary as well as an episode of Downton Abbey in which pestilence hits a country house in England during a dinner party.

So “forgotten” is no longer quite the right word for the nightmare. But it remains almost impossible to imagine the ferocity of the pandemic, much less its scale. The contemporary accounts that Bristow draws on retain their horror. Doctors wrote of patients changing from “an apparently well condition to almost prostration within one or two hours,” with raging fevers and severe pain in even the milder cases – and the worst involving a “bloody exudate” coughed up from the “peculiar and intense congestion of the lungs with [a] hemorrhage,” so that it was “simply a struggle for air until they suffocate.”

Morgues were overrun. In poor households, several delirious family members might be crowded into the same bed along with someone who had died. Those who made it to the hospital could lie unattended for days at a time. The authorities were issuing “don’t worry, it’s just another flu”-type pronouncements well into the catastrophic phase of the epidemic. Quarantines and bans on public gatherings were easier to proclaim than to enforce. Having absorbed the relatively new idea that disease was spread by germs, people donned surgical masks to protect themselves – to no avail, since influenza was a virus. The epidemic went through three waves of contagion in as many years, and it wore down whatever patience or civic-mindedness people had when the disaster hit.

A pandemic, by definition, puts everyone at risk. But access to medical help – inadequate as it proved – was far less egalitarian. (As is still the case, of course.) Much of the historical scholarship on disease in recent decades has stressed how the interaction between medical professionals and their clientele tends to reinforce the social hierarchies already in place. Bristow’s work follows this well-established course, combining it with a familiar emphasis on the changes in medicine’s public role in the wake of Progressive Era reforms.

She writes about how poor, immigrant, or Native American sufferers were assumed guilty “of dishonesty and laziness, and of attempting to take advantage of others’ generosity” until proven otherwise, while the African-American population was forced “to continue relying on their own too limited community resources as they sought to provide sufficient care for their sick neighbors.” And while the U.S. Public Health Service had been created in 1912, its capacity to respond to the influenza crisis was limited, given how poorly the disease was understood. Even gathering reliable statistics on it proved almost impossible while the virus was on its rampage.

The most interesting chapter of America’s Pandemic considers how doctors and nurses responded to the crisis. Although they often worked side-by-side together, their experiences were a marked contrast.

“Ignorant of the disease’s etiology, uncertain of the best methods of treatment, and unable to ease the suffering of their patients,” Bristow writes, “physicians often expressed a sense of helplessness as individuals and humility as members of a profession.” (You know something is catastrophic when it reduces doctors to humility.)

Belonging to an almost entirely male profession, they “gauged their work against the masculine standards of skill and expertise” – and the inevitable military metaphor of going to battle against the disease became that much more intense given the example of actual soldiers fighting and dying in the trenches. But the influenza virus was stronger. “Like a hideous monster,” one physician wrote, “he went his way, and none could hinder.” Doctors’ letters and diaries from the period reflect a sense of bewilderment and failure.

For a while the authority of the profession itself was undermined. Patent medicines and related quackery proved no more effective in treating or preventing the disease than anything the doctors could offer. But they weren’t any less effective, either.

The nurses could not have responded more differently. Caring for patients was “a terrific test” and “high privilege,” they wrote, “a most horrible  and yet most beautiful experience.” As with doctors, many lost their lives while tending to the sick. But one nurses’ report said that the work was “one of the most immediately satisfactory experiences of our lives” for those who survived it, “and this is true even though we were borne down with the knowledge that, do all we might, the pressing, tragic need for nursing was much greater than could possibly be met.”

And this, too, was a matter of gauging their skill by socially sanctioned gender norms. “Women working as nurses aspired to what they viewed as the uniquely feminine qualities of domesticity, compassion, and selflessness,” writes Bristow. “To measure up to these standards nurses needed only to care for their patients, not cure them, and this they proved able to do.”

A few hours after choosing American Pandemic for this week’s column, I attended a public event at which every third person seemed to be coughing, with a congested wheeze usually audible. Synchronicity is not always your friend. For the past several days I have been reading about influenza, and writing about it, while suffering from one of its milder variants. The experience is not to be recommended.

Two quick points might be worth making before the medication kicks in. Bristow’s final assessment is that the horror and devastation of the pandemic could not be reconciled with the preferred national narrative of progress and redemption, “with its upbeat tenor and its focus on a bright future.” At most, its memory was preserved as part of local history, or through family stories.

The argument is plausible, to a degree, but it overlooks the element of trauma involved – not just the suffering and death during that period, but the feeling of being rendered helpless by an event that’s come out of nowhere.

And what sense does it make to think of the events of 1918-20 as “America’s pandemic,” forgotten or otherwise? Deaths from influenza in the United States during that period represent something like 1.4 percent of the world’s fatalities from the disease. How was it remembered -- or erased from public memory -- elsewhere? Diseases don’t respect borders, and it’s hard to see why the historiography of disease should, either.
 

Historians start effort to define what graduates should be able to do

Smart Title: 

In first effort of its kind in the U.S., a discipline works to define what graduates of its programs should be able to do -- from associate degree through the Ph.D.

Review of Liette Gidlow, "Obama, Clinton, Palin"

Intellectual Affairs

It's hard to think of with three living figures in American politics who generate more passion than the ones named in the title of Obama, Clinton, Palin: Making History in Election 2008, a collection of essays edited by Liette Gidlow and published by the University of Illinois Press. The word “passion” here subsumes both ardor and loathing. I doubt it is intentional, but the photographs on the book’s cover are arrayed such that they seem almost attached to one another, like Siamese triplets perhaps, or some beast with multiple heads in one of the more psychedelic passages of Biblical prophecy. If the 2012 campaign doesn’t give you nightmares, that image still might.

Gidlow, the editor, is an associate professor of history at Wayne State University, and the 11 other contributors are all historians as well. Their essays frame the 2008 campaign as a late episode in the country’s uneven progress toward incorporating anybody other than white men into elected government. Every so often we hear that the United States entered the “post-feminist” or “post-racial” era at some unspecified point in the (presumably) recent past. But reality has a way of asserting itself, and the next thing you know there are people demonstrating against the president with signs that show him as a cannibal with a bone through his nose, or a politician responds to a female heckler by hinting that she should perform a sexual service for him.

Debates about race and gender came up often during the 2008 primaries and the election season that followed, so the book’s emphasis is hardly misplaced. Much discussed at the time was each campaign’s groundbreaking status in the history of presidential contests -- with Palin being the first woman to run on the Republican ticket, while Obama was the first African-American, and Clinton the first woman, to have a serious shot at the Democratic nomination.

That is true, but it is blinkered. If the essays in Obama, Clinton, Palin could be reduced to a single theme, it might be that the history-making campaigns of 2008 were also products of history, or echoes of it, as well. The most interesting chapter in that regard is Tera W. Hunter’s “The Forgotten Legacy of Shirley Chisholm,” which recalls the African-American Congresswoman’s presidential bid in 1972.

Chisholm had no hope of winning, and knew it, but paved the way for Hillary Clinton and Barack Obama. She was, Hunter says, “antiracist, antisexist, pro-choice, pro-labor, antiwar, fiercely independent, and, above all, principled.” But the point of invoking her memory is hardly to treat the Clinton or Obama campaigns as rightful heirs. In Hunter’s reading, the Democratic primaries of 2008 were a travesty of Chisholm’s effort.

“Hillary Clinton never spoke openly, critically, or engagingly about the status of women in our society, the problems of gender discrimination, and what we should do about it,” writes Hunter. As the competition heated up, Clinton “became more forthright in claiming to be the victim of gender discrimination,” while Obama “continued to be reluctant to dwell on issues related to race and racism.” By contrast, Chisholm “challenged the racist attitudes and practices in the women’s movement,” Hunter writes, “as much as she challenged sexism among African-Americans and the broader society.”

A cynic might reply that she could afford to do that precisely because she was not trying to get elected. To win, you pander, and when somebody complains, you try to figure out how to pander to them, too. But running a winning campaign involves neutralizing reservations as much as enlisting allegiance. On that score Obama “faced the challenge of calming white fears,” Hunter writes, “of reassuring the populace that he was not an ‘angry black man’ seeking racial retribution.” (The point is also made in “Barack Obama and the Politics of Anger” by Tiffany Ruby Patterson, who recalls how the candidate navigated the controversy over Rev. Jeremiah Wright’s fire-next-time sermons.)

Susan M. Hartmann’s “Hillary Clinton’s Candidacy in Historical and Global Context” offers one of the book’s analyses of how gender stereotypes and media sexism created obstacles for the candidate – even as she “benefited not only from her husband’s name and popularity, but also from the access he afforded” to sundry political resources. “By contrast,” Hartmann says, “Republican vice presidential candidate Sarah Palin escaped much of the gender hostility that Clinton faced,” largely because of “her strong right-wing credentials, importantly including opposition to much of the feminist agenda.”

Indirectly challenging that claim is Catherine E. Rymph’s “Political Feminism and the Problem of Sarah Palin.” Rymph makes the case for regarding Palin as the legatee of a strain of G.O.P. feminism going back to the 1940s, when “Republicans made up a greater number of women serving in Congress” than did Democrats. Their party platform endorsed the Equal Rights Amendment in 1940 – four years before the Democrats did. (See also the abundant scholarship on the role of women activists on the right, discussed by Kim Phillips-Fein shows in “Conservatism: A State of the Field,” her thorough survey of recent historiography.)

Clinton and Palin were both “presented in sexualized ways” during their campaigns, Rymph points out: “Clinton was a castrating bitch, while Palin was a ‘MILF’ (presumably more flattering, but equally degrading).” Opponents were relentless in mocking Palin’s hair, clothes, days as a beauty-pageant competitor and the like, which Rymph cites as evidence that “Americans of all stripes can tolerate and even embrace sexism when it is used as a weapon against women with whom they disagree or whom they see as representing the wrong picture of womanhood.”

Thanks to Google, several contributors are able to document the racist and misogynistic rage churning throughout the primaries and campaigns. This is the third or fourth academic press publication I’ve read in the past few months to quote extensively from blog posts, comments sections, Facebook dialogs, and so forth. The effect is sometimes informative or illuminating, but usually it isn’t.  You get used to seeing chunks of semiliterate ranting online, but it’s still mildly disconcerting to find it in cold type, properly cited, with scholarly apparatus.

The poisonous material quoted in “Michelle Obama, the Media Circus, and America’s Racial Obsession” by Mitch Katchum makes it perhaps the most horrifying chapter in the book. It is undoubtedly necessary to the job of showing the double dose of stereotyping (“angry black woman,” “Jezebel,” “baby mama”) that emerged in 2008 campaign, and set the tone for much that’s followed. But in the future, historians might do well to focus on computerized content analysis of digital chatter, rather than exhibiting samples, because it does not take that long to reach the threshold marked ad nauseum.

I’ve discussed a few papers in Obama, Clinton, Palin, not attempted a comprehensive review. But one general impression bears mentioning, and a look through the index confirms it: there are no entries for Afghanistan, Iraq, Lehmann Brothers, terrorism, torture, or the Troubled Asset Relief Program, nor any other major issue at stake in 2008. All are mentioned at some point in the book. But the index-maker can't be faulted, because they are always quite peripheral to the project.

What we get, then, is political history at a considerable remove from questions of governance. It’s certainly possible to argue that combat over race or gender in a presidential campaign may serve as a proxy for debates over social or economic policy. But that argument has to be made. Otherwise it seems as if the only issue in an election is whether the most powerful elected offices in the country should or should not be more demographically representative.

In any case, the 2012 presidential race has been pretty uneventful in the politics-of-difference department -- so far, anyway. I contacted Liette Gidlow, the editor of the book, to ask what she made of the contrast with four years ago.

“I do think that the 2008 campaigns expanded leadership opportunities for African-Americans, women, and others in a lasting way,” she responded by e-mail. “The political contests so far this year would seem to suggest otherwise; though Michele Bachmann and Herman Cain had their moments, ultimately their campaigns failed to win broad support among Republicans. But for the past 40 years, it has been the Democratic party, not the Republican party, that has been the driving force behind diversity in political representation, and with the primary contests limited to the Republicans this year, we shouldn't be surprised to see that the field has been dominated by white men. Which doesn't mean that in future presidential contests the Democrats will offer a slate that ‘looks like America’ or that the Republicans won't. But every time a candidate who departs from our usual expectations succeeds, it expands our ability to imagine, and ultimately to accept, different kinds of people as leaders.”

That seems fair enough, all in all. But it leaves open the question of what difference it makes, if any, after that. It certainly felt like something was changing on election night in 2008, but four years later, I often wonder what it was.

Essay on how a university responded to criticism of one of its heroes

Sadly, almost any topic in our modern society becomes politicized, forcing us into a corner where we must choose to be for or against. Opinion comes first, interpretation comes later, if at all. Simplicity is the order of the day. Dealing with complexity is inconvenient.

So it is with the irresistible urge to judge historical figures such as Thomas Jefferson, Robert E. Lee and even George Washington — deciding whether we are pro or con, and then injecting them into our contemporary partisan conflicts. It overwhelms any inclination to embrace the study of the past for something other than debating points. We know, or should know, that our own history is complicated. We would appreciate it if those judging us in the future would respect that. We owe the same courtesy to those who came before us.

When Joseph Ellis wrote American Sphinx, his masterful study of Thomas Jefferson, he ran headlong into precisely this problem. As he ventured into his research, his youthful fascination with Jefferson gave way to a mature appreciation for the man with all his contradictions, faults and strengths, failures and accomplishments. History rarely presents us with simple morality tales. And the fortunes of Jefferson in the contemporary age look so much like the dreaded approval ratings in volatile public opinion polls. One day, he is everyone’s hero. The next, he is a hypocrite and a devious politician.

It was a puzzle for Ellis. As commentaries on Jefferson devolved into point-counterpoint volleys, it seemed "impossible to steer an honorable course between idolatry and evisceration," he wrote. The historian concluded wisely that "affection and criticism toward Jefferson are not mutually exclusive positions," and that "all mature appraisals of mythical figures are destined to leave their most ardent admirers somewhat disappointed." Influential individuals who live in turbulent times do things that call attention to the strength of their character, and they do other things that point to their human qualities, which is to say their imperfections. "Anyone who confines his research to one side of the moral equation," wrote Ellis, "is destined to miss a significant part of the story."

There are occasions when I feel these tensions in a direct and personal way.

I serve as president of a college named after two influential and consequential figures. One of them is George Washington, who made our college the beneficiary of his only significant gift to higher education: $20,000 in James River Canal stock. He wanted to support an institution located in an area of the country he considered the "Western frontier."

The other is Robert E. Lee. After the Civil War, he became president of what was then Washington College. He and several members of his family are buried in Lee Chapel, an iconic building on campus where we hold many of our formal ceremonies. My wife, my son and I live in Lee House — the house on campus built for him and his family that has served as the home for all the university's presidents. The dining room is where he died. The building in the driveway was a stall for Traveller, Lee’s horse; it remains preserved as it was back in Lee’s day, and by custom its doors remain forever open. It is the second-most-visited tourist spot in a small town with many historical sites.

We commemorate both men during our annual Founders’ Day Convocation on campus, held on Lee’s birthday each year. Our convocation speaker this year was Ron Chernow, author of the Pulitzer-winning Washington: A Life.

I also am a graduate of the institution I now serve as president, and I proudly and forcefully call upon the traditions of the university in the service of preparing students for lives of integrity and responsibility. My own stance toward Lee is one of respect, especially for what Michael Sandel, the Harvard political theorist, refers to as "the quality of character" of his deliberation when he confronted impossible choices.

But it is not idolatry. Neither is it evisceration. It is instead an honest attempt to understand the man and his times, which included slavery, secession and civil war. I take this stance not for purposes of reaching a final judgment on whether he was destined for heaven or hell — which would be the height of arrogance, as if I, and not Providence, could make such a call — but to appreciate the complexity of history and those who live it. Like Ellis with Jefferson, I have come to the conclusion that affection for and criticism of Lee are not mutually exclusive.

There are times, though, when that is easier said than done.

What should the university do when a Washington Post columnist condemns Lee as a traitor who chose the wrong side when it came to the great moral question of his time? How should we respond when a PBS documentary, which otherwise portrayed the man with all the respect history requires, got it wrong when it came to the chapter in his life that profoundly affected the university? He did not, as the documentary claimed, live out his final years "in hiding" at a small college in the mountains of Virginia. Rather, he fulfilled a pledge. "I have a self-imposed task which I must accomplish," he wrote. "I have led the young men of the South in battle; I have seen many of them die in the field; I shall devote my remaining energies to training young men to do their duty in life." Forsaking far more lucrative offers, he came to a nearly bankrupt college to prepare young men from the North and the South for a dramatically different world in the wake of the Civil War.

Beyond our campus, the story of Lee the general overshadows the story of Lee the educator; understandably so. But those years of curriculum reform and lessons in integrity are inseparable from the man’s biography, and they add a deeper appreciation, especially to our understanding of Lee’s refined sense of duty. Many students and alumni of this university cannot recognize the man when the profile casts aside the effect he had on an educational institution that had a direct effect on our own lives.

For that reason and many others, criticism of Lee, even the unintended oversight, triggers the reflex to rush to the defense. As one thoughtful, dedicated alumnus wrote to me in the wake of the PBS documentary and the Washington Post column, neither of which mentioned the university, "we" have been attacked, and the institution “has done nothing to respond.”

But the university is not synonymous with the man. It is an institution of values, to be sure. And to illustrate its values, it often invokes the stories of individuals who have made it what it is. But it is first and foremost an institution of learning and study, of critical reflection on all matters. Its primary mission is to seek truth. It cannot do so if it closes its own history to examination.

Lee was a dignified, humble man. His sense of duty and honor would cause him to cringe if he ever became the subject of idolatry or the embodiment of myth. Blindly, superficially and reflexively rushing to his defense is no less an affront to history than blindly, superficially and reflexively attacking him. What he needs, what he deserves, and what his record can withstand is the honest appraisal of those who have not made up their minds, who can appreciate the man with all his complexities and contradictions. History is indeed not kind enough to present us with simple morality tales.

More to the point, a university serves its students best by not imposing an orthodox point of view about the past and certainly not the future. Higher education, no less than other institutions, is a victim of our politicized society. The things we do — the courses we teach, the values we espouse, the faculty we hire — should not be subjected to ideological litmus tests.

John Henry Cardinal Newman, the 19th-century British educator, remains a powerful influence on how we think about a college education. His words remind us to keep our bearings. The faculty should “learn to respect, to consult, to aid each other.” In so doing, they create a culture of learning. "A habit of mind is formed which lasts through life, of which the attributes are freedom, equitableness, calmness, moderation, and wisdom." Newman concludes, "This is the main purpose of a university in its treatment of its students."

Ellis has it right when it comes to the role of history. Newman has it right when it comes to the role of the university. But it remains a challenge in our highly divisive times. If any institution should resist the harsh, polarized and emotional discourse of today’s society, and if any institution should model the virtues of calmness, moderation and wisdom, it is a university, especially one named in honor of two individuals who personified precisely those virtues.

Kenneth P. Ruscio is a 1976 graduate of Washington and Lee University. He took office as the university's 26th president in 2006.

Study says colleges pay a price for humanities support

Section: 
Smart Title: 

Study suggests that private colleges with many such programs may pay a price in tuition and research revenues.

Review of Teofilo Ruiz, "The Terror of History"

Intellectual Affairs

You can’t judge a book by its cover, as the old saw goes, but every so often the cover art may stun you into long contemplation. Or horror, in the case of Teofilo R. Ruiz’s The Terror of History: On the Uncertainties of Life in Western Civilization (Princeton University Press), which greets the prospective reader by way of Goya’s “Saturn Devouring His Son.”

The ghastliness of the painting never came through when I’d come across it before, in much smaller reproductions, often in black and white. It was inspired by a myth that Freud could have made up, if antiquity had not already obliged. Saturn, having castrated his father (long story), decided to eat his own offspring, which seems like a reasonable precaution given the circumstances. In a familiar version of the story, he swallows the children whole. Tricked into drinking a purgative, Saturn vomits them up and they go on to better things.

Not so in Goya’s rendition. In it, Saturn grasps a headless corpse, pulling away chunks of flesh, bite by bite, with the red insides of the body visible. He is bearded, and he stares at the viewer with a deranged expression while tearing off an arm with his teeth.

The image is unsettling because its grotesquery cuts through gentler versions of the myth to expose a layer of savagery otherwise concealed. Goya’s "Saturn" is part Theodore Kazinski, part Jeffrey Daumer. And, by implication, vice versa: The killers are his avatars. Malevolent craziness seems primordial.

Putting the book down after staring at its cover again, I turn by habit to Google News. It is not a comfort.

In The Terror of History, Ruiz, professor of history, Spanish, and Portuguese at the University of California at Los Angeles, describes and reflects on how people have tried to escape the unending pageant of catastrophe, violence, suffering, and random disorder making up the human condition. Referring to them as “the uncertainties of life in Western civilization” is odd, since things are not appreciably less messed-up elsewhere. Gautama Buddha had pertinent things to say on the matter, some of which Ruiz discusses. (Plus the subtitle inevitably calls to mind the comment Gandhi is supposed to have made when asked for his opinion of Western civilization: “I think it would be a good idea.”)

But most of Ruiz’s cultural references come from European and American sources, and the image from Goya serves to anchor his meditations in the Greco-Roman past. “It is a pictorial representation  of one of Ancient Greece’s most telling myths,” he writes. “The god Chronos (time) devours his children out of fear that, as fate has predicted, one of them will overthrow him. And so does Chronos devour all of us.”

It is a very old interpretation of the myth, albeit one resting on an etymological mistake. The Greek counterpart of Saturn was named Kronos, who was not the god of time Chronos, although they probably got each other’s mail a lot. To judge by the work of Victorian mythologist Thomas Bullfinch, this has been going on for a while. Bullfinch also suggests that the mix-up may account for the confusing way Saturn’s reign is depicted in ancient sources. On the one hand, it was supposed to be the Golden Age. On the other hand, there was the constant patriarchic cannibalism. It is hard to reconcile them.

The contradiction runs through Ruiz's book. “We live, as it were, always on the edge of the abyss,” he writes, “and when we think we are happy and at peace, as individuals and as communities, awful things may be waiting just around the corner.” But it is difficult, and maybe impossible, to reconcile ourselves to this; and while hope for a Golden Age is hard to come by, it is our nature “to cling to life, to hope against hope” and create meaning. Drawing on the great Dutch medievalist Johan Huizinga’s work, Ruiz organizes his musings around three grand strategies for finding happiness, or at least mitigating total dread: “through belief (in a whole variety of orthodox and heterodox forms), [through] the life of the senses, and/or through culture and the pursuit of the beautiful.”

Under each of these headings, he arrays quotations from and reflections on a kaleidoscopic array of ancient and modern authors and phenomena: Sophocles, Proust, utopian communes, witch-burning crazes, The Decameron, an insurrection in Brazil in the 1890s, the Marquis de Sade, and The Epic of Gilgamesh, to give a representative sampling. Plus there are memoiristic bits. He mentions teaching “a class on world history from the Big Bang to around 400 C.E.” The book seems more ambitious still.

Insofar as an argument emerges, it is that each strategy to escape “the terror of history” has a powerful appeal, but all of them have a tendency to go off the rails, creating more misery, whether individual or social. For every mystic realizing the oneness of being, you get twenty fanatics who treat homicide as a sacrament. Romantic love is sublime, but it has no warranty. People experiment with utopian communes in spite of their track record and not because of it.

For that matter, authorship is no bower of bliss, either:

“As I sit at my computer, churning out one book or article after another, a suspicion gnaws at my mind. Almost like an alarm clock unpleasantly ringing in the morning’s early hours, it tells me that, as serious as I am about the reconstruction of the past, both the projects themselves and my seriousness are forms of escape, of erasing meaninglessness. It is all a bit delusional. Does my work really amount to anything? Does it really matter? Does it fulfill any meaningful purpose? Early in my career, it meant tenure, promotion, recognition, but now what?”

It bears mentioning here that Saturn also presides over melancholy, often named, along with pride, “the disease of scholars.”

As for the experience of reading The Terror of History, I will report less melancholy than dismay. For a short book displaying enormous erudition, it is awfully repetitive. It stops every so often to tell you what it is about, and every point is restated with some frequency. “I am, of course, not saying anything new here,” Ruiz comments at a couple of points. This invites the distracting question of why it’s being said at all.

In spite of my best efforts to see all of this as deliberate -- even thematic (history repeats itself but we forget what it said the first time, etc.) – the preponderance of evidence suggests otherwise. It appear not to have been edited very much. If it had been, “Santillana’s famous dictum that those who do not know history are condemned to repeat it” would have been repaired to give George Santayana the credit. The reference to “Russell Jacobi’s forthcoming book on fraternal violence” would not have made me laugh from imagining an 18th century German philosopher wandering the UCLA campus.

Ruiz twice calls Afro-Cuban music “enervating.” Either he regards the word is a synonym for “energizing” (it means precisely the opposite) or else conga drums fill him with ennui.

He refers to “Nietsche’s elegant dichotomy between the Dionysian (a form of Carnivalesque intoxication) and the Apollonian,” Ruiz equates the latter term with “rational individualism.” Actually the philosopher applies the word to “the beautiful appearance of the inner fantasy world,” which is not someplace where a lot of rational choice takes place.

A good editor would have caught all of these problems (the list is not exhaustive) while gently helping the author through as many revisions as necessary to subdue the redundancies and unknot the thickets. Consider the following:

“The conundrum here is whether to ignore history – though history most certainly does not ignore us, and it is often unforgiving of our neglect – may not be after all far less demanding of our time and strength and lead to far less grief and more pleasure.”

It is possible to render such a sentence into something coherent on first reading. (I have seen this done.) But books are being cranked out by even the most prestigious of university presses without the red pencil ever touching a manuscript, or whatever the current equivalent might be. A gifted editor adds value to the final product. A capable copy editor does as well. Their numbers are thinning; it seems a matter of time before they disappear from the face of the earth. If Saturn isn’t crunching their bones, then Mammon, god of budget decisions, undoubtedly is.

Historians continue debate about career tracks for Ph.D.s

Smart Title: 

Leaders of association issue follow-up to their call for rethinking career paths for those who earn doctorates.

Documentary on social critic Paul Goodman

Intellectual Affairs

The title of Paul Goodman's Growing Up Absurd (1960) has taken on a life of its own -- mimicked or alluded to so often (e.g., Growing Up Amish, Growing Up Digital, and Growing Up Dead) that it seems familiar to people who not only haven't read the book, but have no idea there ever was one by that name. As for the subtitle, "Problems of Youth in Organized Society," it named one of the decisive questions of the decade that followed. One of the people interviewed in Jonathan Lee’s "Paul Goodman Changed My Life" -- a documentary released by Zeitgeist Films and screening around the country over the next couple of months -- recalls that for many years it was the one book found in every dormitory. Another says that you couldn't pick up a major magazine without finding Goodman mentioned, or as author of an article.

Within the limits of exaggeration-for-effect, that is actually a fair way to indicate how of a public presence the author had during the Kennedy administration, and he remained in great demand as a speaker, especially on campuses, for some while after that.

Goodman's political stance was unusual -- “anarcho-pacifist communitarianism” about covers it -- and certainly kept him on the sidelines during the 1950s. But his approach to social criticism was only occasionally that of declamatory denunciation. His approach, much of the time, was to make helpful suggestions toward the public good, in a spirit of responsible citizenship. Imagine the benefits of banning cars from Manhattan, for example, or ending the arms race immediately. Of course, trying to do most of the things he proposed would involve radical change, but so what? A famous piece of graffiti from the 1960s said "Be reasonable, demand the impossible." That might as well have been his slogan.

Goodman was anything but a one-book author, and social commentary was by no means his primary concern. The huge audiences he drew after Growing Up Absurd became a bestseller meant that publishers could not wait to re-issue his earlier work -- his novels and poetry, his University of Chicago dissertation on neo-Aristotelian literary criticism, his volume of psychoanalytic reflections on Kafka, you name it.

Ditto for anything new he wrote. Between 1960 and his death in 1972, he published three or four books a year. He was easily one of the best-known and most-read figures in the country, and "Paul Goodman Changed My Life" is an excellent tribute to his memory and reminder of his influence. It should go a long way toward generating more interest in him than has been evident over the last two or three decades -- when nobody, nobody at all, has been reading him.

An exaggeration for effect, of course. I've been reading him for most of that time, for one. Presumably a few other people have, as well. But still, close enough. Considering the scale of public response to Goodman’s work in final years of his life, the eclipse has been astonishing and all but total. The output of scholarly and critical literature on him has been thin in quantity -- and, for the most part, quality. The most important exception is Here Now Next: Paul Goodman and the Origins of Gestalt Therapy by Taylor Stoehr, a professor emeritus  of English at the University of Massachusetts at Boston, who is Goodman's literary executor. It was published by Jossey-Bass in 1994, and is more far-ranging than the title may suggest. Before fame overtook him, Goodman was involved in a number of academic, psychoanalytic, artistic, and political circles, and Stoehr's monograph is the only attempt, so far, to chart some of his webs of influence and affiliation, at least to my knowledge.

And here is where Jonathan Lee’s documentary gives hope. It does an excellent job of evoking Goodman’s peripatetic and ramshackle career -- the stints teaching at Chicago and Black Mountain College, the years as a lay psychotherapist, the role he played with the off-Broadway Living Theater group, both as playwright and house philosopher. The composer Ned Rorem recounts setting his friend’s poems to music. We get a glimpse of how contemporary students respond to one of Goodman’s essays in a class taught the by adjunct English instructor Zeke Finkelstein at the City College of New York. And, best of all, there are numerous clips of Goodman being interviewed or speaking.

While not charismatic, exactly, he is certainly fearless, an admirable quality in an intellectual and particularly valuable for its scarcity. The interview on William F. Buckley's show "Firing Line" in 1966 is a case in point. The documentary begins with Buckley introducing his guest as  “a pacifist, a bisexualist, a poverty cultist, an anarchist, and a few other distracting things.” Before responding to Buckley’s first question, Goodman objects to how he has been described. “I’m not a poverty cultist," he says. "I do think it's a sign of a good society that it is possible to live in decent poverty, especially if you so choose, that is, if you have more important things to do than to make money.” (He goes on to correct Buckley for misusing the word “axiomatic,” as the host concedes.) But what Goodman doesn’t respond to at all -- noticeably enough -- is the reference to his sexuality. He was candid about it to the point of losing at least a couple of teaching positions. It also got him beaten up.

Goodman could be prickly, egocentric, and not shy about communicating the assumption that he was a genius. Plus he made passes at everybody. He must have been difficult company at times. Some of this comes through in the documentary, and it serves as needed balance to any hagiographic impulse. On the other hand, there was never a valid criticism of Goodman that he hadn't made about himself in a poem or essay somewhere.

The film ends with a suggestion that Goodman's influence and example might revive. Fair enough: some of his work has been reprinted of late, and The Paul Goodman Reader, edited by Stoehr and published by PM Press, is a representative sampling of his work in several fields and genres.

But the possibility of a revival does not explain why his influence and example waned in the first place. During an e-mail discussion with Lee, I asked him why he thought Goodman's star had faded. One thing the director stressed is that Goodman “wasn't a specialist,and therefore did not become a star in any specific academic discipline. His brother Percival told me that if he had written only in one discipline, he would have become famous as an author in that discipline, say psychology, for example, and there would have been an academic constituency to carry him forward.”

At the same time Goodman’s work “is more intellectual, more rationalist, than say a Jack Kerouac, whose [On the Road] is in print. Paul Goodman challenges the reader to think, to act, and reading him is not a dumbed-down experience. I think that we've been continuing a dumbing-down of our public life -- certainly what's available and popular in the mainstream media -- and Goodman is too smart to satisfy the demand for easy, non-challenging material.”

Valid points, as far as they go, though they don’t exhaust the question. Not all of the failings are on the side of the public. The range of subjects in Goodman’s work is great, but so is the range of quality. You have to read a great deal of his work to see how parts of it hold together. He seems to have cobbled together a kind of intellectual framework from elements of Aristotle, Kant, Freud, Dewey, and Kropotkin -- an interesting list, but a slightly odd one. And Susan Sontag’s description of Goodman’s prose is exactly right: “What he wrote was a nervy mixture of syntactical stiffness and verbal felicity; he was capable of writing sentences of a wonderful purity of style and vivacity of language, and also capable of writing so sloppily and clumsily that one imagined he must be doing it on purpose.” (Then again, only a mediocrity is always at his best.)

But there is a passage from the introduction to his book Utopian Essays and Practical Proposals (Random House, 1962) in which Goodman explains himself as clearly anyone could want, and with a kind of eloquence.

“As my books and essays have appeared," Goodman wrote, "I have been been severely criticized as an ignorant man who spreads himself thin on a wide variety of subjects, on sociology and psychology, urbanism and technology, education, literature, esthetics, and ethics. It is true that I don't know much, but it is false that I write about many subjects. I have only one, the human beings I know in their man-made scene. I do not observe that people are in fact subdivided in ways to be conveniently treated by the ‘wide variety’ of separate disciplines. If you talk separately about their group behavior or their individual behavior, their environment or their characters, their practicality or their sensibility, you lose what you are talking about. We are often forced, for analytic purposes, to study a problem under various departments — since everybody can't discuss everything at once, but woe if one then plans for people in these various departments! One will never create a community, and will destroy such community as exists.... I make the choice of what used to be called a Man of Letters, one who relies on the peculiar activity of authorship -- a blending of memory, observation, criticism, reasoning, imagination, and reconstruction -- in order to treat the objects in the world concretely and centrally.”

There are worse models of intellectual activity than this, and Jonathan Lee has done a useful thing by reminding us what it looked like in person.

Scott McLemee is an essayist and critic and the Intellectual Affairs columnist for Inside Higher Ed.

Mad -- or Just Angry?

Few people go down in history by their childhood nicknames, which is probably for the best. But such was the destiny of Gaius Caesar Germanicus, the emperor of Rome from 37 to 41 A.D. and the son of a much-loved commander of the Roman forces stationed in Germany. The father dressed young Gaius up in a kid-sized legionnaire’s uniform, to the delight of the troops, who dubbed him Caligula, meaning “Little Boots.”

The moniker stuck, although the last thing anyone remembers about Caligula is the cuteness. A couple of on-screen depictions of his reign are indicative. It was presented as the height of decadence in Caligula (1979), the big-budget, pornographic bio-pic produced and directed by Bob Guccione Jr., with Malcolm McDowell as the emperor, featuring numerous Penthouse Pets-of-the-Month, smouldering in lieu of dialogue. (Also, Helen Mirren, minus toga.) I have promised the editors not to embed any video clips from it in this column. Suffice it to say that the film was terrible, and Gore Vidal, who wrote the script, seems to have disowned it just as soon as the check cleared.

Better by far -- indeed, unforgettable -- was John Hurt’s turn as the mad tyrant in “I, Claudius,” the BBC miniseries from 1976. He portrayed Caligula as terrifying and monstrous, yet also strangely pitiful. Power corrupts, and absolute power sounds even more enjoyable. But having every whim met without hesitation does not make the descent into insanity any less agonizing, even for Caligula himself. By the time the emperor is assassinated (at the age of 29, after not quite four years in power), Hurt makes his death seem almost a mercy killing.

The BBC program was adapted from two novels by Robert Graves, who drew in turn from the accounts left by Roman historians -- in particular, The Twelve Caesars by Suetonius. (It was also an influence on Guccione’s film, if not quite as much as Deep Throat.) Most of the really lurid charges about Caligula come down to us via Suetonius: the incest, the cross-dressing, the plan to name his favorite horse to an important position, his effort to pay soldiers with sea-shells….

And, most damaging of all, Seutonius records that Caligula proclaimed himself to be a god. He had altars to himself set up around the empire so that the public could worship him. Other sources confirm this, including the Jewish writers Josephus and Philo. They indicate that Roman officials put up statues of Caligula in synagogues, and that the emperor even tried (unsuccessfully) to plant his idol in the most sacred part of the Temple in Jerusalem.

According to Seutonius, the emperor walked around the palace chatting with the other gods. He would ask people whether they thought he was greater than Jupiter. You didn’t have to be a monotheist to find that sort of thing revolting.

But what if all of these claims about Caligula were wrong, or at least overblown? What if he was, in fact, completely sane -- his awful reputation the product of a smear campaign?

In 2003, Aloys Winterling, a professor of ancient history at the University of Basel, in Switzerland, published a book arguing that the emperor’s strange behavior was, in effect, normal Roman politics carried to extremes. Caligula played hardball with his enemies, giving them every reason to exact posthumous revenge. But the truth could be separated out from the slanders. The volume is now available in English translation as Caligula: A Biography, from the University of California Press.

Winterling’s reassessment of the legend of the mad emperor is hardly as contrarian as it may sound. By the 19th century, classicists had enough fresh material to work with (inscriptions on public buildings, for example, and documents of everyday governance) to feel less dependent on the accounts left by Roman authors. They were learning to take the ancient chronicles with a grain of salt. Suetonius, for example, reports things with all the confidence of an eyewitness, but in fact was writing 80 years after Caligula’s death. Evidently he never heard a rumor about the emperor he didn’t record. That makes his tell-all biographies very entertaining, and even useful in a way, but not exactly reliable.

So there were grounds for reasonable doubt. Revisionist accounts of Caligula have appeared from time to time, suggesting that his reign was not wildly different from that of other emperors. When Winterling published his book in 2003, it coincided with the centennial of the landmark study by Hugo Willrich that first made the case for Caligula as rational politician. (This is unlikely to be a total coincidence, but the translated edition says nothing about it one way or the other.) Winterling even expresses concern that some modern accounts have “gone too far in transforming a ruler depicted as immoral and insane into a good one whose actions were rational.”

The figure portrayed in Caligula: A Biography was a rational and competent leader, but “good” is not a word that comes to mind. He was capable, when pushed, of extreme viciousness, ranging from savage humiliation to torture and execution. Making him angry was never a good idea, but neither was trying to flatter him. The targets of his wrath were almost always his fellow aristocrats – which, according to Winterling’s analysis, is a crucial bit of context to keep in mind.

The core of his argument is that even Caligula’s wildest behavior reflected the instability of the political order, not of his mind. The transition from republic to empire in the decades prior to his reign had generated a rather convoluted system of signals between the Senate (the old center of authority, with well-established traditions) and the emperor (a position that emerged only after civil war).

The problem came from deep uncertainty over how to understand the role that Julius Caeser had started to create for himself, and that Augustus later consolidated. The Romans had abolished their monarchy hundreds of years earlier. So regarding the emperor as a king was a total non-starter. And yet his power was undeniable – even as its limits were undefined.

The precarious arrangement held together through a strange combination of mutual flattery and mutual suspicion, with methods of influence-peddling ranging from strategic marriages to murder. And there was always character assassination via gossip, when use of an actual dagger seemed inconvenient or excessive.

Even those who came to despise Caligula thought that his first few months in power did him credit. He undid some of the sterner measures taken by his predecessor, Tiberius, and gave a speech making clear that he knew he was sharing power with the Senate. So eloquent and wonderful was this speech, the senators decided, it ought to be recited each year.

An expression of good will, then? Of bipartisan cooperation, so to speak?

On the contrary, Winterling interprets the flattering praise for Caligula’s speech as a canny move by the aristocrats in the Senate: “It shows they knew power was shared at the emperor’s pleasure and that the arrangement could be rescinded at any time…. Yet they could neither directly express their distrust of the emperor’s declaration that he would share power, nor openly try to force him to keep his word, since either action would imply that his promise was empty.” By “honoring” the speech with an annual recitation, the Senate was giving a subtle indication to Caligula that it knew better than to take him at his word. “Otherwise,” says Winterling, “it would not have been necessary to remind him of his obligation in this way.”

The political chess match went smoothly enough for a while. One version of what went wrong is, of course, that Caligula became deranged from a severe fever when he fell ill for two months. Another version has it that the madness was a side-effect of the herbal Viagra given to him by his wife.

But Winterling sees the turning point in Caligula’s reign as strictly political, not biomedical. It came when he learned of a plot to overthrow him that involved a number of senators. This was not necessarily paranoia. Winterling quotes a later emperor’s remark that rulers’ “claims to have uncovered a conspiracy are not believed until they have been killed.”

In any event, Caligula responded with a vengeance, which inspired at least two more plots against him (not counting the final one that succeeded); and so things escalated. Most of the evidence of Caligula’s madness can actually be taken, in Winterling's interpretation, as ways he expressed contempt for the principle of shared power -- and, even more, for the senators themselves.

Giving his horse a palace and a staff of servants and announcing that the beast would be made consul, for example, can be understood as a kind of taunt. “The households of the senators,” writes Winterling, “represented a central manifestation of their social status…. Achieving the consulship remained the most important goal of an aristocrat’s career.” To put his horse in the position of a prominent aristocrat, then, was a deliberate insult. It implied that the comparison could also be made in the opposite direction.

So Caligula was crazy … like a fox. Winterling reads even Caligula’s self-apotheosis as a form of vengeance, rather than a symptom of mental illness. Senators had to pretend to believe that he conversed with the gods as an equal. Declaring himself divine gave him ever more humiliating ways to make them grovel -- to rub their noses in the reality of his brute and unchecked power.

It was one-upsmanship on the grandest possible scale. Beyond a certain point, I’m not sure where anger ends and madness begins. But Winterling makes a plausible case that his reputation was worse than his behavior. The memory of their degradation by Caligula gave the aristocracy every reason to embellish his real cruelties with stories that were contrived later. In the period just after the emperor's death, even his worst enemies never accused him of incest; that charge came decades afterwards.

So his reign may not have been as surreal as it sounded, but rather a case of realpolitik at its nastiest. Still, it won't be Winterling's portrait that flashes before my mind's eye the next time anyone mentions Caligula. It's a fascinating book, but it can't displace those indelible images of John Hurt in the grip of his delusions, screaming in pain from the voices in his head, and doing terrible things to his sister.

Author/s: 
Scott McLemee
Author's email: 
scott.mclemee@insidehighered.com

Pages

Subscribe to RSS - History
Back to Top