Irish historians have watched the legal case relating to the witness statements from participants in the conflict in Northern Ireland held by Boston College with great interest and with no little trepidation.
Regardless of the ultimate outcome of the case, there are real fears that the controversy has already jeopardized the collection and preservation of historical material relating to the conflict in Northern Ireland.
One friend, who was instrumental in helping University College Dublin Archives to secure a significant collection of private papers that includes material relating to the Northern Ireland peace process, remarked recently that it would have been more difficult to convince the donor to preserve his papers and donate them to an archive if the controversy at Boston College had previously been known.
The great difficulty here is that any comprehensive history of the Northern Ireland conflict will be very dependent on statements from the men and women who were directly engaged in the events: republicans, loyalist paramilitaries, police, British army personnel, politicians, public servants, and the ordinary people whose lives were shaped by the conflict. The nature of the conflict in Northern Ireland was such that no existing archive can expect to stand as sufficient sources for the writing of plausible history; the words of the people who lived through (and participated in) the conflict need to be preserved to allow for the creation of a more meaningful historical record.
The Boston College interviews are one of several series of interviews that currently exist, or are now being collected. Oral history is especially important if we are to tell the story of everyday life during these years, and the motivations and reflections of men and women who did not hold positions of leadership.
Irish historians are very conscious of the importance of such testimonies, because a comparable archive exists relating to the 1916 Rising and the Irish war of independence. In the 1940s and early 1950s the Bureau of Military History – funded by the Irish government – collected statements from men and women who participated in these events. Some of those men and women engaged in violence or other acts about which they might not have been willing to speak publicly. The statements were finally released in 2004, 50 years after they were collected, when all the witnesses had died.
Although this delay has been criticized, it shows a respect for the witnesses and indeed for all who were affected by the events narrated in these testimonies. These statements, and the anticipated release shortly of thousands of Military Pension Files, containing further firsthand statements from those involved in the War of Independence, provide a permanent and valuable record of a critical period in the emergence of contemporary Ireland.
These firsthand accounts have transformed the understanding of these years, bringing it to life in a manner that more formal records cannot do.
The oral statements of participants in the conflict in Northern Ireland offer a similar potential to provide a rounded account of these years. This will only happen, however, if those making statements can trust the record-taker, and trust the place where these records are deposited.
This trust requires firm assurances that the statements will not be released prematurely, or divulged other than under the terms agreed. The witness statements should be collected with the intent of creating a long-term historical record; while there may be an understandable eagerness to gain access to them, in order to be first with the story – they are best left undisturbed for a significant period of time. Essentially, they should be collected and protected for posterity – not for the present.
University College Dublin (UCD), in common with other research universities, has a clear code of ethics that applies to all material that relates to identifiable individuals; securing their consent to any use that permits them to be identified is a key requirement.
In addition researchers and archivists must observe the requirements of the Data Protection Act, which precludes the revealing of personal information – relating to matters such as health, family circumstances or financial records, and these regulations are strictly enforced. Many of the private collections deposited in UCD Archives can only be accessed with the permission of the donor.
While testimonies relating to paramilitary activities are obviously of a particularly sensitive nature, there are recognized laws and procedures in place that protect the witness, the archive, the archivist and the researcher – provided that they are observed.
The issue may become more complex when records are transferred from one country to another, if the legal framework relating to data protection and disclosure is different, but again, a robust protocol and clearly-determined governance – agreed before any records are compiled – should reduce these risks.
Oral histories are extremely valuable sources for posterity, and they are becoming of still greater importance in an age when communication increasingly takes the form of telephone conversations, e-mails, texts, tweets and other means; these are obviously less easily preserved than letters or written memorandums.
Ultimately, there will be lessons to be learned from the specifics of the Boston College case. The overarching ambition must remain unchanged: to ensure that a trusted record of the past can be compiled and preserved for posterity.
Mary E. Daly is professor of modern Irish history at University College Dublin.
It was a classic instance of blaming the messenger: Spanish newspapers carried the earliest reports of a new illness that spread across the globe in the final months of World War I, and so it be came known as “Spanish influenza,” although its real point of origin will never be known. It was virulent and highly communicable. A paper appearing in the Centers for Disease Control and Prevention journalEmerging Infectious Diseases a few years ago estimated that 500 million people, almost a third of the world’s population, were stricken with it. By the end of its reign of terror in the final months of 1920, there were 50 million fatalities -- more than three times as many as died from the war itself. These figures may be on the low side.
In her two long essays on illness, Susan Sontag grappled with the strong and longstanding tendency to treat certain diseases as meaningful: the vehicle for metaphors of social or cultural disturbance. “Feelings about evil are projected onto a disease,” she wrote. “And the disease (so enriched with meanings) is projected onto the world." Just so, one would imagine, with a pandemic. Something in a plague always hints at apocalypse.
But the striking thing about Spanish influenza is how little meaning stuck to it. Plenty of sermons must have figured the Spanish flu as one of the Four Horsemen, at the time, but the whole experience was quickly erased from collective memory, at least in the United States. In 1976, the historian Alfred W. Crosby published a monograph called Epidemic and Peace: 1918 that Cambridge University Press later issued as America’s Forgotten Pandemic (1989). Apart from being snappier, the new title underscored the almost total disappearance from anything but the specialist’s sense of history. One person in four in the U.S. suffered from an attack of Spanish flu, and it killed some 675,000 of them. The catastrophe seems never to have interested Hollywood, though, and the only work of fiction by an author who lived through the outbreak, so far as I know, is Katherine Anne Porter’s novella “Pale Horse, Pale Rider.” (Biblical imagery seems just about unavoidable.) (Note: This article was updated from an earlier version to correct the name of the author of Epidemic and Peace: 1918.)
The title of Nancy K. Bristow’s American Pandemic: The Lost Worlds of the 1918 Influenza Epidemic (Oxford University Press) is an echo of Cross’s America’s Forgotten Pandemic. I don’t want to read too much into the one-word difference, but it does seem that the influenza crisis of almost a century ago has been working its way back into public awareness in recent years. Several more books on the subject have appeared since Gina Kolata’s best-seller Flu: The Story of the Great Influenza Pandemic of 1918 and the Search for the Virus That Caused It came out in 1999. The Public Broadcasting Service has done its part with an excellent documentary as well as an episode of Downton Abbey in which pestilence hits a country house in England during a dinner party.
So “forgotten” is no longer quite the right word for the nightmare. But it remains almost impossible to imagine the ferocity of the pandemic, much less its scale. The contemporary accounts that Bristow draws on retain their horror. Doctors wrote of patients changing from “an apparently well condition to almost prostration within one or two hours,” with raging fevers and severe pain in even the milder cases – and the worst involving a “bloody exudate” coughed up from the “peculiar and intense congestion of the lungs with [a] hemorrhage,” so that it was “simply a struggle for air until they suffocate.”
Morgues were overrun. In poor households, several delirious family members might be crowded into the same bed along with someone who had died. Those who made it to the hospital could lie unattended for days at a time. The authorities were issuing “don’t worry, it’s just another flu”-type pronouncements well into the catastrophic phase of the epidemic. Quarantines and bans on public gatherings were easier to proclaim than to enforce. Having absorbed the relatively new idea that disease was spread by germs, people donned surgical masks to protect themselves – to no avail, since influenza was a virus. The epidemic went through three waves of contagion in as many years, and it wore down whatever patience or civic-mindedness people had when the disaster hit.
A pandemic, by definition, puts everyone at risk. But access to medical help – inadequate as it proved – was far less egalitarian. (As is still the case, of course.) Much of the historical scholarship on disease in recent decades has stressed how the interaction between medical professionals and their clientele tends to reinforce the social hierarchies already in place. Bristow’s work follows this well-established course, combining it with a familiar emphasis on the changes in medicine’s public role in the wake of Progressive Era reforms.
She writes about how poor, immigrant, or Native American sufferers were assumed guilty “of dishonesty and laziness, and of attempting to take advantage of others’ generosity” until proven otherwise, while the African-American population was forced “to continue relying on their own too limited community resources as they sought to provide sufficient care for their sick neighbors.” And while the U.S. Public Health Service had been created in 1912, its capacity to respond to the influenza crisis was limited, given how poorly the disease was understood. Even gathering reliable statistics on it proved almost impossible while the virus was on its rampage.
The most interesting chapter of America’s Pandemic considers how doctors and nurses responded to the crisis. Although they often worked side-by-side together, their experiences were a marked contrast.
“Ignorant of the disease’s etiology, uncertain of the best methods of treatment, and unable to ease the suffering of their patients,” Bristow writes, “physicians often expressed a sense of helplessness as individuals and humility as members of a profession.” (You know something is catastrophic when it reduces doctors to humility.)
Belonging to an almost entirely male profession, they “gauged their work against the masculine standards of skill and expertise” – and the inevitable military metaphor of going to battle against the disease became that much more intense given the example of actual soldiers fighting and dying in the trenches. But the influenza virus was stronger. “Like a hideous monster,” one physician wrote, “he went his way, and none could hinder.” Doctors’ letters and diaries from the period reflect a sense of bewilderment and failure.
For a while the authority of the profession itself was undermined. Patent medicines and related quackery proved no more effective in treating or preventing the disease than anything the doctors could offer. But they weren’t any less effective, either.
The nurses could not have responded more differently. Caring for patients was “a terrific test” and “high privilege,” they wrote, “a most horrible and yet most beautiful experience.” As with doctors, many lost their lives while tending to the sick. But one nurses’ report said that the work was “one of the most immediately satisfactory experiences of our lives” for those who survived it, “and this is true even though we were borne down with the knowledge that, do all we might, the pressing, tragic need for nursing was much greater than could possibly be met.”
And this, too, was a matter of gauging their skill by socially sanctioned gender norms. “Women working as nurses aspired to what they viewed as the uniquely feminine qualities of domesticity, compassion, and selflessness,” writes Bristow. “To measure up to these standards nurses needed only to care for their patients, not cure them, and this they proved able to do.”
A few hours after choosing American Pandemic for this week’s column, I attended a public event at which every third person seemed to be coughing, with a congested wheeze usually audible. Synchronicity is not always your friend. For the past several days I have been reading about influenza, and writing about it, while suffering from one of its milder variants. The experience is not to be recommended.
Two quick points might be worth making before the medication kicks in. Bristow’s final assessment is that the horror and devastation of the pandemic could not be reconciled with the preferred national narrative of progress and redemption, “with its upbeat tenor and its focus on a bright future.” At most, its memory was preserved as part of local history, or through family stories.
The argument is plausible, to a degree, but it overlooks the element of trauma involved – not just the suffering and death during that period, but the feeling of being rendered helpless by an event that’s come out of nowhere.
And what sense does it make to think of the events of 1918-20 as “America’s pandemic,” forgotten or otherwise? Deaths from influenza in the United States during that period represent something like 1.4 percent of the world’s fatalities from the disease. How was it remembered -- or erased from public memory -- elsewhere? Diseases don’t respect borders, and it’s hard to see why the historiography of disease should, either.
In November, Pew Research Center released a report discussing the level of belief in American exceptionalism in the United States. It gauged this by asking whether interviewees accepted the statement "Our people are not perfect, but our culture is superior to others." I have been interested in the history of theories of American exceptionalism for more than twenty years, and gave it a look. Formulating the idea that way struck me as obnoxious and fairly absurd. But then the Pew people are specialists in public opinion research -- and feelings of superiority (or rather, anxieties over it) do seem to be what is at stake as the expression American exceptionalism is used in U.S. politics lately. (It's worth noting that it's the authors of the report who make a connection between superiority and exceptionalism. The interviewers didn't explicitly ask about the latter.)
Republican candidates keep proclaiming their faith in American exceptionalism, or smiting Obama for his failure to believe in it. Not long ago somebody published a letter to the editor claiming that Obama hates American exceptionalism, which would seem to imply that he must believe in it, since hating something you don’t believe in sounds difficult and a real waste of time. But it’s probably best not to expect too much logical consistency at this point in the electoral season.
Obama himself is at least somewhat culpable for the whole situation. The furor all started in 2009 when, in response to a question, he said: “I believe in American exceptionalism, just as I suspect that the Brits believe in British exceptionalism and the Greeks believe in Greek exceptionalism.” That, too, is a misreading of the term, equating it with something like national-self esteem. But of a healthy sort -- well shy of narcissistic grandiosity, with plenty to go around. That's probably what got him into trouble.
Anyway, the Pew study yielded some interesting results. Pew's researchers have been asking whether people agreed with the sentiment "Our people are not perfect, but our culture is superior to others" for at least 10 years now. In 2002, 60 percent of the Americans polled said they did. The figure fell to 55 percent in 2007. Last year, just 49 percent of respondents agreed, with nearly as many (46 percent) saying they disagreed. “Belief in cultural superiority has declined among Americans across age, gender and education groups,” the Pew report said.
The same question was posed in surveys conducted in Britain, France, Germany, and Spain. The level of agreement was higher in the U.S, than elsewhere (Germany and Spain were fairly close) but the variations are less interesting than what held constant: “In the four Western European countries and in the U.S., those who did not graduate from college are more likely than those who did to agree that their culture is superior, even if their people are not perfect.”
Make of that what you will. For my part, the really odd thing about all the recent endorsements of American exceptionalism is that the very expression came into the world as the name for a Communist heresy.
The image of America as a city upon a hill -- uniquely favored by the Almighty and a light unto the heathens -- is older than the United States itself, of course. And it’s true that visitors to the country, including Alexis de Tocqueville, have long declared it “exceptional,” in one way or another, and not always for the better. Charles Dickens thought we were exceptionally prone to printing his books without permission, let alone paying him royalties. But the term "American exceptionalism" is more recent, and it took the Comintern to launch the Republican candidates' preferred way of recommending themselves these days.
Circa 1927-28, a group of American Communist Party leaders began arguing that, yes, the U.S. economy would undoubtedly succumb to the contradictions of capitalism, sooner or later, but it still had plenty of life in it yet, so the comrades abroad should keep that in mind, at least for a while. Their perspective was in accord with the ideas of the Bolshevik theorist Nicholai Bukharin concerning the world economic situation, and he was the one, after all, in charge of the Communist International. So all was copacetic, at least until the summer of 1928, when Stalin quit taking Bukharin’s phone calls.
Before long, the American leaders were called on the carpet by the authorities in Moscow, and found themselves denounced by Stalin himself for an ideological deviation: "American exceptionalism.” Stalin also told them, "When you get back to America, nobody will stay with you except your wives." That turned out to be a slight exaggeration, but they were promptly expelled from the party when they got back home -- taking around a thousand fellow American exceptionalists with them.
As it happened, all of this was just a few months before the stock market crash on Black Tuesday, which made the whole debate seem rather moot. But a catchphrase was born. Stalin’s speeches blasting American exceptionalism were printed as a pamphlet in an enormous edition. The pro-American exceptionalism Communists went off to start their own group, which had a strange and complex history that deserves better scholarship than it has received. But that seems like enough esoterica for now.
David Levering Lewis puts the neologism into a wider context with his essay “Exceptionalism's Exceptions: The Changing American Narrative,” in the new issue of the American Academy of Arts and Sciences’ journal Daedelus. Levering, now a professor of history at New York University, received one Pulitzer Prize each for the two volumes of his biography of W.E.B. Du Bois.
“[I]ts Soviet originators defined American exceptionalism as the colossal historical fallacy that imagined itself exempt from the iron laws of economic determinism,” Levering writes, “whereas most American academics and public intellectuals … avidly embraced a phrase they regarded as an inspired encapsulation of 160 years of impeccable national history.” One of the handful of figures to give the idea a careful, skeptical examination, Levering says, was Du Bois. In his masterpiece Black Reconstruction (1935), he wrote that “two theories of the future of America clashed and blended just after the Civil War.” One was “abolition-democracy based on freedom, intelligence, and power for all men,” and the other was “a new industrial philosophy” with “a vision not of work but of wealth; not of planned accomplishment, but of power.”
American exceptionalism was, in effect, the happy belief that these tendencies reinforced each other. That was not a credible idea for an African-American who received his Ph.D. from Harvard one year before the Supreme Court ruling in Plessy v. Ferguson that endorsed “separate but equal” treatment of the races. For Du Bois, writes Levering, “the cant of exceptionalism survived mainly to keep the Moloch of laissez-faire on life support even as its vital signs failed in the wake of the Great Crash of 1929.”
The doctrine of exceptionalism proved hardier than Du Bois imagined, as the years following World War II showed. Levering mentions that Henry Luce “had already given the world its peacetime marching orders in ‘The American Century,’ a signature 1941 editorial in Life.” Eight other contributors, most of them historians, join Andrew J. Bacevich in assessing that line of march in The Short American Century: A Postmortem (Harvard University Press), a collection of essays spinning off from a lecture series Bacevich organized at Boston University in 2009-2010.
“By the time the seventieth anniversary of Luce’s famous essay rolled around in 2011,” the editor writes, “the gap between what he had summoned Americans to do back in 1941 and what they were actually willing or able to do had become unbridgeable.” Unfortunately the editorial is not reprinted, and it loses something in paraphrase -- a bracing tone of stern moral uplift, perhaps, inherited from his parents, who had been missionaries in China. Here’s a sample:
“[W]hereas their nation became in the 20th Century the most powerful and the most vital nation in the world, nevertheless Americans were unable [after World War One] to accommodate themselves spiritually and practically to that fact. Hence they have failed to play their part as a world power -- a failure which has had disastrous consequences for themselves and for all mankind. And the cure is this: to accept wholeheartedly our duty and our opportunity as the most powerful and vital nation in the world and in consequence to exert upon the world the full impact of our influence, for such purposes as we see fit and by such means as we see fit.”
And plenty more where that came from. “When first unveiled,” Bacevich notes, “Luce’s concept of an American Century amounted to little more than the venting of an overwrought publishing tycoon.” By the end of the war, that had changed: “Claims that in 1941 sounded grandiose became after 1945 unexceptionable.” The American Century brought “plentiful jobs, proliferating leisure activities, cheap energy readily available from domestic sources, and a cornucopia of consumer goods, almost all of them bearing the label ‘Made in the U.S.A.’ ” And all of it while, in Luce’s words, “exert[ing] upon the world the full impact of our influence, for such purposes as we see fit and by such means as we see fit.”
Well, and how did that turn out? The contributors are not of one mind. “As international regimes go, much of the American Century, despite the chronic tensions and occasional blunders of the Cold War (and especially the tragedy of Vietnam) was on the whole a laudably successful affair,” writes David M. Kennedy. For Emily S. Rosenberg, “the period of maximum U.S. power and influence” was “a precursor to a global Consumer Century” that “proved highly adaptive to local cultural variation,” so that equating globalization with Americanization is a misnomer.
In counterpoint, Walter LaFeber rebukes Luce’s vision all along the line. He writes that the American Century “never existed except as an illusion, but an illusion to which Americans, in their repeated willingness to ignore history, fell prey.”
T. J Jackson Lears writes in praise of a “pragmatic realism” informed by the pluralism of William James and Randolph Bourne, and says it “requires a sense of proportionality between means and ends, as well as a careful consideration of consequences – above all, the certain, bloody consequences of war.” But his essay does not exactly portray the American Century as a triumph of pragmatic realism. (C. Wright Mills’s description of the nuclear war strategists’ “crackpot realism” seems a little more apropos.)
Bacevich’s essay concluding the book brings us up the moment by stressing how interconnected the American Century and American exceptionalism have become. “To liken the United States to any other country (Israel possibly excepted) is to defile a central tenet of the American civil religion. In national politics, it is simply impermissible.” Luce’s vision “encapsulate[es] an era about which some (although by no means all) Americans might wax nostalgic, a time, real or imagined, of common purpose, common values, and shared sacrifice.”
Such yearning is understandable, but nostalgia is bad for you: it makes the past seem simpler than it was. And the world has probably had as much exceptionalism as it can stand. As the American psychologist Harry Stack Sullivan put it, we are all much more simply human than anything else. And it seems like there must be a better use of a political figure's time than assuring people that they are all above average.