History

Essay on how historians are defining what students need to learn

Giants can move. So can venerable, cautious scholarly organizations like the American Historical Association. In a recent New York Times op-ed, Kevin Carey of the New America Foundation asked "Who Will Hold Colleges Accountable?" As a professor at Colorado College, and faculty chair of the AHA’s Tuning Project, I can answer: we will. In a moment where college education and the value it provides students, their families, and American society in general seems continuously under attack, the American Historical Association has been quietly helping its members define and promote the value of history. Carey’s piece, pointing out the outdated notion of credit hours that grant students "credit" and eventually degrees for the act of sitting in chairs or staring at screens, thoughtfully calls for scholarly societies to "define and update what it means to be proficient in a field."

The AHA is developing just such a set of definitions. As a group of professional teachers and scholars of history, we do have standards and expectations for what it means to learn to think historically. We should be able to explain what college students who take history courses and major in history have gained from their effort. This might be risky because scholarly organizations generally avoid telling people what they should know, teach, or research in a given discipline. But with the help of a grant from the Lumina Foundation and 70 history departments and programs, the AHA Tuning Project is moving toward a "discipline core."

Now a "discipline core" is not your grandmother’s set of facts that all history students should know – the dreaded lists of dates, czars, emperors, wars, presidents or their wives – but a set of skills and habits of mind that college-educated students should have. And, it turns out, historians can agree about what people with a history degree should be able to do. The 14,000 members of the AHA don’t and won’t ever agree about what facts students should know, but we can agree about the importance of evidence in generating interpretation and the imperative of developing a rich context around those facts. For example, someone with a history major probably could have saved the Gap some money and bad PR by explaining the historical context of "Manifest Destiny" and why that phrase might not be an ideal T-shirt slogan.

History students need to be able to find and sift information, read with a critical eye, assess evidence from the past, write with precision, and be able to tell stories that analyze and narrate the past effectively. We can also agree about a variety of ways students can demonstrate such skills. None of these can be assessed with fill-in-the-bubble tests or any national standardized test, but require meaningful assignments, student responses, and attentive faculty feedback. Take number 8 from the AHA’s discipline core: "Explore multiple historical and theoretical viewpoints that provide perspective on the past." Students could demonstrate this very simply by describing, in written or oral form, a range of descriptions of a specific event. To use the Manifest Destiny example again, a student could describe how this concept emerged in the 1840s and how people in Mexico, in Washington, and in Texas or California might have perceived it as imperial ambition or dangerous racism. Understanding that these different descriptions represent different points of view is great practice in perspective-taking, a tremendously important skill.

The collaborative process that is central to "tuning" means that this set of professional standards will not be prescriptive, but rather will provide reference points to guide history departments and history teachers. Each college and university will read and use a core of professional standards to design courses and degrees that reflect the varied missions and contexts of educational institutions. Having core values and standards that define history as a discipline and the value of historical thinking can and will build programs that do far more than require students to be present for a set of credit bearing hours.

To learn these skills, students have to practice them -- a fact that will immediately ratchet up what goes on in and out of classrooms. Time in a classroom is not, as some skeptics suggest, a waste of money and effort, but essential to real learning. Students have to speak, write, and communicate in a variety of media and to have their work assessed carefully. They need places to practice both skepticism and empathy to acquire the habits of mind required of a history student. These abilities are essential to having thoughtful leaders and citizens, and college graduates with value in the workplace and the community – the central promise of a college education.

Scholarly societies and disciplinary organizations can and should develop professional standards that insist on effective practices at colleges and universities. The AHA is betting that professional historians want to be held accountable for what their students should know and be able to do.

Anne Hyde is professor of history at Colorado College and faculty chair of the American Historical Association Tuning Project.

Section: 
Editorial Tags: 

Essay on "Fascism: Journal of Comparative Fascist Studies"

Intellectual Affairs

Fascism is alive and well in the United States, at least as an epithet. The Third Reich provides the one set of historical analogies everybody will recognize. No more damning evidence about the state of American cultural literacy can be given.

Regardless of who is in office, protesters will wave photographs of the president with the nostril-wide mustache inked in. And whenever a city council or state legislature considers a ban on public smoking, the most unlikely people start complaining about the Gestapo. (First they came for the snuff-dippers, and I did not speak out, for I was not a snuff-dipper….) That seems indicative less of ignorance than of a low threshold for frustration. If there is one thing we can all agree on about totalitarianism, it’s the inconvenience.

The tendency has grown worse over the past dozen years. That, like everything else, can probably be blamed on the Internet, though no doubt some of the responsibility belongs to the History Channel, where Poland is invaded at least twice a day. After a while, fatigue sets in. But then you read about something like the Golden Dawn in Greece -- a rapidly growing party using a streamlined swastika as its emblem – and the word “fascist” ceases to be a free-floating signifier of vituperation. It begins to mean something again. But what?

Fascism: Journal of Comparative Fascist Studies published its first, semiannual issue in October. While not particularly focused on recent developments in the streets, they echo in it even so. The tendency I’ve just complained about – the stretching a concept so thin that it seems to have almost no substance -- has its parallel in the scholarship on fascism. And so does its return to a more substantial form.

St. Augustine said he knew what time was until someone asked him to explain it. Then the trouble started. A similarly perplexed feeling comes over someone reading historiographical efforts to get a handle on fascism.

It’s easy enough to start out with definition by ostension – that is, by pointing to the movements and regimes of Mussolini and of Hitler. And all the more so, given that the Italian leader not only coined the term fascism but wrote an encyclopedia entry on it, or at least signed one. But for all the inspiration Hitler and his early supporters took from Mussolini’s rise to power, Nazi doctrine grew out of its own distinct set of German influences. Racism -- and in particular anti-Semitism of a special variety, bolstered by pseudoscience – played a role in Hitler’s worldview strikingly absent from Mussolini’s doctrine.

And that doctrine itself had a paradoxical aspect. On the one hand, it was, so to speak, nationalism on steroids – deeply hostile to internationalism, especially of the Marxist variety. (In the late ‘10s and early '20s, Germany and Italy alike experienced long revolutionary crises, with left-wing parties making serious bids for power.) At the same time, fascist organizations sprang up all over Europe and in North and South America, with a few also appearing in Asia. Some adherents thought of fascism as a “universal” movement: a new stage of society, of which the Italians, and the later the Germans, were setting the example. In 1934, fascist delegates gathered in Switzerland for a world congress, although the effort soon foundered on ideological differences.

So even the fascists themselves couldn’t agree on how to understand their movement. Nor could historians and political scientists studying them after the defeat of the “classical” fascist regimes. A familiar dichotomy between “lumpers” and “splitters” played itself out, with the former emphasizing common elements among the fascist organizations (authoritarianism, nationalism, leader-worship, tendency to wear uniforms) and interpreting the movement as the product of larger forces (social anomie, economic crisis, resistance to modernization, etc.)

A good précis of the splitters’ response to lumper theorizing appeared in an article by Gilbert Allardyce in The American Historical Review in 1979. Focusing just on Nazism and Italian fascism, he stressed that “one arose in the most advanced industrial nation in Western Europe; the other, in a country still largely underdeveloped. Getting both into any uniform theory is hard enough, but getting both into the same stage of modernization is impossible. Interpretations that make sense in the case of one regime often make no sense in the case of the other.”

At least one historian took the next logical step. Italy was fascist under Mussolini. Fascism involved the dictatorial push of a largely preindustrial society into the age of mechanical reproduction. That wasn’t necessary in Germany. Therefore, Hitler was not a fascist. Likely it would be possible to disprove this syllogism with a Venn diagram or two; but in any event, it feels wrong somehow.

Much of the academic literature on the Italian and German regimes – and just about all of the popular history – goes about its business without getting too bogged down in the “generic fascism” problem. The devil is truly in the details. But the new journal Fascism takes the possibility of a generic concept of the movement as its point of departure, and in ways that seem worth watching.

The field of “comparative fascist studies” as pursued in the journal takes its bearings from Richard Griffin’s understanding of fascism as an ideology defined by a core of “palingenetic ultranationalism” which manifests itself in specific kinds of populist mobilization and charismatic leadership. Griffen, a professor of modern history at Oxford Brookes University, in Britain, first presented this argument in The Nature of Fascism (1991). 

Now, before saying another word, I want to point out that calling fascism “palingenetic” is in not in any way meant as a slur on the beloved former governor of Alaska, vice presidential candidate, and reality television star. Palingenesis means “regeneration, rebirth, revival, resuscitation,” according to the Oxford English Dictionary. “In philosophy and natural science, formerly applied spec[ifically] to the (supposed) regeneration of living organisms from ashes or putrefying matter, to the preformation theory of Charles Bonnet (1720–93), and to the persistence of the soul (metempsychosis) or (in Schopenhauer) of the will from one generation to another.” Let’s be perfectly clear about that. I don’t want any trouble.

In Griffin’s usage, the term carries overtones of both regeneration-from-putrefaction and a sort of reincarnation. A fascist movement seeks a rebirth of the nation’s soul by overcoming its degeneration. It promises not merely a return to “the good old days” but an extreme, and usually violent, new beginning. The national revival comes through “a ‘populist’ drive towards mobilizing the energies of all those considered authentic members of the national community,” with the charismatic aura of the leader and “the pervasive use of theatrical and ritual elements in politics.” Xenophobia and genocidal racism are typical elements but not, as such, absolutely necessary. The main thing is that there be “groups identified with physical and moral decadence,” whose ejection from the nation would be a step towards its rebirth.

It is not so much a theory of fascism as a decent set of fingerprints. Griffin’s description doesn’t explain the movement’s origin or viability in any given country, but it identifies what they shared. That is significant on more than just typological grounds. The program of “comparative fascist studies” as it emerges from Griffin’s keynote essay in the first issue of Fascism, confirmed by the articles following it, includes research into how organizations in various countries influenced one another in the years between Il Duce’s march on Rome in 1922 and the Fuhrer’s suicide in 1945 – and since then, as well. For while the effort to create a “universal fascist” movement collapsed during the Great Depression, the project itself carries on. After a dozen years and more of incredibly trivializing and dumb references to fascism, the rolling economic crisis may yet give it some bite.

A final note. I picked up Fascism at the table of its publisher, Brill, during a conference last month. But it’s also available online, in full, as an open-access journal. The first issue is now available; the next comes out in April.
 

Editorial Tags: 

Survey examines how senior historians view academic careers

Smart Title: 

At universities, teaching isn't highly valued, but at bachelor's institutions, research is highly valued, survey finds. And that research had better not be digital. Study also finds senior professors dissatisfied with academic leaders and students.

Essay on Death of Eugene Genovese

Intellectual Affairs

An ancient and corny joke of the American left tells of a comrade who was surprised to learn that the German radical theorist Kautsky’s first name was Karl and not, in fact, “Renegade.” He’d seen Lenin’s polemical booklet The Proletarian Revolution and the Renegade Kautsky but only just gotten around to reading it.

Eavesdropping on some young Marxist academics via Facebook in the week following the historian Eugene Genovese’s death on September 26, I’ve come to suspect that there is a pamphlet out there somewhere about the Renegade Genovese. Lots of people have made the trek from the left to the right over the past couple of centuries, of course, but no major American intellectual of as much substance has, in recent memory, apart from Genovese. People may throw out a couple of names to challenge this statement, but the operative term here is “substance.” Genovese published landmark studies like Roll, Jordan, Roll: The World the Slaves Made (1974) and – with the late Elizabeth Fox-Genovese, his wife -- Fruits of Merchant Capital: Slavery and Bourgeois Property in the Rise and Expansion of Capitalism, not score-settling memoirs and suchlike.

As for the term “renegade,” well… The author of the most influential body of Marxist historiography in the United States from the past half-century turned into one more curmudgeon denouncing “the race, class, gender swindle.” And at a meeting of the Conservative Political Action Committee, no less. The scholar who did path-breaking work on the political culture of the antebellum South -- developing a Gramscian analysis of how slaves and masters understood one another, at a time when Gramsci himself was little more than an intriguing rumor within the American left – ended up referring to the events of 1861-65 as “the War of Southern Independence.”

Harsher words might apply, but “renegade” will do.

He is listed as “Genovese, Gene” in the index to the great British historian’s Eric Hobsbawm’s autobiography Interesting Times: A Twentieth-Century Life (2002). Actually, now I have to change that to “the late, great British historian” Hobsbawm, rather: he died on October 1.

The two of them belonged to an extremely small and now virtually extinct species: the cohort of left-wing intellectuals who pledged their allegiance to the Soviet Union and other so-called “socialist” countries, right up to that system’s very end. How they managed to exhibit such critical intelligence in their scholarship and so little in their politics is an enigma defying rational explanation. But they did: Hobsbawm remained a dues-paying member of the Communist Party of Great Britain until it closed up shop in 1991.

The case of Genovese is a little more complicated. He was expelled from the American CP in 1950, at the age of 20, but remained close to its politics long after that. In the mid-1960s, as a professor of history at Rutgers University, he declared his enthusiasm for a Vietcong victory. It angered Richard Nixon at the time, and I recall it being mentioned with horror by conservatives well into the 1980s. What really took the cake was that he’d become the president of the Organization of American Historians in 1978-79. Joseph McCarthy and J. Edgar Hoover had to be spinning in their graves.

When such a sinner repents, the angels do a dance. With Eric Hobsbawm, they didn’t have much occasion to celebrate. Though he wrote off the Russian Revolution and all that followed in its wake as more or less regrettable when not utterly disastrous, he didn’t treat the movement he’d supported as a God that failed. He could accept the mixture of noble spirits and outright thugs, of democratic impulses and dictatorial consequences, that made up the history he'd played a small part in; he exhibited no need to make either excuses or accusations.

Genovese followed a different course, as shown in  in the landmark statement of his change in political outlook, an article called  “The Question” that appeared in the social-democratic journal Dissent in 1994. The title referred to the challenge of one disillusioned communist to another: “What did you know and when did you know it?" Genovese never got around to answering that question about himself, oddly enough. But he was anything but reluctant  He was much less reluctant about accusing more or less everybody who’d ever identified as a leftist or a progressive of systematically avoiding criticism of the Soviets. He kept saying that “we” had condoned this or that atrocity, or were complicit with one bloodbath or another, but in his hands “we” was a very strange pronoun, for some reason meaning chiefly meaning “you.”

What made it all even odder was that Genovese mentioned, almost in passing, that he’d clung to his support for Communism “to the bitter end.” If decades of fellow-traveling showed a failure of political judgment, “The Question” was no sign of improvement. His ferocious condemnation seemed to indicate that everyone from really aggressive vegans to Pol Pot belonged to one big network of knowing and premeditated evil. You hear that on talk radio all the time, but never from a winner of the Bancroft Prize for American history. Or almost never.

Recognizing that Genovese’s “open letter to the left [was] intended to provoke,” Dissent’s editors “circulated it to people likely to be provoked” and published their responses, and Genovese’s reply, in later issues. The whole exchange is available in PDF here.

Unfortunately it did not occur to the editors to solicit a response from either Phyllis or Julius Jacobson, the founders of New Politics, a small journal of the anti-Stalinist left, which has somehow managed to stay afloat since their deaths in recent years. (Full disclosure: I’m on its editorial board.) They read “The Question” as soon as it came out. If my memory can be trusted, one or the other of them (possibly both: they finished each other’s sentences) called it “blockheaded.” Coming as it did from septuagenarian Trotskyists, “blockheaded” was a temperate remark.

But Julius, at least, had more to say. He’d served as campus organizer for the Young Socialist League at Brooklyn College in the late 1940s and early ‘50s, when Genovese was there. They crossed paths – how could they not? – and Julius remembered him as a worthy opponent. Genovese could defend the twists and turns in Stalin’s policies with far more skill than most CP members and supporters, whose grasp of their movement’s history and doctrine boiled down to the sentiment that the Soviet Union was, gosh, just swell.  

Julius was not prone to losing debates, but it’s clear that these ideological boxing matches went into overtime. Picturing the young Genovese in battle, I find the expression “more Stalinist than Stalin” comes to mind. But that’s only part of it. He was also -- what’s much rarer, and virtually paradoxical -- an independent Stalinist. He brought intelligent cynicism, rather than muddled faith, to making his arguments. An article by the American historian Christopher Phelps demonstrates that Genovese “knew full well and openly acknowledged the undemocratic nature and barbaric atrocities of the Communist states” but refused to “condemn their crimes unequivocally in his writings” and denounced anyone who did. “It serves no purpose,” Genovese wrote, “to pretend that `innocent' -- personally inoffensive and politically neutral -- people should be spared” from revolutionary violence. (Phelps was a graduate student when he published the commentary in 1994. Today he teaches in the American and Canadian Studies program at the University of Nottingham.)

Genovese wasn’t a political hack; his opinions had the veneer of serious thought, thanks in no small part to the fact that he also became an extremely cogent analyst of the history of American slavery.  When he no longer had a tyranny to support, he “discovered” how complicit others had been, and began warning the world about the incipient totalitarianism of multiculturalism. His studies of the intellectual life of the slaveholding class began to show ever more evident sympathy for them – a point discussed some years ago in “Right Church, Wrong Pew: Eugene Genovese & Southern Conservatism,” an article by Alex Lichtenstein, an associate professor of history at Indiana University, which I highly recommend. Genovese’s scholarship has been influential for generations, and it will survive, but anyone in search of political wisdom or a moral compass should probably look elsewhere.

 

Editorial Tags: 

Essay on James Harvey Robinson centennial and 'Doing Recent History'

This year is the centenary of James Harvey Robinson’s book The New History: Essays Illustrating the Modern Historical Outlook, which made a case for teaching and writing about the past as something other than the record of illustrious men gaining power and then doing things with it.

“Our bias for political history,” he wrote, “led us to include a great many trifling details of dynasties and military history which merely confound the reader and take up precious space that should be devoted to certain great issues hitherto neglected.” The new breed of historians, such as the ones Robinson was training at Columbia University, would explore the social and cultural dimensions of earlier eras -- “the ways in which people have thought and acted in the past, their tastes and their achievements in many fields” – as well as what he called “the intricate question of the role of the State in the past.”

One hundred years and several paradigm shifts later, this “new history” is normal history; it’s not obvious why Robinson’s effort was so provocative at the time. You can see how it might have upset turf-protecting experts concerned with, say, whether or not Charles the Bald was actually bald. But it also promised to make connections between contemporary issues and knowledge of the past -- or threatened to make those connections, to put it another way.

Hold that thought for now, though. Jumping from 1912 to the present, let me point out a new collection of papers from the University of Georgia Press called Doing Recent History, edited by Claire Bond Potter and Renee C. Romano. (Potter is professor of history at the New School, Romano an associate professor of history at Oberlin College.)

There’s something puzzlingly James Harvey Robinson-ish about it, even though none of the contributors give the old man a nod. It must be a total coincidence that the editors are publishing the collection just now, amidst all the centennial non-festivities. And some of Robinson’s complaints about his colleagues would sound bizarre in today’s circumstances – especially his frustration at their blinkered sense of what should count as topics and source materials for historical research. “They exhibit but little appreciation of the vast resources upon which they might draw,” he wrote, “and unconsciously follow for the most part, an established routine in their selection of facts.”

As if in reply, the editors of Doing Recent History write: “We have the opportunity to blaze trails that have not been marked in historical literature. We have access to sources that simply do not exist for earlier periods: in addition to living witnesses, we have unruly evidence such as video games and television programming (which has expanded exponentially since the emergence of cable), as well as blogs, wikis, websites, and other virtual spaces.”

No doubt cranky talk-show hosts and unemployed Charles the Bald scholars will take umbrage at Jerry Saucier’s paper “Playing the Past: The Video Game Simulation as Recent American History” – and for what it’s worth, I’m not entirely persuaded that Saucier’s topic pertains to historiography, rather than ethnography. But that could change at some point. In “Do Historians Watch Enough TV? Broadcast News as a Primary Source,” David Greenberg makes the forceful argument that political historians tend to focus on written material to document their work: a real anachronism given TV’s decisive role in public life for most of the period since World War II. He gives the example of a sweeping history of the Civil Rights movement that seemed to draw on every imaginable source of documentation -- but not the network TV news programs that brought the struggle into the nation's living room. (The historian did mention a couple of prime-time specials, but with no details or reason to suppose he'd watched them.) Likewise, it’s entirely possible that historians of early 21st-century warfare will need to know something about video games, which have had their part in recruiting and training troops.

Besides the carefully organized, searchable databases available in libraries, historians have to come to terms with the oceans of digital text created over the past quarter-century or so -- tucked away on countless servers for now, but posing difficult questions about archiving and citation. The contributors take these issues up, along with related problems about intellectual property and the ethical responsibility of the historian when using documents published in semi-private venues online, or deposited in research collections too understaffed to catch possible violations of confidentiality.

In “Opening Archives on the Recent Past: Reconciling the Ethics of Access and the Ethics of Privacy, “ Laura Clark Brown and Nancy Kaiser discuss a number of cases of sensitive information about private citizens appearing in material acquired by the Southern Historical Collection of the University of North Carolina at Chapel Hill. For example, there's the author whose papers include torrid correspondence with a (married) novelist who wouldn't want his name showing up in the finding aid. Brown and Kaiser also raise another matter for concern: “With the full-text search capabilities of Google Books and electronic journals, scholarly works no longer have practical obscurity, and individuals could easily find their names and private information cited in a monograph with even a very small press run.”

The standard criticism of James Harvey Robinson’s work among subsequent generations of professional historians is that his “new history” indulges in “presentism” – the sin of interpreting the past according to concerns or values of the historian’s own day. In Robinson’s case, he seems to have been a strong believer in the virtues of scientific progress, in its continuing fight against archaic forms of thought and social organization. With that in mind, it’s easier to understand his insistence that social, cultural, and intellectual history were at least as important as the political and diplomatic sort (and really, more so). Students and the general public were better off learning about “the lucid intervals during which the greater part of human progress has taken place,” rather than memorizing the dates of wars and coronations.

None of the contributors to Doing Recent History are nearly that programmatic. Their main concern is with the challenge of studying events and social changes from the past few decades using the ever more numerous and voluminous sources becoming available. Robinson’s “new history” tried to make the past interesting and relevant to the present. The “recent history” people want to generate the insights and critical skills that become possible when you learn to look at the recent past as something much less familiar, and more puzzling, than it might otherwise appear. I'm struck less by the contrast than the continuity.

Robinson would have loved it. In fact, he even anticipated their whole project. “In its normal state,” he wrote one hundred years ago, “the mind selects automatically, from the almost infinite mass of memories, just those things in our past which make us feel at home in the present. It works so easily and efficiently that we are unconscious of what it is doing for us and of how dependent we are upon it.”

Our memory — personal and cultural alike – “supplies so promptly and so precisely what we need from the past in order to make the present intelligible that we are beguiled into the mistaken notion that the present is self-explanatory and quite able to take care of itself, and that the past is largely dead and irrelevant, except when we have to make a conscious effort to recall some elusive fact.” That passage would have make a good epigraph for Doing Recent History, but it’s too late now.

Editorial Tags: 

Appeals court rejects researchers' bid to protect oral history confidentiality

Smart Title: 

U.S. appeals court bars researchers’ bid to quash subpoena seeking oral history records at Boston College.

Irish historian considers significance of fight over papers at Boston College

Irish historians have watched the legal case relating to the witness statements from participants in the conflict in Northern Ireland held by Boston College with great interest and with no little trepidation.

Regardless of the ultimate outcome of the case, there are real fears that the controversy has already jeopardized the collection and preservation of historical material relating to the conflict in Northern Ireland.

One friend, who was instrumental in helping University College Dublin Archives to secure a significant collection of private papers that includes material relating to the Northern Ireland peace process, remarked recently that it would have been more difficult to convince the donor to preserve his papers and donate them to an archive if the controversy at Boston College had previously been known.

The great difficulty here is that any comprehensive history of the Northern Ireland conflict will be very dependent on statements from the men and women who were directly engaged in the events: republicans, loyalist paramilitaries, police, British army personnel, politicians, public servants, and the ordinary people whose lives were shaped by the conflict. The nature of the conflict in Northern Ireland was such that no existing archive can expect to stand as sufficient sources for the writing of plausible history; the words of the people who lived through (and participated in) the conflict need to be preserved to allow for the creation of a more meaningful historical record.

The Boston College interviews are one of several series of interviews that currently exist, or are now being collected. Oral history is especially important if we are to tell the story of everyday life during these years, and the motivations and reflections of men and women who did not hold positions of leadership.

Irish historians are very conscious of the importance of such testimonies, because a comparable archive exists relating to the 1916 Rising and the Irish war of independence. In the 1940s and early 1950s the Bureau of Military History – funded by the Irish government – collected statements from men and women who participated in these events. Some of those men and women engaged in violence or other acts about which they might not have been willing to speak publicly. The statements were finally released in 2004, 50 years after they were collected, when all the witnesses had died.

Although this delay has been criticized, it shows a respect for the witnesses and indeed for all who were affected by the events narrated in these testimonies. These statements, and the anticipated release shortly of thousands of Military Pension Files, containing further firsthand statements from those involved in the War of Independence, provide a permanent and valuable record of a critical period in the emergence of contemporary Ireland.

These firsthand accounts have transformed the understanding of these years, bringing it to life in a manner that more formal records cannot do.

The oral statements of participants in the conflict in Northern Ireland offer a similar potential to provide a rounded account of these years.  This will only happen, however, if those making statements can trust the record-taker, and trust the place where these records are deposited.  

This trust requires firm assurances that the statements will not be released prematurely, or divulged other than under the terms agreed.  The witness statements should be collected with the intent of creating a long-term historical record; while there may be an understandable eagerness to gain access to them, in order to be first with the story – they are best left undisturbed for a significant period of time.  Essentially, they should be collected and protected for posterity – not for the present.

University College Dublin (UCD), in common with other research universities, has a clear code of ethics that applies to all material that relates to identifiable individuals; securing their consent to any use that permits them to be identified is a key requirement.

In addition researchers and archivists must observe the requirements of the Data Protection Act, which precludes the revealing of personal information – relating to matters such as health, family circumstances or financial records, and these regulations are strictly enforced. Many of the private collections deposited in UCD Archives can only be accessed with the permission of the donor.

While testimonies relating to paramilitary activities are obviously of a particularly sensitive nature, there are recognized laws and procedures in place that protect the witness, the archive, the archivist and the researcher – provided that they are observed.

The issue may become more complex when records are transferred from one country to another, if the legal framework relating to data protection and disclosure is different, but again, a robust protocol and clearly-determined governance – agreed before any records are compiled – should reduce these risks.

Oral histories are extremely valuable sources for posterity, and they are becoming of still greater importance in an age when communication increasingly takes the form of telephone conversations, e-mails, texts, tweets and other means; these are obviously less easily preserved than letters or written memorandums.  

Ultimately, there will be lessons to be learned from the specifics of the Boston College case. The overarching ambition must remain unchanged: to ensure that a trusted record of the past can be compiled and preserved for posterity.

Mary E. Daly is professor of modern Irish history at University College Dublin.

Review of Nancy K. Bristow, "American Pandemic: The Lost Worlds of the 1918 Influenza Epidemic"

Intellectual Affairs

It was a classic instance of blaming the messenger: Spanish newspapers carried the earliest reports of a new illness that spread across the globe in the final months of World War I, and so it be came known as “Spanish influenza,” although its real point of origin will never be known. It was virulent and highly communicable. A paper appearing in the Centers for Disease Control and Prevention journal Emerging Infectious Diseases a few years ago estimated that 500 million people, almost a third of the world’s population, were stricken with it. By the end of its reign of terror in the final months of 1920, there were 50 million fatalities -- more than three times as many as died from the war itself. These figures may be on the low side.

In her two long essays on illness, Susan Sontag grappled with the strong and longstanding tendency to treat certain diseases as meaningful: the vehicle for metaphors of social or cultural disturbance. “Feelings about evil are projected onto a disease,” she wrote. “And the disease (so enriched with meanings) is projected onto the world." Just so, one would imagine, with a pandemic. Something in a plague always hints at apocalypse.

But the striking thing about Spanish influenza is how little meaning stuck to it. Plenty of sermons must have figured the Spanish flu as one of the Four Horsemen, at the time, but the whole experience was quickly erased from collective memory, at least in the United States. In 1976, the historian Alfred W. Crosby published a monograph called Epidemic and Peace: 1918 that Cambridge University Press later issued as America’s Forgotten Pandemic (1989). Apart from being snappier, the new title underscored the almost total disappearance from anything but the specialist’s sense of history. One person in four in the U.S. suffered from an attack of Spanish flu, and it killed some 675,000 of them. The catastrophe seems never to have interested Hollywood, though, and the only work of fiction by an author who lived through the outbreak, so far as I know, is Katherine Anne Porter’s novella “Pale Horse, Pale Rider.” (Biblical imagery seems just about unavoidable.) (Note: This article was updated from an earlier version to correct the name of the author of Epidemic and Peace: 1918.)

The title of Nancy K. Bristow’s American Pandemic: The Lost Worlds of the 1918 Influenza Epidemic (Oxford University Press) is an echo of Cross’s America’s Forgotten Pandemic. I don’t want to read too much into the one-word difference, but it does seem that the influenza crisis of almost a century ago has been working its way back into public awareness in recent years. Several more books on the subject have appeared since Gina Kolata’s best-seller Flu: The Story of the Great Influenza Pandemic of 1918 and the Search for the Virus That Caused It came out in 1999. The Public Broadcasting Service has done its part with an excellent documentary as well as an episode of Downton Abbey in which pestilence hits a country house in England during a dinner party.

So “forgotten” is no longer quite the right word for the nightmare. But it remains almost impossible to imagine the ferocity of the pandemic, much less its scale. The contemporary accounts that Bristow draws on retain their horror. Doctors wrote of patients changing from “an apparently well condition to almost prostration within one or two hours,” with raging fevers and severe pain in even the milder cases – and the worst involving a “bloody exudate” coughed up from the “peculiar and intense congestion of the lungs with [a] hemorrhage,” so that it was “simply a struggle for air until they suffocate.”

Morgues were overrun. In poor households, several delirious family members might be crowded into the same bed along with someone who had died. Those who made it to the hospital could lie unattended for days at a time. The authorities were issuing “don’t worry, it’s just another flu”-type pronouncements well into the catastrophic phase of the epidemic. Quarantines and bans on public gatherings were easier to proclaim than to enforce. Having absorbed the relatively new idea that disease was spread by germs, people donned surgical masks to protect themselves – to no avail, since influenza was a virus. The epidemic went through three waves of contagion in as many years, and it wore down whatever patience or civic-mindedness people had when the disaster hit.

A pandemic, by definition, puts everyone at risk. But access to medical help – inadequate as it proved – was far less egalitarian. (As is still the case, of course.) Much of the historical scholarship on disease in recent decades has stressed how the interaction between medical professionals and their clientele tends to reinforce the social hierarchies already in place. Bristow’s work follows this well-established course, combining it with a familiar emphasis on the changes in medicine’s public role in the wake of Progressive Era reforms.

She writes about how poor, immigrant, or Native American sufferers were assumed guilty “of dishonesty and laziness, and of attempting to take advantage of others’ generosity” until proven otherwise, while the African-American population was forced “to continue relying on their own too limited community resources as they sought to provide sufficient care for their sick neighbors.” And while the U.S. Public Health Service had been created in 1912, its capacity to respond to the influenza crisis was limited, given how poorly the disease was understood. Even gathering reliable statistics on it proved almost impossible while the virus was on its rampage.

The most interesting chapter of America’s Pandemic considers how doctors and nurses responded to the crisis. Although they often worked side-by-side together, their experiences were a marked contrast.

“Ignorant of the disease’s etiology, uncertain of the best methods of treatment, and unable to ease the suffering of their patients,” Bristow writes, “physicians often expressed a sense of helplessness as individuals and humility as members of a profession.” (You know something is catastrophic when it reduces doctors to humility.)

Belonging to an almost entirely male profession, they “gauged their work against the masculine standards of skill and expertise” – and the inevitable military metaphor of going to battle against the disease became that much more intense given the example of actual soldiers fighting and dying in the trenches. But the influenza virus was stronger. “Like a hideous monster,” one physician wrote, “he went his way, and none could hinder.” Doctors’ letters and diaries from the period reflect a sense of bewilderment and failure.

For a while the authority of the profession itself was undermined. Patent medicines and related quackery proved no more effective in treating or preventing the disease than anything the doctors could offer. But they weren’t any less effective, either.

The nurses could not have responded more differently. Caring for patients was “a terrific test” and “high privilege,” they wrote, “a most horrible  and yet most beautiful experience.” As with doctors, many lost their lives while tending to the sick. But one nurses’ report said that the work was “one of the most immediately satisfactory experiences of our lives” for those who survived it, “and this is true even though we were borne down with the knowledge that, do all we might, the pressing, tragic need for nursing was much greater than could possibly be met.”

And this, too, was a matter of gauging their skill by socially sanctioned gender norms. “Women working as nurses aspired to what they viewed as the uniquely feminine qualities of domesticity, compassion, and selflessness,” writes Bristow. “To measure up to these standards nurses needed only to care for their patients, not cure them, and this they proved able to do.”

A few hours after choosing American Pandemic for this week’s column, I attended a public event at which every third person seemed to be coughing, with a congested wheeze usually audible. Synchronicity is not always your friend. For the past several days I have been reading about influenza, and writing about it, while suffering from one of its milder variants. The experience is not to be recommended.

Two quick points might be worth making before the medication kicks in. Bristow’s final assessment is that the horror and devastation of the pandemic could not be reconciled with the preferred national narrative of progress and redemption, “with its upbeat tenor and its focus on a bright future.” At most, its memory was preserved as part of local history, or through family stories.

The argument is plausible, to a degree, but it overlooks the element of trauma involved – not just the suffering and death during that period, but the feeling of being rendered helpless by an event that’s come out of nowhere.

And what sense does it make to think of the events of 1918-20 as “America’s pandemic,” forgotten or otherwise? Deaths from influenza in the United States during that period represent something like 1.4 percent of the world’s fatalities from the disease. How was it remembered -- or erased from public memory -- elsewhere? Diseases don’t respect borders, and it’s hard to see why the historiography of disease should, either.
 

Historians start effort to define what graduates should be able to do

Smart Title: 

In first effort of its kind in the U.S., a discipline works to define what graduates of its programs should be able to do -- from associate degree through the Ph.D.

Review of Liette Gidlow, "Obama, Clinton, Palin"

Intellectual Affairs

It's hard to think of with three living figures in American politics who generate more passion than the ones named in the title of Obama, Clinton, Palin: Making History in Election 2008, a collection of essays edited by Liette Gidlow and published by the University of Illinois Press. The word “passion” here subsumes both ardor and loathing. I doubt it is intentional, but the photographs on the book’s cover are arrayed such that they seem almost attached to one another, like Siamese triplets perhaps, or some beast with multiple heads in one of the more psychedelic passages of Biblical prophecy. If the 2012 campaign doesn’t give you nightmares, that image still might.

Gidlow, the editor, is an associate professor of history at Wayne State University, and the 11 other contributors are all historians as well. Their essays frame the 2008 campaign as a late episode in the country’s uneven progress toward incorporating anybody other than white men into elected government. Every so often we hear that the United States entered the “post-feminist” or “post-racial” era at some unspecified point in the (presumably) recent past. But reality has a way of asserting itself, and the next thing you know there are people demonstrating against the president with signs that show him as a cannibal with a bone through his nose, or a politician responds to a female heckler by hinting that she should perform a sexual service for him.

Debates about race and gender came up often during the 2008 primaries and the election season that followed, so the book’s emphasis is hardly misplaced. Much discussed at the time was each campaign’s groundbreaking status in the history of presidential contests -- with Palin being the first woman to run on the Republican ticket, while Obama was the first African-American, and Clinton the first woman, to have a serious shot at the Democratic nomination.

That is true, but it is blinkered. If the essays in Obama, Clinton, Palin could be reduced to a single theme, it might be that the history-making campaigns of 2008 were also products of history, or echoes of it, as well. The most interesting chapter in that regard is Tera W. Hunter’s “The Forgotten Legacy of Shirley Chisholm,” which recalls the African-American Congresswoman’s presidential bid in 1972.

Chisholm had no hope of winning, and knew it, but paved the way for Hillary Clinton and Barack Obama. She was, Hunter says, “antiracist, antisexist, pro-choice, pro-labor, antiwar, fiercely independent, and, above all, principled.” But the point of invoking her memory is hardly to treat the Clinton or Obama campaigns as rightful heirs. In Hunter’s reading, the Democratic primaries of 2008 were a travesty of Chisholm’s effort.

“Hillary Clinton never spoke openly, critically, or engagingly about the status of women in our society, the problems of gender discrimination, and what we should do about it,” writes Hunter. As the competition heated up, Clinton “became more forthright in claiming to be the victim of gender discrimination,” while Obama “continued to be reluctant to dwell on issues related to race and racism.” By contrast, Chisholm “challenged the racist attitudes and practices in the women’s movement,” Hunter writes, “as much as she challenged sexism among African-Americans and the broader society.”

A cynic might reply that she could afford to do that precisely because she was not trying to get elected. To win, you pander, and when somebody complains, you try to figure out how to pander to them, too. But running a winning campaign involves neutralizing reservations as much as enlisting allegiance. On that score Obama “faced the challenge of calming white fears,” Hunter writes, “of reassuring the populace that he was not an ‘angry black man’ seeking racial retribution.” (The point is also made in “Barack Obama and the Politics of Anger” by Tiffany Ruby Patterson, who recalls how the candidate navigated the controversy over Rev. Jeremiah Wright’s fire-next-time sermons.)

Susan M. Hartmann’s “Hillary Clinton’s Candidacy in Historical and Global Context” offers one of the book’s analyses of how gender stereotypes and media sexism created obstacles for the candidate – even as she “benefited not only from her husband’s name and popularity, but also from the access he afforded” to sundry political resources. “By contrast,” Hartmann says, “Republican vice presidential candidate Sarah Palin escaped much of the gender hostility that Clinton faced,” largely because of “her strong right-wing credentials, importantly including opposition to much of the feminist agenda.”

Indirectly challenging that claim is Catherine E. Rymph’s “Political Feminism and the Problem of Sarah Palin.” Rymph makes the case for regarding Palin as the legatee of a strain of G.O.P. feminism going back to the 1940s, when “Republicans made up a greater number of women serving in Congress” than did Democrats. Their party platform endorsed the Equal Rights Amendment in 1940 – four years before the Democrats did. (See also the abundant scholarship on the role of women activists on the right, discussed by Kim Phillips-Fein shows in “Conservatism: A State of the Field,” her thorough survey of recent historiography.)

Clinton and Palin were both “presented in sexualized ways” during their campaigns, Rymph points out: “Clinton was a castrating bitch, while Palin was a ‘MILF’ (presumably more flattering, but equally degrading).” Opponents were relentless in mocking Palin’s hair, clothes, days as a beauty-pageant competitor and the like, which Rymph cites as evidence that “Americans of all stripes can tolerate and even embrace sexism when it is used as a weapon against women with whom they disagree or whom they see as representing the wrong picture of womanhood.”

Thanks to Google, several contributors are able to document the racist and misogynistic rage churning throughout the primaries and campaigns. This is the third or fourth academic press publication I’ve read in the past few months to quote extensively from blog posts, comments sections, Facebook dialogs, and so forth. The effect is sometimes informative or illuminating, but usually it isn’t.  You get used to seeing chunks of semiliterate ranting online, but it’s still mildly disconcerting to find it in cold type, properly cited, with scholarly apparatus.

The poisonous material quoted in “Michelle Obama, the Media Circus, and America’s Racial Obsession” by Mitch Katchum makes it perhaps the most horrifying chapter in the book. It is undoubtedly necessary to the job of showing the double dose of stereotyping (“angry black woman,” “Jezebel,” “baby mama”) that emerged in 2008 campaign, and set the tone for much that’s followed. But in the future, historians might do well to focus on computerized content analysis of digital chatter, rather than exhibiting samples, because it does not take that long to reach the threshold marked ad nauseum.

I’ve discussed a few papers in Obama, Clinton, Palin, not attempted a comprehensive review. But one general impression bears mentioning, and a look through the index confirms it: there are no entries for Afghanistan, Iraq, Lehmann Brothers, terrorism, torture, or the Troubled Asset Relief Program, nor any other major issue at stake in 2008. All are mentioned at some point in the book. But the index-maker can't be faulted, because they are always quite peripheral to the project.

What we get, then, is political history at a considerable remove from questions of governance. It’s certainly possible to argue that combat over race or gender in a presidential campaign may serve as a proxy for debates over social or economic policy. But that argument has to be made. Otherwise it seems as if the only issue in an election is whether the most powerful elected offices in the country should or should not be more demographically representative.

In any case, the 2012 presidential race has been pretty uneventful in the politics-of-difference department -- so far, anyway. I contacted Liette Gidlow, the editor of the book, to ask what she made of the contrast with four years ago.

“I do think that the 2008 campaigns expanded leadership opportunities for African-Americans, women, and others in a lasting way,” she responded by e-mail. “The political contests so far this year would seem to suggest otherwise; though Michele Bachmann and Herman Cain had their moments, ultimately their campaigns failed to win broad support among Republicans. But for the past 40 years, it has been the Democratic party, not the Republican party, that has been the driving force behind diversity in political representation, and with the primary contests limited to the Republicans this year, we shouldn't be surprised to see that the field has been dominated by white men. Which doesn't mean that in future presidential contests the Democrats will offer a slate that ‘looks like America’ or that the Republicans won't. But every time a candidate who departs from our usual expectations succeeds, it expands our ability to imagine, and ultimately to accept, different kinds of people as leaders.”

That seems fair enough, all in all. But it leaves open the question of what difference it makes, if any, after that. It certainly felt like something was changing on election night in 2008, but four years later, I often wonder what it was.

Pages

Subscribe to RSS - History
Back to Top