History

Wrong Things, Rightly Named

Suppose that, 30 or 40 years ago, the news media of the West had gotten hold of a KGB document reviewing its experiences in interrogating those who posed a menace to the peace, progress, and stability of the People’s Democracies. For younger readers, perhaps I should explain that the Soviet Union and its client states liked to call their system by that cheerful term. And yes, they were serious. Self-deception is a powerful force, sometimes.

Suppose the report listed such methods of information-gathering as beatings, suffocation, and mock executions. And suppose, too -- on a lurid note -- that it mentioned using threats to murder or sexually violate members of a prisoner’s family. Now imagine numerous pages of the report were redacted, so that you could only guess what horrors they might chronicle.

With all of that as a given, then... How much debate would there have been over the moral status of these acts? Would someone who insisted that they did not constitute torture get a hearing? Could a serious case be made that it was in the best interests of justice to move forward without dwelling on the past?

If so, would such arguments have been presented in major newspapers, magazines, and broadcasts? Or would they have been heard in out-of-the-way meeting halls, where the only cheer was borrowed from posters of the National Council of American-Soviet Friendship?

This thought experiment comes to mind, of course, in the wake of reading about the report of the CIA’s Office of the Inspector General. The analogy is not perfect by any means. No comparable degree of “openness” (however grossly inappropriate that word seems in this case) existed on the other side of the old Iron Curtain. But let’s not cheer ourselves hoarse over that fact just yet.

Actions that would have been judged without hesitation to be torture if conducted by a member of the East German secret police (or, in the case of waterboarding, by the Khmer Rouge) did not meet the wonderfully scrupulous standards laid out seven years ago by the Department of Justice’s Office of Legal Counsel. If more testimony to the power of self-deception needed, this would do.

When the CIA made its evaluation of various bloody-minded interrogation practices in 2004, the Bush administration’s response was reportedly frustration that the techniques hadn’t been more effective. The assessment of the Obama administration seems to be that torture has been both unproductive and damaging for “soft power” – a public-relations nightmare. This is progress, of a kind. If somebody decides to give up sociopathic behavior on the grounds it is proving bad for business, that is only just so much reason for relief. But it is preferable to the alternative.

It might be possible to hold ourselves to higher standards than that. But first it would be necessary to face reality. One place to start is Tzvetan Todorov’s little book Torture and the War on Terror, first published in France last year and now available in translation from Seagull Books (distributed by the University of Chicago Press).

Todorov once lived in what was called, at the time, the People’s Republic of Bulgaria. As an émigré in Paris in the 1960s, he wrote The Poetics of Prose and other standard works in structuralist literary criticism – as well as a study of the idiosyncratic Russian theorist Mikhail Bakhtin that, in my opinion, made Bakhtin’s thought seem a lot more systematic than it really was.

Over the past quarter century, Todorov’s concerns have shifted from the structuralist analysis of literary language to a moral inquiry into the historical reality of violence and domination, including books on the Nazi and Stalinist concentration camps.

Torture and the War on Terror is more pamphlet than treatise. Some pages revisit points that ought to be familiar to anyone who has given any thought to the experience of the past eight years. To grasp the essential meaninglessness of a phrase like “war on terror” (you can’t bomb a state of mind) does not require a degree in linguistics or rhetoric. But then, the ability to state the obvious can have its uses.

The document prepared by the Justice Department in August 2002 carefully parsed its definition of torture so that it covered only acts leading to the "severe pain" characteristic of permanent “impairment of bodily function.” Todorov does not hesitate to specify what is going on within that semantic maneuver: “The reasoning of this memorandum – paradoxically so, for a legal document prepared by competent jurists – proceeds from a form of magical thinking insofar as it pretends that we can act on things by changing their names.” That about covers it. The expression “magical thinking” covers a great deal of our public life in those years – a time exemplified by the consistently miraculous heroics of Jack Bauer on “24.”

As both a student of the phenomenon of state violence and a former resident of People’s Bulgaria, Todorov is willing to recognize and name what has been going on this past decade. We need to read the following and remember that it is what goes in the history books:

“In prisons scattered throughout countries outside the United States, the detainees have been regularly raped, hung from hooks, immersed in water, burned, attached to electrodes, deprived of food, water or medicine, attacked by dogs, and beaten until their bones are broken. On military bases or on American territory, they have been subjected to sensory deprivation and to other violent sensory treatments, forced to wear headphones so they cannot hear, hoods so they cannot see, surgical masks to keep them from smelling, and thick gloves that interfere with the sense of touch. They have been subjected to nonstop ‘white noise’ or to the regular alternation of deafening noise and total silence; prevented from sleeping, either by use of bright lights or by being subjected to interrogations that can last twenty hours on end, forty-eight days in a row; and taken from extreme cold to extreme heat and vice versa. None of these methods cause ‘the impairment of bodily function,’ but they are known to cause the rapid destruction of personal identity.”

Given the inefficacy of torture as a way to extract intelligence, its real “value” comes in the form of retribution -- and the feeling of restored mastery this permits.

“Reducing the other to a state of utter powerlessness,” writes Todorov, “gives you a sense of limitless power. This feeling is obtained more from torture than from murder, because, once dead, the other is an inert object that can no longer yield the jubilation that comes from wholly defeating the will of the other. On the other hand, raping a woman in front of her husband, parents, and children or torturing a child in front of his father yields an illusion of omnipotence and a sense of absolute sovereignty. Transgressing human laws in this way makes you feel close to the gods.... Torture leaves an indelible mark not only on the victim but also on the torturer.”

Todorov might have pushed this line of thinking (with its nod to Hegel’s dialectic of the struggle for recognition) a lot further than he does. The “indelible mark” can take various forms, and it is not restricted to those who directly wield the instruments of torture.

The craving for “an illusion of omnipotence and a sense of absolute sovereignty” is something best channeled into wish fulfillment-oriented forms of entertainment. There it can be aggrandized yet contained. Money and commodities change hands; the consumer gets a catharsis of sorts; civil society muddles along, and everybody wins.

When sectors of the populace come to regard its pursuit in reality as a necessary part of the business of the state, things are on a different and more worrying terrain. A host of strange side effects then follow – including nostalgia for 9/11 itself in some quarters, since the country was so deeply “unified” on 9/12. A scarcely concealed yearning for another terrorist assault makes perfect sense, given that it would presumably justify another sustained effort to assert American omnipotence and sovereignty. (In saying it “makes perfect sense,” I mean, of course, in a perfectly insane way.)

“As a rule,” writes Todorov, “citizens in liberal democracies will condemn without hesitation the violent practices of a state that will tolerate torture, and especially of a state that systematizes its use, as in the case of totalitarian regimes. Now we have discovered that these same democracies can adopt totalitarian attitudes without changing their overall structure. This cancer does not eat away at a single individual; its metastases are found in people who thought they had eradicated it in others and considered themselves immune. That is why we cannot be reassured.”

True enough. But we have a long way to go before reassurance will be desirable, let alone possible.

Author/s: 
Scott McLemee
Author's email: 
scott.mclemee@insidehighered.com

The Public Option

Shortly after last week’s column appeared, I headed out to Iowa City to attend -- and, as the occasion required, to pontificate at -- a gathering called Platforms for Public Scholars. Sponsored by the Obermann Center for Advanced Studies at the University of Iowa, it drew somewhere between 100 and 150 participants over three days.

This was the latest round in an ongoing conversation within academe about how to bring work in the humanities into civic life, and vice versa. The discussion goes back almost a decade now, to the emergence of the Imagining America consortium, which fosters collaboration between faculty at research universities and partners in community groups and nonprofit organizations.

That effort often runs up against institutional inertia. You sense this from reading "Scholarship in Public: Knowledge Creation and Tenure Policy in the Engaged University" (the report of the consortium's Tenure Team Initiative, released last year). Clearly there is a long way to go before people in the humanities can undertake collaborative, interdisciplinary, and civic-minded work without fearing that they are taking a risk.

Even so, the presentations delivered in Iowa City reported on a variety of public-scholarship initiatives -- local history projects, digital archives, a festival of lectures and discussions on Victorian literature, and much else besides. Rather than synopsize, let me recommend a running account of the sessions live-blogged by Bridget Draxler, a graduate student in English at the University of Iowa. It is available at the Web site of the Humanities, Arts, Sciences, and Technology Advanced Collaboratory (better known as HASTAC, usually pronounced “haystack”).

Word went around of plans to publish a collection of papers from the gathering. I asked Teresa Mangum, a professor of English at U of I, who organized and directed the event, if that was in the cards. She “built the platform,” as someone put it, and presided over all three days with considerable charm -- intervening in the discussion in ways that were incisive while also tending to foster the collegiality that can be elusive when people come from such different disciplinary and professional backgrounds.

“My goal is to have some kind of ‘artifact’ of the conference,” she told me, “but I'm trying to think more imaginatively what it might be ... possibly a collection of essays with a Web site. We definitely want to produce a online bibliography but maybe trying to use the Zotero exhibition approach there.”

It was a symposium in the strict sense, in that food was involved. Also, beverages. On the final day, a roundtable assessment of the whole event was the last item on the agenda -- only for this discussion to be bumped into the farewell dinner when things ran long.

Unfortunately I was unable to attend, for fear that a persistent hacking cough was turning me into a pandemic vector. Instead, I retired to the hotel to scribble out some thoughts that might have been worth taking up at the roundtable. Here they are -- afterthoughts, a little late for the discussion.

Most people who attended were members of the academic community, whether from Iowa or elsewhere, and most of the sessions took place in university lecture halls. But the first event on the first day was held at the Iowa City Public Library. This was a panel on new ways of discussing books in the age of digital media -- recounted here by Meena Kandasamy, a young Tamil writer and translator whose speech that evening rather stole the show.

Holding the event at the public library opened the proceedings up somewhat beyond the usual professorial demographic. At one point, members of the panel watched as a woman entered with her guide dog, stretched out on the ground at the back of the room, and closed her eyes to listen. At least we hoped she was listening. I think there is an allegory here about the sometimes ambiguous relationship between public scholarship and its audience.

In any case, the venue for this opening session was important. Public libraries were once called “the people’s universities.” The populist impulse has fallen on some scurvy times, but this trope has interesting implications. The public library is an institution that nobody would be able to start now. A place where you can read brand-new books and magazines for free? The intellectual property lawyers would be suing before you finished the thought.

So while musing on collaborative and civic-minded research, it is worth remembering the actually existing public infrastructure that is still around. Strengthening that infrastructure needs to be a priority for public scholarship -- at least as much, arguably, as "the production of knowledge." (This phrase, repeated incessantly in some quarters of the humanities, has long since slipped its original moorings, and owes more to American corporate lingo than to Althusser.)

Institutions can be narcissistic; and one symptom of this is a certain narrowly gauged conception of professionalism. often indistinguishable in demeanor from garden-variety snobbery. Any real progress in consolidating the practice of public scholarship has to involve a strengthening of ties with people in the public sector -- especially librarians and teachers.

It is not that scholars exist over here while something called “the public” is over there -- off in the distance. Rather, people are constituted as a public in particular spaces and activities. The university is one such site, at least sometimes. But it isn’t the only one, and public scholarship needs to have moorings in as many such venues as possible.

The problem being that it is often hard enough to drop an anchor in academe, let alone in the wide Sargasso Sea of civil society. I am not a professor and have no advice to give on that score. But it seems important to pass along the comments of someone attending Platforms for Public Scholars who confided some thoughts to me during some downtime. I will pass them along by permission, but without giving away anything about this person's identity.

During one panel, a couple of tenured professors mentioned being concerned that their civically engaged scholarship might not count for promotion. One even noted that people who had done collaborative work in the humanities tended to discount it as part of a tenure file -- saying, “Well I did my mine without getting credit for it, so why should you?”

At the time, I raised an eyebrow, but didn’t really think much about it. Later, though, someone referred back to the session in tones that suggested chagrin and longstanding doubts about having a career in the humanities.

“These are people who actually are established, who have some power in their institutions," this individual told me. "I don’t have that. I don’t even have a job yet. And I want them to show some courage. If you really have a conviction that collaboration and public engagement are important, then do it without worrying so much. And support it. Make it possible for someone like me to make doing public work part of my scholarship. Otherwise, what are we even talking about?”

Author/s: 
Scott McLemee
Author's email: 
scott.mclemee@insidehighered.com

The First of the Year

The First of the Month is a cultural and intellectual publication that is singularly lively, and no less strange. It started out in 1998, in tabloid format, as a “newspaper of the radical imagination” published in Harlem. First has been compared to Partisan Review, the legendary magazine of the New York Intellectuals that began during the Depression. But honestly, that's just lazy. Any time a bunch of smart people start a magazine, somebody ends up comparing to it to Partisan Review, especially if it is published in New York; but First took its name from a song by Bone-Thugs-n-Harmony, and while I’d like to picture Delmore Schwartz doing a little freestyle rapping over scotch at the White Horse Tavern, it’s a stretch.

Following what has become the contemporary routine, the paper gave birth to a Web site; this then replaced the print edition. An anthology culled from its first decade appeared last year as The First of the Year: 2008, published by Transaction. On first approach, the book looks like a memorial service for the whole project. And an impressive one: the roster of contributors included (to give a very abbreviated and almost random roll-call) Amiri Baraka, Greil Marcus, Lawrence Goodwyn, Grace Lee Boggs, Adolph Reed, Russell Jacoby, Armond White, Kurt Vonnegut, Kate Millet, Richard Hoggart, and Ellen Willis.

I meant to give the volume a plug when it appeared; so much for the good intention. But happily, my initial impression was totally wrong. While continuing to function online (and to have its world headquarters in Harlem, where editorial collective member and impressario Benj DeMott lives) First has reinvented itself as an annual anthology. First of the Year: 2009, has just been published, which seems worth noting here, in this first column of the year.

The viability of any small-scale, relatively unprofitable cultural initiative is a function of two forces. One is the good will of the people directly involved. The other is getting support from the public – or rather, creating one.

In this case, the process is made more difficult by the fact that First is sui generis. Which is putting it politely. My own response upon first encountering it about 10 years ago involved a little cartoon balloon forming over my forehead containing the letters “WTF?” It is not simply that it is hard to know what to expect next; sometimes it is hard to say what it was you just read. In First, political commentary, cultural analysis, and personal essays sit side-by-side. But at times, all three are going on at once, within the same piece. Kenneth Burke used to refer to such jostlings of the coordinate system as creating "perspective by incongruity." It signals a breakdown of familiar formats -- a scrambling of routine associations.This is stimulating, if perplexing. The confusion is not a bug but a feature.

One familiar description of the journal that I have come to distrust treats First as a bridge between popular culture and the ivory tower. An often-repeated blurb from some years ago calls it "the only leftist publication [one] could imagine being read at both Columbia University and Rikers.”

Good advertising, to be sure. But the better the ad, the more its presumptions need checking. The whole “building a bridge” trope implies that there is a distance to be spanned – a connection between enclaves to be made. (The ideas are over here, the masses over there.) But reading First involves getting oriented to a different geography. Some academics do write for it, but they do not have pride of place among the other contributors, who include poets and musicians and journalists, and people who might best just be called citizens. The implication is not that there is distance to be crossed, but that we're all on common ground, whether we know it, or like it, or not.

In the wake of 9/11, some writers for First (not all of them) rallied to the call for a war, and at least one endorsed George W. Bush during the 2004 campaign. Does that mean that First is actually “the only ‘neoconservative’ publication read in both academe and prisons”? Well, no, but funny you should ask, because it underscores the convenience made possible by pre-gummed ideological labels.

At times they are useful (I tend to think "social-imperialist" is a pretty good label for the idea that "shock and awe" was necessary for historical progress in Iraq) but not always.

The discussion of Obama in the new volume is a case in point. Both Paul Berman (a Clintonian liberal who supported the Iraq War) and Amiri Baraka (who takes his political bearings from Mao Tsetung Thought) concurring that the 2008 election was a transformative moment. This is, let's say, an unanticipated convergence. Meanwhile, Charles O’Brien (an editorial collective member who endorsed Bush in ‘04, on more or less populist grounds) treats Obama as a short-circuit in the creation of a vigorous radicalism-from-below needed for social change. “Of the Obama campaign, what endures?” he asks. “The new Pepsi ad.”

It would be wrong to see First as yet another wonk magazine with some cultural stuff in it. Nor is one of those journals (edited on the bridge, so to speak) in which the latest reality-TV show provides the excuse for yet another tour of Foucault’s panopticon. Politics and culture come together at odd angles in the pages of First, -- or rather, each spins out from some vital center that proves hard to pin down. Margin and mainstream are configured differently here.

I tried to get a handle on First's particularity by talking to Benj DeMott, who edited the two anthologies and is now working on the third. We spoke by phone. Taking notes did not seem like a plausible endeavor on my part, because DeMott's mind moves like greased lightning – the ideas and references coming out in arpeggios, rapid-fire and sometimes multitrack.

But one point he made did stick. It was a consideration on holding together a project in which the contributorsdo not share a party line, and indeed sometimes only just barely agree to disagree. It sounds complicated and precarious. Often, he said, it comes down to sharing a passion for music -- for sensing that both democracy and dancing ought to be in the streets. Politics isn't about policy, it's about movement.

That does not mean celebration is always the order of the day. The indulgence of academic hiphop fans is legendary, but if you want to see what tough-minded cultural analysis looks like, check out the African-American film critic Armond White's reflections on white rapper Eminem in The First of the Year: 2009. The essay can be recommended even if its subject is now shrinking in pop culture’s rearview mirror.

“Rather than a symbol of cultural resistance,” writes White, “he’s the most egregious symbol of our era’s selfish trends. With his bootstrap crap and references to rugged individualism reminiscent of the 80s, he’s a heartless Reagan-baby – but without the old man’s politesse.... His three albums of obstinate rants culminate in the egocentric track ‘Without Me,’ making him the Ayn Rand of rap – a pop hack who refuses to look beyond himself.... Minus righteousness, angry rap is dismissible. Rap is exciting when it voices desire for social redress; the urge toward public and personal justice is what made it progressive. Eminem’s resurrected Great White Hope disempowers hip hop’s cultural movement by debasing it.”

Now, if you can imagine such thoughts ever appearing in an essay by Irving Howe -- let alone Irving Kristol -- then we can go ahead and describe First as inheriting the legacy of the New York Intellectuals.

Otherwise, it may be time to recognize and respect First for what it is in its own right: a journal of demotic intelligence, alive to its own times, with insights and errors appropriate to those times, making it worth the price of perplexity.

Only after talking to Benj DeMott did I read what seems, with hindsight, like the essay that best explains what is going on with the whole project. This is long tribute -- far more analytical than sentimental -- to his father, the late Benjamin DeMott, who was a professor of English at Amherst College. He was a remarkable essayist and social critic.

It is time that someone publish a volume of DeMott senior's selected writings. Meanwhile, his influence on First seems pervasive. The younger DeMott quotes a letter written in his father’s final years -- a piece of advice given to a friend. It offers a challenge to what we might call "the will to sophistication," and its hard clarity is bracing:

"Study humiliation. You have nothing ahead of you but that. You survive not by trusting old friends. Or by hoping for love from a child. You survive by realizing you have nothing whatever the world wants, and that therefore the one course open to you is to start over. Recognize your knowledge and experience are valueless. Realize the only possible role for you on earth is that of a student and a learner. Never think that your opinions – unless founded on hard work in a field that is totally new to you – are of interest to anyone. Treat nobody, friend, co-worker, child, whomever, as someone who knows less than you about any subject whatever. You are an Inferior for life. Whatever is left of it....This is the best that life can offer. And it’s better than it sounds.”

This amounts, finally, to a formulation of a democratic ethos for intellectual life. It bends the stick, hard, against the familiar warp. So, in its own way, does First, and I hope the website and the series of anthologies will continue and prosper as new readers and writers join its public.

Author/s: 
Scott McLemee
Author's email: 
scott.mclemee@insidehighered.com

'Economy of Attention'

It is not only people living on islands who count as insular -- etymology notwithstanding. Consider a recent piece by New York Times columnist David Brooks, whose usual shtick might be called “Thornstein Veblen for Dummies.” Using the disaster in Haiti following the earthquake earlier this month as his peg, Brooks diagnosed the country’s poverty and imploding civil society as byproducts of Voodoo – which, with its magical worldview, discourages rational calculation and planning.

Evidently the pundit is growing ambitious; he has graduated to Max Weber for Dummies. The thesis makes perfect sense, as long as you ignore as much economic and political history as possible.

After enslaved people of African descent liberated themselves during the Haitian revolution of the 1790s (creating an independent state, at enormous cost of life) they were forced to pay reparations to France, which had fought a war to resubjugate the island, but lost. The price of diplomatic recognition was not cheap; by one estimate, France demanded the equivalent of $21 billion in today’s currency. Haiti continued to pay it well into the middle of the 20th century. The resources of a poor country were transferred, decade after decade, to a rich country. Was this more rational than a belief in zombies? Would it not tend to foster a belief that the world is governed by capricious forces who must be placated?

The response of sundry blowhards to the news from Haiti is only partly the result of unabashed ignorance, of course. Moral callousness is also a factor. But even among people feeling empathy and a sense of responsibility to help there is often a blindspot with regard to the Caribbean – an underestimation of its place in the history of Atlantic societies, its role in connecting Europe, Africa, and the Americas.This was one of the points tirelessly emphasized by the late C.L.R. James, the Trinidadian historian and political theorist, whose classic book The Black Jacobins: Toussaint Louverture and the San Domingo Revolution (1938), is being rediscovered now. Perhaps it is not too late to grasp the Caribbean as a crucial part of the process shaping global society.

For some more up-to-date reflections on the region at this moment of crisis, I got in touch with Nicholas Laughlin, editor of The Caribbean Review of Books, who lives near James’s old hometown of Port of Spain, Trinidad. In addition to transforming CRB from a quarterly print magazine to an online journal, he is co-editor of the poetry magazine Town and an administrator of Alice Yard, a small contemporary arts space in Port of Spain.

Laughlin is the editor of Letters from London by C.L.R. James (University Press of New England, 2003) and of the revised, expanded, and re-annotated edition of Letters Between a Father and Son by V.S. Naipaul (Pan Macmillan, 2009). I reviewed the earlier book some years ago, and have had the occasional brief dialogue with him by e-mail in the meantime. Following the events of the past two weeks, we had a much more substantial discussion – one touching on the history and politics of the Caribbean, and how its cultural institutions (academic and otherwise) fit into the “economy of attention” of the 21st century.

A transcript of that exchange follows. Some of Laughlin’s spelling has been Americanized, for I have yielded to the cultural imperialism of WordPerfect.

Q: You know how the disaster in Haiti is being discussed by the mass media here. What can you say about how it is being framed within the Caribbean?

A: The Caribbean is so various, it's hard to generalize. I don't really know how recent events in Haiti are being framed in the Hispano- and Francophone Caribbean. Within the Anglophone Caribbean, responses vary from country to country or island to island. In the Bahamas – just north of Haiti, where there are significant numbers of Haitian immigrants – there's been concern about being swamped by refugees. My colleague Nicolette Bethel – anthropologist, playwright, theatre director, and editor of the online literary journal tongues of the ocean – has criticized the way the earthquake has been reported in the Bahamian press, even as many Bahamians have thrown themselves into organizing relief efforts.

In Trinidad, on the other hand, at the opposite end of the Caribbean, there's been some anger about the way the government has responded i.e. with what looks to many of us like faint concern. It took our prime minister nearly a full day to make any kind of statement about the earthquake, in the form of off-the-cuff remarks to the press.

Of course many of us in the Caribbean have CNN, the BBC, and the U.S. networks on cable, and read the international papers online. Where there's been local reporting, it's mostly focused on local angles – the Jamaican press gave lots of coverage to the visit their prime minister made to Haiti last week, and here in Trinidad there have been several stories on Trinidadians who happened to be in Haiti during the earthquake.

As with the media anywhere, the media here "like" stories of chaos and mayhem. So there's been ample coverage, via wire service stories, of looting, machete-wielding gangs, street violence etc., even though there are many people on the ground in Haiti who say that violent incidents have been rare and very localized, and there has been extraordinary cooperation among displaced Haitians – and the whole issue of "looting" needs serious deconstruction.

Q: Yes, it’s quite similar to how the coverage of the disaster in New Orleans unfolded in the American media just after Katrina. Do you notice anything distinctive about the Caribbean discussion of the crisis now?

A: I think there is wider awareness in the Caribbean (than in the U.S., say) of some of the historical circumstances that contributed to the present crisis – crippling and unjust debt, meddling by foreign powers, and so on. I'm pretty sure that the Haitian Revolution and its aftermath, including the "reparations" payments to the French (by now we can all quote the amounts by heart) are on the secondary schools history syllabus in Trinidad.

I've read a few incisive op-ed pieces in the local press – by the historian Hilary Beckles in Barbados, the activist Raffique Shah in Trinidad and Tobago, the literary scholar Carolyn Cooper in Jamaica, and I'm sure I've missed others – that explain Haiti's history of being bullied by wealthier countries with much bigger armies. Certainly among what you might call the Caribbean's intellectual elite, for want of a better term, there's a definite sense of the wider Caribbean's moral debt to Haiti. We've almost all read at least some of The Black Jacobins.

Still, the rhetoric of "failure," the idea that Haiti is "unlucky," has a foothold in the discourse. I got into a sort of argument the other day, on Facebook, with a friend who was riffing off the Senegalese president's offer to "repatriate" Haitians. This friend suggested, no doubt as a kind of deliberately absurd thought experiment, that the "seemingly interminable problem" of Haiti would be solved by permanently evacuating the whole country – resettling all nine or ten million people elsewhere. Even among well-educated and well-meaning people, the idea of Haiti as a "problem" is entrenched.

But another friend who entered the conversation said something that struck and moved me. She said she was appalled by her own attitudes towards Haiti – meaning, I think, that the horrors of the earthquake and its aftermath had forced her to confront her own unconscious prejudices and ignorance. I feel the same, and there seems to be a wider sense here that Caribbean citizens must take some blame for Haiti's troubles in recent decades. We haven't been interested enough, haven't pressured our politicians enough, haven't bothered to try to understand. Many friends and colleagues seem to share an awareness that real recovery for Haiti means meaningful involvement by Caribbean citizens, and a still-unfocused resolve to be a part of that. I hope we stick to our guns.

Q: The possibility of pan-Caribbean citizenship is familiar from C.L.R. James’s writings. It was something he saw as necessary and urgent. But it sounds like there hasn’t been much progress on that front in the two decades since his death. Why is that?

A: Individual Caribbean countries have much in common, of course, but there are real knowledge gaps separating us and very real prejudices behind the facade of solidarity that we generally like to put forward. At the best of times there are strong prejudices against the region's less wealthy nations. In the southern Caribbean, that means Guyana, and Haiti seems to fill that role further north. If some Bahamians are worried about being overrun by Haitians, some Barbadians feel the same way about Guyanese, and Trinidadians have long been suspicious of "small islanders" wanting to settle here.

It's more acute when it comes to Haiti – not only is it a very "poor" and "undeveloped" country, but it has a reputation for violence, HIV, and voodoo. Never mind that violent crime and HIV infection rates are rising everywhere in the region, and every Caribbean territory has one or more versions of a syncretic religion combining elements of belief and practice from West Africa, Christianity, and sometimes other traditions.

Q: As editor of The Caribbean Review of Books, you are in a good position to assess the literary and intellectual traffic within the region, and between the Caribbean and the rest of the Atlantic. Would you say something about this?

A: I often think I'd be in a better position to, as you put it, "assess the literary and intellectual traffic" if I lived not in Port of Spain but in New York or London or Toronto or even Miami. I'm also pretty sure it would be easier to publish a magazine like the CRB in one of those places. It would probably be easier to secure grant funding, and the magazine would be physically closer to a critical mass of potential readers.

One of the hot concepts in Caribbean academic circles these days is the "transnational" Caribbean. It can mean different things. The positive spin is the notion of the Caribbean not as a physical region but as a cultural or social phenomenon – a space, not a place – that includes the major Caribbean populations in metropolitan centers like the ones I listed above. So we can claim Brooklyn and Brixton as "ours," and we quote the late Jamaican poet Louise Bennett about "colonizing in reverse."

On the one hand, this notion of the transnational is simply descriptive. There are millions of Caribbean immigrants and their immediate descendants in the U.S., Canada, the U.K. and elsewhere, and there is certainly a sense in which they continue to be Caribbean, participate in conversations about Caribbean society, contribute to Caribbean economies (through remittances), etc. On the other hand, and especially from the point of view of someone who actually does live here, it sometimes seems like wishful thinking, or at least like an attempt to put a hopeful face on a situation that is not so hopeful. Colonizing in reverse, or brain drain?

The last census in Guyana suggested some shockingly high percentage of Guyanese of my generation with a secondary education now live abroad – over 80 percent. Anecdotally, I can say that about half of my graduating class at secondary school (one of Trinidad's "elite" schools) is now abroad.

This isn't a new phenomenon, of course. Going abroad for education or to expand intellectual possibilities has been part of the standard narrative of Caribbean intellectual life at least as far back as 1932, when James left Trinidad. It's widely held that West Indian literature suddenly sprang into existence in the 1950s when various aspiring writers from different British West Indian territories went to London, discovered common cultural elements, and found an audience for their work via the BBC's Caribbean Voices program and postwar publishers with a taste for exotica from the colonies. (Though that narrative is now disputed by some younger Caribbean lit scholars working on earlier writers and texts.)

There was a moment in the late '60s when it seemed the center of intellectual gravity might shift back to the Caribbean itself, but it didn't take long for post-Independence disillusion to set in. It's very moving but also puzzling for me to read a book like Andrew Salkey's Georgetown Journal (1972), set just at that moment when Independence optimism was beginning to tremble.

Q: I asked about this without thinking about how the cultural history would overlap with your own personal experience. Would you say a little more about that?

A: I came of age in the 1980s, which with adult hindsight I can see was a very pessimistic time for Caribbean people of my parents' generation, but I remember as a schoolchild thinking that people who "went away to live" were specially lucky, even if it was an eventuality I couldn't imagine for myself. Had I gone to university abroad, it's likely I wouldn't have come back to Trinidad, not to live. I still can't decide whether that would have been a better thing.

Having reached my mid-30s, having never lived anywhere else, I'm now fairly certain I'll stay here. But that's something I still think about often – almost every time I travel to the U.S. or Britain, I spend a good chunk of my time trying to imagine an alternative life there. I think that for many Caribbean people of my generation and approximate background – middle class, relatively well-educated – the question of going or staying remains acute.

Sitting here in Diego Martin, west of Port of Spain, it seems to me that in 2010 the literary and intellectual traffic within the Caribbean – and between the region and North America and Europe – is still directed mainly by agents physically located outside the Caribbean itself. Most of our intellectuals and writers are elsewhere. Almost all our books are published elsewhere, There are only two publishers of consequence in the Anglophone Caribbean – both based in Jamaica, both quite small. The main intellectual journal of the Anglophone Caribbean, Small Axe, is based in New York.

Most serious contemporary Caribbean artists either live abroad or depend heavily on financial support from abroad via grants, residencies, etc. Many if not most intellectual or cultural initiatives in the Caribbean similarly depend on financial support from abroad. What's kept the CRB going in the past couple years is a grant from the Prince Claus Fund in the Netherlands. And the audiences for all of these are also now, in the main, in North America and Europe. In the money economy as well as the attention economy, we still depend on investment and, frankly, charity from elsewhere.

Q: What role does academe play in all of this?

A: It's true that indigenous institutions like the University of the West Indies do a great deal to promote conversations between separate territories, and the three campuses are relatively important centers of activity. But the university faculty are vastly outnumbered by Caribbean scholars in the academy abroad who are inevitably better funded and better positioned to insert themselves into essential debates in their disciplines.

It's terribly revealing that the theme of the Caribbean Studies Association's 2009 conference, held in Jamaica, was "Centering the Caribbean in Caribbean Studies." You can read the phrase in more ways than one, but my interpretation is: “bringing the Caribbean back to the center of Caribbean studies.” Well, where else was it?

I don't mean to set up a binary opposition between here and there, local and diaspora, us and them, because of course the reality is far more complex. There is conversation and exchange and movement between all these nodes, and they are often fruitful. But aspects of the situation are depressing. For the better part of five centuries the Caribbean was devoted to producing raw materials to enrich already wealthy countries further north. Now sometimes it feels like we're producing cultural raw materials to be turned into books, films, lectures, etc. by intellectual agents in New York or London or Toronto.

Q: Is there a silver lining to contemporary developments?

A: When I (infrequently) attend academic conferences or meetings in the Caribbean, like a stuck record I implore the assembled scholars to make more strategic use of the web to share their research, to make it more widely available. I remind them that many of us in the Caribbean don't have easy access to research libraries or online journal subscriptions. In the age of WordPress and Blogger, when anyone who can use a word processor can also set up a website, there's no excuse.

One of the interesting and encouraging developments in the Trinidad art scene in the past year or so has been the rapid flourishing of artists' blogs. The writers will follow close behind, I hope. There is no serious engagement with visual art in the press here – no real reviewing, no professional critics, and commercial galleries are generally highly conservative and mercenary. So younger artists are increasingly creating work to share online, and using their blogs and websites to document their practice and comment on the work of their peers. It's early yet, and if there's a conversation going on, it's still happening within a small circle, but the odd international curator has peeked or poked in.

Some of my friends and colleagues in the art scene here have been energized and encouraged by this development. It may fizzle out, or these small individual initiatives may coalesce. In the past year or two, I've been more involved in, and paid more attention to, the Caribbean visual art scene than to the literary scene – partly because that's where the energy seems to be, partly because Caribbean visual images seem to be doing better than Caribbean literary texts in the economy of attention.

Q: That seems like a useful expression – “the economy of attention.” Clearly it is bound up, in all sorts of complicated ways, with economics in the more familiar sense. But it’s also political....

A: At the moment everything going through my head is colored by the fact of Haiti. Who gets to decide what help Haiti needs and how to rebuild? I'm not sure Haitians will. Who gets to decide what contemporary Caribbean literature is? Publishers in New York and London and literary scholars in American, British, and Canadian universities. Those two questions aren't comparable in degree, but are bound together in a common dilemma.

Author/s: 
Scott McLemee
Author's email: 
scott.mclemee@insidehighered.com

In the American Grain

Howard Zinn -- whose A People’s History of the United States, first published by Harper & Row in 1980, has sold some two million copies -- died last week at the age of 87. His passing has inspired numerous tributes to his role in bringing a radical, pacifist perspective on American history to a wide audience.

It has also provoked denunciations of Zinn as “un-American,” which seems both predictable and entirely to his credit. One of Zinn’s lessons was that protest is a deeply American inclination. The thought is unbearable in some quarters.

One of the most affectionate tributes came from the sports writer Dave Zirin. As with many other readers, he found that reading Zinn changed his whole sense of why you would even want to study the past. “When I was 17 and picked up a dog-eared copy of Zinn's book,” he writes, “I thought history was about learning that the Magna Carta was signed in 1215. I couldn't tell you what the Magna Carta was, but I knew it was signed in 1215. Howard took this history of great men in powdered wigs and turned it on its pompous head.” Zirin went on to write A People’s History of Sports (New Press, 2008), which is Zinnian down to its cells.

Another noteworthy commentary comes from Christopher Phelps, an intellectual historian now in the American and Canadian studies program at the University of Nottingham. He assesses Zinn as a kind of existentialist whose perspective was shaped by the experience of the civil rights struggle. (He had joined the movement in the 1950s as a young professor at Spelman College, a historically black institution in Atlanta.)

An existentialist sensibility -- the tendency to think in terms of radical commitment, of decision making as a matter of courage in the face the Absurd -- was common to activists of his generation. That Phelps can hear the lingering accent in Zinn’s later work is evidence of a good ear.

Zinn “challenged national pieties and encouraged critical reflection on received wisdom,” writes Phelps. “He understood that America’s various radicalisms, far from being ‘un-American,’ have propelled the nation toward more humane and democratic arrangements.... He urged others to seek in the past the inspiration to dispel resignation, demoralization, and deference, the foundations of inertia. The past meant nothing, he argued, if severed from present and future.”

I've spent less time reading the fulminations against Zinn, but they seem like backhanded honors. When a historian known for saying good things about the Fascists who won the Spanish Civil War considers it necessary to denounce somebody, that person’s life has been well-spent.

Others have claimed that Zinn did not sufficiently denounce Stalinism and its ilk. The earliest example of the complaint that I know came in a review of People’s History that appeared in The American Scholar in 1980, when that magazine was a cultural suburb of the neoconservative movement. The charge has been recycled since Zinn’s death.

This is thrifty. It is also intellectually dishonest. For what is most offensive about Zinn (to those who find him so) is that he held both the United States and the Soviet Union to the same standard. He even dared to suggest that they were in the grip of a similar dynamic.

“Expansionism,” he wrote in an essay from 1970, “with its accompanying excuses, seems to be a constant characteristic of the nation-state, whether liberal or conservative, socialist or capitalist. I am not trying to argue that the liberal-democratic state is especially culpable, only that it is not less so than other nations. Russian expansionism into Eastern Europe, the Chinese moving into Tibet and battling with India over border territories -- seem as belligerent as the pushings of that earlier revolutionary upstart, the United States.... Socialism and liberalism both have advantages over feudal monarchies in their ability to throw a benign light over vicious actions.”

Given certain cretinizing trends in recent American political discourse, it bears stressing that Zinn here uses “liberalism” and “socialism” as antonyms. A liberal supports individual rights in a market economy. By any rigorous definition, Sarah Palin is a liberal. And so, of course, is Barack Obama, who can only be called a “socialist” by an abuse of language. (But such abuse is an industry now, and I feel like Sisyphus just for complaining about it.)

The most substantial critique of A People’s History remains the review by Michael Kazin that appeared in Dissent in 2004. Kazin’s polemic seems to me too stringent by half. Zinn's book is not offered as the last word on the history of the United States, but as a corrective to dominant trends. It is meant to be part of an education, rather than the totality of it.

But Kazin does make points sometimes acknowledged even by the book’s admirers: “Zinn reduces the past to a Manichean fable and makes no serious attempt to address the biggest question a leftist can ask about U.S. history: why have most Americans accepted the legitimacy of the capitalist republic in which they live?”

That is indeed the elephant in the room. Coercion has certainly been a factor in preserving the established order, but persuasion and consent have usually played the greater part. Any American leftist who came of age after Antonio Gramsci’s work began to be assimilated is bound to consider hegemony a starting point for discussion, rather than an afterthought.

But Zinn was the product of an earlier moment -- one for which the stark question of commitment had priority. A strategic map of the political landscape was less urgent than knowing that you stood at a crossroads. You either joined the civil rights struggle or you didn’t. You were fighting against nuclear proliferation or the Vietnam War, or you were going along with them. It is possible to avoid recognizing such alternatives -- though you do end up making the choice between them, one way or the other.

There were subtler interpretations of American history than Howard Zinn’s. Anyone whose understanding of the past begins and ends with it has confused taking a vitamin for consuming a meal. But that does not make it worthless. The appreciation of complexity is a virtue, but there are times when a moment of clarity is worth something, too.

Author/s: 
Scott McLemee
Author's email: 
scott.mclemee@insidehighered.com

The Good Reader

I fell in love with him first over the phone. A man older than my father, a man I hadn’t yet met in person, a man about whom I knew little except that he was kind to me, but someone who was different in obvious and profound ways from the people I encountered every day.

In 1984 I had just started a full-time job as an editorial assistant. I was fortunate to be working for an editor who saw it as part of her mission to educate me. Susan would invite me to sit in on her hectoring conversations with authors, where she would tell them, harsh and didactic in both her tone and her language, exactly what was wrong with their thinking. She would explain to me how manuscripts get shaped into books, what the expectations of a reader are and how the author can’t afford to frustrate them. She talked a lot about fairness. She was the first proud Reaganite I ever thought was smart.

A part of my job, after I’d finished typing up correspondence, preparing projects to go into production, trafficking copy and marketing materials, and answering the phone, was to line up readers to report on manuscripts. This was just as Oxford was starting to publish scholarly books, and it was an easy way to build a list. Peer review counted more than an editor’s good judgment and took less time. Susan had only to look quickly at the first few pages and decide whether or not a project was worthy of being “sent out.” She’d give me a list of names, or tell me to call one person (usually an author) and ask for suggestions. This was one of my least favorite parts of the job. You’d have to make a zillion phone calls, and leave hundreds of messages. Sometimes academics were nice, but they were rarely in their offices. They could be mean or pompous, and would sometimes lecture me on the manuscript they hadn’t yet read. I imagine that now, with e-mail, things are a whole lot easier.

Sometimes I’d strike out so many times I would end up with a reader who wouldn’t be a natural extension of the scholars on the original list, which is probably how it happened, because now I can’t believe that Susan would ever have asked me to call him as a reader. He was surprised at being asked to do something for Oxford. But he talked to me in a way that other readers -- busy, name-brand academics -- didn’t.

He wondered what it was like to do my job. (I loved it. Really? he said. You love your work?) Where had I come from? (College -- I’d started working at Oxford one day a week before I’d even graduated. He asked which college, and then didn’t say much.) Where did I live? (Manhattan. He’d grown up, he said, in Brooklyn. His accent, in fact, reminded me of my grandparents who still lived there.) Where had I grown up? (In the boonies of upstate New York.) My parents’ jobs? (My father was a bitter, third-rate academic at a state university.) Future plans? (No plans -- I took this job instead of going for my dream, working on a dude ranch in Wyoming. Too bad, he said, that my dream hadn’t quite worked out. Yet.)

I can’t remember what manuscript I’d sent him, or what he sent back in his reader’s report. I do remember that Susan was surprised to see a report from him and to hear that we’d had a long talk. And I remember that not long after, he called to say that he was going to be in New York City and wanted to come by the office. Come by, I said.

He wasn’t anything like what I expected. He was tall, very tall, and athletic-looking, with a strong face, and salty dark hair. He was handsome in ways that I didn’t think a sixty-year-old man could ever be.

We sat in my office and chatted. He told me about his kids -- there were two, both older than me. Mostly, though, he wanted to know about me.

How much is there for a 22-year-old girl to say about herself? I talked about my work. I talked about how this job was like being in graduate school. I got to learn not only about publishing, but about a whole bunch of academic fields I hadn’t gone deep into in college, where I read mostly English literature, a little bit of criticism, and dabbled in philosophy. Now I was immersed in political science, sociology and, when Susan could sneak it in, history. Oxford was then divided -- in divisive and rancorous ways -- by traditional disciplinary lines and editorial jealousies, and fears of “poaching” buzzed like flies in the hallways.

We editorial assistants had our own sources of strife. Some had to work for bosses who were, if I’m to be honest, bat-shit crazy. Others were chained to their desks, barely allowed to leave the building for lunch. Most didn’t get the kind of author contact I was allowed, because most editors didn’t share their jobs as fully as Susan did. I got to go to important meetings while my friends were typing. I went to lunches at fancy expensive account restaurants as long as I got the Xeroxing done.

I told him about the opportunities being handed to me, the fact that I didn’t even know, most of the time, when I was talking to people who were famous in their fields. I was paid to read, I told him, to learn. It was a great job.

Later, he would confess that he told his students about me -- that I was one of the few people he knew who was truly happy with her work. Now I don’t know whether to believe this; I suspect he said it to make me feel good and that he knew that someday I would realize that things were not so simple.

We began having lunches. I flexed a fledgling expense account to take him out when he came to New York, and I made time to see him on the occasions when I got to travel to Boston. I would always ask what he was working on, acting like a big girl editor. After my first year at OUP I left Susan to work for the American history editor, so this would have made sense, but it didn’t interest him in the least. His work was different from most of Oxford’s list; in many ways, I later realized, we were the mainstream he was reacting against.

Once he called me up to say that he had written a play about Emma Goldman; it was being produced in New York, directed by his son. He gave me the information and I said I would go. He told me to introduce myself to his son. I went to the play -- enraptured by the fact of knowing the playwright -- but was too shy to track down his son, who was tall and handsome like his father.

One day, over lunch in Boston, I grilled him and made me tell him his story. It’s a familiar one, by now: He was the son of Jewish immigrant, and began his career as a rabble-rouser at age 17. At that point I had moved to Brooklyn, and he talked about the Brooklyn Naval Yard, about meeting Roslyn, his wife. He talked about joining the Army to fight on the side of good against evil. “I was a bombardier,” he said. He said it twice, as if he could hardly believe it We were eating lunch, maybe at Legal Sea Foods in Cambridge, and I knew he was telling me something important. He told me about the box he’d stuffed all his army belongings into -- his medals, his papers -- and that he’d written “Never again” across the top.

He told me about teaching at Spelman College and his work during as part of the civil rights movement. He told me about getting fired.

And then he talked about Boston University and about John Silber. He said that he taught the biggest course at the university and that he wasn’t allowed any teaching assistants. The president had offered him some, he said, if he cut down his class size. Way down. He was on the eve of retirement, but said that he wanted to stick around just to irritate Silber.

At that point, I hadn’t read A People’s History of the United States. I knew Howard Zinn only as a professor who had read a manuscript for me and become, unexpectedly, my friend. And then I read him, and fell in love with him in myriad other ways. For his bravery. For his lucidity. And of course, for the generosity and authenticity of his vision. I have met labor historians who have no truck with laborers; defenders of social justice who say racist and sexist and plain old bigoted things after a cocktail or two.

There are many others who can talk about how Howard Zinn changed not only their lives, but the world. I am well aware of my good fortune. When I needed the figure of a good father, someone wise and kind, challenging and encouraging, I did a slipshod job at work and called him to read a manuscript.

Author/s: 
Rachel Toor
Author's email: 
info@insidehighered.com

Rachel Toor teaches creative writing in the MFA program at Eastern Washington University, in Spokane.

Everywhere and Nowhere

We seem to be in the midst of a religious revival. At least that seems true within higher education, and especially within our own field of American history. According to a recent report from the American Historical Association (and written about in Inside Higher Ed), religion now tops the list of interests that historians claim to have as their specialty.

This renaissance bodes well for a discipline that more or less has forgotten about or tended to marginalize religion, especially when it has examined modern America (typically defined as anything after 1865). Even to this day, religion is everywhere around us, and religious historians have written about it in compelling and exciting ways, but within mainstream historiography it has been basically left behind. In a sense, religion is everywhere in modern American history, but nowhere in modern American historiography.

To illustrate the point, Jon Butler’s 2004 article for the Journal of American History analyzed American history textbooks to see if religion was present in their pages. He found that religion was omnipresent in the telling of early American history (before 1865), but after the Civil War it appeared only episodically, "as a jack-in-the-box," popping up "colorfully" here and there, then disappearing, "momentary, idiosyncratic thrustings up of impulses from a more distant American past or as foils for a more persistent secular history."

Butler is not alone in noticing this shortcoming. Robert Orsi, an eminent historian of American everyday religious practice recently suggested that historians have failed to grasp what has been going on within the subdiscipline of religious history. And David A. Hollinger, an intellectual historian and the incoming president of the Organization of American Historians, has urged historians to study religion, not for the sake of advocacy, but because of the extensive gap between intellectuals and the rest of the population. His articles have carried such urgent titles as "Jesus Matters in the USA" and "Why is there so much Christianity in the United States?"

There are several possible explanations for this everywhere/nowhere disjunction. Some explanations include the rise of social history, the disconnect in professed religious beliefs between academics and other Americans, the fact that the widespread recognition of America’s religious pluralism has forced our institutions to become increasingly secular, and more.

But we would like to suggest some ways in which religious historians have attempted to fuse their stories into the mainstream narrative. A good example to begin with (because its timing accords perfectly with religion’s historiographical absence) might be Edward Blum’s award-winning book Reforging the White Republic: Race, Religion, and American Nationalism, 1865-1898 (2005). When explaining the demise of the cause of equality that was so prominent in the Civil War, Blum lays blame directly on American religious institutions and their leaders. Blum shows that many if not most of the narratives of reconciliation that emerged in the 1870s embraced Christian images of reunion, a Messianic notion of coming together again and working on the great American project, all at the expense of African Americans. And Northern ministers led the way. The capstone final moment in D. W. Griffith’s Birth of a Nation (1915), which illuminates the triumph of the Klan as the "birth of the nation," shows Jesus’ face hovering over it. It was Protestant ministers and activists, Blum shows, who midwived the end of Reconstruction and the rise of a united, white Christian America, aggressively on the prowl for territorial conquests.

During the progressive era of the early 20th century, even as many American institutions were secularizing, religion marked many aspects of social life. Clifford Putney’s study of recreational and professional sports from 1880 to 1920 put a Muscular Christianity, as he titled his 2001 book, at the center of Victorian manhood. A revitalized and reformed Protestantism, based in no small part on excluding itself from new immigrants, help to recreate the notion of Victorian manhood. Religion was the key. William J. Baker has updated this story for our own times in Playing with God: Religion and Modern Sport (2007), which affirms Putney’s timing that muscular Christianity emerged out of a turn-of-the-twentieth-century need to redefine manhood in the Industrial Age.

For the interwar years, Matthew Avery Sutton’s new biography, Aimee Semple McPherson and the Resurrection of Christian America (2007), portrays McPherson’s deep influence in the period (and also, therefore, conservative Protestantism’s deep influence too). Sutton argues that McPherson was among the first in the modern era to unite conservative Christianity, American nationalism, and a political sensibility favoring the fantastic. Although she has been derided as a sexualized simpleton who quickly faded from the scene, Sutton portrays her as the forefather (mother?) of today’s Religious Right. Her style of publicly and personally sensational politics created a model that would be picked up several decades later by the likes of Jimmy Swaggart, Pat Robertson, and James and Tammy Faye Baker. In her day, she was everywhere, but today, she hardly appears at all in the mainstream narrative of American history, this despite her importance in laying the groundwork for the rise of the Religious Right.

During the post-World War II period, the United States experienced a religious revival of sorts as well, although one that was unusual in American history because it was a revival not just for Protestants, but for Roman Catholics and Jews. Catholics and Jews took advantage of the anti-fascist rhetoric of World War II and the Cold War in order to combat any lingering connections between Protestantism and American nationalism. Instead, they articulated the idea that the state should be neutral in handling religious affairs, whether it be in the U.S. Census or in the realm of public education. In this way, religion sits at the root of today’s multicultural struggles, where differences are to be recognized and even championed, but never prioritized by the state. Interestingly, these ways of managing pluralism were worked out when religious groups were the primary provocateurs, and not by racial, ethnic, or gendered groups.

There are many more examples of these acts of incorporation. But despite all this recent work, our general thesis that religion has been everywhere in history but nowhere in historiography has two major exceptions: in historical works on the civil rights movement and the religious right. When it comes to civil rights historiography, religious interpretations have vitally influenced scholarship; indeed, those who downplay the influence of religion tend to be the “heretics,” rather than the other way around. Meanwhile, we now have a small library of books on contemporary figures of the Religious Right, from Jerry Falwell to James Dobson to Phyllis Schlafly.

Noting these two exceptions raises important questions. For example, since these are two groups that have been historically racialized and/or marginalized, does that make it “safer” to incorporate religion more centrally into their intellectual trajectories? And to what degree do they influence the mainstream narrative? In other words, when we move from the mainstream to the margins, does it become safer to introduce religion as a central actor in people’s lives? And if so, will that scholarship focusing on the margins find its way into the mainstream narratives? The almost complete absence of religion from David M. Kennedy’s Freedom from Fear and James Patterson’s Grand Expectations, the two Oxford History of the United States volumes covering the period from 1932 to 1974, provides just cause for such reflection.

Meanwhile, religion continues to influence and shape Americans’ lives. The much-publicized "U.S. Religious Landscape Survey" (2008) from the Pew Forum on Religion and Public Life provides some startling data. First, the survey found that almost 1 out of every 10 Americans is an ex-Catholic. For the past 100 years, Catholics have always been and still do make up about 25 percent of the population, but the stability of proportion for recent years is only explicable because of the large numbers of immigrants, mostly Hispanic, that have come to the United States since 1965, when the United States loosened its immigration laws. Second, by the standards of population, the United States is still not “Abrahamic” or even “Judeo-Christian,” if it ever was. Jews make up 1.7 percent of the population, while no other non-Christian religion constitutes more than 1 percent. Meanwhile, nearly 80 percent of Americans consider themselves to be some variety of Christian. And finally, the largest growth area in people’s religious identification lies in the category of “uncommitted,” now amounting to about 14 percent of the population, according to the Pew survey. That figure varies vastly by region. Thus, "uncommitted" makes up a sizable portion of the Pacific Northwest, but barely registers as a religious alternative in the Deep South.

Other highlights from the Pew survey include the fact that there are more Buddhists than Muslims in America, there almost as many atheists as Jews (and more agnostics), and more than a quarter (28 percent) of all Americans have left the grand faith tradition into which they were born, while nearly half of all Americans have left the faith of their birth or switched denominations at some point in their life (44 percent). The survey thus emphasizes that the structure of faith in America is an amorphous thing, constantly changing, influencing people’s lives in new and dynamic and important ways. And religious historians have been busy tracing religion’s dynamism in modern American history.

If only more historians would care. Perhaps our discipline’s “religious revival” will help make it so.

Author/s: 
Kevin M. Schultz and Paul Harvey
Author's email: 
info@insidehighered.com

Kevin M. Schultz is assistant professor of history at the University of Illinois at Chicago. Paul Harvey is associate professor of history at the University of Colorado at Colorado Springs.This essay is adapted from a forthcoming article in the Journal of the American Academy of Religion.

Andy Warhol, Then and Now

In two weeks, the National Book Critics Circle will vote on this year’s awards, and so, of late, I am reading until my eyes bleed. Well, not literally. At least, not yet. But it is a constant reminder of one's limits -- especially of the brain's plasticity. The ability to absorb new impressions is not limitless.

But one passage in Edmund White’s City Boy: My Life in New York During the 1960s and ‘70s (a finalist in the memoir category, published by Bloomsbury) did leave a trace, and it seems worth passing along. The author is a prominent gay novelist who was a founding member of the New York Institute for the Humanities. One generation’s gossip is the next one’s cultural history, and White has recorded plenty that others might prefer to forget. City Boy will be remembered in particular for its chapter on Susan Sontag. White says that it is unfortunate she did not win the Nobel Prize, because then she would have been nicer to people.

But the lines that have stayed with me appear earlier in the book, as White reflects on the cultural shift underway in New York during the 1960s. The old order of modernist high seriousness was not quite over; the new era of Pop Art and Sontag's "new sensibility" had barely begun.

White stood on the fault line:

"I still idolized difficult modern poets such as Ezra Pound and Wallace Stevens," he writes, "and I listened with uncomprehending seriousness to the music of Schoenberg. Later I would learn to pick and choose my idiosyncratic way through the ranks of canonical writers, composer, artists, and filmmakers, but in my twenties I still had an unquestioning admiration for the Great -- who were Great precisely because they were Great. Only later would I begin to see the selling of high art as just one more form of commercialism. In my twenties if even a tenth reading of Mallarmé failed to yield up its treasures, the fault was mine, not his. If my eyes swooned shut while I read The Sweet Cheat Gone, Proust's pacing was never called into question, just my intelligence and dedication and sensitivity. And I still entertain those sacralizing preconceptions about high art. I still admire what is difficult, though I now recognize it's a 'period' taste and that my generation was the last to give a damn. Though we were atheists, we were, strangely enough, preparing ourselves for God's great Quiz Show; we had to know everything because we were convinced we would be tested on it -- in our next life."

This is a bit overstated. Young writers at a blog like The New Inquiry share something of that " 'period' taste," for example. Here and there, it seems, "sacralizing preconceptions about high art" have survived, despite inhospitable circumstances.

White's comments caught my bloodshot eye because I had been thinking about Arthur C. Danto's short book Andy Warhol, published late last year by Yale University Press. (It is not among the finalists for the NBCC award in criticism, which now looks, to my bloodshot eye, like an unfortunate oversight.)

It was in his article “The Artworld,” published in The Journal of Philosophy in 1964, that Danto singled out for attention the stack of Brillo boxes that Warhol had produced in his studio and displayed in a gallery in New York. Danto maintained that this was a decisive event in aesthetic history: a moment when questions about what constituted a piece of art (mimesis? beauty? uniqueness?) were posed in a new way. Danto, who is now professor emeritus of philosophy at Columbia University, has never backed down from this position. He has subsequently called Warhol “the nearest thing to a philosophical genius the history of art has produced.”

It is easy to imagine Warhol's response to this, assuming he ever saw The Journal of Philosophy: “Wow. That’s really great.”

Danto's assessment must be distinguished from other expressions of enthusiasm for Warhol's work at the time. One critic assumed that Warhol's affectlessness was inspired by a profound appreciation for Brecht’s alienation effect; others saw his paintings as a radical challenge to consumerism and mass uniformity.

This was pretty wide of the mark. The evidence suggests that Warhol’s work was far more celebratory than critical. He painted Campbell’s soup cans because he ate Campell’s soup. He created giant images based on sensational news photos of car crashes and acts of violence -- but this was not a complaint about cultural rubbernecking. Warhol just put it into a new context (the art gallery) where people would otherwise pretend it did not exist.

“He represented the world that Americans lived in,” writes Danto in his book, “by holding up a mirror to it, so that they could see themselves in its reflection. It was a world that was largely predictable through its repetitions, one day like another, but that orderliness could be dashed to pieces by crashes and outbreaks that are our nightmares: accidents and unforeseen dangers that make the evening news and then, except for those immediately affected by them, get replaced by other horrors that the newspapers are glad to illustrate with images of torn bodies and shattered lives.... In his own way, Andy did for American society what Norman Rockwell had done.”

It seems like an anomalous take on an artist whose body of work also includes films in which drag queens inject themselves with amphetamines. But I think Danto is on to something. In Warhol, he finds an artistic figure who fused conceptual experimentation with unabashed mimeticism. His work portrays a recognizable world. And Warhol’s sensibility would never think to change or challenge any of it.

Chance favors the prepared mind. While writing this column, I happened to look over a few issues of The Rag, one of the original underground newspapers of the 1960s, published in Austin by students at the University of Texas. (It lasted until 1977.) The second issue, dated October 17, 1966, has a lead article about the struggles of the Sexual Freedom League. The back cover announces that the Thirteenth Floor Elevators had just recorded their first album in Dallas the week before. And inside, there is a discussion of Andy Warhol’s cinema by one Thorne Dreyer, who is identified, on the masthead, not as the Rag’s editor but as its “funnel.”

The article opens with an account of a recent showing, of the 35-minute film Warhol film “Blow Job” at another university. The titular action is all off-screen. Warhol's camera records only the facial expressions of the recipient. Well before the happy ending, a member of the audience stood up and yelled, “We came to get a blow job and we ended up getting screwed.” (This anecdote seems to have passed into the Warhol lore. I have seen it repeated in various places, though Danto instead mentions the viewers who began singing “He shall never come” to the tune of the civil-right anthem.)

Dreyer goes on to discuss the recent screening at UT of another Warhol film, which consisted of members of the artist's entourage hanging out and acting silly. The reviewer calls it “mediocrity for mediocrity’s sake.” He then provides an interpretation of Warhol that I copy into the digital record for its interest as an example of the contemporary response to his desacralizing efforts -- and for its utterly un-Danto-esque assessment of the artist's philosophical implications.

“Warhol’s message is nihilism," writes Dreyer. "Man in his social relations, when analyzed in the light of pure objectivity and cold intellectualism, is ridiculous (not absurd). And existence is chaos. But what is this ‘objectivity’? How does one obtain it? By not editing his film and thus creating ‘real time’? By boring the viewer into some sort of ‘realization’? But then, is not ‘objectivity’ just as arbitrary and artificial a category as any other? Warhol suggests there is a void. He fills it with emptiness. At least he is pure. He doesn’t cloud the issue with aesthetics.”

And so the piece ends. I doubt a copy ever reached Warhol. It is not hard to imagine how he would have responded, though: “It gives me something to do.” The line between nihilism and affirmation could be awfully thin when Warhol drew it.

Author/s: 
Scott McLemee
Author's email: 
scott.mclemee@insidehighered.com

The American Jitters

Morris Dickstein's Dancing in the Dark: A Cultural History of the Great Depression, published last year by W.W. Norton, is one of five finalists for the National Book Critics Circle award in criticism. The author, a professor of English and theater at CUNY Graduate Center, has written and edited numerous other works of literary and cultural analysis. His explorations of American literature, films, and music of the "long decade" between 1929 and 1941 seem to be written in an almost classical mode -- as if he were simultaneously channeling the major figures assayed in his Double Agent: The Critic and Society (Oxford, 1992).

My short discussion of Dancing in the Dark recently appeared at the website of the National Book Critics Circle. This was written as part of my duties as a member of the NBCC board, but doing so was no burden; this is a book to inspire enthusiasm. So without further ado, here follows the transcript of an e-mail interview with its author. The winners of the NBCC awards will be announced during a ceremony at the New School University in New York City on Thursday night.

Q: What was it like to find that a project you'd been working on for years suddenly turned out to be all too timely?

A: I actually began working on it in the Reagan era, when it was timely, since Reagan set about to upend the New Deal consensus in many ways, especially about the role of government in our lives and about our need, beautifully articulated in FDR's second Inaugural address, to take collective responsibility for each other, and especially for the worst off among us. Reagan gave a license to self-seeking that was a throwback to the Harding-Coolidge era, and he also dealt a severe and lasting blow to unions when he broke the air traffic controllers' strike.

The book was also timely because every year, almost every month of the 1980s brought the 50th anniversary of some New Deal program. For me, however, many other projects large and small intervened, and the book seemed to grow less timely by the year. In the go-go years of 1990s, the '30s seemed like ancient history. But as I was finishing the book in 2008 the economy tanked, and suddenly the Depression was on everyone's lips.

My wonderful editor at Norton, Bob Weill, said "I would have bought this book even without the financial meltdown." I reminded him that he did -- two months before. But I'm sure this timeliness is responsible for much of the attention the book has received, including an amazing number of reviews and pretty healthy sales.

Q: Your book is not a work of social or political history. But I'm not entirely persuaded that your subtitle is quite right to call it a cultural history, either. A cultural historian of the Depression would have to pay a lot more attention to radio, for one thing. It feels very much more like an interconnected set of essays in literary and cinematic criticism, written with a close attention to form, but also with an old-fashioned willingness to assess value. Is that pigeonholing you wrongly?

A: The book is a hybrid of cultural history and criticism, in proportions that are entirely my own. I wanted to explore how the arts illuminate the Depression and how the Depression illuminates the work done in the arts, even work that appears to have little or no reference to it, and has usually been seen as escapist. I also aimed to uncover the patterns that link or contrast many different kinds of artists: Mike Gold and Henry Roth, Walker Evans and Margaret Bourke-White, Steinbeck and Faulkner, Steinbeck and Nathanael West, Busby Berkeley and Leni Riefenstahl.

I grouped them around four broad cultural themes, and I actually considered subtitling the book "Cultural Themes from the Great Depression." You'd agree that that would have been unduly modest, besides being commercially obtuse.

Why should cultural history have to cover everything, instead of offering an abundant variety of case studies, chosen because they were representative but also because they mattered strongly to me as a critic, and hence I might have something fresh to say about them? What usually passes for cultural history tends to be panoramic but superficial. Jacques Barzun, a pioneer cultural historian, once said that his friend and colleague Lionel Trilling often urged him to dig a little deeper, to pause over some of his many examples. He certainly did that in his book on Berlioz. That's where the critic comes in to deepen and thicken the work of the historian.

The British critic F.R. Leavis, often wrongly seen as a New Critic because of his focus on the texture of a writer's language, actually began writing in the Marxist decade of the 1930s as a sociological critic. Look at the early volumes of his magazine Scrutiny. But he also wrote several essays complaining that social historians tended to use literature merely for documentation, whereas the only real way to "use" literature, to bare its social meanings, was to get deeply inside it, as any subtle and patient reader does. That demands attention to form and language as well as surface content. It also involves critical judgment to assess where the work works, where its intentions are effectively realized and where it actually moves us.

This is not a thumbs-up, thumbs-down approach but a sensitivity to where the writer's or artist's imagination come alive. If you think, as I do, that the arts offer a unique access to the mind and heart of an era, to the way people really thought and felt, then you need to pay a good deal of attention to form and also to exercise your best critical judgment. It requires that indefinable thing called taste as well as some knowledge of the history of the arts. For me the arts, including a large swath of popular culture- - music, movies, photography, theater, and design -- served as a way to lay bare the inner life of the era. I could have subtitled the book "The Inner History of the Great Depression," but that would have been even more controversial, besides making people think it was a book about depression.

Q: Your discussion of Let Us Now Praise Famous Men, the book that James Agee wrote to accompany Walker Evans's photographs of tenant farmers, seems very judicious. I admire this book a lot, but think of it as a defective masterpiece, and you clarify why that seems like a fitting assessment of it. But it was also striking to see your comment on teaching the book: "When I've assigned it to undergraduates, the results have been disastrous." Would you say more about that? Not about Agee's text, necessarily, but about what you mean by the results being disastrous.

A: Since the cliche about the literature and the visual arts of the 30s is that they were folky and naturalistic, or else overtly political, it was important for me to highlight the modernist currents that carried over from the 20s. Such experiments took on a different meaning during the Depression. Agee's writing is Faulknerian, the book is long and digressive, its structure is elusive, and the whole thing can drive you crazy at times. All this proved maddening to undergraduate readers. It took me a while to realize that this was his fault as much as theirs. Much of 30s writing -- Steinbeck, for example -- is more straightforward, since it's rooted in journalism, influenced by Hemingway, and its social criticism is right on the surface.

This is not true of either Walker Evans's pictures, which are far more detached, or Agee's prose, at the other extreme -- so tortured and self-conscious, and so written. Yet it's a genuine 30s book, since it raises serious questions about motives of social documentary and the ethical demands of reporting about lives lived in poverty and deprivation.

For Agee, paradoxically, this was both guilt-inducing and spiritually exalting. Neither of these emotions is comes naturally to readers today, young or old; this was also the case for readers in 1941, when the book first appeared. The worst of the Depression was over, so the book must have seemed like a throwback, despite the timeless appeal of the photographs, which are among the best ever taken by an American. But they lacked the drama and narrative punch that readers had come to expect from photojournalism, thanks to movie newsreels and Life magazine. The book was stillborn till it was republished in 1960.

Q: Were there other experiences in the classroom that deepened your understanding of this period -- or of particular texts, films, etc. -- in ways that were decisive for your book?

A: In my experience, the cultural work of the 1930s teaches very well, especially the films. With the movie industry's rapid adjustment to the technology of sound, that decade saw the consolidation of the classic Hollywood style in relation to mass taste, including strikingly different studio styles, the development of the star system, and the cultivation of genre as the keystone of industrial production. Many of the stars and genres from that period remain hugely attractive today. The luster of Bogart, Cooper, Hepburn, Davis, Fonda, Cagney, Stewart, Stanwyck, Gable, Colbert, Lombard are many others remains surprisingly undimmed.

Screwball comedies and gangster movies still go over very well with students, partly because they're hard-boiled, cynical, and unsentimental, yet also subtly romanticized. The best social problem dramas, from I Am a Fugitive from a Chain Gang to The Grapes of Wrath, have a strong visceral appeal, besides teaching students essential things about the times. I lean to movies that touch obliquely on the Depression, including Busby Berkeley musicals, Frank Capra comedies, and historical epics about other hard times, such as Gone with the Wind, which I often ask students to compare with The Grapes of Wrath. Repeatedly teaching such movies helped create an agenda for the book by showing me what worked and how it still gripped people, which pointed the way to how it might have gripped many Americans back then.

By and large I avoid concentrating on inferior stuff simply as evidence of the times. I have no patience for it and no feeling for it. To do good work, I need to write about things that turn me on. The lasting quality of certain films, books, songs, and photographs is an indication of the many levels on which they work, and also of how much more they have to tell us. Multi-layered books like Call It Sleep, As I Lay Dying, Miss Lonelyhearts, The Day of the Locust, Tender Is the Night, Their Eyes Were Watching God, and Let Us Now Praise Famous Men tell us more about the period than its transient best-sellers, even though they were not truly appreciated until much later. Some books that were sensationally popular then, such as Native Son and The Grapes of Wrath, along with many of the musical standards, remain just as meaningful today. Others books, among them the Studs Lonigan and U.S.A. trilogies, are simply too cumbersome to teach, besides seeming somewhat dated in their naturalistic styles.

One of my aims in the book was to create a living canon of works from the 1930s, not simply an archaeological dig to expose the buried and forgotten layers of a distant culture. Finally, I wanted to show what art could contribute to a society in the throes of a social and economic crisis, a theme that took on unexpected resonance in the wake of our own financial meltdown.

Q: History doesn't repeat itself but it does rhyme, as Mark Twain is supposed to have said. Have any of the works from the Great Depression seemed to you to "rhyme" somehow with the experience of the past 18 months? I've occasionally thought that the title of Edmund Wilson's book of reportage, "The American Jitters," feels quite fitting now....

A: "Jitters" is exactly the right word. Instead of the kind of dramatic crisis that lasted for years during the Depression, a low-level anxiety hangs over the country, what with talk of a "jobless recovery," a concern about structural shifts in the economy, with little sign of improvement in the housing sector, and a resentful sense that only the banks have truly bounced back, largely at our own expense. Despite the populist backlash that has fed Republican hopes, the current economic fears are an improvement over the grim mood of the first six months of the recession, a pervasive dread that we were sliding inexorably into Depression 2.0.

That mood was closely parallel to the early years of the Great Depression. Both were set off by a banking crisis; both witnessed a plague of foreclosures on homes and farm; both revealed terrific flaws in the regulatory system and required serious federal intervention; above all, in both periods there was a huge crisis of confidence, economically in the collapse of the credit markets, psychologically in the widespread fear for the future, especially our children's future. It now becomes clear that the Obama administration did not or could not take full advantage of those first few months of crisis. Like FDR, the president tried hard to serve as Cheerleader in Chief, with some success. Along with the rescue of the banks, the passage of the stimulus bill was probably the single most important factor in avoiding a slide into a second Depression.

But where the New Deal managed to pursue three R's at once -- relief, regulation, and reform -- the administration has so far made little progress toward instituting new forms of regulation or vitally needed reforms. Some such reforms are in the pipeline but they will be bitterly contested, especially now that the worst of the crisis is assumed to have passed.

As far as the response of the arts to the Great Recession, that too is in the pipeline and impossible to predict, though we can be sure that the current downbeat mood, the sense of lowered expectations, will soon be reflected in indie filmmaking, darkly realistic fiction, and popular music at once cheering and keening. By economic indicators the recession may formally have ended, though there has scarcely been a full recovery, but the psychological recession will be with us for a long while.

Author/s: 
Scott McLemee
Author's email: 
scott.mclemee@insidehighered.com

The Tea Party Challenge

When considering the political scene of the moment, it is difficult not to see how historical allegory plays an important role in the public spectacle known as the Tea Party movement. From the name itself, an acronym (Taxed Enough Already) that fuses current concerns to a patriotic historical moment, to the oral and written references by some of its members to Stalin and Hitler, the Tea Party appears to be steeped (sorry) in history. However, one has only to listen to a minute of ranting to know that what we really are talking about is either a deliberate misuse or a sad misunderstanding of history.

Misuse implies two things: first, that the Partiers themselves know that they are attempting to mislead, and second, that the rest of us share an understanding of what accurate history looks like. Would that this were true. Unfortunately, there is little indication that the new revolutionaries possess more than a rudimentary knowledge of American or world history, and there is even less reason to think that the wider public is any different. Such ignorance allows terms like communism, socialism, and fascism to be used interchangeably by riled-up protesters while much of the public, and, not incidentally, the media, nods with a fuzzy understanding of the negative connotations those words are supposed to convey (of course some on the left are just as guilty of too-liberally applying the “fascist” label to any policy of which they do not approve). It also allows the Tea Partiers to believe that their situation – being taxed with representation – somehow warrants use of "Don’t Tread On Me" flags and links their dissatisfaction with a popularly elected president to that of colonists chafing under monarchical rule.

While the specifics of the moment (particularly, it seems, the fact of the Obama presidency) account for some of the radical resentment, the intensity of feeling among the opposition these days seems built upon a total lack of historical perspective. Would someone who really understood the horrors of Stalin’s purges still believe that President Obama sought to emulate the Soviet leader? Or, a drier example, could you speak of a sudden government "takeover" of health care, replete with death panels, if you knew of the long and gradual approach to building the modern American welfare state? The problem, of course, is that many Americans have at best a shaky hold on the relevant historical facts and are therefore credulous when presented with distortions and fabrications. Even after college graduation, too many students lack understanding of key historical developments. And that’s just college students – let’s not forget the majority of Americans who last studied history in their high school years, perhaps in a state like Texas, where Thomas Jefferson was just erased from the past because he is now considered too radical and the word "capitalism" has been replaced by "free enterprise" to help smooth out its rough edges.

It is important to realize that ignorance about history allows falsehoods and distortions to be presented as facts, but it is also significant that Tea Partiers look to history to legitimize their endeavors. In other words, history is still seen as authoritative; the problem is that the authority is being abused. Such abuse can succeed only when the public’s collective historical memory has been allowed to atrophy.

In addition to a vague (at best) recollection of the pertinent facts, Tea Partier warnings of cataclysm are taken seriously because the skill of thinking historically has not been emphasized in high school and college curriculums. Teaching students to understand that things change over time because of particular actions taken or not taken and that context matters, also referred to as "critical thinking," gives them some perspective and helps them to take the long view that can illuminate the emptiness of sky-is-falling scare tactics. The politics of our moment, focused solely on what's happening this minute and what it means for the next election (no matter how far off), cry out for a skeptical appreciation by an electorate that unfortunately does not know how to think historically.

In recent years, conservative groups like the Intercollegiate Studies Institute and the American Council of Trustees and Alumni have been the loudest critics of the low status of history in colleges in the United States. They are especially upset with the lack of American history requirements at elite universities. But this should not be solely a conservative issue, nor can it be one that professional historians ignore. As the Tea Party movement is demonstrating, there are direct political consequences if the public is unable to perceive when history is used to mislead and confuse people.

Unfortunately, as budgets are being slashed at colleges and universities nationwide, history is seen by many as impractical and unimportant. Courses that focus on “career-building” and “real-world skills” are prioritized while history departments are unable to replace retiring faculty. One reason for this is that the case for history has not been made effectively. As ACTA has reported, none of the top 50 universities requires its students to take U.S. history – and 10 require no history course at all. Some students may take a history course that fulfills a broader core requirement, but many do not. And too often these core courses are deficient in teaching historical practice. Historians, whether just entering the field or preparing to retire, have an obligation as people with special knowledge of history's significance to make the case for a greater commitment to the discipline – to students, campus administrators, legislators, and the public. Indeed, anyone concerned about education who does not want to see our contemporary political discourse sink lower should be actively interested in promoting history education.

This is an uphill battle. There is no easy-to-measure market value for teaching history, no space race to gin up patriotic sentiment, no simplistic explanation to combat the perception that studying the subject offers no reward. Yet as the Tea Party "movement" has made apparent, history continues to float in the air of our political discourse, its authority ripe for sucking into every imaginable debate. There will always be divergent interpretations of the past and disagreements about what facts to emphasize, and individual schools and teachers will construct their courses as they see fit. But most of all, we must redouble our efforts to foster historical thinking. Teaching students how historians find and use evidence to construct their arguments develops the critical skills necessary for sorting through the various and often outlandish claims available 24 hours a day on cable TV and the Internet. As long as people reference past events while staking out their positions in the present – and that is unlikely to change – a functioning democracy demands a citizenry capable of spotting historical fantasy and hyperbolic misapplication of historical precedent.

Author/s: 
Erik Christiansen and Jeremy Sullivan
Author's email: 
info@insidehighered.com

Erik Christiansen teaches history at the University of Rhode Island and at Roger Williams University. Jeremy Sullivan is a Ph.D. candidate in history at the University of Maryland at College Park.

Pages

Subscribe to RSS - History
Back to Top