History

Prophets of Deceit

The recent surge of right-wing fantasy into American public discourse should not be surprising. Claims that Obama is a foreigner, that health-care reform means bureaucratic death squads, that “the country we once knew is being destroyed,” as anguished people at town halls have put it – only on the most superficial level are these beliefs the product of ignorance, irrationality, and intractable boneheadedness.

Let’s face reality. An African-American man without so much as an Anglo-Saxon syllable to his name is now occupying an institution called (not on purely descriptive grounds) the White House. What did you think was going to happen? In the 1760s, George Washington complained that the British had a “systematic plan” to render the Americans “as tame and abject as the blacks we rule over with such arbitrary sway.” (An interesting choice of terms, that.) This is a country in which anxiety goes deep, and all the way back. It is not an afterthought.

Mostly, of course, it stays in check. With enough stress on the system, the craziness tends to flare up, like a cold sore. The “viral” political message involved sounds, in part, something like this:

“What’s wrong? I’ll tell you what is wrong. We have robbed man of his liberty. We have imprisoned him behind the iron bars of bureaucratic persecution. We have taunted the American businessman until he is afraid to sign his name to a pay check for fear he is violating some bureaucratic rule that will call for the surrender of a bond, the appearance before a committee, the persecution before some Washington board, or even imprisonment itself.... In the framework of a democracy the great mass of decent people do not realize what is going on when their interests are betrayed. This is a day to return to the high road, to the main road that leads to the preservation of our democracy, and to the traditions of our republic.”

As it happens, this is not a transcript from Fox News, but taken from the opening pages of Leo Lowenthal and Norbert Guterman’s book Prophets of Deceit: A Study of the Techniques of the American Agitator, first published in 1949 by Harper and Brothers. Plus ça change....

The passage just quoted appears in “The Agitator Speaks” – an introductory segment of the book presenting an archetypal harangue by a Depression-era radio ranter or streetcorner demagogue. Father Couglin remains the most notorious of the lot -- perhaps the only one with name recognition today. But scores of them were in business during the worst of the crisis, and enough of them kept plying their trade after the war to worry the American Jewish Committee, which sponsored the study.

My first reading of Prophets of Deceit was about 20 years ago. At the time, its interest to me was for the most part historical – as an example of Frankfurt School theory being used for empirical social research. Lowenthal, a German emigre, was the main author. The focus of his other research was the sociology of literature and popular culture. Guterman, identified on the title page as a co-author, was primarily a translator. The preface expresses appreciation to a young assistant named Irving Howe for “much help in preparing the final manuscript.” That may understate his role. Some chapters are suspiciously well written.

In analyzing speeches and writings by the Depression agitators, Lowenthal showed a particular interest in how they operated as rhetoric – how the imagery, figures of speech, and recurrent themes worked together, appealing to the half-articulated desires and frustrations of the demagogue’s followers. Another of the Frankfurters, Theodore Adorno, had produced a similar if more narrowly focused monograph, The Psychological Technique of Martin Luther Thomas' Radio Addresses, published a few years ago by Stanford University Press. And Prophets of Deceit itself was the third in the AJC’s five-volume series “Studies in Prejudice.”

The insights and blindspots of this large-scale effort to analyze “the authoritarian personality” generated controversy that continues all these decades later. But I wasn’t thinking of any of that when Prophets of Deceit came back to mind not long ago.

The catalyst, rather, was my first exposure to the cable talk-show host Glenn Beck. His program, on the de facto Republican party network Fox, has been a locus for much of the pseudopopulist craziness about how the Presidency has been taken over by a totalitarian illegal alien. You will find most of the themes of this form of political thinking cataloged by Lowenthal and associates. (Sixty years ago, the ranting tended very quickly to become anti-Semitic, while now it seems the conspiracy is run by the Kenyans. This change deserves closer study.)

But the striking thing about Beck’s program was not its ideological message but something else: its mode of performance, which was so close to that described in Prophets of Deceit that I had to track down a copy to make sure my memory was not playing tricks. The book was reissued a few years ago in an edition of Lowenthal’s collected writings published by Transaction.

In case you have not seen him in action, Beck “weeps for his country.” Quite literally so: the display of waterworks is the most readily parodied aspect of his performance. He confesses to being terrified for the future, and quakes accordingly. He acts out aggressive scenarios, such as one in which he pretended to be Obama throwing gasoline on an Average American and threatening to set him on fire.

Prophets of Deceit describes Beck’s act perfectly, six decades avant la lettre: “something between a tragic recital and a clownish pantomime.”

The performance is intended, not to provide information or even to persuade, but rather to create a space in which rational discussion can be bypassed entirely. The demagogue, whether of old or new vintage, “does not confront his audience from the outside; he seems rather like someone arising from its midst to express its innermost thoughts. He works, so to speak, from inside the audience, stirring up what lies dormant there.... It is difficult to pin him down to anything and he gives the impression that he is deliberately playacting.... Moving in the twilight zone between the respectable and the forbidden, he is ready to use any device, from jokes to doubletalk to wild extravagances.”

Instead of argument about the relative merits of this or that policy or action, this mode fosters beliefs that are “always facile, simple, and final, like daydreams.” The point is not to analyze or to convince members of the public but to offer “permission to indulge in anticipatory fantasies in which they violently discharge those emotions against alleged enemies.”

A lot has changed since Prophets of Deceit appeared, but not everything. Rereading it now leaves the definite sense that we’ve been here before.

Author/s: 
Scott McLemee
Author's email: 
scott.mclemee@insidehighered.com

The Accidental Celebrity

“There are two modes of establishing our reputation: to be praised by honest men, and to be abused by rogues. It is best, however, to secure the former, because it will invariably be accompanied by the latter.”

-- Charles Caleb Colton, Anglican clergyman (1780-1832)

One deleted e-mail marked the beginning of my ordeal. It was finals week, just before Christmas break, when I received a strange message asking me to comment on some kind of online political essay that I had supposedly written. Since I’m not a blogger and make it a point to avoid the many rancorous political forums on the Internet, I immediately dismissed it as spam and hit delete.

But the notes kept coming, increasing in their fervor and frequency, until I could no longer deny it: I was receiving “fan mail.” Some writers called me courageous. Others hailed me as a visionary. A few suggested that I was predestined to play a pivotal role in the apocalyptic events foretold in the Book of Revelation. (Seriously.) Now, over the past 12 years I have published a scholarly book and eight journal articles on various historical topics, but I have to admit that through it all I never even attracted one groupie. So with my curiosity very much piqued, I began an online quest in search of the mysterious article.

I suppose it was inevitable that I was not going to like what I found. There, prominently displayed on a rather extreme Web site, was an essay (information about it can be found here) that likened President Obama to ... Adolf Hitler. Underneath the title was the inscription “by Tim Wood.”

To say I was not pleased would be a colossal understatement. However, even though my parents always told me I was special, a quick Internet search will reveal that I am not, in fact, the world’s only Tim Wood. So I ignored the article -- at least until one of the versions of the essay being forwarded via e-mail mutated into a form which included the rather unambiguous phrase “Professor of History, Southwest Baptist University.” The writer of this message also helpfully appended my office phone number and e-mail address.

Stunned, I struggled to regain my bearings and tried to grasp the full implications of this professional identity theft. Beyond the fact that the comparison is utterly ridiculous (anyone who believes that truly has no understanding of the depths of evil plumbed by the Nazi regime), it was now personal. Who had the right to speak for me like that? How dare they hide behind my name! What if my colleagues -- or my friends and family -- read this and believed it?

But the most pressing question seemed to be what kind of damage control would be necessary in order to prevent this from irreparably damaging my career. And that, in turn, led me to begin reflecting on how scholars will need to safeguard their professional reputations in the 21st century. Although I would never wish this kind of ordeal on anybody, the realist inside me fears that I will not be the last professor to fall victim to digital dishonesty. As academics, we must be aware that our professional reputations are transmitted through the technology of a bygone era, and even then are typically shrouded in secrecy or obscurity. Mentors, colleagues, and administrators exchange sealed and confidential references printed out on university letterhead. Editors, referees, and reviewers validate our scholarly work by allowing us access to or giving us coverage in their publications, but the results of that process all too often lie buried in library stacks and academic databases. In the meantime, the malicious or misinformed denizens of the Web have had time to hit the “forward” button about a million times.

So what lessons have I learned through this ordeal? First of all, be proactive. Once these rumors hit a certain critical mass, ignoring them will not make them go away. Indeed, a situation like this becomes the ultimate test of one’s personal credibility in the workplace. Immediately after I discovered that my specific identity had become attached to that particular article, I treated myself to a tour of the university’s administration building. Everybody from my department chair, to my dean, to the provost, to the directors of human resources, information technology, and university relations heard my side of the story within 48 hours. In my case, I was fortunate enough to have retained the confidence and support of my administration. There is no substitute for goodwill.

Secondly, I tried to remain positive and to find the teaching moment hidden within all of this. I posted an item on the university’s faculty Web page that served both as a public disclaimer and an opportunity to emphasize to students (and anybody else who might read it) why it is that faculty constantly warn against an uncritical acceptance of materials found on the Internet. I reminded my readers that in history, scholars are trained to constantly analyze their sources. Always historians must be aware that the documents they are working with may contain errors, lies, omissions, distortions, or may even turn out to be wholesale forgeries. To navigate those potential pitfalls, scholars check facts and look for other documents that confirm (or contradict) the information found in our sources. We seek to identify the author and understand his or her motives for writing. We try to understand the larger historical and cultural context surrounding a document. By doing our homework, we are better able to judge when people deserve to be “taken at their word.”

This episode has also taught me a tough lesson in maintaining a professional demeanor, even in the face of outrageous provocations. Although the majority of people who wrote to inquire about the article were gracious, and many even apologized for the mistake, enough of my correspondents were belligerent and rude to make me dread opening my inbox every morning. Even after learning I was not the author, many readers clearly still expected me to lend my professional credibility to the essay, vouching for its accuracy and validating its interpretations. After reading my denial (where I explicitly refused to endorse the article’s contents), many supporters of the piece became abusive, writing back to attack the depth of my patriotism, the sincerity of my religious faith, and the integrity of the academic community in the United States in general.

Critics of the essay were not above lashing out either -- even in the absence of evidence. One disgruntled detractor wrote to inform me that my brand of “voodoo” and “fear-mongering” would soon be vanishing into irrelevancy, heralding the advent of a new Age of Reason. (Hopefully that individual’s definition of reason will eventually grow to include a commitment to basic research and fact-checking and an unwillingness to take forwarded e-mails at face value.) In the meantime, along with the angry rants, there came from the fever swamps of political paranoia long-discredited conspiracy theories, urging me to consider that the course of history was being determined by Jewish bankers, or the Jesuits, or the Illuminati, or even flesh-eating space aliens. Frequently at those junctures, I felt the temptation to fire back with a “spirited” and “colorful” rebuttal. However, I resisted for many reasons: because I am ultimately a firm believer in civility in public debate, because I did not want to embarrass the colleagues and administrators who had stood by me through this, and because arguing with people who have already made up their minds and have come to demonize those who disagree is almost always an exercise in futility.

Moreover, this incident has led me to reconsider my somewhat adversarial relationship with technology. (I’m the guy who still refuses to buy a cell phone.) But one of the greatest difficulties I encountered in all of this was finding a platform from which to launch a rebuttal. Although I did write personal replies to many of the people who wrote me inquiring about the article, it seemed clear that such a strategy alone was like battling a plague of locusts with a flyswatter. Instead, Internet rumors are best refuted by channeling people toward some definitive, universally available, online point-of-reference (a Web address, for instance) that exposes the lie. In my case, the university was kind enough to grant me access to a page on its Web site, and I quickly began disseminating the link to my posting. However, that solution may not be available to everyone who falls victim to this kind of a hoax, and I am beginning to believe this issue is far too important for faculty to leave to others anyway. A year ago, I would have considered the creation of an “official Tim Wood Web site” to be pretentious in the extreme. Today, I’m not so sure. Like it or not, faculty are public figures, and if we do not take the initiative to define ourselves in ways that are accessible and relevant to those outside the academy, we risk being defined by others in ways that suit their agenda, not ours.

Finally, confronting this situation has led me to take a fresh look at the qualities that make a good historian. In 1964 Richard Hofstadter, an influential scholar of American politics, wrote an article for Harper’s Magazine entitled “The Paranoid Style in American Politics.” In this passage, he describes a paranoia all too familiar in today’s political discourse:

As a member of the avant-garde who is capable of perceiving the conspiracy before it is fully obvious to an as yet unaroused public, the paranoid is a militant leader. He does not see social conflict as something to be mediated and compromised.... Since what is at stake is always a conflict between absolute good and absolute evil, what is necessary is not compromise but the willingness to fight things out to a finish. Since the enemy is thought of as being totally evil and totally unappeasable, he must be totally eliminated -- if not from the world, at least from the theatre of operations to which the paranoid directs his attention.

As author Dick Meyer pointed out in a 2005 CBS News article, this mentality has come to transcend political labels:

The great dynamic is that so many people .... are convinced that a malevolent opponent wants to destroy their very way of life and has the power to do so. Evangelical Christians may believe that gay marriage, abortion rights, promiscuous and violent popular culture, and gun control are all part of a plot to destroy their community of values. Urban, secular liberals may believe that presidential God-talk, anti-abortion legislators and judges, intrusive Homeland Security programs, and imperialist wars are part of a sinister cabal to quash their very way of life.

Historians often find themselves compared to storytellers, and are lauded for their ability to present compelling interpretations of the past and to craft powerful narratives. But perhaps equally as important is our role as listeners. In an increasingly divided society, consensus will never be achieved by shouting (or e-mailing) until we drown out all competing voices. Instead, the first steps toward reconciliation come by those who seek to understand all aspects of the question and try to remain mindful of the needs of others.

In any case, my battle continues. Monday I will go to work, try to sort through all the chaos, and do my best to help folks figure out the truth. (Which is probably pretty close to what I did before my identity was stolen, come to think of it....) And I will continue to contemplate the ways in which this experience will change the way I present myself as a professor and a historian. In the meantime, if any of you encounter any online rantings and ravings that claim to be by me, do not necessarily believe them. Things are not always what they seem.

Author/s: 
Timothy L. Wood
Author's email: 
info@insidehighered.com

Timothy L. Wood is an assistant professor of history at Southwest Baptist University in Bolivar, Missouri. He is the author of Agents of Wrath, Sowers of Discord: Authority and Dissent in Puritan Massachusetts, 1630-1655 (Routledge).

Wrong Things, Rightly Named

Suppose that, 30 or 40 years ago, the news media of the West had gotten hold of a KGB document reviewing its experiences in interrogating those who posed a menace to the peace, progress, and stability of the People’s Democracies. For younger readers, perhaps I should explain that the Soviet Union and its client states liked to call their system by that cheerful term. And yes, they were serious. Self-deception is a powerful force, sometimes.

Suppose the report listed such methods of information-gathering as beatings, suffocation, and mock executions. And suppose, too -- on a lurid note -- that it mentioned using threats to murder or sexually violate members of a prisoner’s family. Now imagine numerous pages of the report were redacted, so that you could only guess what horrors they might chronicle.

With all of that as a given, then... How much debate would there have been over the moral status of these acts? Would someone who insisted that they did not constitute torture get a hearing? Could a serious case be made that it was in the best interests of justice to move forward without dwelling on the past?

If so, would such arguments have been presented in major newspapers, magazines, and broadcasts? Or would they have been heard in out-of-the-way meeting halls, where the only cheer was borrowed from posters of the National Council of American-Soviet Friendship?

This thought experiment comes to mind, of course, in the wake of reading about the report of the CIA’s Office of the Inspector General. The analogy is not perfect by any means. No comparable degree of “openness” (however grossly inappropriate that word seems in this case) existed on the other side of the old Iron Curtain. But let’s not cheer ourselves hoarse over that fact just yet.

Actions that would have been judged without hesitation to be torture if conducted by a member of the East German secret police (or, in the case of waterboarding, by the Khmer Rouge) did not meet the wonderfully scrupulous standards laid out seven years ago by the Department of Justice’s Office of Legal Counsel. If more testimony to the power of self-deception needed, this would do.

When the CIA made its evaluation of various bloody-minded interrogation practices in 2004, the Bush administration’s response was reportedly frustration that the techniques hadn’t been more effective. The assessment of the Obama administration seems to be that torture has been both unproductive and damaging for “soft power” – a public-relations nightmare. This is progress, of a kind. If somebody decides to give up sociopathic behavior on the grounds it is proving bad for business, that is only just so much reason for relief. But it is preferable to the alternative.

It might be possible to hold ourselves to higher standards than that. But first it would be necessary to face reality. One place to start is Tzvetan Todorov’s little book Torture and the War on Terror, first published in France last year and now available in translation from Seagull Books (distributed by the University of Chicago Press).

Todorov once lived in what was called, at the time, the People’s Republic of Bulgaria. As an émigré in Paris in the 1960s, he wrote The Poetics of Prose and other standard works in structuralist literary criticism – as well as a study of the idiosyncratic Russian theorist Mikhail Bakhtin that, in my opinion, made Bakhtin’s thought seem a lot more systematic than it really was.

Over the past quarter century, Todorov’s concerns have shifted from the structuralist analysis of literary language to a moral inquiry into the historical reality of violence and domination, including books on the Nazi and Stalinist concentration camps.

Torture and the War on Terror is more pamphlet than treatise. Some pages revisit points that ought to be familiar to anyone who has given any thought to the experience of the past eight years. To grasp the essential meaninglessness of a phrase like “war on terror” (you can’t bomb a state of mind) does not require a degree in linguistics or rhetoric. But then, the ability to state the obvious can have its uses.

The document prepared by the Justice Department in August 2002 carefully parsed its definition of torture so that it covered only acts leading to the "severe pain" characteristic of permanent “impairment of bodily function.” Todorov does not hesitate to specify what is going on within that semantic maneuver: “The reasoning of this memorandum – paradoxically so, for a legal document prepared by competent jurists – proceeds from a form of magical thinking insofar as it pretends that we can act on things by changing their names.” That about covers it. The expression “magical thinking” covers a great deal of our public life in those years – a time exemplified by the consistently miraculous heroics of Jack Bauer on “24.”

As both a student of the phenomenon of state violence and a former resident of People’s Bulgaria, Todorov is willing to recognize and name what has been going on this past decade. We need to read the following and remember that it is what goes in the history books:

“In prisons scattered throughout countries outside the United States, the detainees have been regularly raped, hung from hooks, immersed in water, burned, attached to electrodes, deprived of food, water or medicine, attacked by dogs, and beaten until their bones are broken. On military bases or on American territory, they have been subjected to sensory deprivation and to other violent sensory treatments, forced to wear headphones so they cannot hear, hoods so they cannot see, surgical masks to keep them from smelling, and thick gloves that interfere with the sense of touch. They have been subjected to nonstop ‘white noise’ or to the regular alternation of deafening noise and total silence; prevented from sleeping, either by use of bright lights or by being subjected to interrogations that can last twenty hours on end, forty-eight days in a row; and taken from extreme cold to extreme heat and vice versa. None of these methods cause ‘the impairment of bodily function,’ but they are known to cause the rapid destruction of personal identity.”

Given the inefficacy of torture as a way to extract intelligence, its real “value” comes in the form of retribution -- and the feeling of restored mastery this permits.

“Reducing the other to a state of utter powerlessness,” writes Todorov, “gives you a sense of limitless power. This feeling is obtained more from torture than from murder, because, once dead, the other is an inert object that can no longer yield the jubilation that comes from wholly defeating the will of the other. On the other hand, raping a woman in front of her husband, parents, and children or torturing a child in front of his father yields an illusion of omnipotence and a sense of absolute sovereignty. Transgressing human laws in this way makes you feel close to the gods.... Torture leaves an indelible mark not only on the victim but also on the torturer.”

Todorov might have pushed this line of thinking (with its nod to Hegel’s dialectic of the struggle for recognition) a lot further than he does. The “indelible mark” can take various forms, and it is not restricted to those who directly wield the instruments of torture.

The craving for “an illusion of omnipotence and a sense of absolute sovereignty” is something best channeled into wish fulfillment-oriented forms of entertainment. There it can be aggrandized yet contained. Money and commodities change hands; the consumer gets a catharsis of sorts; civil society muddles along, and everybody wins.

When sectors of the populace come to regard its pursuit in reality as a necessary part of the business of the state, things are on a different and more worrying terrain. A host of strange side effects then follow – including nostalgia for 9/11 itself in some quarters, since the country was so deeply “unified” on 9/12. A scarcely concealed yearning for another terrorist assault makes perfect sense, given that it would presumably justify another sustained effort to assert American omnipotence and sovereignty. (In saying it “makes perfect sense,” I mean, of course, in a perfectly insane way.)

“As a rule,” writes Todorov, “citizens in liberal democracies will condemn without hesitation the violent practices of a state that will tolerate torture, and especially of a state that systematizes its use, as in the case of totalitarian regimes. Now we have discovered that these same democracies can adopt totalitarian attitudes without changing their overall structure. This cancer does not eat away at a single individual; its metastases are found in people who thought they had eradicated it in others and considered themselves immune. That is why we cannot be reassured.”

True enough. But we have a long way to go before reassurance will be desirable, let alone possible.

Author/s: 
Scott McLemee
Author's email: 
scott.mclemee@insidehighered.com

The Public Option

Shortly after last week’s column appeared, I headed out to Iowa City to attend -- and, as the occasion required, to pontificate at -- a gathering called Platforms for Public Scholars. Sponsored by the Obermann Center for Advanced Studies at the University of Iowa, it drew somewhere between 100 and 150 participants over three days.

This was the latest round in an ongoing conversation within academe about how to bring work in the humanities into civic life, and vice versa. The discussion goes back almost a decade now, to the emergence of the Imagining America consortium, which fosters collaboration between faculty at research universities and partners in community groups and nonprofit organizations.

That effort often runs up against institutional inertia. You sense this from reading "Scholarship in Public: Knowledge Creation and Tenure Policy in the Engaged University" (the report of the consortium's Tenure Team Initiative, released last year). Clearly there is a long way to go before people in the humanities can undertake collaborative, interdisciplinary, and civic-minded work without fearing that they are taking a risk.

Even so, the presentations delivered in Iowa City reported on a variety of public-scholarship initiatives -- local history projects, digital archives, a festival of lectures and discussions on Victorian literature, and much else besides. Rather than synopsize, let me recommend a running account of the sessions live-blogged by Bridget Draxler, a graduate student in English at the University of Iowa. It is available at the Web site of the Humanities, Arts, Sciences, and Technology Advanced Collaboratory (better known as HASTAC, usually pronounced “haystack”).

Word went around of plans to publish a collection of papers from the gathering. I asked Teresa Mangum, a professor of English at U of I, who organized and directed the event, if that was in the cards. She “built the platform,” as someone put it, and presided over all three days with considerable charm -- intervening in the discussion in ways that were incisive while also tending to foster the collegiality that can be elusive when people come from such different disciplinary and professional backgrounds.

“My goal is to have some kind of ‘artifact’ of the conference,” she told me, “but I'm trying to think more imaginatively what it might be ... possibly a collection of essays with a Web site. We definitely want to produce a online bibliography but maybe trying to use the Zotero exhibition approach there.”

It was a symposium in the strict sense, in that food was involved. Also, beverages. On the final day, a roundtable assessment of the whole event was the last item on the agenda -- only for this discussion to be bumped into the farewell dinner when things ran long.

Unfortunately I was unable to attend, for fear that a persistent hacking cough was turning me into a pandemic vector. Instead, I retired to the hotel to scribble out some thoughts that might have been worth taking up at the roundtable. Here they are -- afterthoughts, a little late for the discussion.

Most people who attended were members of the academic community, whether from Iowa or elsewhere, and most of the sessions took place in university lecture halls. But the first event on the first day was held at the Iowa City Public Library. This was a panel on new ways of discussing books in the age of digital media -- recounted here by Meena Kandasamy, a young Tamil writer and translator whose speech that evening rather stole the show.

Holding the event at the public library opened the proceedings up somewhat beyond the usual professorial demographic. At one point, members of the panel watched as a woman entered with her guide dog, stretched out on the ground at the back of the room, and closed her eyes to listen. At least we hoped she was listening. I think there is an allegory here about the sometimes ambiguous relationship between public scholarship and its audience.

In any case, the venue for this opening session was important. Public libraries were once called “the people’s universities.” The populist impulse has fallen on some scurvy times, but this trope has interesting implications. The public library is an institution that nobody would be able to start now. A place where you can read brand-new books and magazines for free? The intellectual property lawyers would be suing before you finished the thought.

So while musing on collaborative and civic-minded research, it is worth remembering the actually existing public infrastructure that is still around. Strengthening that infrastructure needs to be a priority for public scholarship -- at least as much, arguably, as "the production of knowledge." (This phrase, repeated incessantly in some quarters of the humanities, has long since slipped its original moorings, and owes more to American corporate lingo than to Althusser.)

Institutions can be narcissistic; and one symptom of this is a certain narrowly gauged conception of professionalism. often indistinguishable in demeanor from garden-variety snobbery. Any real progress in consolidating the practice of public scholarship has to involve a strengthening of ties with people in the public sector -- especially librarians and teachers.

It is not that scholars exist over here while something called “the public” is over there -- off in the distance. Rather, people are constituted as a public in particular spaces and activities. The university is one such site, at least sometimes. But it isn’t the only one, and public scholarship needs to have moorings in as many such venues as possible.

The problem being that it is often hard enough to drop an anchor in academe, let alone in the wide Sargasso Sea of civil society. I am not a professor and have no advice to give on that score. But it seems important to pass along the comments of someone attending Platforms for Public Scholars who confided some thoughts to me during some downtime. I will pass them along by permission, but without giving away anything about this person's identity.

During one panel, a couple of tenured professors mentioned being concerned that their civically engaged scholarship might not count for promotion. One even noted that people who had done collaborative work in the humanities tended to discount it as part of a tenure file -- saying, “Well I did my mine without getting credit for it, so why should you?”

At the time, I raised an eyebrow, but didn’t really think much about it. Later, though, someone referred back to the session in tones that suggested chagrin and longstanding doubts about having a career in the humanities.

“These are people who actually are established, who have some power in their institutions," this individual told me. "I don’t have that. I don’t even have a job yet. And I want them to show some courage. If you really have a conviction that collaboration and public engagement are important, then do it without worrying so much. And support it. Make it possible for someone like me to make doing public work part of my scholarship. Otherwise, what are we even talking about?”

Author/s: 
Scott McLemee
Author's email: 
scott.mclemee@insidehighered.com

The First of the Year

The First of the Month is a cultural and intellectual publication that is singularly lively, and no less strange. It started out in 1998, in tabloid format, as a “newspaper of the radical imagination” published in Harlem. First has been compared to Partisan Review, the legendary magazine of the New York Intellectuals that began during the Depression. But honestly, that's just lazy. Any time a bunch of smart people start a magazine, somebody ends up comparing to it to Partisan Review, especially if it is published in New York; but First took its name from a song by Bone-Thugs-n-Harmony, and while I’d like to picture Delmore Schwartz doing a little freestyle rapping over scotch at the White Horse Tavern, it’s a stretch.

Following what has become the contemporary routine, the paper gave birth to a Web site; this then replaced the print edition. An anthology culled from its first decade appeared last year as The First of the Year: 2008, published by Transaction. On first approach, the book looks like a memorial service for the whole project. And an impressive one: the roster of contributors included (to give a very abbreviated and almost random roll-call) Amiri Baraka, Greil Marcus, Lawrence Goodwyn, Grace Lee Boggs, Adolph Reed, Russell Jacoby, Armond White, Kurt Vonnegut, Kate Millet, Richard Hoggart, and Ellen Willis.

I meant to give the volume a plug when it appeared; so much for the good intention. But happily, my initial impression was totally wrong. While continuing to function online (and to have its world headquarters in Harlem, where editorial collective member and impressario Benj DeMott lives) First has reinvented itself as an annual anthology. First of the Year: 2009, has just been published, which seems worth noting here, in this first column of the year.

The viability of any small-scale, relatively unprofitable cultural initiative is a function of two forces. One is the good will of the people directly involved. The other is getting support from the public – or rather, creating one.

In this case, the process is made more difficult by the fact that First is sui generis. Which is putting it politely. My own response upon first encountering it about 10 years ago involved a little cartoon balloon forming over my forehead containing the letters “WTF?” It is not simply that it is hard to know what to expect next; sometimes it is hard to say what it was you just read. In First, political commentary, cultural analysis, and personal essays sit side-by-side. But at times, all three are going on at once, within the same piece. Kenneth Burke used to refer to such jostlings of the coordinate system as creating "perspective by incongruity." It signals a breakdown of familiar formats -- a scrambling of routine associations.This is stimulating, if perplexing. The confusion is not a bug but a feature.

One familiar description of the journal that I have come to distrust treats First as a bridge between popular culture and the ivory tower. An often-repeated blurb from some years ago calls it "the only leftist publication [one] could imagine being read at both Columbia University and Rikers.”

Good advertising, to be sure. But the better the ad, the more its presumptions need checking. The whole “building a bridge” trope implies that there is a distance to be spanned – a connection between enclaves to be made. (The ideas are over here, the masses over there.) But reading First involves getting oriented to a different geography. Some academics do write for it, but they do not have pride of place among the other contributors, who include poets and musicians and journalists, and people who might best just be called citizens. The implication is not that there is distance to be crossed, but that we're all on common ground, whether we know it, or like it, or not.

In the wake of 9/11, some writers for First (not all of them) rallied to the call for a war, and at least one endorsed George W. Bush during the 2004 campaign. Does that mean that First is actually “the only ‘neoconservative’ publication read in both academe and prisons”? Well, no, but funny you should ask, because it underscores the convenience made possible by pre-gummed ideological labels.

At times they are useful (I tend to think "social-imperialist" is a pretty good label for the idea that "shock and awe" was necessary for historical progress in Iraq) but not always.

The discussion of Obama in the new volume is a case in point. Both Paul Berman (a Clintonian liberal who supported the Iraq War) and Amiri Baraka (who takes his political bearings from Mao Tsetung Thought) concurring that the 2008 election was a transformative moment. This is, let's say, an unanticipated convergence. Meanwhile, Charles O’Brien (an editorial collective member who endorsed Bush in ‘04, on more or less populist grounds) treats Obama as a short-circuit in the creation of a vigorous radicalism-from-below needed for social change. “Of the Obama campaign, what endures?” he asks. “The new Pepsi ad.”

It would be wrong to see First as yet another wonk magazine with some cultural stuff in it. Nor is one of those journals (edited on the bridge, so to speak) in which the latest reality-TV show provides the excuse for yet another tour of Foucault’s panopticon. Politics and culture come together at odd angles in the pages of First, -- or rather, each spins out from some vital center that proves hard to pin down. Margin and mainstream are configured differently here.

I tried to get a handle on First's particularity by talking to Benj DeMott, who edited the two anthologies and is now working on the third. We spoke by phone. Taking notes did not seem like a plausible endeavor on my part, because DeMott's mind moves like greased lightning – the ideas and references coming out in arpeggios, rapid-fire and sometimes multitrack.

But one point he made did stick. It was a consideration on holding together a project in which the contributorsdo not share a party line, and indeed sometimes only just barely agree to disagree. It sounds complicated and precarious. Often, he said, it comes down to sharing a passion for music -- for sensing that both democracy and dancing ought to be in the streets. Politics isn't about policy, it's about movement.

That does not mean celebration is always the order of the day. The indulgence of academic hiphop fans is legendary, but if you want to see what tough-minded cultural analysis looks like, check out the African-American film critic Armond White's reflections on white rapper Eminem in The First of the Year: 2009. The essay can be recommended even if its subject is now shrinking in pop culture’s rearview mirror.

“Rather than a symbol of cultural resistance,” writes White, “he’s the most egregious symbol of our era’s selfish trends. With his bootstrap crap and references to rugged individualism reminiscent of the 80s, he’s a heartless Reagan-baby – but without the old man’s politesse.... His three albums of obstinate rants culminate in the egocentric track ‘Without Me,’ making him the Ayn Rand of rap – a pop hack who refuses to look beyond himself.... Minus righteousness, angry rap is dismissible. Rap is exciting when it voices desire for social redress; the urge toward public and personal justice is what made it progressive. Eminem’s resurrected Great White Hope disempowers hip hop’s cultural movement by debasing it.”

Now, if you can imagine such thoughts ever appearing in an essay by Irving Howe -- let alone Irving Kristol -- then we can go ahead and describe First as inheriting the legacy of the New York Intellectuals.

Otherwise, it may be time to recognize and respect First for what it is in its own right: a journal of demotic intelligence, alive to its own times, with insights and errors appropriate to those times, making it worth the price of perplexity.

Only after talking to Benj DeMott did I read what seems, with hindsight, like the essay that best explains what is going on with the whole project. This is long tribute -- far more analytical than sentimental -- to his father, the late Benjamin DeMott, who was a professor of English at Amherst College. He was a remarkable essayist and social critic.

It is time that someone publish a volume of DeMott senior's selected writings. Meanwhile, his influence on First seems pervasive. The younger DeMott quotes a letter written in his father’s final years -- a piece of advice given to a friend. It offers a challenge to what we might call "the will to sophistication," and its hard clarity is bracing:

"Study humiliation. You have nothing ahead of you but that. You survive not by trusting old friends. Or by hoping for love from a child. You survive by realizing you have nothing whatever the world wants, and that therefore the one course open to you is to start over. Recognize your knowledge and experience are valueless. Realize the only possible role for you on earth is that of a student and a learner. Never think that your opinions – unless founded on hard work in a field that is totally new to you – are of interest to anyone. Treat nobody, friend, co-worker, child, whomever, as someone who knows less than you about any subject whatever. You are an Inferior for life. Whatever is left of it....This is the best that life can offer. And it’s better than it sounds.”

This amounts, finally, to a formulation of a democratic ethos for intellectual life. It bends the stick, hard, against the familiar warp. So, in its own way, does First, and I hope the website and the series of anthologies will continue and prosper as new readers and writers join its public.

Author/s: 
Scott McLemee
Author's email: 
scott.mclemee@insidehighered.com

'Economy of Attention'

It is not only people living on islands who count as insular -- etymology notwithstanding. Consider a recent piece by New York Times columnist David Brooks, whose usual shtick might be called “Thornstein Veblen for Dummies.” Using the disaster in Haiti following the earthquake earlier this month as his peg, Brooks diagnosed the country’s poverty and imploding civil society as byproducts of Voodoo – which, with its magical worldview, discourages rational calculation and planning.

Evidently the pundit is growing ambitious; he has graduated to Max Weber for Dummies. The thesis makes perfect sense, as long as you ignore as much economic and political history as possible.

After enslaved people of African descent liberated themselves during the Haitian revolution of the 1790s (creating an independent state, at enormous cost of life) they were forced to pay reparations to France, which had fought a war to resubjugate the island, but lost. The price of diplomatic recognition was not cheap; by one estimate, France demanded the equivalent of $21 billion in today’s currency. Haiti continued to pay it well into the middle of the 20th century. The resources of a poor country were transferred, decade after decade, to a rich country. Was this more rational than a belief in zombies? Would it not tend to foster a belief that the world is governed by capricious forces who must be placated?

The response of sundry blowhards to the news from Haiti is only partly the result of unabashed ignorance, of course. Moral callousness is also a factor. But even among people feeling empathy and a sense of responsibility to help there is often a blindspot with regard to the Caribbean – an underestimation of its place in the history of Atlantic societies, its role in connecting Europe, Africa, and the Americas.This was one of the points tirelessly emphasized by the late C.L.R. James, the Trinidadian historian and political theorist, whose classic book The Black Jacobins: Toussaint Louverture and the San Domingo Revolution (1938), is being rediscovered now. Perhaps it is not too late to grasp the Caribbean as a crucial part of the process shaping global society.

For some more up-to-date reflections on the region at this moment of crisis, I got in touch with Nicholas Laughlin, editor of The Caribbean Review of Books, who lives near James’s old hometown of Port of Spain, Trinidad. In addition to transforming CRB from a quarterly print magazine to an online journal, he is co-editor of the poetry magazine Town and an administrator of Alice Yard, a small contemporary arts space in Port of Spain.

Laughlin is the editor of Letters from London by C.L.R. James (University Press of New England, 2003) and of the revised, expanded, and re-annotated edition of Letters Between a Father and Son by V.S. Naipaul (Pan Macmillan, 2009). I reviewed the earlier book some years ago, and have had the occasional brief dialogue with him by e-mail in the meantime. Following the events of the past two weeks, we had a much more substantial discussion – one touching on the history and politics of the Caribbean, and how its cultural institutions (academic and otherwise) fit into the “economy of attention” of the 21st century.

A transcript of that exchange follows. Some of Laughlin’s spelling has been Americanized, for I have yielded to the cultural imperialism of WordPerfect.

Q: You know how the disaster in Haiti is being discussed by the mass media here. What can you say about how it is being framed within the Caribbean?

A: The Caribbean is so various, it's hard to generalize. I don't really know how recent events in Haiti are being framed in the Hispano- and Francophone Caribbean. Within the Anglophone Caribbean, responses vary from country to country or island to island. In the Bahamas – just north of Haiti, where there are significant numbers of Haitian immigrants – there's been concern about being swamped by refugees. My colleague Nicolette Bethel – anthropologist, playwright, theatre director, and editor of the online literary journal tongues of the ocean – has criticized the way the earthquake has been reported in the Bahamian press, even as many Bahamians have thrown themselves into organizing relief efforts.

In Trinidad, on the other hand, at the opposite end of the Caribbean, there's been some anger about the way the government has responded i.e. with what looks to many of us like faint concern. It took our prime minister nearly a full day to make any kind of statement about the earthquake, in the form of off-the-cuff remarks to the press.

Of course many of us in the Caribbean have CNN, the BBC, and the U.S. networks on cable, and read the international papers online. Where there's been local reporting, it's mostly focused on local angles – the Jamaican press gave lots of coverage to the visit their prime minister made to Haiti last week, and here in Trinidad there have been several stories on Trinidadians who happened to be in Haiti during the earthquake.

As with the media anywhere, the media here "like" stories of chaos and mayhem. So there's been ample coverage, via wire service stories, of looting, machete-wielding gangs, street violence etc., even though there are many people on the ground in Haiti who say that violent incidents have been rare and very localized, and there has been extraordinary cooperation among displaced Haitians – and the whole issue of "looting" needs serious deconstruction.

Q: Yes, it’s quite similar to how the coverage of the disaster in New Orleans unfolded in the American media just after Katrina. Do you notice anything distinctive about the Caribbean discussion of the crisis now?

A: I think there is wider awareness in the Caribbean (than in the U.S., say) of some of the historical circumstances that contributed to the present crisis – crippling and unjust debt, meddling by foreign powers, and so on. I'm pretty sure that the Haitian Revolution and its aftermath, including the "reparations" payments to the French (by now we can all quote the amounts by heart) are on the secondary schools history syllabus in Trinidad.

I've read a few incisive op-ed pieces in the local press – by the historian Hilary Beckles in Barbados, the activist Raffique Shah in Trinidad and Tobago, the literary scholar Carolyn Cooper in Jamaica, and I'm sure I've missed others – that explain Haiti's history of being bullied by wealthier countries with much bigger armies. Certainly among what you might call the Caribbean's intellectual elite, for want of a better term, there's a definite sense of the wider Caribbean's moral debt to Haiti. We've almost all read at least some of The Black Jacobins.

Still, the rhetoric of "failure," the idea that Haiti is "unlucky," has a foothold in the discourse. I got into a sort of argument the other day, on Facebook, with a friend who was riffing off the Senegalese president's offer to "repatriate" Haitians. This friend suggested, no doubt as a kind of deliberately absurd thought experiment, that the "seemingly interminable problem" of Haiti would be solved by permanently evacuating the whole country – resettling all nine or ten million people elsewhere. Even among well-educated and well-meaning people, the idea of Haiti as a "problem" is entrenched.

But another friend who entered the conversation said something that struck and moved me. She said she was appalled by her own attitudes towards Haiti – meaning, I think, that the horrors of the earthquake and its aftermath had forced her to confront her own unconscious prejudices and ignorance. I feel the same, and there seems to be a wider sense here that Caribbean citizens must take some blame for Haiti's troubles in recent decades. We haven't been interested enough, haven't pressured our politicians enough, haven't bothered to try to understand. Many friends and colleagues seem to share an awareness that real recovery for Haiti means meaningful involvement by Caribbean citizens, and a still-unfocused resolve to be a part of that. I hope we stick to our guns.

Q: The possibility of pan-Caribbean citizenship is familiar from C.L.R. James’s writings. It was something he saw as necessary and urgent. But it sounds like there hasn’t been much progress on that front in the two decades since his death. Why is that?

A: Individual Caribbean countries have much in common, of course, but there are real knowledge gaps separating us and very real prejudices behind the facade of solidarity that we generally like to put forward. At the best of times there are strong prejudices against the region's less wealthy nations. In the southern Caribbean, that means Guyana, and Haiti seems to fill that role further north. If some Bahamians are worried about being overrun by Haitians, some Barbadians feel the same way about Guyanese, and Trinidadians have long been suspicious of "small islanders" wanting to settle here.

It's more acute when it comes to Haiti – not only is it a very "poor" and "undeveloped" country, but it has a reputation for violence, HIV, and voodoo. Never mind that violent crime and HIV infection rates are rising everywhere in the region, and every Caribbean territory has one or more versions of a syncretic religion combining elements of belief and practice from West Africa, Christianity, and sometimes other traditions.

Q: As editor of The Caribbean Review of Books, you are in a good position to assess the literary and intellectual traffic within the region, and between the Caribbean and the rest of the Atlantic. Would you say something about this?

A: I often think I'd be in a better position to, as you put it, "assess the literary and intellectual traffic" if I lived not in Port of Spain but in New York or London or Toronto or even Miami. I'm also pretty sure it would be easier to publish a magazine like the CRB in one of those places. It would probably be easier to secure grant funding, and the magazine would be physically closer to a critical mass of potential readers.

One of the hot concepts in Caribbean academic circles these days is the "transnational" Caribbean. It can mean different things. The positive spin is the notion of the Caribbean not as a physical region but as a cultural or social phenomenon – a space, not a place – that includes the major Caribbean populations in metropolitan centers like the ones I listed above. So we can claim Brooklyn and Brixton as "ours," and we quote the late Jamaican poet Louise Bennett about "colonizing in reverse."

On the one hand, this notion of the transnational is simply descriptive. There are millions of Caribbean immigrants and their immediate descendants in the U.S., Canada, the U.K. and elsewhere, and there is certainly a sense in which they continue to be Caribbean, participate in conversations about Caribbean society, contribute to Caribbean economies (through remittances), etc. On the other hand, and especially from the point of view of someone who actually does live here, it sometimes seems like wishful thinking, or at least like an attempt to put a hopeful face on a situation that is not so hopeful. Colonizing in reverse, or brain drain?

The last census in Guyana suggested some shockingly high percentage of Guyanese of my generation with a secondary education now live abroad – over 80 percent. Anecdotally, I can say that about half of my graduating class at secondary school (one of Trinidad's "elite" schools) is now abroad.

This isn't a new phenomenon, of course. Going abroad for education or to expand intellectual possibilities has been part of the standard narrative of Caribbean intellectual life at least as far back as 1932, when James left Trinidad. It's widely held that West Indian literature suddenly sprang into existence in the 1950s when various aspiring writers from different British West Indian territories went to London, discovered common cultural elements, and found an audience for their work via the BBC's Caribbean Voices program and postwar publishers with a taste for exotica from the colonies. (Though that narrative is now disputed by some younger Caribbean lit scholars working on earlier writers and texts.)

There was a moment in the late '60s when it seemed the center of intellectual gravity might shift back to the Caribbean itself, but it didn't take long for post-Independence disillusion to set in. It's very moving but also puzzling for me to read a book like Andrew Salkey's Georgetown Journal (1972), set just at that moment when Independence optimism was beginning to tremble.

Q: I asked about this without thinking about how the cultural history would overlap with your own personal experience. Would you say a little more about that?

A: I came of age in the 1980s, which with adult hindsight I can see was a very pessimistic time for Caribbean people of my parents' generation, but I remember as a schoolchild thinking that people who "went away to live" were specially lucky, even if it was an eventuality I couldn't imagine for myself. Had I gone to university abroad, it's likely I wouldn't have come back to Trinidad, not to live. I still can't decide whether that would have been a better thing.

Having reached my mid-30s, having never lived anywhere else, I'm now fairly certain I'll stay here. But that's something I still think about often – almost every time I travel to the U.S. or Britain, I spend a good chunk of my time trying to imagine an alternative life there. I think that for many Caribbean people of my generation and approximate background – middle class, relatively well-educated – the question of going or staying remains acute.

Sitting here in Diego Martin, west of Port of Spain, it seems to me that in 2010 the literary and intellectual traffic within the Caribbean – and between the region and North America and Europe – is still directed mainly by agents physically located outside the Caribbean itself. Most of our intellectuals and writers are elsewhere. Almost all our books are published elsewhere, There are only two publishers of consequence in the Anglophone Caribbean – both based in Jamaica, both quite small. The main intellectual journal of the Anglophone Caribbean, Small Axe, is based in New York.

Most serious contemporary Caribbean artists either live abroad or depend heavily on financial support from abroad via grants, residencies, etc. Many if not most intellectual or cultural initiatives in the Caribbean similarly depend on financial support from abroad. What's kept the CRB going in the past couple years is a grant from the Prince Claus Fund in the Netherlands. And the audiences for all of these are also now, in the main, in North America and Europe. In the money economy as well as the attention economy, we still depend on investment and, frankly, charity from elsewhere.

Q: What role does academe play in all of this?

A: It's true that indigenous institutions like the University of the West Indies do a great deal to promote conversations between separate territories, and the three campuses are relatively important centers of activity. But the university faculty are vastly outnumbered by Caribbean scholars in the academy abroad who are inevitably better funded and better positioned to insert themselves into essential debates in their disciplines.

It's terribly revealing that the theme of the Caribbean Studies Association's 2009 conference, held in Jamaica, was "Centering the Caribbean in Caribbean Studies." You can read the phrase in more ways than one, but my interpretation is: “bringing the Caribbean back to the center of Caribbean studies.” Well, where else was it?

I don't mean to set up a binary opposition between here and there, local and diaspora, us and them, because of course the reality is far more complex. There is conversation and exchange and movement between all these nodes, and they are often fruitful. But aspects of the situation are depressing. For the better part of five centuries the Caribbean was devoted to producing raw materials to enrich already wealthy countries further north. Now sometimes it feels like we're producing cultural raw materials to be turned into books, films, lectures, etc. by intellectual agents in New York or London or Toronto.

Q: Is there a silver lining to contemporary developments?

A: When I (infrequently) attend academic conferences or meetings in the Caribbean, like a stuck record I implore the assembled scholars to make more strategic use of the web to share their research, to make it more widely available. I remind them that many of us in the Caribbean don't have easy access to research libraries or online journal subscriptions. In the age of WordPress and Blogger, when anyone who can use a word processor can also set up a website, there's no excuse.

One of the interesting and encouraging developments in the Trinidad art scene in the past year or so has been the rapid flourishing of artists' blogs. The writers will follow close behind, I hope. There is no serious engagement with visual art in the press here – no real reviewing, no professional critics, and commercial galleries are generally highly conservative and mercenary. So younger artists are increasingly creating work to share online, and using their blogs and websites to document their practice and comment on the work of their peers. It's early yet, and if there's a conversation going on, it's still happening within a small circle, but the odd international curator has peeked or poked in.

Some of my friends and colleagues in the art scene here have been energized and encouraged by this development. It may fizzle out, or these small individual initiatives may coalesce. In the past year or two, I've been more involved in, and paid more attention to, the Caribbean visual art scene than to the literary scene – partly because that's where the energy seems to be, partly because Caribbean visual images seem to be doing better than Caribbean literary texts in the economy of attention.

Q: That seems like a useful expression – “the economy of attention.” Clearly it is bound up, in all sorts of complicated ways, with economics in the more familiar sense. But it’s also political....

A: At the moment everything going through my head is colored by the fact of Haiti. Who gets to decide what help Haiti needs and how to rebuild? I'm not sure Haitians will. Who gets to decide what contemporary Caribbean literature is? Publishers in New York and London and literary scholars in American, British, and Canadian universities. Those two questions aren't comparable in degree, but are bound together in a common dilemma.

Author/s: 
Scott McLemee
Author's email: 
scott.mclemee@insidehighered.com

In the American Grain

Howard Zinn -- whose A People’s History of the United States, first published by Harper & Row in 1980, has sold some two million copies -- died last week at the age of 87. His passing has inspired numerous tributes to his role in bringing a radical, pacifist perspective on American history to a wide audience.

It has also provoked denunciations of Zinn as “un-American,” which seems both predictable and entirely to his credit. One of Zinn’s lessons was that protest is a deeply American inclination. The thought is unbearable in some quarters.

One of the most affectionate tributes came from the sports writer Dave Zirin. As with many other readers, he found that reading Zinn changed his whole sense of why you would even want to study the past. “When I was 17 and picked up a dog-eared copy of Zinn's book,” he writes, “I thought history was about learning that the Magna Carta was signed in 1215. I couldn't tell you what the Magna Carta was, but I knew it was signed in 1215. Howard took this history of great men in powdered wigs and turned it on its pompous head.” Zirin went on to write A People’s History of Sports (New Press, 2008), which is Zinnian down to its cells.

Another noteworthy commentary comes from Christopher Phelps, an intellectual historian now in the American and Canadian studies program at the University of Nottingham. He assesses Zinn as a kind of existentialist whose perspective was shaped by the experience of the civil rights struggle. (He had joined the movement in the 1950s as a young professor at Spelman College, a historically black institution in Atlanta.)

An existentialist sensibility -- the tendency to think in terms of radical commitment, of decision making as a matter of courage in the face the Absurd -- was common to activists of his generation. That Phelps can hear the lingering accent in Zinn’s later work is evidence of a good ear.

Zinn “challenged national pieties and encouraged critical reflection on received wisdom,” writes Phelps. “He understood that America’s various radicalisms, far from being ‘un-American,’ have propelled the nation toward more humane and democratic arrangements.... He urged others to seek in the past the inspiration to dispel resignation, demoralization, and deference, the foundations of inertia. The past meant nothing, he argued, if severed from present and future.”

I've spent less time reading the fulminations against Zinn, but they seem like backhanded honors. When a historian known for saying good things about the Fascists who won the Spanish Civil War considers it necessary to denounce somebody, that person’s life has been well-spent.

Others have claimed that Zinn did not sufficiently denounce Stalinism and its ilk. The earliest example of the complaint that I know came in a review of People’s History that appeared in The American Scholar in 1980, when that magazine was a cultural suburb of the neoconservative movement. The charge has been recycled since Zinn’s death.

This is thrifty. It is also intellectually dishonest. For what is most offensive about Zinn (to those who find him so) is that he held both the United States and the Soviet Union to the same standard. He even dared to suggest that they were in the grip of a similar dynamic.

“Expansionism,” he wrote in an essay from 1970, “with its accompanying excuses, seems to be a constant characteristic of the nation-state, whether liberal or conservative, socialist or capitalist. I am not trying to argue that the liberal-democratic state is especially culpable, only that it is not less so than other nations. Russian expansionism into Eastern Europe, the Chinese moving into Tibet and battling with India over border territories -- seem as belligerent as the pushings of that earlier revolutionary upstart, the United States.... Socialism and liberalism both have advantages over feudal monarchies in their ability to throw a benign light over vicious actions.”

Given certain cretinizing trends in recent American political discourse, it bears stressing that Zinn here uses “liberalism” and “socialism” as antonyms. A liberal supports individual rights in a market economy. By any rigorous definition, Sarah Palin is a liberal. And so, of course, is Barack Obama, who can only be called a “socialist” by an abuse of language. (But such abuse is an industry now, and I feel like Sisyphus just for complaining about it.)

The most substantial critique of A People’s History remains the review by Michael Kazin that appeared in Dissent in 2004. Kazin’s polemic seems to me too stringent by half. Zinn's book is not offered as the last word on the history of the United States, but as a corrective to dominant trends. It is meant to be part of an education, rather than the totality of it.

But Kazin does make points sometimes acknowledged even by the book’s admirers: “Zinn reduces the past to a Manichean fable and makes no serious attempt to address the biggest question a leftist can ask about U.S. history: why have most Americans accepted the legitimacy of the capitalist republic in which they live?”

That is indeed the elephant in the room. Coercion has certainly been a factor in preserving the established order, but persuasion and consent have usually played the greater part. Any American leftist who came of age after Antonio Gramsci’s work began to be assimilated is bound to consider hegemony a starting point for discussion, rather than an afterthought.

But Zinn was the product of an earlier moment -- one for which the stark question of commitment had priority. A strategic map of the political landscape was less urgent than knowing that you stood at a crossroads. You either joined the civil rights struggle or you didn’t. You were fighting against nuclear proliferation or the Vietnam War, or you were going along with them. It is possible to avoid recognizing such alternatives -- though you do end up making the choice between them, one way or the other.

There were subtler interpretations of American history than Howard Zinn’s. Anyone whose understanding of the past begins and ends with it has confused taking a vitamin for consuming a meal. But that does not make it worthless. The appreciation of complexity is a virtue, but there are times when a moment of clarity is worth something, too.

Author/s: 
Scott McLemee
Author's email: 
scott.mclemee@insidehighered.com

The Good Reader

I fell in love with him first over the phone. A man older than my father, a man I hadn’t yet met in person, a man about whom I knew little except that he was kind to me, but someone who was different in obvious and profound ways from the people I encountered every day.

In 1984 I had just started a full-time job as an editorial assistant. I was fortunate to be working for an editor who saw it as part of her mission to educate me. Susan would invite me to sit in on her hectoring conversations with authors, where she would tell them, harsh and didactic in both her tone and her language, exactly what was wrong with their thinking. She would explain to me how manuscripts get shaped into books, what the expectations of a reader are and how the author can’t afford to frustrate them. She talked a lot about fairness. She was the first proud Reaganite I ever thought was smart.

A part of my job, after I’d finished typing up correspondence, preparing projects to go into production, trafficking copy and marketing materials, and answering the phone, was to line up readers to report on manuscripts. This was just as Oxford was starting to publish scholarly books, and it was an easy way to build a list. Peer review counted more than an editor’s good judgment and took less time. Susan had only to look quickly at the first few pages and decide whether or not a project was worthy of being “sent out.” She’d give me a list of names, or tell me to call one person (usually an author) and ask for suggestions. This was one of my least favorite parts of the job. You’d have to make a zillion phone calls, and leave hundreds of messages. Sometimes academics were nice, but they were rarely in their offices. They could be mean or pompous, and would sometimes lecture me on the manuscript they hadn’t yet read. I imagine that now, with e-mail, things are a whole lot easier.

Sometimes I’d strike out so many times I would end up with a reader who wouldn’t be a natural extension of the scholars on the original list, which is probably how it happened, because now I can’t believe that Susan would ever have asked me to call him as a reader. He was surprised at being asked to do something for Oxford. But he talked to me in a way that other readers -- busy, name-brand academics -- didn’t.

He wondered what it was like to do my job. (I loved it. Really? he said. You love your work?) Where had I come from? (College -- I’d started working at Oxford one day a week before I’d even graduated. He asked which college, and then didn’t say much.) Where did I live? (Manhattan. He’d grown up, he said, in Brooklyn. His accent, in fact, reminded me of my grandparents who still lived there.) Where had I grown up? (In the boonies of upstate New York.) My parents’ jobs? (My father was a bitter, third-rate academic at a state university.) Future plans? (No plans -- I took this job instead of going for my dream, working on a dude ranch in Wyoming. Too bad, he said, that my dream hadn’t quite worked out. Yet.)

I can’t remember what manuscript I’d sent him, or what he sent back in his reader’s report. I do remember that Susan was surprised to see a report from him and to hear that we’d had a long talk. And I remember that not long after, he called to say that he was going to be in New York City and wanted to come by the office. Come by, I said.

He wasn’t anything like what I expected. He was tall, very tall, and athletic-looking, with a strong face, and salty dark hair. He was handsome in ways that I didn’t think a sixty-year-old man could ever be.

We sat in my office and chatted. He told me about his kids -- there were two, both older than me. Mostly, though, he wanted to know about me.

How much is there for a 22-year-old girl to say about herself? I talked about my work. I talked about how this job was like being in graduate school. I got to learn not only about publishing, but about a whole bunch of academic fields I hadn’t gone deep into in college, where I read mostly English literature, a little bit of criticism, and dabbled in philosophy. Now I was immersed in political science, sociology and, when Susan could sneak it in, history. Oxford was then divided -- in divisive and rancorous ways -- by traditional disciplinary lines and editorial jealousies, and fears of “poaching” buzzed like flies in the hallways.

We editorial assistants had our own sources of strife. Some had to work for bosses who were, if I’m to be honest, bat-shit crazy. Others were chained to their desks, barely allowed to leave the building for lunch. Most didn’t get the kind of author contact I was allowed, because most editors didn’t share their jobs as fully as Susan did. I got to go to important meetings while my friends were typing. I went to lunches at fancy expensive account restaurants as long as I got the Xeroxing done.

I told him about the opportunities being handed to me, the fact that I didn’t even know, most of the time, when I was talking to people who were famous in their fields. I was paid to read, I told him, to learn. It was a great job.

Later, he would confess that he told his students about me -- that I was one of the few people he knew who was truly happy with her work. Now I don’t know whether to believe this; I suspect he said it to make me feel good and that he knew that someday I would realize that things were not so simple.

We began having lunches. I flexed a fledgling expense account to take him out when he came to New York, and I made time to see him on the occasions when I got to travel to Boston. I would always ask what he was working on, acting like a big girl editor. After my first year at OUP I left Susan to work for the American history editor, so this would have made sense, but it didn’t interest him in the least. His work was different from most of Oxford’s list; in many ways, I later realized, we were the mainstream he was reacting against.

Once he called me up to say that he had written a play about Emma Goldman; it was being produced in New York, directed by his son. He gave me the information and I said I would go. He told me to introduce myself to his son. I went to the play -- enraptured by the fact of knowing the playwright -- but was too shy to track down his son, who was tall and handsome like his father.

One day, over lunch in Boston, I grilled him and made me tell him his story. It’s a familiar one, by now: He was the son of Jewish immigrant, and began his career as a rabble-rouser at age 17. At that point I had moved to Brooklyn, and he talked about the Brooklyn Naval Yard, about meeting Roslyn, his wife. He talked about joining the Army to fight on the side of good against evil. “I was a bombardier,” he said. He said it twice, as if he could hardly believe it We were eating lunch, maybe at Legal Sea Foods in Cambridge, and I knew he was telling me something important. He told me about the box he’d stuffed all his army belongings into -- his medals, his papers -- and that he’d written “Never again” across the top.

He told me about teaching at Spelman College and his work during as part of the civil rights movement. He told me about getting fired.

And then he talked about Boston University and about John Silber. He said that he taught the biggest course at the university and that he wasn’t allowed any teaching assistants. The president had offered him some, he said, if he cut down his class size. Way down. He was on the eve of retirement, but said that he wanted to stick around just to irritate Silber.

At that point, I hadn’t read A People’s History of the United States. I knew Howard Zinn only as a professor who had read a manuscript for me and become, unexpectedly, my friend. And then I read him, and fell in love with him in myriad other ways. For his bravery. For his lucidity. And of course, for the generosity and authenticity of his vision. I have met labor historians who have no truck with laborers; defenders of social justice who say racist and sexist and plain old bigoted things after a cocktail or two.

There are many others who can talk about how Howard Zinn changed not only their lives, but the world. I am well aware of my good fortune. When I needed the figure of a good father, someone wise and kind, challenging and encouraging, I did a slipshod job at work and called him to read a manuscript.

Author/s: 
Rachel Toor
Author's email: 
info@insidehighered.com

Rachel Toor teaches creative writing in the MFA program at Eastern Washington University, in Spokane.

Everywhere and Nowhere

We seem to be in the midst of a religious revival. At least that seems true within higher education, and especially within our own field of American history. According to a recent report from the American Historical Association (and written about in Inside Higher Ed), religion now tops the list of interests that historians claim to have as their specialty.

This renaissance bodes well for a discipline that more or less has forgotten about or tended to marginalize religion, especially when it has examined modern America (typically defined as anything after 1865). Even to this day, religion is everywhere around us, and religious historians have written about it in compelling and exciting ways, but within mainstream historiography it has been basically left behind. In a sense, religion is everywhere in modern American history, but nowhere in modern American historiography.

To illustrate the point, Jon Butler’s 2004 article for the Journal of American History analyzed American history textbooks to see if religion was present in their pages. He found that religion was omnipresent in the telling of early American history (before 1865), but after the Civil War it appeared only episodically, "as a jack-in-the-box," popping up "colorfully" here and there, then disappearing, "momentary, idiosyncratic thrustings up of impulses from a more distant American past or as foils for a more persistent secular history."

Butler is not alone in noticing this shortcoming. Robert Orsi, an eminent historian of American everyday religious practice recently suggested that historians have failed to grasp what has been going on within the subdiscipline of religious history. And David A. Hollinger, an intellectual historian and the incoming president of the Organization of American Historians, has urged historians to study religion, not for the sake of advocacy, but because of the extensive gap between intellectuals and the rest of the population. His articles have carried such urgent titles as "Jesus Matters in the USA" and "Why is there so much Christianity in the United States?"

There are several possible explanations for this everywhere/nowhere disjunction. Some explanations include the rise of social history, the disconnect in professed religious beliefs between academics and other Americans, the fact that the widespread recognition of America’s religious pluralism has forced our institutions to become increasingly secular, and more.

But we would like to suggest some ways in which religious historians have attempted to fuse their stories into the mainstream narrative. A good example to begin with (because its timing accords perfectly with religion’s historiographical absence) might be Edward Blum’s award-winning book Reforging the White Republic: Race, Religion, and American Nationalism, 1865-1898 (2005). When explaining the demise of the cause of equality that was so prominent in the Civil War, Blum lays blame directly on American religious institutions and their leaders. Blum shows that many if not most of the narratives of reconciliation that emerged in the 1870s embraced Christian images of reunion, a Messianic notion of coming together again and working on the great American project, all at the expense of African Americans. And Northern ministers led the way. The capstone final moment in D. W. Griffith’s Birth of a Nation (1915), which illuminates the triumph of the Klan as the "birth of the nation," shows Jesus’ face hovering over it. It was Protestant ministers and activists, Blum shows, who midwived the end of Reconstruction and the rise of a united, white Christian America, aggressively on the prowl for territorial conquests.

During the progressive era of the early 20th century, even as many American institutions were secularizing, religion marked many aspects of social life. Clifford Putney’s study of recreational and professional sports from 1880 to 1920 put a Muscular Christianity, as he titled his 2001 book, at the center of Victorian manhood. A revitalized and reformed Protestantism, based in no small part on excluding itself from new immigrants, help to recreate the notion of Victorian manhood. Religion was the key. William J. Baker has updated this story for our own times in Playing with God: Religion and Modern Sport (2007), which affirms Putney’s timing that muscular Christianity emerged out of a turn-of-the-twentieth-century need to redefine manhood in the Industrial Age.

For the interwar years, Matthew Avery Sutton’s new biography, Aimee Semple McPherson and the Resurrection of Christian America (2007), portrays McPherson’s deep influence in the period (and also, therefore, conservative Protestantism’s deep influence too). Sutton argues that McPherson was among the first in the modern era to unite conservative Christianity, American nationalism, and a political sensibility favoring the fantastic. Although she has been derided as a sexualized simpleton who quickly faded from the scene, Sutton portrays her as the forefather (mother?) of today’s Religious Right. Her style of publicly and personally sensational politics created a model that would be picked up several decades later by the likes of Jimmy Swaggart, Pat Robertson, and James and Tammy Faye Baker. In her day, she was everywhere, but today, she hardly appears at all in the mainstream narrative of American history, this despite her importance in laying the groundwork for the rise of the Religious Right.

During the post-World War II period, the United States experienced a religious revival of sorts as well, although one that was unusual in American history because it was a revival not just for Protestants, but for Roman Catholics and Jews. Catholics and Jews took advantage of the anti-fascist rhetoric of World War II and the Cold War in order to combat any lingering connections between Protestantism and American nationalism. Instead, they articulated the idea that the state should be neutral in handling religious affairs, whether it be in the U.S. Census or in the realm of public education. In this way, religion sits at the root of today’s multicultural struggles, where differences are to be recognized and even championed, but never prioritized by the state. Interestingly, these ways of managing pluralism were worked out when religious groups were the primary provocateurs, and not by racial, ethnic, or gendered groups.

There are many more examples of these acts of incorporation. But despite all this recent work, our general thesis that religion has been everywhere in history but nowhere in historiography has two major exceptions: in historical works on the civil rights movement and the religious right. When it comes to civil rights historiography, religious interpretations have vitally influenced scholarship; indeed, those who downplay the influence of religion tend to be the “heretics,” rather than the other way around. Meanwhile, we now have a small library of books on contemporary figures of the Religious Right, from Jerry Falwell to James Dobson to Phyllis Schlafly.

Noting these two exceptions raises important questions. For example, since these are two groups that have been historically racialized and/or marginalized, does that make it “safer” to incorporate religion more centrally into their intellectual trajectories? And to what degree do they influence the mainstream narrative? In other words, when we move from the mainstream to the margins, does it become safer to introduce religion as a central actor in people’s lives? And if so, will that scholarship focusing on the margins find its way into the mainstream narratives? The almost complete absence of religion from David M. Kennedy’s Freedom from Fear and James Patterson’s Grand Expectations, the two Oxford History of the United States volumes covering the period from 1932 to 1974, provides just cause for such reflection.

Meanwhile, religion continues to influence and shape Americans’ lives. The much-publicized "U.S. Religious Landscape Survey" (2008) from the Pew Forum on Religion and Public Life provides some startling data. First, the survey found that almost 1 out of every 10 Americans is an ex-Catholic. For the past 100 years, Catholics have always been and still do make up about 25 percent of the population, but the stability of proportion for recent years is only explicable because of the large numbers of immigrants, mostly Hispanic, that have come to the United States since 1965, when the United States loosened its immigration laws. Second, by the standards of population, the United States is still not “Abrahamic” or even “Judeo-Christian,” if it ever was. Jews make up 1.7 percent of the population, while no other non-Christian religion constitutes more than 1 percent. Meanwhile, nearly 80 percent of Americans consider themselves to be some variety of Christian. And finally, the largest growth area in people’s religious identification lies in the category of “uncommitted,” now amounting to about 14 percent of the population, according to the Pew survey. That figure varies vastly by region. Thus, "uncommitted" makes up a sizable portion of the Pacific Northwest, but barely registers as a religious alternative in the Deep South.

Other highlights from the Pew survey include the fact that there are more Buddhists than Muslims in America, there almost as many atheists as Jews (and more agnostics), and more than a quarter (28 percent) of all Americans have left the grand faith tradition into which they were born, while nearly half of all Americans have left the faith of their birth or switched denominations at some point in their life (44 percent). The survey thus emphasizes that the structure of faith in America is an amorphous thing, constantly changing, influencing people’s lives in new and dynamic and important ways. And religious historians have been busy tracing religion’s dynamism in modern American history.

If only more historians would care. Perhaps our discipline’s “religious revival” will help make it so.

Author/s: 
Kevin M. Schultz and Paul Harvey
Author's email: 
info@insidehighered.com

Kevin M. Schultz is assistant professor of history at the University of Illinois at Chicago. Paul Harvey is associate professor of history at the University of Colorado at Colorado Springs.This essay is adapted from a forthcoming article in the Journal of the American Academy of Religion.

Andy Warhol, Then and Now

In two weeks, the National Book Critics Circle will vote on this year’s awards, and so, of late, I am reading until my eyes bleed. Well, not literally. At least, not yet. But it is a constant reminder of one's limits -- especially of the brain's plasticity. The ability to absorb new impressions is not limitless.

But one passage in Edmund White’s City Boy: My Life in New York During the 1960s and ‘70s (a finalist in the memoir category, published by Bloomsbury) did leave a trace, and it seems worth passing along. The author is a prominent gay novelist who was a founding member of the New York Institute for the Humanities. One generation’s gossip is the next one’s cultural history, and White has recorded plenty that others might prefer to forget. City Boy will be remembered in particular for its chapter on Susan Sontag. White says that it is unfortunate she did not win the Nobel Prize, because then she would have been nicer to people.

But the lines that have stayed with me appear earlier in the book, as White reflects on the cultural shift underway in New York during the 1960s. The old order of modernist high seriousness was not quite over; the new era of Pop Art and Sontag's "new sensibility" had barely begun.

White stood on the fault line:

"I still idolized difficult modern poets such as Ezra Pound and Wallace Stevens," he writes, "and I listened with uncomprehending seriousness to the music of Schoenberg. Later I would learn to pick and choose my idiosyncratic way through the ranks of canonical writers, composer, artists, and filmmakers, but in my twenties I still had an unquestioning admiration for the Great -- who were Great precisely because they were Great. Only later would I begin to see the selling of high art as just one more form of commercialism. In my twenties if even a tenth reading of Mallarmé failed to yield up its treasures, the fault was mine, not his. If my eyes swooned shut while I read The Sweet Cheat Gone, Proust's pacing was never called into question, just my intelligence and dedication and sensitivity. And I still entertain those sacralizing preconceptions about high art. I still admire what is difficult, though I now recognize it's a 'period' taste and that my generation was the last to give a damn. Though we were atheists, we were, strangely enough, preparing ourselves for God's great Quiz Show; we had to know everything because we were convinced we would be tested on it -- in our next life."

This is a bit overstated. Young writers at a blog like The New Inquiry share something of that " 'period' taste," for example. Here and there, it seems, "sacralizing preconceptions about high art" have survived, despite inhospitable circumstances.

White's comments caught my bloodshot eye because I had been thinking about Arthur C. Danto's short book Andy Warhol, published late last year by Yale University Press. (It is not among the finalists for the NBCC award in criticism, which now looks, to my bloodshot eye, like an unfortunate oversight.)

It was in his article “The Artworld,” published in The Journal of Philosophy in 1964, that Danto singled out for attention the stack of Brillo boxes that Warhol had produced in his studio and displayed in a gallery in New York. Danto maintained that this was a decisive event in aesthetic history: a moment when questions about what constituted a piece of art (mimesis? beauty? uniqueness?) were posed in a new way. Danto, who is now professor emeritus of philosophy at Columbia University, has never backed down from this position. He has subsequently called Warhol “the nearest thing to a philosophical genius the history of art has produced.”

It is easy to imagine Warhol's response to this, assuming he ever saw The Journal of Philosophy: “Wow. That’s really great.”

Danto's assessment must be distinguished from other expressions of enthusiasm for Warhol's work at the time. One critic assumed that Warhol's affectlessness was inspired by a profound appreciation for Brecht’s alienation effect; others saw his paintings as a radical challenge to consumerism and mass uniformity.

This was pretty wide of the mark. The evidence suggests that Warhol’s work was far more celebratory than critical. He painted Campbell’s soup cans because he ate Campell’s soup. He created giant images based on sensational news photos of car crashes and acts of violence -- but this was not a complaint about cultural rubbernecking. Warhol just put it into a new context (the art gallery) where people would otherwise pretend it did not exist.

“He represented the world that Americans lived in,” writes Danto in his book, “by holding up a mirror to it, so that they could see themselves in its reflection. It was a world that was largely predictable through its repetitions, one day like another, but that orderliness could be dashed to pieces by crashes and outbreaks that are our nightmares: accidents and unforeseen dangers that make the evening news and then, except for those immediately affected by them, get replaced by other horrors that the newspapers are glad to illustrate with images of torn bodies and shattered lives.... In his own way, Andy did for American society what Norman Rockwell had done.”

It seems like an anomalous take on an artist whose body of work also includes films in which drag queens inject themselves with amphetamines. But I think Danto is on to something. In Warhol, he finds an artistic figure who fused conceptual experimentation with unabashed mimeticism. His work portrays a recognizable world. And Warhol’s sensibility would never think to change or challenge any of it.

Chance favors the prepared mind. While writing this column, I happened to look over a few issues of The Rag, one of the original underground newspapers of the 1960s, published in Austin by students at the University of Texas. (It lasted until 1977.) The second issue, dated October 17, 1966, has a lead article about the struggles of the Sexual Freedom League. The back cover announces that the Thirteenth Floor Elevators had just recorded their first album in Dallas the week before. And inside, there is a discussion of Andy Warhol’s cinema by one Thorne Dreyer, who is identified, on the masthead, not as the Rag’s editor but as its “funnel.”

The article opens with an account of a recent showing, of the 35-minute film Warhol film “Blow Job” at another university. The titular action is all off-screen. Warhol's camera records only the facial expressions of the recipient. Well before the happy ending, a member of the audience stood up and yelled, “We came to get a blow job and we ended up getting screwed.” (This anecdote seems to have passed into the Warhol lore. I have seen it repeated in various places, though Danto instead mentions the viewers who began singing “He shall never come” to the tune of the civil-right anthem.)

Dreyer goes on to discuss the recent screening at UT of another Warhol film, which consisted of members of the artist's entourage hanging out and acting silly. The reviewer calls it “mediocrity for mediocrity’s sake.” He then provides an interpretation of Warhol that I copy into the digital record for its interest as an example of the contemporary response to his desacralizing efforts -- and for its utterly un-Danto-esque assessment of the artist's philosophical implications.

“Warhol’s message is nihilism," writes Dreyer. "Man in his social relations, when analyzed in the light of pure objectivity and cold intellectualism, is ridiculous (not absurd). And existence is chaos. But what is this ‘objectivity’? How does one obtain it? By not editing his film and thus creating ‘real time’? By boring the viewer into some sort of ‘realization’? But then, is not ‘objectivity’ just as arbitrary and artificial a category as any other? Warhol suggests there is a void. He fills it with emptiness. At least he is pure. He doesn’t cloud the issue with aesthetics.”

And so the piece ends. I doubt a copy ever reached Warhol. It is not hard to imagine how he would have responded, though: “It gives me something to do.” The line between nihilism and affirmation could be awfully thin when Warhol drew it.

Author/s: 
Scott McLemee
Author's email: 
scott.mclemee@insidehighered.com

Pages

Subscribe to RSS - History
Back to Top