History

Turning a Page

Ideas have seldom been the currency of American politics. (Most of the time, currency is the currency of American politics.) But this seems like a moment in history when new thinking is a matter of some urgency.

Over the past few days, I've been conducting an utterly unscientific survey of academics, editors, and public intellectuals to find out how -- if given a chance -- they might try to influence the incoming occupant of the White House. The question was posed by e-mail as follows:

"Imagine you are invited to a sit-down with the president-elect and given the chance to suggest some recommended reading between now and the inauguration.Since we're trying to keep this fantasy of empowerment at least slightly plausible, I'd ask you to limit yourself to one book. (He will be busy.) Something not yet available in English is fine; we will assume a crack team of translators is standing by. Journal articles, historical documents, and dissertations also acceptable.

"What would you propose? Why? Is there a special urgency to recommending it to the attention of the next Chief Executive at this very moment? Remember, this is a chance to shape the course of history. Use your awesome power wisely...."

I tried to cast a wide net for potential respondents -- wider than my own political sympathies, in any case. Not all who were invited chose to participate. But everyone who did respond is included here. The suggestions were far-ranging, and the president-elect would no doubt benefit from time spent reading any of the nominated titles. (To make tracking things down easier on his staff, I have added the occasional clarifying note in brackets.)

In reality, of course, it's a long shot that the new president will take any of this advice. But the exercise is serious, even so -- for it is matter of opening a wider discussion of what books and ideas should be brought to bear on public life at this pivotal instant. An election is a political process; but so, sometimes, is thinking.

Eric Rauchway is a professor of history at the University of California at Davis and author of The Great Depression and the New Deal: A Very Short Introduction, recently published by Oxford University Press.

If they were asking me I'd suppose they were familiar with my own modest works, so I'd try to point out a perhaps neglected or forgotten classic.

Suppose it's John McCain, who has often expressed admiration for Theodore Roosevelt. I'd humbly suggest President-elect McCain revisit the chapters in George Mowry's classic Era of Theodore Roosevelt dealing with Roosevelt's first full term of office (1905-1909), when he worked hard with Congress to craft landmark legislation regulating business, affording protection to consumers, and providing for workers' compensation.

Suppose, conversely, it's Barack Obama, who would be the first northern Democrat elected since the party sloughed off the South in the Civil Rights era (i.e., since John Kennedy) and who would, like the greatest northern Democrat and perhaps the greatest president of all, Franklin Roosevelt, take office in a time of profound crisis. I would humbly remind him of Isaiah Berlin's classic essay on Roosevelt, in which he describes how much could be accomplished by a deft politician, sensitive even to minute ebbs and flows in political opinion, who while not lacking vision or integrity nevertheless understand—as Berlin wrote—"what to do and when to do it."

[The essay on Roosevelt can be found in the Berlin omnibus collection The Proper Study of Mankind: An Anthology of Essays, published ten years ago by Farrar, Straus and Giroux. Or here, while the link lasts.-SM]

Elvin Lim is an assistant professor of government at Wesleyan University and author of The Anti-Intellectual Presidency: The Decline of Presidential Rhetoric from George Washington to George W. Bush, published by Oxford University Press and discussed recently in this column.

The president-elect should read Preparing to be President: The Memos of Richard E. Neustadt (AEI Press, 2000), edited by Charles O. Jones. Richard Neustadt was a scholar-practitioner who advised Presidents Truman, Kennedy, Johnson, and Clinton, and, until his passing in 2003, also the dean of presidential studies. Most of the memos in this volume were written for president-elect John Kennedy, when the country was, as it is now, ready for change.

At the end of every election, "everywhere there is a sense of a page turning ... and with it, irresistibly, there comes the sense, 'they' couldn't, wouldn't, didn't, but 'we' will," Neustadt wrote years ago, reminding presidents-elect that it is difficult but imperative that they put the brake on a campaign while also starting the engine of a new administration. Campaigning and governing are two different things.

Buoyed by their recent victory, first-term presidents have often over-reached and under-performed, quickly turning hope into despair. If there is one common thread to Neustadt's memos, it is the reminder that there is no time for hubris or celebration. The entire superstructure of the executive branch - the political appointees who direct the permanent civil service - is about to lopped off, and the first and most critical task of the president-elect is to surround himself with competent men and women he can work with and learn from.

In less than three months, the president-elect will no longer have the luxury of merely making promises on the campaign trail. Now he must get to work.

Jenny Attiyah is host and producer of Thoughtcast, an interview program devoted to writers and academics, and available via podcast.

We don't have to agree with everything we read in this country. Reading is not unpatriotic. So may I suggest that the future commander-in-chief actually read the speeches by Osama bin Laden? At a minimum, he can read between the lines. As Sun Tzu said, "know thine enemy". But we know so little about bin Laden. We don't even know where he lives. Supposedly, he "hates our freedoms" – but he would argue that what he hates is the freedom we take with our power.

After these videos were released, it usually took some effort to dig out a transcription. In the end, I had to go to Al Jazeera for a translation. What I remember most clearly is grainy video of the guy, holding his index finger aloft, but with the volume silenced, so our talking TV heads could impart their wisdom in peace. Let's hope the next president is willing to turn off the mute button on our enemy. Ignorance is no longer an excuse.

[Verso Press made this much easier three years ago with the collection Messages to the World: The Statements of Osama Bin Laden, which provides as much OBL as anyone should have to read.-SM]

Daniel Drezner is a professor of international relations at Tufts University. He also blogs.

I'd probably advise the president to read the uber-source for international relations, Thucydides' History of the Peloponnesian War. Too many people only read portions like the Melian Dialogue, which leads to a badly distorted view of world politics (the dialogue represents the high-water mark of Athenian power -- it all goes downhill after that). The entire text demonstrates the complex and tragic features of international politics, the folly of populism, the occasional necessity of forceful action, the temptations and dangers of empire, and, most importantly, the ways in which external wars can transform domestic politics in unhealthy ways.

Chris Matthew Sciabarra is a visiting scholar at New York University and a founding co-editor of The Journal of Ayn Rand Studies.

Given my own views of the corporatist state-generated roots of the financial crisis, I'd probably recommend The Theory of Money and Credit by Ludwig von Mises, so that he could get a quick education on how the credit policies of a central bank set the boom-bust cycle into motion. Perhaps this might shake the new President into a truly new course for US political economy.

Irving Louis Horowitz is professor emeritus of sociology and political science at Rutgers University and editorial director of Transaction Publications.

While I seriously and categorically doubt that any one book will shape the course of history, and even less, do I feel touched by a sense of "awesome power" much less preternatural wisdom, I will recommend a book that the next president of the United States would, or better should, avail himself of: On Thermonuclear War by Herman Kahn. Released first by Princeton University Press in the dark days of the Cold War in 1960, and reissued by Transaction Publishers in 2007, this is the painful reminder that peace in our time is heavily dependent of the technology of war in our time. The howls of dismissal that greeted this book upon first blush have been replaced by a sober appreciation that the global threat to our Earth are very much a man made product.

Kahn's book can serve as a guide in the stages of diplomatic failure and its consequential turn to military activities at maximum levels. Kahn does not presume pure rationality as a deterrent to war, and in light of the nuclear devices in the hands of dangerous nations states such as Iran and North Korea, where notions of life and death may give way to Gotterdamerung and the preference of destruction and self-immolation, such presumed rational behavior discourse may prove dangerous and even delusionary.The unenviable task of the next president will be to avoid taking the world to the proverbial brink - and making sure others do not dare take the fatal step to do likewise. Oddly, for all of its dire scenarios, Kahn's classic is a curiously optimistic reading, rooted in realistic policy options. It deserves to be on the shelf of the next head of the American nation.

Dick Howard is a professor of history at the State University of New York at Stony Brook and editor of the Columbia University Press series Columbia Studies in Political Thought/Political History.

I'd have him read Polanyi's The Great Transformation. Why? It's short, clearly argued, and makes a simple but fundamental point: capitalism is not the natural way that people relate to one another (including in their "economic" relations). It is the result of several political decisions that create the framework within which it can emerge. The next president will have to recognize that he too will make political decisions with economic consequences (and should not deceive himself into thinking that his decisions are simply a reaction to economic "necessities").

To be noted as well: Polanyi, a former banker in Austria, was writing in the wake of the Great Depression, whose causes he was trying to understand. It was the inability of "economics" to understand what had happened to the world economy that led Polanyi to his pathbreaking and brilliant study.

A hubristic final note: I of course recommend this only because my own study of the history of political thought from the Greeks to the American and French revolutions, titled The Primacy of Politics, will not yet be on the market.

[ Primacy will be published by Columbia University Press in late '09.-SM]

James Marcus is the book-review editor for The Columbia Journalism Review and has translated several books from Italian.

It's not often that the POTUS asks me what to read next, and at first I thought I should rise to the occasion with something suitably canonical. I considered Democracy in America, The Federalist Papers, maybe even The Education of Henry Adams (although I'd allow the leader of the free world to skip the virgin-and-dynamo stuff at the end). Then I decided that it made more sense to submit a narrow-gauge production: a book that grappled with public issues through the prism of personal experience, not unlike Barack Obama's Dreams from My Father or John McCain's Faith of My Fathers. If, like the two titles I just mentioned, it included a dash of Oedipal ambivalence, so much the better.

What I came up with was Tobias Wolff's In Pharaoh's Army: Memories of the Lost War. As the next president ponders the best way to extract the United States from its Iraqi quagmire, a memoir of Vietnam seems like a useful reality check. The author, a self-confessed screw-up, spent part of his enlistment in the Mekong Delta, advising a Vietnamese artillery battalion. There are very few heroics in his book, and no argumentation about the wisdom of being there in the first place. What we do get is the endless confusion of fighting a popular insurgency. And an insistence that even the survivors of such a conflict are permanently marked: "It's the close call you have to keep escaping from, the unending doubt that you have a right to your own life. It's the corruption suffered by everyone who lives on, that henceforth they must wonder at the reason, and probe its justice."

Over the next four years, the president will almost certainly order U.S. troops into battle. In its modest, personal, anti-rhetorical manner, this book reminds us of the price to be paid.

Claire Potter is a professor of history and American studies at Wesleyan University, and is also known as Tenured Radical. She contributes to the history blog Cliopatria.

My contribution to President Obama's reading list is Nancy Cott's Public Vows: A History of Marriage and the Nation (2000). While the history of marriage has been augmented considerably since this book came to include important volumes on the history of interracial marriage, the demand for gay marriage, and the fraught relationship between Christianity and marriage, all other scholars have relied, more or less, on Cott's argument that marriage is first and foremost a contract with the state.

It's not primarily a contract with another person – although it is that; it is not a contract with your local community – regardless of their approval and disapproval; and it is in no way a contract with any religious hierarchy – although it can be critical to the terms of inclusion in a religious community.

Marriage, President Obama, is about citizenship. You, along with nearly everyone who hedges his bets on gay marriage, reiterates that the most important fact about marriage is that it is between "one man and one woman." But that's not true. In the United States, as Cott shows, marriage has been primarily about the qualifications of a man "to be a participating member of a state."

While over time political authorities in the United States have allowed marriage to "bear the impress of the Christian religion," if marriage is a public institution at all, its function is to mirror the political community and to be the arm of the state that functions to "shape the gender order." In other words, Mr. President, the history of marriage is a political history, not a religious one; and it is a history of inclusion or exclusion from political power.

George Scialabba is the author of What Are Intellectuals Good For?, a collection of essays forthcoming from Pressed Wafer in March 2009. He was profiled in this column two years ago .

Dear Citizen Obama (I'm afraid the overly deferential "Mr. President" encourages the aggrandizement of the Executive Branch):

More than thirty years ago, your predecessor Jimmy Carter described America's tax system as "a national disgrace." Since then, it's gotten much, much worse. It is now so complex and irrational that only two groups of Americans understand it: tax lawyers and readers of David Cay Johnston, Pulitzer-Prize-winning New York Times reporter and author of Perfectly Legal: The Covert Campaign to Rig Our Tax System to Benefit the Super-Rich -- and Cheat Everybody Else. The abuses and evasions detailed in Perfectly Legal (and its companion volume, Free Lunch: How the Wealthiest Americans Enrich Themselves at Government Expense – and Stick You with the Bill) may raise your blood pressure dramatically. You should read them, but only under a doctor's supervision.

Continued tax avoidance at current staggering levels by the wealthy is your mortal enemy. Unless the tax code is drastically reformed -- and effectively enforced -- there will simply not be enough money to accomplish your goals. It will take courage, persistence, and all your celebrated rhetorical skills to vanquish this dragon in your path. But unless you do, your hopes will be thwarted and your administration will be no more than a ripple on the surface of American history.

Good luck and Godspeed.

James Mustich is editor ofThe Barnes & Noble Review.

Since I have more than once in the past few months mourned the unkind timing of Norman Mailer's death this year -- What might the author of one of our finest war novels have made of the trials of Senator McCain on the campaign trail? How would the instigatory commentator on so much of our nation's cultural, political, and existential foment make sense of the long and disciplined loneliness of Senator Obama? And, last but by no means least, how would an imagination precocious and peculiar enough to have set a novel called Why Are We in Vietnam? in Alaska have illuminated the passage of Sarah Palin through the national psyche? -- I'd recommend to the new chief executive Mailer's piece on the 1960 Democratic convention, "Superman Comes to the Supermarket."

Coming out of the exhaustions of electoral combat, I might even give him a pass and ask him only to read the first paragraph -- forgive me, Norman -- if he promised to spend some time thinking about it:

"For once let us try to think about a political convention without losing ourselves in housing projects of fact and issue. Politics has its virtues, all too many of them -- it would not rank with baseball as a topic of conversation if it did not satisfy a great many things -- but one can suspect that its secret appeal is close to nicotine. Smoking cigarettes insulates one from one's life, one does not feel as much, often happily so, and politics quarantines one from history; most of the people who nourish themselves in the political life are in the game not to make history but to be diverted from the history that is being made."

Jodi Dean is a professor of political science at Hobart and William Smith Colleges and author of Democracy and Other Neoliberal Fantasies, forthcoming from Duke University Press.

I would recommend that President Obama read Our American King by David Lozell Martin.First, Obama is already familiar with Marxist, feminist, structuralist, and post-colonial theory from his days as a student at Harvard. So there is already some coverage here. Second, Obama has lots of advisors providing lots of advice on policy matters. Anything added here would end up just another item in the mix. Third, the new President faces so many enormous challenges that it is highly unlikely he'll have much time to devote to pondering a complex text, no matter how important.So I recommend a novel published last year, bedside reading that will provide the new President with food for thought. It captures, I think, the fears of many of us for the future of democracy in a time of extreme inequality, the sense that our country is leaning heavily on the wrong side of a precipice.

Our American King depicts what remains of the United States after a great economic calamity: the top .1 percent of Americans have appropriated all the wealth and goods for themselves and left the rest of the country to fend for itself. As the super-rich live in heavily defended enclaves, the suburbs and cities descend into violence, starvation, and death. Social order collapses. The President and Vice President that oversaw the calamity, that presided over the great transfer of wealth from the many to the few, are hung upside and backwards on the White House gates. The central drama of the novel involves the man who comes to power next. He is set up as a king, a uniter, the great hope of the people. Through him, they begin to work together, to imagine again the possibility of collective responsibility. The new king's authority draws from the people's fear and desperate longing for hope, a fear and a longing that, as Martin makes clear, may not always lead to the best outcomes.

My hope is that President Obama will read this book and recognize that people's longing for a leader, the One, is powerful but precisely because of that power should be redirected toward common projects, toward faith in each other and belief in equality, toward a renewed conviction that the conditions of the least well off--not the best--tell us who we are.

Richard Byrne is the editor ofUMBC Magazine. His play Burn Your Bookes premiered last year in Prague. He blogs at Balkans via Bohemia.

As a playwright, I want the next president to read a play. Plays are perfect fodder for the chief executive-to-be: they are short, can be digested in one sitting, and offer the advantage of distilling larger currents of thought into character, dialogue and action. And such an opportunity should not be wasted on agit-prop (Bertolt Brecht, Clifford Odets) or classics that should already have been imbibed by the civilized soul. (So let’s shelve Henry V and Major Barbara for now.) The play should talk to the president about the human cost of tough times, the dignities and foibles of ordinary citizens, and the dire alternatives to forceful and human courses of action.

For such times, the German playwright Odon von Horvath is just the ticket. Before his tragic death on the cusp of World War II, Horvath offered a window on the brutalities of economic collapse and the roots of fascism in desperation and human folly. But which Horvath to select? Tales from the Vienna Woods is Horvath’s masterpiece, but I’d worry that its deep subtleties and epic canvas of pre-war Austria would confound a reader pressed for time. So I’d opt instead for Horvath’s tiny jewel of human desolation: Faith, Hope and Charity.

In a mere 52 pages, the play follows Elisabeth, an ordinary young woman down on her luck, as she is hounded to death by close encounters with unfeeling bureaucracy and casual cruelty. It is a succinct and powerful play with a simple lesson: if our political institutions are not suffused with the moral values of the play’s title, they can be perverted into engines of personal annihilation. It is a message the new president should consider as sweeping changes in government and its powers are proposed and enacted.

Author/s: 
Scott McLemee
Author's email: 
scott.mclemee@insidehighered.com

The Forgotten Virtue of Gratitude

It was a typical 1970s weekday evening. The sky was growing dark and I, an elementary school student, was sitting at the kitchen table of a modest North Jersey cape cod putting the finishing touches on the day’s homework. The back door opened -- a telltale sign that my father was home from work. As he did every day, Dad stopped in the laundry room to take off his muddied work boots. As usual, he was tired. He could have been covered with any number of substances, from dirt to paint to dried spackle. His hands were rough and gnarled. I kissed him hello, he went to the bathroom to “wash up,” and my family sat down to eat dinner.

I always knew how hard my father worked each day in his job as a general contractor. When I got older I spent summers working with him. I learned the virtues of this kind of working class life, but I also experienced the drudgery that came with laying concrete footings or loading a dumpster with refuse. I worked enough with my father to know that I did not want to do this for the rest of my life. Though he never told me so, I am sure that Dad probably didn't want that for me, either.

I eventually became only the second person in my extended family to receive a college degree. I went on to earn a Ph.D. (a “post-hole digger” to my relatives) in history and settled into an academic life. As I enter my post-tenure years, I am grateful for what I learned from my upbringing and for the academic vocation I now pursue. My gratitude inevitably stems from my life story. The lives that my parents and brothers (one is a general contract and the other is a plumber) lead are daily reminders of my roots.

It is not easy being a college professor from a working-class family. Over the years I have had to explain the geographic mobility that comes with an academic life. I have had to invent creative ways to make my research understandable to aunts and uncles. My parents read my scholarly articles, but rarely finish them. My father is amazed that some semesters I go into the office only three days a week. As I write this I am coming off of my first sabbatical from teaching. My family never quite fathomed what I possibly did with so much time off. (My father made sense of it all by offering to help me remodel my home office, for which I am thankful!) “You have the life,” my brother tells me. How can I disagree with him?

Gratitude is a virtue that is hard to find in the modern academy, even at Thanksgiving time. In my field of American history, Thanksgiving provides an opportunity to set the record straight, usually in op-ed pieces, about what really happened in autumn 1621. (I know because I have done it myself!). Granted, as public intellectuals we do have a responsibility to debunk the popular myths that often pass for history, but I wonder why we can’t also use the holiday, as contrived and invented and nostalgic and misunderstood as it is, to stop and be grateful for the academic lives we get to lead.

Thanksgiving is as good a time as any to do this. We get a Thursday off from work to take a few moments to reflect on our lives. And since so many academics despise the shopping orgy known as “Black Friday,” the day following Thanksgiving presents a wonderful opportunity to not only reject consumer self-gratification, but practice a virtue that requires us to forget ourselves.

I am not sure why we are such an unthankful bunch. When we stop and think about it we enjoy a very good life. I can reference the usual perks of the job -- summer vacation, the freedom to make one’s own schedule, a relatively small amount of teaching (even those with the dreaded 4-4 load are in the classroom less than the normal high school teacher). Though we complain about students, we often fail to remember that our teaching, when we do it well, makes a contribution to society that usually extends far beyond the dozens of people who have read our recent monograph. And speaking of scholarship, academics get paid to spend a good portion of their time devoted to the world of ideas. No gnarled hands here.

Inside Higher Ed recently reported that seventy-eight percent of all American professors express “overall job satisfaction.” Yet we remain cranky. As Immanuel Kant put it, “ingratitude is the essence of vileness.” I cannot tell you how many times I have wandered into a colleague’s office to whine about all the work my college expects of me.

Most college and university professors live in a constant state of discontentment, looking for the fast track to a better job and making excuses as to why they have not landed one yet. Academia can be a cutthroat and shallow place to spend one’s life. We are too often judged by what is written on our conference name badges. We say things about people behind their backs that we would never say to their faces. We become masters of self-promotion. To exhibit gratefulness in this kind of a world is countercultural.

The practice of gratitude may not change our professional guilds, but it will certainly relieve us of our narcissism long enough to realize that all of us are dependent people. Our scholarship rests upon the work of those scholars that we hope to expand upon or dismantle. Our careers are made by the generosity of article and book referees, grant reviewers, search committees, and tenure committees. We can all name teachers and mentors who took the time to encourage us, offer advice, and write us letters. Gratitude may even do wonders for our mental health. Studies have shown that grateful people are usually less stressed, anxious, and depressed.

This Thanksgiving take some time to express gratitude. In a recent study the Harvard University sociologist Neil Gross concluded that more college and university professors believe in God than most academics ever realized. If this is true, then for some of us gratitude might come in the form of a prayer. For others it may be a handwritten note of appreciation to a senior scholar whom we normally contact only when we need a letter of recommendation. Or, as the semester closes, it might be a kind word to a student whose academic performance and earnest pursuit of the subject at hand has enriched our classroom or our intellectual life. Or perhaps a word of thanks to the secretary or assistant who makes our academic life a whole lot easier.

As the German theologian and Christian martyr Dietrich Bonhoeffer explained, “gratitude changes the pangs of memory into a tranquil joy.”

Author/s: 
John Fea
Author's email: 
newsroom@insidehighered.com

John Fea teaches American history at Messiah College, in Grantham, Pa. He is the author of The Way of Improvement Leads Home: Philip Vickers Fithian and the Rural Enlightenment in America (University of Pennsylvania Press, 2008).

Confess!

This past weekend, a comic playing Bill Clinton on Saturday Night Live told the world’s leaders not to pull anything on Hillary when she becomes Secretary of State. It's not even worth trying, he indicated, because she’ll see right through you. But he offered some reassuring advice on how to finesse things, if necessary. “The only words you’re gonna need when Hillary shows up: ‘I ... am ... sorry.’ It don’t work all the time, but it’s a good place to start.”

A friend recounted this skit to me when he saw the galleys of Susan Wise Bauer’s new book The Art of the Public Grovel: Sexual Sin and Public Confession in America (Princeton University Press). Its cover shows the former president in a posture of contrition: hands in front of his face, as if to pray; his eyes both wide and averted. But Bauer’s point is that effective public groveling requires a lot more than just assuming the position, let alone saying “I am sorry.”

There is (so her argument goes) a specific pattern for how a public figure must behave in order to save his hide when caught in a scandal. It is not sufficient to apologize for the pain, or offense to public sensibility, that one has caused. Still less will it do to list the motivating or extenuating circumstances of one’s actions. Full-scale confession is required, which involves recognizing and admitting the grievous nature of one’s deeds, accepting responsibility, and making a plea for forgiveness and asking for support (divine or communal, though preferably both).

The process corresponds to a general pattern that Bauer traces back to the Puritan conversion narratives of the 17th century. Confession started out as a way to deal with Calvinist anxieties over the precarious nature of any given believer’s status in the grand scheme of predestination. Revealing to fellow believers an awareness of the wickedness in one’s own life was, at very least, evidence of a profound change in heart, possibly signaling the work of God’s grace.

Secularized via pop psychology and mass media, public confession now serves a different function. In the 20th century, it became “a ceremonial laying down of power,” writes Bauer, “made so that followers can pick that power up and hand it back. American democratic expectations have woven themselves into the practice of public confession, converting it from a vertical act between God and a sinner into a primarily horizontal act, one intended to re-balance the relationship between leaders and their followers. We both idolize and hate our leaders; we need and resent them; we want to submit, but only once we are reassured that the person to whom we submit is no better than we are. Beyond the demand that leaders publicly confess their sins is our fear that we will be overwhelmed by their power.”

Leaders who follow the pattern may recover from embarrassing revelations about their behavior. Major examples of this that Bauer considers are Jimmy Swaggart (with his hobby of photographing prostitutes in hotel rooms) and Bill Clinton (intern, humidor, etc.) Because they understood and accepted the protocol for a “ceremonial laying down of power” through confession, they were absolved and returnd to their positions of authority.

By contrast, public figures who neglect the proper mode of groveling will suffer a loss of support. Thus Edward Kennedy’s evasive account of what happened at Chappaquiddick cost him a shot at the presidency. The empire of televangelist Jim Bakker collapsed when he claimed that he was entrapped into extramarital canoodling. And Bernard Cardinal Law, the bishop overseeing the Catholic community in Boston, declined to accept personal responsibility for assigning known pedophile priests to positions where they had access to children. Cardinal Law did eventually grovel a bit – more or less along the lines Bauer suggests – but only after first blaming the scandal on the Boston Globe, his own predecessors, and earlier church policy. The pope accepted his resignation six years ago.

It’s one thing to suspect that a set of deep continuities exist between evangelical religion, group psychotherapy, and “performances of self” in an age of mass media. Many of us found ourselves positing this quite often during the late 1990s, usually while yelling at the TV news.

But it’s a much tougher prospect to establish that such continuities really exist – or that they add up to an ethos that is accepted by something called “the American public” (a diverse and argumentative conglomeration, if ever there were one). At the very least, it seems necessary to look at how scandals unfold in nations shaped by a different religious matrix. Bauer doesn’t make such comparisons, unfortunately. And her case studies of American scandals don’t always clinch the argument nearly so well as it may appear.

The discussions of Jim Bakker and Bill Clinton form a center of gravity for the whole book. The chapters on them are of almost equal length. (This may testify less to the historical significance of Jim Bakker’s troubles than to their very considerable entertainment value.) And in keeping with Bauer’s analysis, the men’s responses to embarrassment form a neat contrast in approaches to the demand for confession.

Having been exposed for using church funds to pay blackmail to cover up an affair with a church secretary, Bakker has always presented himself as more sinned against than sinning – the victim of a wicked conspiracy by jealous rivals. In other words, he never performed the sort of confession prescribed by the cultural norms that Bauer identifies. He never handed over his power through suitable groveling, and so his followers punished him.

“Refusing to confess, unable to show his one-ness with his followers, ” she writes, “Bakker remains unable to return to ministry.” Which is inaccurate, actually. He has been televangelizing for the past five years, albeit on a less grandiose scale than was once his wont. Bakker’s inability to reclaim his earlier power may have something to do with his failure to follow the rules for confessing his sins and begging forgiveness. But he still owes the IRS several million dollars, which would be something of a distraction.

Bakker’s claims to have been lured into immorality and disgrace are self-serving, of course. Yet Bauer’s account makes clear that his competitors in the broadcast-holiness business wasted little time in turning on him – the better to shore up their own reputational capital and customer base, perhaps. The critical reader may suspect that Bakker’s eclipse had more to do with economics than with the reverend's failures of rhetorical efficacy.

Former president Clinton, by contrast, is rhetorical efficacy incarnate. Bauer’s chapter on l’affaire Lewinsky attributes his survival to having met the demand for confession.

Of course, he did not exactly make haste to do so. Bauer includes a set of appendices reprinting pertinent statements by the various figures she discusses. The section on Clinton is the longest of any of them. More than a third of the material consists of deceptive statements and lawyerly evasions. But the tireless investigative pornographers of the Starr Commission eventually corned the president and left him with no choice. “In Bill Clinton’s America,” writes Bauer, “the intersection of Protestant practice, therapeutic technique, and talk-show ethos was fully complete. In order to survive, he had to confess.”

He pulled out all the stops – quoting from the Bible on having a “broken spirit,” as well as a Yom Kippur liturgy on the need to turn “from callousness to sensitivity, from pettiness to purpose” (and so forth). It worked. “Against all odds,” writes Bauer, “his confessions managed to convince a significant segment of the American public that he was neither a predator nor an evildoer, and that he was fighting the good fight against evil. Most amazingly, this white, male lawyer, this Rhodes Scholar, who held the highest elected office in the land, persuaded his followers that he was just like the country’s poorest and most oppressed.”

That is one way to understand how things unfolded ten years ago. According to Bauer's schema, Clington underwent a “ceremonial laying down of power,” only to have it handed back with interest. No doubt that description accounts for some people’s experience of the events. But plenty of others found the whole thing to be sordid, cynical, and cheesy as hell – with the confession as less a process that strengthened socials bonds than a moment of relief, when it seemed like the soap opera might end.

So it did, eventually. But there will always be another one, perhaps involving some politician we've never heard of before. That is why The Art of the Public Grovel ought to be kept in stock at Trover’s, the bookshop on Capitol Hill, from now on. While not entirely persuasive in its overall analysis, it might still have non-scholarly applications.

Author/s: 
Scott McLemee
Author's email: 
scott.mclemee@insidehighered.com

Rediscovering Hubert Harrison

The most exciting and eagerly awaited title in this season’s haul from the scholarly presses is Jeffrey B. Perry’s study Hubert Harrison: The Voice of Harlem Radicalism, 1883-1918, just published by Columbia University Press. Well, eagerly awaited by me, anyway.... The world at large has not exactly been clamoring for a gigantic biography of Hubert Harrison -- whose name, until quite recently, was little known even to specialists in African-American political and intellectual history. But that started to change over the past few years, thanks to Perry’s decades of research and advocacy.

The two volumes of essays collected by Harrison during his lifetime have been out of print since the 1920s. A major step forward in his rediscovery came in 2001, when Wesleyan University Press published A Hubert Harrison Reader, edited by Perry, who also prepared a thorough entry on him for Wikipedia. (This can't have hurt: Where a Google search once turned up a dozen or so pages mentioning Harrison, it now yields thousands.)

Last month, Perry sat down with me for an interview, excerpts from which are available here as an Inside Higher Ed podcast. The night before, he had spoken at a Washington, D.C., bookstore; to judge by the warmth of that talk’s reception it seems fair to say that a wider public is ready to rediscover Harrison now. Besides traveling around giving talks to promote the book, Perry is also busy preparing a digital archive of Harrison’s work, to be made available soon by Columbia University.

 

A familiar account of African-American culture during the first two decades of the 20th century frames it as a conflict between Booker T. Washington (champion of patient economic self-improvement within the existing framework of a racist society) and W.E.B. Du Bois (strategist of an active struggle for civil rights under the leadership of the black community’s “talented tenth”). The life and work of Hubert Harrison does not just complicate this picture; he breaks right through its frame.

A tireless organizer for the Socialist Party at the height of its influence in the years before World War I, he took the idea of solidarity among the oppressed a lot more seriously than did his white comrades. (That is putting it mildly: One prominent member of the party wrote a pamphlet called “‘Nigger Equality,” of which the title was not the vilest part.) He later became active with Marcus Garvey’s black nationalist movement, in spite of reservations about it. A prolific critic and essayist, he was also a memorable public speaker and a fierce debater. He lectured for New York City’s Board of Education and seems to have contributed to most of the major newspapers and magazines of his day.

But following his death of appendicitis in 1927, at the age of 44, this public intellectual and activist was almost completely forgotten. One index of this might be Ahmed Shawki’s useful historical survey in Black Liberation and Socialism (Haymarket, 2006), which makes no mention of Harrison. For that matter, in the course of many years spent researching the life and work of C.L.R. James (a figure bearing a number of similarities to Harrison), I never came across any reference by James to his remarkable predecessor. My own appreciation of Harrison's significance came only when Christopher Phelps, an associate professor of history at Ohio State University, published a review-essay on Perry’s Reader in the journal Science & Society.

Reading that piece, it seemed natural to suppose that Perry was a young African-American professor, somewhere. And one in a rather enviable position. After all, it’s one thing to carve out a professional niche -- and something altogether more awesome to rediscover a lost continent.

As luck would have it, I ran into a guy handing out fliers for Hubert Harrison: The Voice of Harlem Radicalism at a conference at Columbia University last month. He was a retired postal worker (white like me) and prone to considerable animation as he talked about the book, which, it turned out, he had written.

I say “it turned out” because Perry is strikingly unproprietary about his book. He displayed very little ego regarding it. Starting to say something about the thoroughness of his research on this or that topic, he would catch himself, seem embarrassed at the presumption, then insist that younger scholars were bound to discover more than he had. (Having gone over his footnotes, I want to wish them luck with that.)

After a while, this began to seem less like shyness than a matter of absolute concentration on Harrison himself. But I wanted to find out how it had come to pass that Perry discovered Harrison – let alone persuaded Columbia University Press to publish a two-volume biography. (The second part, covering the final decade of Harrison’s life, is now in progress.)

It’s neither a short nor a simple tale. Perry graduated from Princeton in 1968 and attended the Harvard Graduate School of Education for a year or so -- making straight A’s, he says, “until I had an opportunity to travel by land through the Americas and took it. I went to Argentina and back.” In 1974, he took a job at the New Jersey International Bulk Mail Center and joined the postal workers’ union. He retired in June 2007.

Clearly his years in postal work had their share of both drama (including a major strike in 1978) and danger (some of Tony Soprano’s friends were union leaders). He edited a couple of mail handlers’ newspapers, and received an M.A. in labor studies from Rutgers University. And while doing graduate work in history at Columbia University in the late 1970s -- initially with an eye to writing a dissertation on how socialist and communist groups had understood “the Negro question,” as the old expression put it -- Perry made a discovery that now looks like destiny.

“In the course of my research,” he told me, “I came across microfilm copies of Hubert Harrison’s two published books The Negro and the Nation and When Africa Awakes: The “Inside Story” of the Stirrings and Strivings of the New Negro in the Western World at the Schomburg Center for Research in Black Culture of the New York Public Library in Harlem. I was immediately arrested by the clarity of Harrison’s thought and the perceptiveness of his analysis. I knew that I had encountered a writer of great importance, and, within a short while, I decided to change my dissertation topic to a biography of Harrison.”

Digging through the available sources, Perry was several hundred pages into his project when, in 1983, mutual contacts put him in touch with Harrison’s daughter, Aida Harrison Richardson, and son, William Harrison. The family “had preserved the remains of Hubert Harrison’s once vast collection of papers and books in a series of Harlem apartments. After several meetings and discussions of their father’s work, they very generously (before William’s death in 1984) granted me access to some of their father’s materials, which were in a room in William’s Harlem apartment.”

Perry worked to preserve and inventory the material, much of it in fragile condition, and he helped the family to place the collection with the Rare Book and Manuscript Library of Columbia University. “I then worked with the Columbia staff,” he said, “to develop a finding aid.”

In 1986, his dissertation was accepted at Columbia University. By that point, Perry felt reasonably confident that he had examined all of Harrison’s papers. To celebrate the completion of his Ph.D. work, he took Harrison’s daughter -- then 75 years old -- out to dinner in New York. At the end of the meal, he says, “she reached into her bag and handed me his diary.... For a biographer, research efforts don’t get much better than this. After going through the extraordinarily insightful and revealing diary, many new avenues of research were opened and I was fully convinced that I had a two-volume biography on my hands.”

As he completed his dissertation, Perry was an elected labor official at a 4,000 worker postal facility while also editing and writing for The Mail Handler’s Voice, a newspaper challenging the mobbed-up union leadership. His articles appeared under pseudonyms. Otherwise it might have been, so to speak, a publish-and-perish situation. (Here, his activism echoed his scholarship: Harrison had edited a paper called The Voice.)

“I did not feel under the pressures often faced in the academic community to publish as a step related to employment and tenure,” Perry recalls. In the early 1990s, he submitted a manuscript to a university press that, all things considered, should probably go nameless. “It was volume 1 of my proposed two-volume Harrison biography and it received extraordinarily positive reviews. I was asked to rework the manuscript, to make it shorter, and to turn it into one volume if possible.”

While making revisions, Perry found still more Harrison material, then re-submitted the manuscript -- insisting, once more, that it would be the first of two volumes. “Again, the reviewers’ comments were extremely favorable,” he says, “and again no decision on publication was ever made." This kind of back-and-forth continued for more than a decade.

"The Harrison biography was in limbo.... Essentially, I think that the publisher was confronted with the question of whether or not it wanted to go with a two-volume biography of an unknown subject by an unknown author. It was undoubtedly a daunting proposition for them.” Perry eventually asked to be released from his contract.

Along the way, however, his dissertation from the mid-1980s came to the attention of Winston James, now a professor of history at the University of California at Irvine, who wrote about Harrison in his book Holding Aloft the Banner of Ethiopia: Caribbean Radicalism in Early Twentieth Century America (Verso, 1998). James introduced Perry to Peter Dimock, an editor at Columbia University Press -- which is how the first volume of this Harrison biography came to be published in its present form.

In an e-mail note, Perry describes “the hunger to write about and discuss Hubert Harrison that I have encountered, especially in some younger Black historians.” He mentions the example of Ousmane Power-Greene, an assistant professor of history at Clark University, who eagerly discusses Harrison with his colleagues. “Power-Greene suggests that Harrison is already beginning to enter ‘the canon’ in an important way," says Perry. "And since Harrison touches so many areas -- politics, history, the arts, science, religion and so on -- he will continue to attract increased attention.”

The forthcoming digital edition of Harrison’s collected works (running to some seven hundred articles) will certainly help with that. Perry also notes that the younger Harrison-minded scholars he has been in touch with “often benefited from non-university mentors ... and are attracted to intellectuals like Harrison who are visible in the community and haven’t received the attention they, or their work, merit.”

In that regard, Perry is being a bit autobiographical: his own mentor was the late Theodore W. Allen, another working-class historian and author of the two-volume study The Invention of the White Race. He now has the responsibility of handling Allen’s posthumous papers, including some book manuscripts that sound more or less ready for publication. While readers may look forward to the second volume of the Harrison biography, we probably shouldn’t start holding our breaths just yet.

Unless, of course, some far-sighted cohort of graduate students is ready to help the man out by serving internships with him. I think hanging around Jeff Perry for a while would be an education in itself.

Author/s: 
Scott McLemee
Author's email: 
scott.mclemee@insidehighered.com

The Fire Last Time

In the wake of Martin Luther King Jr.'s assassination on April 4, 1968, a wave of riots erupted throughout the United States -- leading to the occupation of Baltimore, Chicago, and the District of Columbia by federal troops and the mobilization of the National Guard in a dozen more. The violence lasted for a week. Clay Risen gives some numbers in the opening pages of A Nation on Fire: America in the Wake of the King Assassination, just published by John Wiley & Sons: "39 people were dead, more than 2,600 were injured, and 21,000 had been arrested. The damages were estimated at $65 million--about $385 million today."

But when Risen, the managing editor of the journal Democracy: A Journal of Ideas, told people he was working on a book about the '68 riots, he says they often assumed he meant the police melee at the Democratic national convention in Chicago a few months later. The post-assassination upheaval -- which engulfed more than 100 cities -- gets a brief nod in accounts of the Sixties, of course; yet the details remain vague, as if our historical memory had somehow erased most of them. Drawing on contemporary accounts, government reports, interviews, and archival sources, Risen presents a narrative history of some of the events of that catastrophic week.

"A race war did in fact come to America that day," he writes, "but it turned out to be a cold war, not a hot one. When the smoke cleared and the sirens ran down, an invisible wall went up between urban and suburban America, every bit as real as the one in Berlin.... In the worlds of legal theorist Jonathan Simon, in the 40-year wake of the riots, 'Americans have built a new civil and political order structured around the problem of violent crime.'" But evidence suggests that the riots were hardly the work of street thugs. "There was no 'typical' rioter," notes Risen, "but the statistically average profile was better educated and more likely to be employed than most people in the riot area....Such results underscore an alternative theory of ghetto rioting: that it was at least as much an expression of protopolitical anger as it was of opportunism and common criminality."

With its emphasis on the political logic of racial backlash, A Nation on Fire shares themes with Rick Perlstein's Nixonland -- but it also seems strangely contemporary at a time when fresh surges of "protopolitical anger" are in the air, worldwide. (Street demonstrations have just toppled the government in Iceland, for example.) Risen agreed to answer a series of questions about his book by e-mail. A transcript of our exchange follows.

Q:The upheavals of early April 1968 seem to be a kind of blindspot in our historical memory -- something not entirely obliterated perhaps, but underrepresented, at any rate. How do you understand this? Is it the forgetfulness of trauma? Or do the events themselves somehow resist being incorporated into accepted narratives about the period?

A: Both, and there's a political element, too. It's hard to imagine today, but King was a very controversial figure at the time of his death -- not only was he still opposed by millions of racist and bigoted whites, but his turn toward antiwar and prolabor activism in 1967 and 1968 distanced him from many racial moderates and liberals as well. And, sadly, the civil rights movement had to work hard to distance itself from the annual riots of the mid- to late 1960s, which many Americans felt was a direct consequence of black activism. Not surprisingly, then, the urge over the last 40 years has been to memorialize Martin Luther King's life, which almost by necessity excludes a discussion of the rather ironic violence that followed his death.

And so while I do think there was an element of historical amnesia resulting from national trauma -- after all, very little is known, at least popularly, about other major riots of the 1960s--I think it is also the result of conscious choices in the crafting of civil rights history.I believe those choices, to be clear, were the right ones at the time. But of course there are unintended consequences of such selective history writing. Because few things are ever, as you say, "entirely obliterated," we still have a vague knowledge that massive violence occurred, but a skewed understanding of what actually happened.

Q:What is the most common misunderstanding about the aftermath of MLK's assassination? And what consequences follow from getting that right?

A: There are so many. But I would say one of the most common is the role of black radicals in starting (or not starting) the riots. This is not unique to April 1968 -- in the wake of almost every riot during the 1960s, efforts were made to uncover plots and hunt down the radicals that formed them. Congress even held several rounds of hearings on the question, despite repeated reports by post-riot commissions finding no link between radicals and the riots at all.

Stokely Carmichael looms large here. In April 1968 he was out of SNCC and dabbling with the Black Panthers; at the same time, he was drawing closer to King and, having settled in Washington, was putting together the Black United Front, a militant but ecumenical group that drew equally from fringe groups and church leaders. At the same time, he was being hounded by the FBI and Congress for supposedly instigating many of the decade's worst riots (he was, almost literally, the black bogeyman for many whites, the catchall explanation for the complexities of urban violence).

Naturally, when violence broke out in Washington the night of King's murder, all eyes turned to Carmichael as the cause, even though -- as I demonstrate in the book -- he was if anything trying to keep the peace in the city's streets. It's hard to know what he was really thinking -- whether he actually opposed violence as a response, or whether he merely opposed unstructured mob violence -- but the point is the same. Nevertheless, faulty and sensationalist reports immediately pegged him as the cause, and even today I come across accounts that place the blame largely, if not wholly, on Carmichael.

This sounds like a historical debaters' point, but understanding the role of black radicals in the 1960s is critical to getting a complete picture of the civil rights movement, race relations, intra-racial politics, and of the political backlash that followed. I'm not sure such a measured understanding was possible then, but I believe it is now, 40 years later.

Q:You quote predictions of impending urban guerilla warfare appearing in mainstream American publications months before King was assassinated. You also show that the government took the possibility of race war very seriously -- drawing up contingency plans that sound nearly as detailed as any preparation for a Soviet attack. Was the subsequent tendency to blame the riots on radicals just a matter of the authorities seeing what their anxieties told them to expect? At what point did that expectation of guerilla warfare fade away? Or did it?

A: There were really two currents in the military's response to domestic riots in the 1960s. On the one hand was what you might call the conservative wing, which saw any domestic activity as a drain on military resources at the height of the Vietnam war, a risk to civil liberties, and a potential PR disaster. On the other hand, there was an activist (for lack of a better word) wing that saw domestic radicalism -- steeped as it often was in Marxist rhetoric -- as an obvious extension of international Communism. (There were others, not in either wing, who gladly accepted any additional duties because they meant more resources and more political clout.)

Given the propensity among the media, the public, and politicians to blame rioting on organized radicals, it was easy for this wing to argue that the riots were ultimately a part of a communist plot to subvert American social order. These were the ones who pushed aggressively on domestic surveillance efforts, as well as massive allocations of resources for domestic deployments.

I don't want to push this element too far -- relatively few military officials fully believed that the riots were part of a communist plot. But conversely, very few could dismiss it completely. And so, to be better safe than sorry, even those who had significant reservations remained silent, at least until, after 1968, urban mass violence began to recede.

As I point out in the book, this is one area where the story of the riots and the government's response bears heavily on today's obsession with homeland security. In both instances there is a "better safe than sorry" mentality and a willingness, even an eagerness, to put civil liberties on the scales against (often imagined) threats to domestic security. Sadly, the lessons of the 1960s with regards to domestic surveillance and military operations in the homeland were largely forgotten, and we are only now relearning them.

Q:The common white perception of the riots--and a major factor shaping the backlash-- was the belief that they were the work of criminals, hooligans, and riffraff. Would you talk a bit about what you learned about the people who took to the streets, and about how the process of disinhibition played itself out?

A: Yes -- the white response was incoherent in that regard. It assumed that the rioters were both instigated by radicals and, at the same time, self-interested riffraff. It was a difficult issue for liberals to address. The riots were clearly political in nature -- or, in sociology-speak, protopolitical -- in that they were a mass response to unacceptable conditions in the inner-city, usually instigated by a particular event, like an overly aggressive arrest, as in Watts. At the same time, they weren't fully political in that they weren't goal-oriented; they were an expression of rage, but little else, and there was certainly no organization.

An entire cottage industry of "riotology" -- a mix of criminal justice, political science, and sociology -- emerged in the 1960s to explain precisely what was happening in the inner-city (partly imported from Europe, where mass violence had long been a focus of study). The consensus view, and one I agree with, is that many inner-cities in the mid- to late 1960s were never wholly peaceful, and that riots were simply spikes in an always-on-edge, boiling ghetto rage. Low-level violence against local merchants and symbols of authority was common, and short-lived mass violence was frequent throughout the country, even if it went below the general public's radar.

What was necessary to set off a full-fledged riot, though, was a catalyst and a precise mixture of factors -- a large crowd, a small police presence, and a series of precipitating events. (In Washington a few days before King died, there had been a near-riot at a drug store after the manager accused a black teen of stealing.) At that point, all that was needed was a spark - -what sociologists call a "Schelling incident," in which a symbolic violation of the law goes unanswered, demonstrating that, suddenly, there was everything to gain (loot, catharsis) with little risk of punishment for mass violence.

Q:The riots appear to have been catalytic in creating a deep shift in American politics -- the collapse of the New Deal coalition, the emergence of "silent majority" conservatism. Is that a fair assessment?

A: I'm not sure I'd go so far as to say that the riots alone caused the deep shift in American politics, though they were certainly an important part of the story. What I argue in the book is that riots, particularly the widespread riots of April 1968, helped make manifest a deep and widely felt sentiment among America's increasingly suburban white middle class -- namely that liberalism had stopped working for them, and instead started working for economic and racial groups that were inimical to their newly established social order.

In simpler words, whites had managed to escape the city and establish a politically moderate, racially exclusive utopia; from there, they could look back on the cities and, seeing violence amidst such extensive and expensive government support, decide that liberalism had failed the country. The extremes of inner-city life--the crime, the radicalism, the riots--helped unify and "desublimate" anti-urban conservatism along racial lines, which in turn set the stage for the anti-civil rights backlashes of the 1970s, from suburban secessions to anti-busing campaigns.

This long-wave backlash fed the candidacies of every president from Richard Nixon to George W. Bush, including Jimmy Carter and Bill Clinton, both of whom expertly harnessed anti-big government sentiment to liberal, or at least Democratic, ends. The question surrounding Barack Obama's liberal ascendancy is whether it is fueled by anything more than a visceral reaction against Bushism--if it does, in fact, signal an end to 40 years of anti-urban, anti-liberal politics.

Q:A wave of food riots erupted in several countries last year. Not long ago Robert Wade of the London School of Economics suggested that more international civil unrest is likely in the months ahead, starting perhaps this spring: "It will be caused by the rise of general awareness throughout Europe, America and Asia that hundreds of millions of people in rich and poor countries are experiencing rapidly falling consumption standards; that the crisis is getting worse not better; and that it has escaped the control of public authorities, national and international." The outpouring of misery in response to a political assassination is one thing; hunger and economic uncertainty another. But there does seem to be something in common: "the awareness...that the crisis is getting worse not better." If so, what's the lesson here?

A: I would caution against drawing too many parallels between the riots of then and the riots of today -- context is very important, and I simply don't know enough about today's food riots to comment. That said, on a certain level, a commodity riot is a commodity riot. People feel they are being deprived of a good they rightfully deserve, at least at a fair price. In 1960s-era inner-cities, residents knew they were being charged extra for the same goods as white suburbanites; in today's developing countries, people can see the price of foodstuffs going through the roof.

As you say, the critical issue is the trend line. In the 1960s, inner-city residents expected, with good reason, that life should be getting better, and yet it was only getting worse for them. In developing countries today, several years of growth in the standard of living suddenly gave way to spiking food prices and commodity shortages. In both cases, the critical factor is a sense of loss of control after society has seemingly promised more of it. Is it any surprise that some people turn to violence in order to regain that control?

Author/s: 
Scott McLemee
Author's email: 
scott.mclemee@insidehighered.com

War in the Heavens and Here Below

The world, it is said, is made up of two kinds of people: those who divide the world into two kinds of people, and those who don’t. The joke is too old to be funny, yet it has a point, even so. The impulse to dichotomize is not universal – but close enough. Some of us tend to think that the ability to distinguish shades of gray is a mark of progress. But the digital alternative of black and white tends to reassert itself from time to time. Maybe our brains are wired for binary oppositions, after all?

The fascination of Michel Tardieu’s book Manichaeism, just published in an English translation by the University of Illinois Press comes from watching the emergence and consolidation of the most emphatic possible variant of this tendency – from seeing it take a particular shape in a specific place and time.

The arrival of the prophet Mani (born in Persia in the year 216) falls almost exactly between the lives of Jesus and Mohammad. The religion he founded has died off. Until libraries of Manichean scriptures were unearthed over the past century or so, most of what we knew about the faith came via Christian and Muslim polemicists. But the Mani's vision is another matter. Manicheaism regards the world as a battlefield occupied by the forces of light and darkness, good and evil, with combat headed fast towards a final reckoning. This outlook is alive and well along the border between Pakistan and Afghanistan -- not to mention certain holdouts in Thinktankistan, a province of Washington, D.C.

No such topical points are scored by Tardieu, who lectures on the religious syncretisms of late antiquity at the Collège de France. His book first appeared in 1981 -- then in a revised edition in 1997 -- as part of the “Que Sais-Je?” series of popular guides to scholarship. (Its format is somewhat reminiscent of Oxford University Press’s Very Short Introductions.) The approach here is, for the most party, strictly positivist – or, to put it another way, a bit dry, though that soon proves an advantage. For the history and doctrines of Manichaeism are more than imaginative enough in themselves. If the prophet Mani had not existed, I suspect Jorge Luis Borges would have needed to invent him.

His father Patteg, it seems, was a regular worshipper attending a house of idols in a city in what is now Iran. Or rather, he was one, until he heard a voice that commanded him to abstain from meat, wine, and sex. This went on for three days and made a big impression, as well it might. Patteg abandoned his pagan ways and joined a sect that combined elements of Christianity with its own rather more stringent gloss on Jewish dietary laws. Meat of any kind was forbidden, for example, and the faithful would baptize vegetables before eating them.

Mani was presumably conceived before Patteg's ascetic commandments took full effect. He grew up in the faith, but had his own set of visions when he was 12 years old -- the same age Jesus was when his parents found him arguing fine points of scripture with the elders at the temple. Tradition also has it that the prophet's his mother was named Maryam. (You can see where this kind of thing would annoy Christian heresiologists.)

In any case, when Mani proclaimed his own revelations in his early 20s, he challenged the idea that blessing your food while washing it made it pure. What came out of your backside was the same as if you had eaten something the law proclaimed unclean. As he continued to preach and draw followers, Mani made it clear that he recognized and respected the authority of three other prophets – Jesus, Zoroaster, and the Buddha. His own role was to complete their work. He would synthesize what they had revealed, and fulfill what they had left undone. Mani was “the seal of the prophets.”

It would be a mistake to think this amounted to some New Age, come-as-you-are brand of spirituality. Nor did his satiric jibes at food baptism mean that followers should eat just whatever they wanted. The revelations of Mani supplanted previous doctrines, and imposed a severe discipline on believers. The struggle for purity involved a lot more than washing your vegetables.

The demands on the Manichean faithful make the life of a Puritan seem like that of a libertine. Bathing was forbidden, for example, since it would be (1) an act of violence against the water and (2) a source of sensual pleasure. The clergy had to take vows of extreme poverty. Its members were supposed to eat only certain vegetables, and not many of them. But even that was forbidden during the periods of fasting, which were regular and frequent.

At an annual festival, lay believers presented a banquet of really good fruit to "the elect." By that point, the religious leaders were famished, but sufficiently pure for the task of harvesting the “particles of light” contained in their food. The particles had been scattered throughout the universe during the struggles between two eternal principles known as the Father of Greatness and the King of Darkness -- the forces of good and evil.

Mani explained that there had already been two cosmic battles between them. The conflict had generated a number of lesser gods and devils. Some of the demons had created Adam and Eve -- with Eve being particularly susceptible to their influence. Procreation was a diabolical means for further scattering the “particles of light” in the world. Funny how often these cosmic dualisms have a streak of misogyny in them, somewhere.

But happily Adam was approached by one of the three versions of Jesus. (Seriously, don’t ask.) And so mankind now has a role to play in the third war between Light and Darkness -- the final apocalyptic showdown between good and evil. The role of the Manichean religion was to help bolster the divine forces. Augustine of Hippo, who converted to Christianity after a period as one of the Manichean laity, is quite sarcastic about this in his Confessions: “To those who were called ‘elect’ and ‘holy,’ we brought foods, out of which, in the workshop of their stomachs, they were to make us angels and gods, by whom we might be liberated.”

Plenty here for outsiders to ridicule, then. But the conviction that they were troops in a cosmic battle gave believers a certain esprit de corps that was hard to break. The faith also had a streak of self-conscious universalism that encouraged proselytizing. Mani himself went to India and converted some Buddhists to his revelation. As late as the 13th century, Marco Polo encountered Manicheans in China. And severe asceticism can exercise a fascination even on people who reject the doctrines behind it. Christianity and Islam did not so much wipe out Mani’s faith as, so to speak, absorb certain particles lodged within it.

In any case, Mani himself was clearly some kind of genius. Jesus and the Buddha left it to disciples to record their teachings. By contrast, Mani composed his own scriptures and even perfected an alphabet to make it a better medium for recording speech. He illustrated his complex history of warfare among superhuman forces with paintings that were remembered long after they were lost. “In the culture of Islamic Iran,” writes Tardieu, Mani’s name “has come to symbolize beauty of the most refined kind." (Although Tardieu does not venture this point, something about Mani's visions, with their bizarrely intricate mythology, call to mind Blake's prophetic books. The fact that both were lovingly illustrated suggests the parallel is not simply in the eye of the beholder.)

Mani took care to elaborate the rituals and organizational structure of his religion, instead of leaving this for later generations to suss out for themselves. It seems almost as if he’d read Max Weber on the routinization of charisma and put it into practice. He also tried to establish his faith as a new state religion by talking it up to various monarchs. The effort did not pay off. Indeed, it led to Mani’s execution at the age of sixty, from which he had the misfortune not to be resurrected.

One other circumstance may have been decisive in Manichaeism ending up as also-ran among the world religions. Treating procreation as an instrument of the Evil One tends to be bad for a creed's long-term viability. Tardieu is much too sober a scholar to speculate, but I feel pretty sure it was a factor.

Author/s: 
Scott McLemee
Author's email: 
scott.mclemee@insidehighered.com

Encouraging Political Incorrectness and Civility

Editor's Note: Vanderbilt University Press is this spring releasing American Conservatism: Thinking It, Teaching It, by Paul Lyons, who died in January. In the book, Lyons features writings from a teaching log he kept from a course on conservatism that he taught at Richard Stockton College of New Jersey. The material from the log appears below in italics, and his additional commentary is in regular text.

February 15

Most of academic life is a blessing; sometimes I’m amazed that I get paid for doing this, doing what I love. When class discussion turns to work, I always ask my students if they or people they know would stay at their jobs if they won the New Jersey lottery big time. Almost all say that they’d quit. This is a useful marker for defining alienation, doing what is alien to you. And, of course, it is paralleled by students staying in school for reasons that are alien to their desires. Similarly, all academics hate wasted time with self-important administrators, having to deal with petty and occasionally vicious colleagues (the academy is more vicious than high finance precisely because so little is at stake), paperwork and more paperwork. For most of us it is relief to walk back into the
classroom.

In this particular classroom, I found myself offering a brief biog of William F. Buckley Jr. I was well prepared having reread John Judis’s definitive study. So I walked them through his family life, his early “bad boy” years at and after Yale, his most influential books, his role in the founding of National Review. Then, with maybe 10 minutes remaining, I read to them marked-out sections in Judis’s biography that pointed to Buckley’s worst moments of narrow-mindedness, comments he made in the 1950s and early 1960s about civil rights in America and independence movements in Africa. The statements, sometimes flippant in that Buckley “squire of the manor” style, were at best patronizing, at worst, deeply racist, particularly one statement in which he suggests that Africans will be ready for self-determination when they stop eating one another. I wanted my class to come to grips with the burden conservatives carried in that period, being on the wrong side of history, still holding onto a kind of British arrogance about “wogs” — Colonel Blimp if you will. But one of my most conservative students, Dick, jumped in with support for Buckley’s worst comment, responding with a smirk, with a knowing look about “those people,” those Africans.

If there is such a thing as a teaching moment, this was it. I stopped him and asked the class if it would be different if there were African American students in this class. They quickly saw my point, but one responded, “They’d beat the shit out of Dick.” I countered by suggesting that it shouldn’t be the obligation of black students to call Dick on his statement, but the obligation of whites to do so. There were a few quizzical looks as I explained the unfairness of expecting blacks — or Jews or women or gays or Catholics — to be obliged to defend themselves from inappropriate assault.

I was thinking on my feet, mostly trying to figure out how to chastise Dick without putting him too much on the spot, how to signal what’s OK and not OK in my classroom without stifling legitimate commentary, how to, in effect, be politically correct without being stuffy, hypocritical, humorless, unwilling to engage on controversial issues. I have examined some of the literature that addresses the plight of so many African nations — the kleptocracies, the genocides, the ethnic wars, the waste of resources. I have rooted for the best of African leaders, anticipated that the resource-rich nations of Nigeria, South Africa, and Congo would have to be the linchpins of development. And I have thought a great deal about the reason why East Asia and now all of Asia is moving forward to rapid economic growth — with all the caveats about inequalities, environmental dangers, corruption, dictatorship — and Africa stagnates. Sometimes I think that it must be that Asian cultures, Asian imperial history, especially in China and India, sustained an identity that now provided the cultural capital for an Asian version of the work ethic. Africa seemingly has struggled more with the very creation of nation-states. When I consider Latin America and then the Islamic Middle East, I am more confused, in my relative ignorance of their respective histories.

I am sometimes taken aback by what we do not teach our students. Aside from the above-noted gaps in what we can reduce to the “great books,” there are other appalling shortfalls, at least in many public institutions: the shortage of courses in what are probably the most salient developments of our times, the reemergence of China and India as players on the world stage, the increasing importance of Asia where almost two-thirds of the world’s population lives; the minimal attention paid to world religions — my students are not only unable to demonstrate any accurate knowledge of Buddhism, Hinduism, or Islam, but they are also remarkably ignorant about their own religious backgrounds. Few can tell me what a Christian is, at least if I ask for comment on Catholics, Protestants, and Orthodox. Fewer can distinguish Presbyterians from Episcopalians, nor can they define evangelicals or fundamentalists, not to speak of Pentecostals. More heartening is that most of my students are motivated to learn about organized religions; our K–12 schools, afraid of offending almost anyone, do not teach them about the history of the very Judeo-Christian tradition they abstractly celebrate.

But I do know that leftists and well-meaning liberals too often respond to questions of African horror with the same old saw — its colonial and neocolonial factors. True but not enough to explain why Taiwan and South Korea and China have moved forward. And it just plays into conservative stereotypes that the Left always blames the West and the United States and never holds people of color, here or elsewhere, accountable. It is the macro version of what I will simplify as the attacks on Daniel Patrick Moynihan’s study of the African American family. So I tried to make sure that in chastising Dick and indicating acceptable boundaries of discourse, I was simultaneously, and as strongly, modeling that raising questions about African nations is legitimate. How could I not, given my own point of view? Whether I was successful remains to be seen. Time will tell. But it was, I think, a useful beginning of a discussion I assumed we would engage when we got to George Wallace and the white backlash of the 1960s. I am debating whether to post a question on this issue on Web Board this week or to wait until we have more meat and potatoes on the plate such that we can do more than discuss without context or information. But I must admit that I left class pumped with the anticipation of that set of discussions and, hopefully I’m right, with some confidence that we started it well.

I don’t think we as academics and teachers do a very good job teaching about race and racism. Some seems to be liberal guilt. Mostly it rests on the lack of confidence that one can present complicated situations, nuanced realities without risking being misinterpreted by colleagues and students.

Several years ago at a panel on racism I suggested that we begin by seeing if we could agree on four axioms, the first being that there is more racism in America than most white people were willing to admit. No controversies there. The second was that there has been considerable progress over the past 40 years based on the civil rights revolution of the 1960s. More curious looks but no hostility. Then the third axiom, that there were some African Americans who see racism when it doesn’t exist. At that point, the room became more agitated with some furrowed brows and raised eyebrows. The fourth axiom brought down the house: that given the above three axioms, it was presently more difficult to assess allegations of racism. Indeed, I added, there were now so many divergent voices within the African American community — a partial measure of the successes above noted — that no one could any more claim to represent “the black voice.”

The panelist following me denounced my position, arguing that racism was as bad or worse than 40 years ago, merely more hidden. Then the panel opened for questions from the audience. A black female undergraduate asked me how I would respond if she believed that I had said something racist in class and she came to complain to me. I told her that I would take her allegation very seriously, consider whether I thought it was valid, and give her my most honest response. She was dissatisfied, indeed offended by my response, as were many on the panel and in the audience. The student asked me why I wouldn’t accept the validity of her allegation. I told her that I thought it would be harmful to her or any other student to allow an automatic acceptance of any allegation, that it risked corrupting her or anyone else in that it would allow for false charges to go unchallenged. I ended by suggesting that true respect included disagreement. I added that if not satisfied, a student always had the remedy of taking the allegation to my superiors.

The room erupted with anger at me, with one white colleague screaming at me that I was patronizing the student. I was disappointed and depressed by this display of what seemed to me to be wrong-headed, racially retrograde, and demagogic. I need to add that I was not angry at the student who raised the issue; she seemed honest and forthcoming, even in disagreement.

Most interesting is that over the next weeks several of my African American students asked me what had happened — there obviously had been a buzz in the hallways. This led to some fruitful conversation about how one determines the existence of racism. I also received several notes from white colleagues expressing admiration for what I had said but confessing that they were too cowardly to do the same. This depressed me even more than the hostile responses. Had we come to this — faculty, even tenured ones, afraid to speak their minds in fear of being charged with racism? Indeed, we had. One junior faculty member told me that he never goes near certain hot-button issues like affirmative action or underclass behavior because of his fear that it might put his job at risk.

As teachers we struggle with students who hold back from authentically discussing issues of prejudice, who go silent or simply echo agreement. It is hard work to achieve honest discussions; all students enter with bruises. One must establish a trusting environment for such discussions to be fruitful. Trust does not exist at the beginning of a class. I tell students that the handshake is an apt metaphor for our relations — I hold your hand, you hold mine — we trust one another but I also prevent you from hitting me in case that is your hidden desire. We trust and mistrust simultaneously. And then we can begin to have an honest dialogue.

I begin with a modest sense of how much influence I have with my students, especially regarding changes in their essential behavior regarding issues of social justice. Teachers are fortunate if we increase at the margin those who are willing to stand up for others. But human behavior being what it is, we remain burdened with the knowledge of how difficult it is to educate individuals to identify with all of the “others,” to construct a global identity focused on human rights. Sigmund Freud, given the trauma of World War I, asserted not only that reason and enlightenment were fragile, but also that there was something in the existence of human intelligence which never allowed the darkness to be all-engulfing, and that this indistinguishable light of humane thought had a surprising persistence. Our goal as educators is to widen that ray of light, to assist a few more ordinary men and women to resist the extraordinarily evil and to stretch toward the extraordinarily good.

My own view is that the optimal way to help students respond to moral challenges is to help them understand the contradictory strands of heroism and knavery, the victimized and the victimizing, of many of our peoples. And we as educators need to understand and communicate the contextual nature of human behavior, its range and subtleties, and the contradictory ways that humans respond to moral challenges. As such, we teach humility before the wonder — the heroism, the cowardice, the insensitivities, the villainies — of our own natures, our own histories.

This might be called the double helix of all peoples, the intertwining of their burdens and their inspirations, their hidden shames and forgotten accomplishments, the recognition of which makes it more likely that they will be able to recognize the same complexity in others.

All of this has to begin with the obvious: that I am a white guy teaching about race and racism. No matter how you slice it, it makes a difference. It does help that I was born and bred in Newark and have some “cred” with my city kids (keep in mind that many of my African American students are middle class and suburban). I work very hard to break down the obvious stereotypes, including those blacks have of non-blacks. I want all of my students to recognize that each of us is simultaneously a member of an ethnic/racial/religious group, a human being, and a very distinct and unique individual. When we address social class and poverty, I want my students to understand the need to disaggregate poverty, to note three kinds of the poor: the temporary poor, the working poor, and the underclass poor. The first two groups share all of the values and behaviors of Americans, for example, the work ethic. They suffer from short-term crises, such as a husband and father splitting and not providing sufficient support, a worker facing a health problem without insurance, or people suffering from poor educations that limit their income potential to close to minimum wage, holding jobs with no benefits.

It’s only the latter category, sometimes linked to a “culture of poverty,” certainly no more than one-fourth of those poor, who exhibit the self-destructive behaviors — substance abuse, bad work habits, impulsive control problems, criminal activities, abuse of women and children — that fall outside of societal norms. Most of my students of color have no difficulty in affirming that such behaviors exist; indeed, they often go farther than I am willing to go in ascribing such behaviors to the black poor. I rely a great deal on the work of William Julius Wilson, the extraordinary black sociologist, in teaching about the links between class and race, between behavior and opportunity and, especially, the need to address the most painful and least flattering aspects of black street life
honestly and directly.

I tell all of my students to go beyond the snapshot to the motion picture. That guy drinking from a bottle in a paper bag in front of a
bar — how did he get that way? I bring in the start of the motion picture, the differential chances of success already there in birthing rooms. How is it that I can stand in front of that room full of newborns and, based on race and social class, tell with a high degree of accuracy which babies will graduate from college, who will have a decent middle-class life, and who will end up in prison or dead before age 30. That is criminal to me. No baby determines the well-being of its parents. But the odds are set very early. Now odds are not determinants; people beat the odds. But I remain angry and want my students to share that rage at the inherent injustices that await so many of our poor children.

Many of my African American — and increasingly, Latino — students are quite inspirational. Many, not most or all, come from difficult environments. Many have surmounted extraordinary barriers — broken families, crime-infested neighborhoods, drug experiences, lousy schools, early pregnancies and child-rearing, physical and sexual abuse — to make it to college. I hope that my pride in them, which includes pushing them to excel, prodding them to resist racial and often gender stereotypes, comes through in the classroom. I want that young woman who was offended by my comments at the panel discussion to hang in there, continue challenging me, but I also want more time to try to persuade her that there is respect in disagreement, that she will be best served by being taken seriously.

Author/s: 
Paul Lyons
Author's email: 
info@insidehighered.com

The late Paul Lyons taught American history and social policy at Richard Stockton College of New Jersey. This essay is an excerpt from American Conservatism: Thinking It, Teaching It, and appears here with permission of the publisher, Vanderbilt University Press.

The Monster at Our Door

Laid low with illness -- while work piles up, undone and unrelenting -- you think, “I really couldn’t have picked a worse time to get sick.”

It’s a common enough expression to pass without anyone ever having then to draw out the implied question: Just when would you schedule your symptoms? Probably not during a vacation....

It’s not like there is ever a good occasion. But arguably the past few days have been the worst time ever to get a flu. Catching up with a friend by phone on Saturday, I learned that he had just spent several days in gastrointestinal hell. The question came up -- half in jest, half in dread -- of whether he’d contracted swine variety.

Asking this was tempting fate. Within 24 hours, I started coughing and aching and in general feeling, as someone put it on "Deadwood," “pounded flatter than hammered shit.” This is not a good state of mind in which to pay attention to the news. It is not reassuring to know that the swine flu symptoms are far more severe than the garden-variety bug. You try to imagine your condition getting exponentially worse, and affecting everyone around you -- and everyone around them.....

So no, you really couldn’t pick a worse time to get sick than right now. On the other hand, this is a pretty fitting moment for healthy readers to track down The Monster at Our Door: The Global Threat of Avian Flu, by Mike Davis, a professor of history at the University of California at Irvine. It was published four years ago by The New Press, in the wake of Severe Acute Respiratory Syndrome (SARS), which spread to dozens of countries from China in late ‘02 and early ‘03.

The disease now threatening to become a pandemic is different. For one thing, it is less virulent -- so far, anyway. And its proximate source was pigs rather than birds.

But Davis’s account of “antigenic drift” -- the mechanism by which flu viruses constantly reshuffle their composition -- applies just as well to the latest developments. A leap across the species barrier results from an incessant and aleatory process of absorbing genetic material from host organisms and reconfiguring it to avoid the host’s defense systems. The current outbreak involves a stew of avian, porcine, and human strands. “Contemporary influenza,” writes Davis, “like a postmodern novel, has no single narrative, but rather disparate storylines racing one another to dictate a bloody conclusion."

Until about a dozen years ago, the flu virus circulating among pigs “exhibited extraordinary genetic stability,” writes Davis. But in 1997, some hogs on a “megafarm” in North Carolina came down with a form of human flu. It began rejiggering itself with genetic material from avian forms of the flu, then spread very rapidly across the whole continent.

Vaccines were created for breeding sows, but that has not kept new strains of the virus from emerging. “What seems to be happening instead,” wrote Davis a few years ago, “is that influenza vaccinations -- like the notorious antibiotics given to steers -- are probably selecting for resistant new viral types. In the absence of any official surveillance system for swine flu, a dangerous reassortant could emerge with little warning.” An expert on infectious diseases quoted by CNN recently noted that avian influenza never quite made the leap to being readily transmitted between human beings: "Swine flu is already a man-to-man disease, which makes it much more difficult to manage, and swine flu appears much more infectious than SARS."

There is more to that plot, however, than perverse viral creativity. Davis shows how extreme poverty and the need for protein in the Third World combine to form an ideal incubator for a global pandemic. In underdeveloped countries, there is a growing market for chicken and pork. The size of flocks and herds grows to meet the demand -- while malnutrition and slum conditions leave people more susceptible to infection.

Writing halfway through the Bush administration, Davis stressed that the public-health infrastructure had been collapsing even as money poured into preparations to deal with the bioterrorism capabilities of Iraq’s nonexistent weapons of mass destruction. The ability to cope with a pandemic was compromised: “Except for those lucky few -- mainly doctors and soldiers -- who might receive prophylactic treatment with Tamiflu, the Bush administration had left most Americans as vulnerable to the onslaught of a new flu pandemic as their grandparents or great-grandparents had been in 1918.”

The World Health Organization began stockpiling Tamiflu in 2006, with half of its reserve of five million doses now stored in the United States, according to a recent New York Times article. The report stressed that swine flu is driving up the value of the manufacturer’s stocks -- in case you wondered where the next bubble would be.

But don't expect to see comparable growth in the development of vaccines. As Davis wrote four years ago, “Worldwide sales for all vaccines produced less revenue than Pfizer’s income from a single anticholesterol medication. ... The giants prefer to invest in marketing rather than research, in rebranded old products rather than new ones, and in treatment rather than prevention; in fact, they currently spend 27 percent of their revenue on marketing and only 11 percent on research.”

The spread of SARS was contained six years ago -- a good thing, of course, but also a boon to the spirit of public complacency, which seems as tireless as the flu virus in finding ways to reassert itself.

And to be candid, I am not immune. A friend urged me to read The Monster at Our Door not long after it appeared. It sat on the shelf until a few days ago.

Now the book seems less topical than prophetic -- particularly when Davis draws out the social consequences of his argument about the threat of worldwide pandemics. If the market can’t be trusted to develop vaccines and affordable medications, he writes, “then governments and non-profits should take responsibility for their manufacture and distribution. The survival of the poor must at all times be accounted a higher priority than the profits of Big Pharma. Likewise, the creation of a truly global public-health infrastructure has become a project of literally life-or-death urgency for the rich countries as well as the poor.”

There is an alternative to this scenario, of course. The word "disaster" barely covers it.

MORE: Mike Davis discusses the swine flu outbreak in an article for The Guardian. He also appeared recently on the radio program Beneath the Surface, hosted by Suzi Weissman, professor of politics at St. Mary's College of California, available as a podcast here.

Author/s: 
Scott McLemee
Author's email: 
scott.mclemee@insidehighered.com

Toward a 21st Century Renaissance -- in My Day

I.

Given this chilly climate for administrators -- salary freeze, hiring freeze -- I turn for relief to that dusty ghost-town in my mind’s geography, the one labeled Intellect. This turn has been further encouraged by the publication in recent months of an article on the influence, or lack thereof, of a book I wrote 20 years ago on the relations between American and British writers in the 19th century, titled Atlantic Double-Cross. This book tried to explain why the writers of each country hated each other’s guts and how this animosity informs the great literary works of the period. In it, I argued for a new subdiscipline of comparative literature that would take up the Anglo-American relationship. The book pretty much flopped, in my view, and so I was delighted to read even a measured discussion of the book’s effect on my discipline — delighted, that is, until I arrived at a paragraph beginning, “In Weisbuch’s day. ...”

At first I was tempted to call the gifted, clearly youthful Columbia professor who wrote this sentence to say, “listen, it may be late afternoon; it may even be early evening. But it is still my day.”

More to the point, the phrase made me realize that I am pretty old, and that made me think — I guess I am supposed to speak like a codger now and say instead, “that got me to thinking...” — about the changes in academe in my lifetime. I thought about the move of psychology, for instance, away from the humanities through the social sciences over to the sciences, a journey by which Freud was moved from being a point of reference to a butt for ridicule.

I considered the tendency for economics to forego fundamental questions for refining an accepted model. I note as well a decline in the influence of the humanities, from whence most university presidents arose in the 1930’s, say, and an ascendancy of the sciences, and in particular genetic science, from which field an increasing number of our institutional leaders now emerge.

But going through these admittedly contentious thoughts, I saw something more substantial, which was that my thinking was taking place via the disciplines — and to that I added the realization that my poor book of so long ago had stated itself as an attempt to create a subdiscipline. I have just recently reread Douglas Bennett’s very perceptive quick history of the liberal arts in the 20th century where he notes that the organization of colleges by these disciplines that we now take so for granted in fact was a fast and dramatic occurrence between about 1890 and 1910. Today, it seems, we really care more about them than we do about the whole self or whatever the liberal arts ideal is.

So I got angry at the disciplines, and there is reason for that. It gets difficult to understand, especially at the graduate level, why a doctorate in literature and a doctorate in physics exist on the same campus when it seems they might as well be pursued on different planets. During a year when I served as graduate dean at the University of Michigan, I attended a physics lecture and was seated next to the chair of the comparative literature program. As the scientist went on, my neighbor whispered to me incredulously, “This guy thinks the world is real.” That takes C.P. Snow’s two-worlds problem to a new and desperate place.

Or again, I invited the scientist from NYU who had successfully submitted an article of post-structuralist nonsense to a journal of literary theory and had his hoax accepted to speak at Michigan, with a panel of three scientists and three humanists responding. As the large crowd left the room, the conversations were remarkable. The scientists in the audience to a person found the critique of the pretension of literary theory wholly persuasive. The humanists to a person felt that their colleagues had successfully parried the attack, no question about it, reminding the physical and life scientists that their language could be pretty thick to an outsider too and that the very history of science could be seen as the overturning of accepted truths to be revealed as unintended hoaxes.

And so, enraged at the disciplines, I tried to imagine what it would be like to have a university, a world, a mind that did not rely on the disciplines — and failed.

And my next move is to say, perhaps this is fine. If general education is tantamount to a mild benevolence toward humanity, involvement in a discipline is like falling passionately in love with a particular person. We need both. It is okay to be captured by an ecstatic interest. But we also know the danger of early love. In the words of Gordon McRae or somebody, “I say that falling in love is wonderful.” And indeed it is arguable at least that we do not induct students into a love of the life of the mind by abstractions but by finding the single discipline that fixes their fascination.

Even so, we want that fascination to be versatile, to be capable, that is, of moving from one arena of thought to another, or at least to understanding why someone else would care passionately about something else. Every summer, I spend a week on an island off Lake Winnipesaukee. This is very odd for me, as my relation to nature is such that a friend once asked if I had suffered a traumatic experience in a forest or a park. I prefer my nature in iambic pentameters, and this family island, without electricity or plumbing, I have dubbed The Island Without Toilets. Still, it is restful, and each year we campers read and discuss a book or essay. One year it was Bill McKibben’s book, The Age of Missing Information. In this tome, McKibben contrasts a day spent hiking to a modest mountaintop to a day spent watching a full 24 hours of each channel of a cable television system in Virginia. (The fact that there were only 90 channels in 1992 tells us that we are losing more information all the time.) The book is somewhat eco-snobby, but McKibben’s main contrast is really not between the natural world and its vastly inferior electronic similitude or replacement but between deep knowledge and sound bites.

He illustrates deep knowledge by an Adirondack farmer’s conversation concerning each and all the species of apple. There is so much to know, it turns out, about apples; indeed, there is so much to know about everything. As I wrote a few years ago, “Life may appear a deserted street. But we open one manhole cover to find a complex world of piano-tuning, another to discover a world of baseball, still others to discover intricate worlds of gemology and wine, physical laws and lyric poetry, of logic and even of television.” And I asked, “Do our schools and colleges and universities reliably thrill their charges with this sense of plenitude?”

They do not. And while I cannot even imagine a world without the disciplines — which are really the academic organization of each of these microcosms of wonder -- I can imagine them contributing to an overall world flaming with interest. Falling in love is great and irreplaceable, but how about reimagining the campus as Big Love, Morman polygamy for all sexes, or at least as a commune, where each of us is mated to a discipline but lives in close proximity with family-like others on a daily basis.

That is, I believe, what we are, however awkwardly, attempting by having the disciplines inhabiting the same campus. However much general education has been swamped by disciplinary insistence, a remnant remains. Even academics tend to tell other people where they went to college, not so much in what they majored. We probably already possess the right mechanism for a 21st century renaissance. It just needs some adjustments.

I want to suggest two such adjustments. One is in the relation of the arts and sciences to the world; and another readjusts the arts and sciences in relation to themselves and to professional education.

II.

When I was at the University of Michigan several years ago, something shocking took place. The sciences faculty, en masse, threatened to leave the college of liberal arts. “How could the sciences leave the arts and sciences any more than Jerry could leave Ben and Jerry’s?,” I asked someone who had been present at these secession meetings. “The same way another Jerry could leave Dean Lewis and Jerry Martin,” he replied. Somehow to Michigan’s credit, the rebellion was quelled, but to me it is suggestive of the weakness of the liberal arts ideal at many of our institutions.

There are many signs of its frailty, beginning with the frequent statistic that more students at four-year colleges now major in leisure studies than in mathematics and the sciences. It is difficult to find a middle or high school where anyone speaks of the liberal arts, and much as I have been worrying about the disciplines, aside from scattered efforts they seem to have been missing in action from much of the last forty years of discussion of school reform. In speaking about the arts and sciences in relation to the world, I want to suggest, though, that the lording of the disciplines over general education and the absence of the excitement of the disciplines in the schools have everything to do with each other.

This near paradox can be illustrated best if I stay within my own neighborhood of the humanities for this aspect of the argument. Last month, filled with nostalgia, I agreed to serve on a panel for the National Humanities Association, which advocates to Congress for funding for these impoverished disciplines. My job was to provide one version of the speech for the public efficacy of English and history, religion and philosophy and so on. I decided to fulfill this assignment rapidly and then to ask why, if we believed in the public efficacy of the humanities, we utterly ignored it in our mentoring of graduate students in these disciplines.

My argument for the humanities is exactly the same as my argument for the arts and sciences generally. As a young person, I never expected a major battle of my lifetime to be the renewal of dogmatic fundamentalism in opposition to free thinking. I find myself again and again referring to an episode of the television program "The West Wing" that aired shortly after 9-11. The president’s youthful assistant is speaking to a group of visiting school children and he says, “Do you really want to know how to fight terrorists? Do you know what they are really afraid of? Believe in more than one idea.”

This is not always as simple as the Taliban versus Shakespeare. There are subtle discouragements within our own society to the freedom to doubt and the freedom to change one’s mind. And there are elements within each of us that tend toward dogmatism and against the embracing of difference and a will to tolerate complexity. The campus, ideally, is a battleground for this freedom.

Against the many who would tyrannize over thought, we need to fight actively for our kind of education, which is far deeper than the usual political loyalties and divisions. God and George Washington are counting on us. And so are all those kids in East LA scarred by violence and poverty. In a nation of inequality and a world of sorrows, damn us if we neglect to advocate effectively for the only education that lifts up people.

Having said that, I asked why, paraphrasing Emerson, we do not turn our rituals and our rhetoric into reality. Over the last 40 years, the professoriate in the humanities has been a mostly silent witness to an atrocity, a huge waste of human resources. According to Maresi Nerad in the "Ten Years After" study, in a class of 20 English Ph.D.'s at whatever prestigious institution, three or four will end up with tenure-track positions at selective colleges or research universities. And yet this degree program, and all others in the humanities, pretend that all 20 are preparing for such a life. It’s a Ponzi scheme.

When I led the Woodrow Wilson Foundation, we began a Humanities at Work program, one aspect of which was to give little $2,000 scholarships to doctoral students who had found summer work beyond the academy. A cultural anthropologist at Texas worked at a school for delinquent girls who had been abused as children. She employed dance, folktales, autobiographical writings and a whole range of activities related to her expertise to improve these girls’ self-images. A history student at U.Va. created a freedom summer school for fifth graders in Mississippi, teaching them African American history. Meanwhile, we secured thirty positions at corporations and non-profits for doctoral graduates.

Our point was not to become an employment agency but to suggest that every sector of society, from government to K-12 to business, could benefit hugely by the transferable talents of people who think with complexity, write and speak with clarity, and teach with verve and expertise. We wanted such graduates to comprehend the full range of their own possibilities. Should they then decide to enter academia, at least they would perceive this as a free choice. And in the meantime, the liberal arts would populate every social sector as never before. I do not mean it ironically when I look to the liberal arts takeover of the world.

For that to take place at any level of education, I think, we need to marry intellectual hedonism to the responsibility of the intellectual. If we want our professoriate and our students to apply their learning -- and I do -- if we want them not simply to critique society but to constitute it, we must first acknowledge the simple joy of learning as a prime realistic moment. My dear friend Steve Kunkel is a leading investigator at Michigan of the AIDS virus. He is a fine fellow and I am certain that he would wish to reduce human suffering. But when I call Steve at 7 in the morning at his lab, because I know he will be there already, he is there less out of a humanitarian zeal than because he is crazy about science, the rattiest of lab rats. Just so, when I unpack a poem’s meaning, I experience a compulsive enjoyment. This is half of the truth, and it leads someone like Stanley Fish to scorn the other half by writing a book with the title Save the World on Your Own Time.

I think we can devote some school time to saving the world without prescribing or proscribing the terms of its salvation. Louis Menand, surely no philistine, argues that we need to get over our fear of learning that may brush shoulders with the practical and more generously empower our students. Granted, and granted enthusiastically, academic enclosure, the distancing of a quiet mind from the harsh noise of immediacy, is a great joy, even a necessity in the growth of an individual. But when it becomes the end rather than the instrument, we approach social disaster. We must travel back and forth between the academic grove and the city of social urgencies.

This is to say, and beyond the humanities, that a certain precious isolation — is it a fear? — has kept the fruit of the disciplines within the academy, away even from our near neighbors in the schools. The absence of the disciplines from the public life and the bloating of the disciplines to squeeze out the liberal arts ideal in the colleges are part and parcel of the same phenomenon. It is not that the world rejected the liberal arts but that the liberal arts rejected the world.

In a brilliant article, Douglas Bennett provides a brief history of 20th century college in which he notes an increasingly exclusionary notion of the arts and sciences. And this seems to me part and parcel of the same dubious ethic that so distrusts the messiness of the social world. As I read that we arts and science adepts kept purifying ourselves — education is too messy, throw it out, along with the study of law, along with business, along with anything material (again, “That guy thinks the world is real”) -- I am reminded of Walt Whitman’s critique of Matthew Arnold, whom he termed “one of the dudes of Western literature.” To Arnold, Whitman says, “the dirt is so dirty. But everything comes out of the dirt, everything; everything comes out of the people, the people as you find them and leave them: not university people, not F.F.V. people: people, people, just people.”

The liberal arts became pure and they became puerile. Having greatly expanded the old curriculum by addition and subdivision, they spent the rest of the century apologizing by limiting themselves. They expelled fascinating areas of human endeavor that then came to constitute professional education, and professional education proceeded to eat the libbies’ lunch.

Who or what can teach us to do what Menand urges, empower not only our students but our academic disciplines? The answer, plain as can be, is the sciences. Is it any wonder, given the exclusionary bent of the liberal arts, that scientists, whose subject and whose instruments of investigation are often frankly material, might consider secession, especially when social influence, which is also to say funding, was getting thrown away along with whole areas of crucial consequence?

And by the same token, it is the sciences that can teach the humanities in particular how to reconnect. Indeed, a few moments ago, I was calling for the humanities equivalent of tech transfer; and that is half of my hope for a 21st century renaissance.

III.

By a renaissance in our time — in Weisbuch’s day -- I do not mean the recovery of classical learning and its inclusion in a Christian worldview that marked the original. I want to invoke instead the extreme interdisciplinarity of that time when the arts and sciences came so spectacularly into, if not unity, vital relationship, and when learning and worldliness ceased their contradiction. Here is what I mean. I do not in fact live on the campus of Drew University, but in a town 15 miles away, Montclair, New Jersey. Aside from the filming of some scenes featuring AJ Soprano down the street at our high school, the neighborhood was all too quiet when we moved in, with neighbors at most stiffly waving to one another from a distance. Then Tom and Janet and their three moppets moved in, along with Tom’s insane white limousine, the backyard hockey rink, the Halloween scare show, the whole circus. As Tom started offering the middle-school neighborhood kids “rock-star drop-offs” to school in his limo, everything changed. Some of our houses have large front porches, and neighbors began to congregate on summer evenings. Soon, whenever we lit the barby a few families would turn up with their own dogs and steaks and ask if they could join in. There are about ten families now that assist each other in myriad ways, that laugh together and, when necessary, provide solace and support.

The university can become a porch society in relation to the disciplines. Indeed, for the last 40 years we have been experiencing a loosening of the boundaries, as the prefix “bio” gets attached to the other sciences; as environmental studies unites the life sciences, theology, the physical sciences, public policy, even literary criticism; as Henry Louis Gates, as historian, employs genetic research to revise and complicate the notion of racial heritage. And then there is the huge potential of democratizing knowledge and recombining it through the burst of modern technology, one of whose names, significantly, is the Web.

You cannot intend a zeitgeist but you can capitalize upon one, and this is one. A few simple administrative helps occur to me as examples. We can invite more non-academics to join with us in our thinking about the curriculum. We can require our doctoral students to take some time learning a discipline truly a ways from their own rather than requiring the weak cognate or two, and we can take just a few hours to give them a sense of the educational landscape of their country. We can start meeting not with our own kind all the time but across institutional genres, and we can especially cross the divide into public education not so much by teaching teachers how to teach but with the rich ongoing controversies and discoveries of the living disciplines.

Less grandly, within our own institutions, we can pay a bonus to the most distinguished faculty member in each department who will teach the introductory course and a bigger bonus to those who will teach across disciplines, with the size of the bonus depending upon the perceived distance between the disciplines. We can stop attempting to formulate distribution requirements or core curricula via committees of 200, which is frankly hopeless in terms of conveying the excitement of the liberal arts, and instead let groups of five or ten do their inspired thing, spreading successes. We can create any number of rituals that encourage a porch society. As one new faculty member told me at a Woodrow Wilson conference years ago, “My graduate education prepared me to know one thing, to be, say, the world’s greatest expert on roller coasters. But now in my teaching position, I have to run the whole damn amusement park and I know nothing about the other rides, much less health and safety issues, employment practices, you name it.”

We might name this zeitgeist the whole damn amusement park, but I would suggest a naming in the form of a personification: Barack Obama. When I am fundraising, I often chant something of a mantra, and I ask you to forgive its sloganeering. The new knowledge breaks barriers. The new learning takes it to the streets. The new century is global. And the new America is multi-everything. There you go and here he is. Our fresh new president is indeed international, multi-racial, multi-religious, multi-ethnic, a liberal-arts major and law school grad who became a community organizer and breaks barriers with an ease that seems supernal. He was not required; like the courses we choose freely, he was elected.

Barack Obama was born on an island, and at the start of this essay I mentioned the site of my summer challenge, the Island Without Toilets. Our disciplines are islands. Our campuses are islands. And islands are wonderful and in fact essential as retreats for recuperation. But in the pastoral poems of an earlier Renaissance, the over-busy poet rediscovers his soul in a leafy seclusion but then returns, renewed and renewing, to the city. It is time for us to leave our islands. We are equipped.

Author/s: 
Robert Weisbuch
Author's email: 
info@insidehighered.com

Robert Weisbuch is president of Drew University. This essay is adapted from a talk he gave at the 2009 annual meeting of the American Educational Research Association.

Fifty Years After Stonewall

When the police conducted a routine raid on the Stonewall Inn, a bar in Greenwich Village, during the early hours of June 28, 1969, the drag queens did not go quietly. In grief at the death of Judy Garland one week earlier, and just plain tired of being harassed, they fought back -- hurling bricks, trashing cop cars, and in general proving that it is a really bad idea to mess with anybody brave enough to cross-dress in public.

Before you knew it, the Black Panther Party was extending solidarity to the Gay Liberation Front. And now, four decades later, an African-American president is being criticized -- even by some straight Republicans -- for his administration’s inadequate commitment to marriage rights for same-sex couples. Social change often moves in ways that are stranger than anyone can predict.

Today the abbreviation LGBT (covering lesbians, gays, bisexuals, and transgender people) is commonplace. Things only become esoteric when people start adding Q (questioning) and I (intersex). And the scholarship keeps deepening. Six years ago, after publishing a brief survey of historical research on gay and lesbian life, I felt reasonably well-informed (at least for a rather unadventurous heteroetcetera). But having just read a new book by Sherry Wolf called Sexuality and Socialism: History, Politics, and Theory of LGBT Liberation (Haymarket) a few days ago, I am trying to process the information that there were sex-change operations in Soviet Russia during the 1920s. (This was abolished, of course, once Stalinism charted its straight and narrow path to misery.) Who knew? Who, indeed, could even have imagined?

Well, not me, anyway. But the approaching anniversary of Stonewall seemed like a good occasion to consider what the future of LGBT scholarship might bring. I wrote to some well-informed sources to ask:

“By the 50th anniversary of Stonewall, what do you think (or hope) might have changed in scholarship on LGBT issues? Please construe this as broadly as you wish. Is there an incipient trend now that will come to fruition over the next few years? Do you see the exhaustion of some topic, or approach, or set of familiar questions? Or is it a matter of a change in the degree of institutional acceptance or normalization of research?”

The responses were few, alas -- but substantial and provocative. Here, then, in a partial glimpse at what may yet be on the agenda for LGBT studies.

Claire Potter is a professor of history at Wesleyan University. In 2008, she received the Audre Lorde Prize for “Queer Hoover: Sex, Lies, and Political History,” an article appearing in Journal of the History of Sexuality.

One of the changes already underway in GLBTQ studies is, ironically, destabilizing the liberation narrative that begins with Stonewall in 1969 and ends with the right to equal protection in Romer v. Evans (1996). Part of what we know from the great burst of energy that constitutes the field is that the Stonewall Riot we celebrate as the beginning of the liberation movement is not such a watershed, nor is the affirmation of equal protection the end of the story.

For example, I begin the second half of my queer history survey with Susan Stryker’s “Screaming Queens: The Riot at Compton’s Café” documenting a similar San Francisco rebellion in 1966, three years prior to Stonewall; I end with Senator Larry Craig being arrested in a Minneapolis men’s room. GLBTQ liberation is unfinished and becoming more complex as the research emerges that takes us on beyond Stonewall. But I would also add a caveat: Where are the transnational and comparative histories that are on the cutting edge in other fields, like ethnic studies, cultural studies, anthropology and women’s studies?

Just as significant, in my view, is that the greatest social stigma and official discrimination (not to mention inattention in queer courses and integration into the mainstream curriculum) is still aimed at the group we celebrate when we celebrate Stonewall, transgendered and transsexual people. This is an area where we need a lot of growth.

What I would like for transgender studies in 10 years is what is happening already in gay and lesbian history: placing the emergence of identities and the emergence of liberation struggles in a longer history that goes beyond the North American 20th century. Often senior scholars view trans history as “impossible” to write, a past without an archive other than interviews with the living. However, people said that about gay and lesbian history, African‑American women’s history and other new fields, and it always turned out not to be true.

Furthermore, I would argue that trans studies has a tenuous and often politically situational relationship to the GLB and Q of the field, and that needs to be addressed because the critical issues that are specific to trans studies are not being taken seriously in most curricula that claim to actually teach the field.

The final thing I would like to see by 2019 is graduate students writing dissertations in GLBTQ studies being honestly considered for regular old history jobs, rather than jobs in the history of gender: these young people are writing in legal history, urban history, the history of science, political history, medical history and whatnot -- and they are often only considered seriously for jobs in gender or women’s studies.

What pushes a field ahead is when young people can do important research, not be professionally stigmatized for it and know they can make a living as scholars.

Doug Ireland is a veteran political reporter covering both sides of the Atlantic. He is currently the U.S. correspondent and columnist for the French political‑investigative weekly Bakchich, and international affairs editor for Gay City News, New York City's largest LGBT weekly newspaper.

Sad to say, much of what comes out of university gay studies programs these days is altogether too precious, artificial and written in an academic jargon that is indigestible to most LGBT people. Reclaiming our own history is still not getting enough attention from these programs (witness Larry Kramer's long and ultimately failed fight to have the Larry Kramer Initiative he and his brother endowed at Yale become more history‑oriented and relevant).

The OutHistory website -- founded by superb, pioneering gay historian and scholar Jonathan Ned Katz -- desperately needs more institutional financial support to continue and expand its important work of creating the world's largest online historical archive of LGBT historical materials. OutHistory's innovative program to cooperatively and simultaneously co‑publish historian John D'Emilio's work on Chicago LGBT history in that city's gay newspaper, the Windy City Times -- a program which it also hopes to expand -- should be a model for the way gay studies programs can become more relevant to the majority of queers outside the hothouse of academe and to the communities by which our universities are surrounded.

We need to know where we've been to know where we should be going, yet there is still a paucity of attention paid to the history and evolution of the modern gay movement, to the death of gay liberation, with all its glorious rambunctiousness and radical emphasis on difference, and its replacement by what Jeffrey Escoffier has called the assimilationist "gay citizenship movement," which is staid, narrow‑gauge in its fund raising‑driven focus (on gay marriage and the like), and inaccurate in the homogenized, white, nuclear‑family‑imitative portrait of who we are that the wealthiest entities in the institutionalized gay movement present and foster.

One of my greatest criticisms of today's institutionalized gay movement is its isolationism. Our largest national organizations shun the concept of international solidarity with queers being oppressed in other countries, claiming their "mission" is only a domestic one. This is in sharp contrast to European LGBT organizations, where the duty of international solidarity is universal and a priority.

Gay studies programs should be encouraging more scholarship on the 86 countries which, in 2009, still have penal laws against homosexuality on the books, and in helping to give voice to the same‑sexers and gender dissidents in those hostile environments who have difficulty publishing in their own countries or where gay scholarship is banned altogether.

To cite just two examples, the ongoing organized murder campaign of "sexual cleansing" in Iraq being carried out by fundamentalist Shiite death squads with the collusion of the U.S.‑allied government is killing Iraqi queers every day, and the horrors of the Islamic Republic of Iran's violent reign of terror against Iranian LGBTs is driving an ever‑increasing number of them to flee their homeland ‑‑ gay scholars have a role to play in helping these people reclaim their history and culture.

Why is it that the most sensitive, rigorous, and complete account of the way in which homosexuality has been extensively woven into Persian culture in sophisticated ways for over 1500 years has just been published by a non‑gay historian, Janet Afary (Sexual Politics in Modern Iran, Cambridge University Press)? In the hands of Iranian queers, this book will become a weapon of liberation against the theocratic regime's campaign to erase that history and keep it from the Iranian people. University presses need to publish more work by queer writers from LGBT‑oppressing countries (as MIT and Semiotexte have just done with Moroccan writer Abdellah Taia's fine autobiographical novel Salvation Army).

In many countries, homophobia and homophobic laws are part of the legacy of colonialism, and were imported from the West. But where is the gay scholar who has developed a serious critique of and rebuttal to the homophobic conspiracy theories of Columbia University's Joseph Massad, who has invented a "Gay International" he accuses of being a tool of Western imperialism (Massad provides a theoretical framework utilized by infantile leftist defenders of Teheran's theocratic regime for attacks on those, including Iranians, who expose the ayatollahs' inhumane persecutions of queers and sexual dissidents)?

One small, concrete and simple but powerful gesture of international solidarity would be for gay studies programs to sponsor book donation drives to make gay history and culture available to those many queers in oppressed countries who thirst for it as they construct their own identities and struggle for sexual freedom. I can tell you from my own reporting as a journalist that making such knowledge available can save lives.

Let's hope that it won't take 10 years to have less artificial, picky intellectual onanism of the obscure theoretical variety and more gay scholarship that's accessible and relevant to people's lived lives and struggles, in other countries as well as our own.

Marcia M. Gallo is an assistant professor of history at the University of Nevada, Las Vegas. In 2006, she won the Lambda Literary Award for her book Different Daughters: A History of the Daughters of Bilitis and the Rise of the Lesbian Rights Movement (Carroll & Graf).

In considering what I might wish to have changed by the 50th anniversary of Stonewall, a 25‑year old quote from Audre Lorde came to mind: “Somewhere on the edge of consciousness, there is what I call a ‘mythical norm,’ which each one of us knows ‘that is not me.’ In [A]merica this norm is usually defined as white, thin, male, young, heterosexual, Christian, and financially secure. It is within this mythical norm that the trappings of power lie within this society. Those of us who stand outside that power often identify one way in which we are different, and we assume that to be the primary cause of all oppression, forgetting other distortions around difference, some of which we ourselves may be practicing.”

By the time 2019 rolls around, we will need to have plumbed the depths of the “mythical norm” and revealed the “distortions around difference” that still separate the L from the G and the B as well as the T not to mention the Q and the I. In the next decade, I would hope that we deepen our understanding of, and mount effective challenges to, the seductiveness of normative values; question the conflation of equal rights with social justice; and celebrate the significance of queer inclusiveness, innovation, and radicalism.

Specifically, our scholarship must:

(1) acknowledge and analyze the continuing marginalization – and strategies for resistance ‑‑ of many queer people, especially those who are poor, homeless, currently or formerly incarcerated;

(2) restore the “L” -- meaning, give credence and visibility to the power of women’s experiences and leadership, still sorely lacking in our consciousness and in our publications;

(3) refocus on the importance of activism -- especially at local and regional levels, beyond the coasts -- to our research and writing.

Christopher Phelps, currently an associate professor of history at Ohio State University, will join the School of American and Canadian Studies at the University of Nottingham later this year as associate professor. In 2007 his paper “A Neglected Document on Socialism and Sex” appeared in Journal of the History of Sexuality.

I'd like to suggest that the interpretive problem of homosexuality and capitalism still cries out for exploration. John D'Emilio, David Halperin, and others have demonstrated that although same‑sex desire extends to the ancients, homosexuality is a modern phenomenon. As a sexual orientation or identity, homosexuality arose only with the individual wage labor and the separation of household and work characteristic of capitalism.

A mystery remains, though, for how did the very same mode of production that created the conditions for this new consciousness also produce intense compulsions for sexual repression? Why, if capitalism gave rise to homosexuality, are the ardent defenders of capitalism, whether McCarthyist or on our contemporary Republican right, so often obsessed with attacking same‑sex desire? How does capitalism generate both the conditions for homosexuality and the impulse to suppress it?

This relates closely to the modalities by which homosexuality and homophobia are to be found in the same minds, from J. Edgar Hoover and Roy Cohn in the 1950s down to the Ted Haggards and Larry Craigs of the present day. I believe this goes beyond self‑hatred. It speaks to a cultural ambivalence, one still present today. We live in a moment when capitalism is experiencing its deepest crisis in fifty years, even as the movement for gay acceptance seems to be advancing, if haltingly. The recent state approvals of gay marriage, for example, contrast markedly with Nazi Germany, where the economic crisis of the 1930s led to the scapegoating of gays forced to wear pink triangles. How to explain this contrast? In what ways is capitalism liberatory, in what ways constrictive?

Conversely, we need a lot more conceptual thinking about homosexuality and the left, by which I mean specifically the strand of the left that opposes capitalism. Many of the signal breakthroughs in what is now called the gay civil rights movement were the result of thinkers and doers who came out of the anti‑capitalist left, most famously the Mattachine Society and the Gay Liberation Front. This is also true of many lesser‑known examples, such as the Marine Cooks and Stewards, a left‑led union of the 1930s and 1940s that Allan Bérubé was researching before his death. (To topic‑seeking graduate students out there, by the way, Bérubé's project deserves a new champion, and we badly lack a definitive study of the GLF.)

To make such breakthroughs, however, gay leftists often had to break with the parties and movements that taught them so much and enabled them to recognize their own oppression. The founders of the Mattachine were men forced out of the Communist Party, which saw homosexuality as reactionary decadence. The libertarian left, both anarchist and socialist, broke free of the impulse for respectability, but such libertarian and egalitarian radicals were on the margins of the styles of left‑liberalism and Stalinism prevalent on the left at midcentury.

So this deepens the mystery, because it means that while capitalism generated homosexuality, it often takes radicals opposed to capitalism to push sexual liberation forward‑‑and yet sometimes they must do so against the instincts of the dominant left. We would really benefit from a deeper theoretical excavation of this set of problems.

Author/s: 
Scott McLemee
Author's email: 
scott.mclemee@insidehighered.com

Pages

Subscribe to RSS - History
Back to Top