History

War in the Heavens and Here Below

The world, it is said, is made up of two kinds of people: those who divide the world into two kinds of people, and those who don’t. The joke is too old to be funny, yet it has a point, even so. The impulse to dichotomize is not universal – but close enough. Some of us tend to think that the ability to distinguish shades of gray is a mark of progress. But the digital alternative of black and white tends to reassert itself from time to time. Maybe our brains are wired for binary oppositions, after all?

The fascination of Michel Tardieu’s book Manichaeism, just published in an English translation by the University of Illinois Press comes from watching the emergence and consolidation of the most emphatic possible variant of this tendency – from seeing it take a particular shape in a specific place and time.

The arrival of the prophet Mani (born in Persia in the year 216) falls almost exactly between the lives of Jesus and Mohammad. The religion he founded has died off. Until libraries of Manichean scriptures were unearthed over the past century or so, most of what we knew about the faith came via Christian and Muslim polemicists. But the Mani's vision is another matter. Manicheaism regards the world as a battlefield occupied by the forces of light and darkness, good and evil, with combat headed fast towards a final reckoning. This outlook is alive and well along the border between Pakistan and Afghanistan -- not to mention certain holdouts in Thinktankistan, a province of Washington, D.C.

No such topical points are scored by Tardieu, who lectures on the religious syncretisms of late antiquity at the Collège de France. His book first appeared in 1981 -- then in a revised edition in 1997 -- as part of the “Que Sais-Je?” series of popular guides to scholarship. (Its format is somewhat reminiscent of Oxford University Press’s Very Short Introductions.) The approach here is, for the most party, strictly positivist – or, to put it another way, a bit dry, though that soon proves an advantage. For the history and doctrines of Manichaeism are more than imaginative enough in themselves. If the prophet Mani had not existed, I suspect Jorge Luis Borges would have needed to invent him.

His father Patteg, it seems, was a regular worshipper attending a house of idols in a city in what is now Iran. Or rather, he was one, until he heard a voice that commanded him to abstain from meat, wine, and sex. This went on for three days and made a big impression, as well it might. Patteg abandoned his pagan ways and joined a sect that combined elements of Christianity with its own rather more stringent gloss on Jewish dietary laws. Meat of any kind was forbidden, for example, and the faithful would baptize vegetables before eating them.

Mani was presumably conceived before Patteg's ascetic commandments took full effect. He grew up in the faith, but had his own set of visions when he was 12 years old -- the same age Jesus was when his parents found him arguing fine points of scripture with the elders at the temple. Tradition also has it that the prophet's his mother was named Maryam. (You can see where this kind of thing would annoy Christian heresiologists.)

In any case, when Mani proclaimed his own revelations in his early 20s, he challenged the idea that blessing your food while washing it made it pure. What came out of your backside was the same as if you had eaten something the law proclaimed unclean. As he continued to preach and draw followers, Mani made it clear that he recognized and respected the authority of three other prophets – Jesus, Zoroaster, and the Buddha. His own role was to complete their work. He would synthesize what they had revealed, and fulfill what they had left undone. Mani was “the seal of the prophets.”

It would be a mistake to think this amounted to some New Age, come-as-you-are brand of spirituality. Nor did his satiric jibes at food baptism mean that followers should eat just whatever they wanted. The revelations of Mani supplanted previous doctrines, and imposed a severe discipline on believers. The struggle for purity involved a lot more than washing your vegetables.

The demands on the Manichean faithful make the life of a Puritan seem like that of a libertine. Bathing was forbidden, for example, since it would be (1) an act of violence against the water and (2) a source of sensual pleasure. The clergy had to take vows of extreme poverty. Its members were supposed to eat only certain vegetables, and not many of them. But even that was forbidden during the periods of fasting, which were regular and frequent.

At an annual festival, lay believers presented a banquet of really good fruit to "the elect." By that point, the religious leaders were famished, but sufficiently pure for the task of harvesting the “particles of light” contained in their food. The particles had been scattered throughout the universe during the struggles between two eternal principles known as the Father of Greatness and the King of Darkness -- the forces of good and evil.

Mani explained that there had already been two cosmic battles between them. The conflict had generated a number of lesser gods and devils. Some of the demons had created Adam and Eve -- with Eve being particularly susceptible to their influence. Procreation was a diabolical means for further scattering the “particles of light” in the world. Funny how often these cosmic dualisms have a streak of misogyny in them, somewhere.

But happily Adam was approached by one of the three versions of Jesus. (Seriously, don’t ask.) And so mankind now has a role to play in the third war between Light and Darkness -- the final apocalyptic showdown between good and evil. The role of the Manichean religion was to help bolster the divine forces. Augustine of Hippo, who converted to Christianity after a period as one of the Manichean laity, is quite sarcastic about this in his Confessions: “To those who were called ‘elect’ and ‘holy,’ we brought foods, out of which, in the workshop of their stomachs, they were to make us angels and gods, by whom we might be liberated.”

Plenty here for outsiders to ridicule, then. But the conviction that they were troops in a cosmic battle gave believers a certain esprit de corps that was hard to break. The faith also had a streak of self-conscious universalism that encouraged proselytizing. Mani himself went to India and converted some Buddhists to his revelation. As late as the 13th century, Marco Polo encountered Manicheans in China. And severe asceticism can exercise a fascination even on people who reject the doctrines behind it. Christianity and Islam did not so much wipe out Mani’s faith as, so to speak, absorb certain particles lodged within it.

In any case, Mani himself was clearly some kind of genius. Jesus and the Buddha left it to disciples to record their teachings. By contrast, Mani composed his own scriptures and even perfected an alphabet to make it a better medium for recording speech. He illustrated his complex history of warfare among superhuman forces with paintings that were remembered long after they were lost. “In the culture of Islamic Iran,” writes Tardieu, Mani’s name “has come to symbolize beauty of the most refined kind." (Although Tardieu does not venture this point, something about Mani's visions, with their bizarrely intricate mythology, call to mind Blake's prophetic books. The fact that both were lovingly illustrated suggests the parallel is not simply in the eye of the beholder.)

Mani took care to elaborate the rituals and organizational structure of his religion, instead of leaving this for later generations to suss out for themselves. It seems almost as if he’d read Max Weber on the routinization of charisma and put it into practice. He also tried to establish his faith as a new state religion by talking it up to various monarchs. The effort did not pay off. Indeed, it led to Mani’s execution at the age of sixty, from which he had the misfortune not to be resurrected.

One other circumstance may have been decisive in Manichaeism ending up as also-ran among the world religions. Treating procreation as an instrument of the Evil One tends to be bad for a creed's long-term viability. Tardieu is much too sober a scholar to speculate, but I feel pretty sure it was a factor.

Author/s: 
Scott McLemee
Author's email: 
scott.mclemee@insidehighered.com

Encouraging Political Incorrectness and Civility

Editor's Note: Vanderbilt University Press is this spring releasing American Conservatism: Thinking It, Teaching It, by Paul Lyons, who died in January. In the book, Lyons features writings from a teaching log he kept from a course on conservatism that he taught at Richard Stockton College of New Jersey. The material from the log appears below in italics, and his additional commentary is in regular text.

February 15

Most of academic life is a blessing; sometimes I’m amazed that I get paid for doing this, doing what I love. When class discussion turns to work, I always ask my students if they or people they know would stay at their jobs if they won the New Jersey lottery big time. Almost all say that they’d quit. This is a useful marker for defining alienation, doing what is alien to you. And, of course, it is paralleled by students staying in school for reasons that are alien to their desires. Similarly, all academics hate wasted time with self-important administrators, having to deal with petty and occasionally vicious colleagues (the academy is more vicious than high finance precisely because so little is at stake), paperwork and more paperwork. For most of us it is relief to walk back into the
classroom.

In this particular classroom, I found myself offering a brief biog of William F. Buckley Jr. I was well prepared having reread John Judis’s definitive study. So I walked them through his family life, his early “bad boy” years at and after Yale, his most influential books, his role in the founding of National Review. Then, with maybe 10 minutes remaining, I read to them marked-out sections in Judis’s biography that pointed to Buckley’s worst moments of narrow-mindedness, comments he made in the 1950s and early 1960s about civil rights in America and independence movements in Africa. The statements, sometimes flippant in that Buckley “squire of the manor” style, were at best patronizing, at worst, deeply racist, particularly one statement in which he suggests that Africans will be ready for self-determination when they stop eating one another. I wanted my class to come to grips with the burden conservatives carried in that period, being on the wrong side of history, still holding onto a kind of British arrogance about “wogs” — Colonel Blimp if you will. But one of my most conservative students, Dick, jumped in with support for Buckley’s worst comment, responding with a smirk, with a knowing look about “those people,” those Africans.

If there is such a thing as a teaching moment, this was it. I stopped him and asked the class if it would be different if there were African American students in this class. They quickly saw my point, but one responded, “They’d beat the shit out of Dick.” I countered by suggesting that it shouldn’t be the obligation of black students to call Dick on his statement, but the obligation of whites to do so. There were a few quizzical looks as I explained the unfairness of expecting blacks — or Jews or women or gays or Catholics — to be obliged to defend themselves from inappropriate assault.

I was thinking on my feet, mostly trying to figure out how to chastise Dick without putting him too much on the spot, how to signal what’s OK and not OK in my classroom without stifling legitimate commentary, how to, in effect, be politically correct without being stuffy, hypocritical, humorless, unwilling to engage on controversial issues. I have examined some of the literature that addresses the plight of so many African nations — the kleptocracies, the genocides, the ethnic wars, the waste of resources. I have rooted for the best of African leaders, anticipated that the resource-rich nations of Nigeria, South Africa, and Congo would have to be the linchpins of development. And I have thought a great deal about the reason why East Asia and now all of Asia is moving forward to rapid economic growth — with all the caveats about inequalities, environmental dangers, corruption, dictatorship — and Africa stagnates. Sometimes I think that it must be that Asian cultures, Asian imperial history, especially in China and India, sustained an identity that now provided the cultural capital for an Asian version of the work ethic. Africa seemingly has struggled more with the very creation of nation-states. When I consider Latin America and then the Islamic Middle East, I am more confused, in my relative ignorance of their respective histories.

I am sometimes taken aback by what we do not teach our students. Aside from the above-noted gaps in what we can reduce to the “great books,” there are other appalling shortfalls, at least in many public institutions: the shortage of courses in what are probably the most salient developments of our times, the reemergence of China and India as players on the world stage, the increasing importance of Asia where almost two-thirds of the world’s population lives; the minimal attention paid to world religions — my students are not only unable to demonstrate any accurate knowledge of Buddhism, Hinduism, or Islam, but they are also remarkably ignorant about their own religious backgrounds. Few can tell me what a Christian is, at least if I ask for comment on Catholics, Protestants, and Orthodox. Fewer can distinguish Presbyterians from Episcopalians, nor can they define evangelicals or fundamentalists, not to speak of Pentecostals. More heartening is that most of my students are motivated to learn about organized religions; our K–12 schools, afraid of offending almost anyone, do not teach them about the history of the very Judeo-Christian tradition they abstractly celebrate.

But I do know that leftists and well-meaning liberals too often respond to questions of African horror with the same old saw — its colonial and neocolonial factors. True but not enough to explain why Taiwan and South Korea and China have moved forward. And it just plays into conservative stereotypes that the Left always blames the West and the United States and never holds people of color, here or elsewhere, accountable. It is the macro version of what I will simplify as the attacks on Daniel Patrick Moynihan’s study of the African American family. So I tried to make sure that in chastising Dick and indicating acceptable boundaries of discourse, I was simultaneously, and as strongly, modeling that raising questions about African nations is legitimate. How could I not, given my own point of view? Whether I was successful remains to be seen. Time will tell. But it was, I think, a useful beginning of a discussion I assumed we would engage when we got to George Wallace and the white backlash of the 1960s. I am debating whether to post a question on this issue on Web Board this week or to wait until we have more meat and potatoes on the plate such that we can do more than discuss without context or information. But I must admit that I left class pumped with the anticipation of that set of discussions and, hopefully I’m right, with some confidence that we started it well.

I don’t think we as academics and teachers do a very good job teaching about race and racism. Some seems to be liberal guilt. Mostly it rests on the lack of confidence that one can present complicated situations, nuanced realities without risking being misinterpreted by colleagues and students.

Several years ago at a panel on racism I suggested that we begin by seeing if we could agree on four axioms, the first being that there is more racism in America than most white people were willing to admit. No controversies there. The second was that there has been considerable progress over the past 40 years based on the civil rights revolution of the 1960s. More curious looks but no hostility. Then the third axiom, that there were some African Americans who see racism when it doesn’t exist. At that point, the room became more agitated with some furrowed brows and raised eyebrows. The fourth axiom brought down the house: that given the above three axioms, it was presently more difficult to assess allegations of racism. Indeed, I added, there were now so many divergent voices within the African American community — a partial measure of the successes above noted — that no one could any more claim to represent “the black voice.”

The panelist following me denounced my position, arguing that racism was as bad or worse than 40 years ago, merely more hidden. Then the panel opened for questions from the audience. A black female undergraduate asked me how I would respond if she believed that I had said something racist in class and she came to complain to me. I told her that I would take her allegation very seriously, consider whether I thought it was valid, and give her my most honest response. She was dissatisfied, indeed offended by my response, as were many on the panel and in the audience. The student asked me why I wouldn’t accept the validity of her allegation. I told her that I thought it would be harmful to her or any other student to allow an automatic acceptance of any allegation, that it risked corrupting her or anyone else in that it would allow for false charges to go unchallenged. I ended by suggesting that true respect included disagreement. I added that if not satisfied, a student always had the remedy of taking the allegation to my superiors.

The room erupted with anger at me, with one white colleague screaming at me that I was patronizing the student. I was disappointed and depressed by this display of what seemed to me to be wrong-headed, racially retrograde, and demagogic. I need to add that I was not angry at the student who raised the issue; she seemed honest and forthcoming, even in disagreement.

Most interesting is that over the next weeks several of my African American students asked me what had happened — there obviously had been a buzz in the hallways. This led to some fruitful conversation about how one determines the existence of racism. I also received several notes from white colleagues expressing admiration for what I had said but confessing that they were too cowardly to do the same. This depressed me even more than the hostile responses. Had we come to this — faculty, even tenured ones, afraid to speak their minds in fear of being charged with racism? Indeed, we had. One junior faculty member told me that he never goes near certain hot-button issues like affirmative action or underclass behavior because of his fear that it might put his job at risk.

As teachers we struggle with students who hold back from authentically discussing issues of prejudice, who go silent or simply echo agreement. It is hard work to achieve honest discussions; all students enter with bruises. One must establish a trusting environment for such discussions to be fruitful. Trust does not exist at the beginning of a class. I tell students that the handshake is an apt metaphor for our relations — I hold your hand, you hold mine — we trust one another but I also prevent you from hitting me in case that is your hidden desire. We trust and mistrust simultaneously. And then we can begin to have an honest dialogue.

I begin with a modest sense of how much influence I have with my students, especially regarding changes in their essential behavior regarding issues of social justice. Teachers are fortunate if we increase at the margin those who are willing to stand up for others. But human behavior being what it is, we remain burdened with the knowledge of how difficult it is to educate individuals to identify with all of the “others,” to construct a global identity focused on human rights. Sigmund Freud, given the trauma of World War I, asserted not only that reason and enlightenment were fragile, but also that there was something in the existence of human intelligence which never allowed the darkness to be all-engulfing, and that this indistinguishable light of humane thought had a surprising persistence. Our goal as educators is to widen that ray of light, to assist a few more ordinary men and women to resist the extraordinarily evil and to stretch toward the extraordinarily good.

My own view is that the optimal way to help students respond to moral challenges is to help them understand the contradictory strands of heroism and knavery, the victimized and the victimizing, of many of our peoples. And we as educators need to understand and communicate the contextual nature of human behavior, its range and subtleties, and the contradictory ways that humans respond to moral challenges. As such, we teach humility before the wonder — the heroism, the cowardice, the insensitivities, the villainies — of our own natures, our own histories.

This might be called the double helix of all peoples, the intertwining of their burdens and their inspirations, their hidden shames and forgotten accomplishments, the recognition of which makes it more likely that they will be able to recognize the same complexity in others.

All of this has to begin with the obvious: that I am a white guy teaching about race and racism. No matter how you slice it, it makes a difference. It does help that I was born and bred in Newark and have some “cred” with my city kids (keep in mind that many of my African American students are middle class and suburban). I work very hard to break down the obvious stereotypes, including those blacks have of non-blacks. I want all of my students to recognize that each of us is simultaneously a member of an ethnic/racial/religious group, a human being, and a very distinct and unique individual. When we address social class and poverty, I want my students to understand the need to disaggregate poverty, to note three kinds of the poor: the temporary poor, the working poor, and the underclass poor. The first two groups share all of the values and behaviors of Americans, for example, the work ethic. They suffer from short-term crises, such as a husband and father splitting and not providing sufficient support, a worker facing a health problem without insurance, or people suffering from poor educations that limit their income potential to close to minimum wage, holding jobs with no benefits.

It’s only the latter category, sometimes linked to a “culture of poverty,” certainly no more than one-fourth of those poor, who exhibit the self-destructive behaviors — substance abuse, bad work habits, impulsive control problems, criminal activities, abuse of women and children — that fall outside of societal norms. Most of my students of color have no difficulty in affirming that such behaviors exist; indeed, they often go farther than I am willing to go in ascribing such behaviors to the black poor. I rely a great deal on the work of William Julius Wilson, the extraordinary black sociologist, in teaching about the links between class and race, between behavior and opportunity and, especially, the need to address the most painful and least flattering aspects of black street life
honestly and directly.

I tell all of my students to go beyond the snapshot to the motion picture. That guy drinking from a bottle in a paper bag in front of a
bar — how did he get that way? I bring in the start of the motion picture, the differential chances of success already there in birthing rooms. How is it that I can stand in front of that room full of newborns and, based on race and social class, tell with a high degree of accuracy which babies will graduate from college, who will have a decent middle-class life, and who will end up in prison or dead before age 30. That is criminal to me. No baby determines the well-being of its parents. But the odds are set very early. Now odds are not determinants; people beat the odds. But I remain angry and want my students to share that rage at the inherent injustices that await so many of our poor children.

Many of my African American — and increasingly, Latino — students are quite inspirational. Many, not most or all, come from difficult environments. Many have surmounted extraordinary barriers — broken families, crime-infested neighborhoods, drug experiences, lousy schools, early pregnancies and child-rearing, physical and sexual abuse — to make it to college. I hope that my pride in them, which includes pushing them to excel, prodding them to resist racial and often gender stereotypes, comes through in the classroom. I want that young woman who was offended by my comments at the panel discussion to hang in there, continue challenging me, but I also want more time to try to persuade her that there is respect in disagreement, that she will be best served by being taken seriously.

Author/s: 
Paul Lyons
Author's email: 
info@insidehighered.com

The late Paul Lyons taught American history and social policy at Richard Stockton College of New Jersey. This essay is an excerpt from American Conservatism: Thinking It, Teaching It, and appears here with permission of the publisher, Vanderbilt University Press.

The Monster at Our Door

Laid low with illness -- while work piles up, undone and unrelenting -- you think, “I really couldn’t have picked a worse time to get sick.”

It’s a common enough expression to pass without anyone ever having then to draw out the implied question: Just when would you schedule your symptoms? Probably not during a vacation....

It’s not like there is ever a good occasion. But arguably the past few days have been the worst time ever to get a flu. Catching up with a friend by phone on Saturday, I learned that he had just spent several days in gastrointestinal hell. The question came up -- half in jest, half in dread -- of whether he’d contracted swine variety.

Asking this was tempting fate. Within 24 hours, I started coughing and aching and in general feeling, as someone put it on "Deadwood," “pounded flatter than hammered shit.” This is not a good state of mind in which to pay attention to the news. It is not reassuring to know that the swine flu symptoms are far more severe than the garden-variety bug. You try to imagine your condition getting exponentially worse, and affecting everyone around you -- and everyone around them.....

So no, you really couldn’t pick a worse time to get sick than right now. On the other hand, this is a pretty fitting moment for healthy readers to track down The Monster at Our Door: The Global Threat of Avian Flu, by Mike Davis, a professor of history at the University of California at Irvine. It was published four years ago by The New Press, in the wake of Severe Acute Respiratory Syndrome (SARS), which spread to dozens of countries from China in late ‘02 and early ‘03.

The disease now threatening to become a pandemic is different. For one thing, it is less virulent -- so far, anyway. And its proximate source was pigs rather than birds.

But Davis’s account of “antigenic drift” -- the mechanism by which flu viruses constantly reshuffle their composition -- applies just as well to the latest developments. A leap across the species barrier results from an incessant and aleatory process of absorbing genetic material from host organisms and reconfiguring it to avoid the host’s defense systems. The current outbreak involves a stew of avian, porcine, and human strands. “Contemporary influenza,” writes Davis, “like a postmodern novel, has no single narrative, but rather disparate storylines racing one another to dictate a bloody conclusion."

Until about a dozen years ago, the flu virus circulating among pigs “exhibited extraordinary genetic stability,” writes Davis. But in 1997, some hogs on a “megafarm” in North Carolina came down with a form of human flu. It began rejiggering itself with genetic material from avian forms of the flu, then spread very rapidly across the whole continent.

Vaccines were created for breeding sows, but that has not kept new strains of the virus from emerging. “What seems to be happening instead,” wrote Davis a few years ago, “is that influenza vaccinations -- like the notorious antibiotics given to steers -- are probably selecting for resistant new viral types. In the absence of any official surveillance system for swine flu, a dangerous reassortant could emerge with little warning.” An expert on infectious diseases quoted by CNN recently noted that avian influenza never quite made the leap to being readily transmitted between human beings: "Swine flu is already a man-to-man disease, which makes it much more difficult to manage, and swine flu appears much more infectious than SARS."

There is more to that plot, however, than perverse viral creativity. Davis shows how extreme poverty and the need for protein in the Third World combine to form an ideal incubator for a global pandemic. In underdeveloped countries, there is a growing market for chicken and pork. The size of flocks and herds grows to meet the demand -- while malnutrition and slum conditions leave people more susceptible to infection.

Writing halfway through the Bush administration, Davis stressed that the public-health infrastructure had been collapsing even as money poured into preparations to deal with the bioterrorism capabilities of Iraq’s nonexistent weapons of mass destruction. The ability to cope with a pandemic was compromised: “Except for those lucky few -- mainly doctors and soldiers -- who might receive prophylactic treatment with Tamiflu, the Bush administration had left most Americans as vulnerable to the onslaught of a new flu pandemic as their grandparents or great-grandparents had been in 1918.”

The World Health Organization began stockpiling Tamiflu in 2006, with half of its reserve of five million doses now stored in the United States, according to a recent New York Times article. The report stressed that swine flu is driving up the value of the manufacturer’s stocks -- in case you wondered where the next bubble would be.

But don't expect to see comparable growth in the development of vaccines. As Davis wrote four years ago, “Worldwide sales for all vaccines produced less revenue than Pfizer’s income from a single anticholesterol medication. ... The giants prefer to invest in marketing rather than research, in rebranded old products rather than new ones, and in treatment rather than prevention; in fact, they currently spend 27 percent of their revenue on marketing and only 11 percent on research.”

The spread of SARS was contained six years ago -- a good thing, of course, but also a boon to the spirit of public complacency, which seems as tireless as the flu virus in finding ways to reassert itself.

And to be candid, I am not immune. A friend urged me to read The Monster at Our Door not long after it appeared. It sat on the shelf until a few days ago.

Now the book seems less topical than prophetic -- particularly when Davis draws out the social consequences of his argument about the threat of worldwide pandemics. If the market can’t be trusted to develop vaccines and affordable medications, he writes, “then governments and non-profits should take responsibility for their manufacture and distribution. The survival of the poor must at all times be accounted a higher priority than the profits of Big Pharma. Likewise, the creation of a truly global public-health infrastructure has become a project of literally life-or-death urgency for the rich countries as well as the poor.”

There is an alternative to this scenario, of course. The word "disaster" barely covers it.

MORE: Mike Davis discusses the swine flu outbreak in an article for The Guardian. He also appeared recently on the radio program Beneath the Surface, hosted by Suzi Weissman, professor of politics at St. Mary's College of California, available as a podcast here.

Author/s: 
Scott McLemee
Author's email: 
scott.mclemee@insidehighered.com

Toward a 21st Century Renaissance -- in My Day

I.

Given this chilly climate for administrators -- salary freeze, hiring freeze -- I turn for relief to that dusty ghost-town in my mind’s geography, the one labeled Intellect. This turn has been further encouraged by the publication in recent months of an article on the influence, or lack thereof, of a book I wrote 20 years ago on the relations between American and British writers in the 19th century, titled Atlantic Double-Cross. This book tried to explain why the writers of each country hated each other’s guts and how this animosity informs the great literary works of the period. In it, I argued for a new subdiscipline of comparative literature that would take up the Anglo-American relationship. The book pretty much flopped, in my view, and so I was delighted to read even a measured discussion of the book’s effect on my discipline — delighted, that is, until I arrived at a paragraph beginning, “In Weisbuch’s day. ...”

At first I was tempted to call the gifted, clearly youthful Columbia professor who wrote this sentence to say, “listen, it may be late afternoon; it may even be early evening. But it is still my day.”

More to the point, the phrase made me realize that I am pretty old, and that made me think — I guess I am supposed to speak like a codger now and say instead, “that got me to thinking...” — about the changes in academe in my lifetime. I thought about the move of psychology, for instance, away from the humanities through the social sciences over to the sciences, a journey by which Freud was moved from being a point of reference to a butt for ridicule.

I considered the tendency for economics to forego fundamental questions for refining an accepted model. I note as well a decline in the influence of the humanities, from whence most university presidents arose in the 1930’s, say, and an ascendancy of the sciences, and in particular genetic science, from which field an increasing number of our institutional leaders now emerge.

But going through these admittedly contentious thoughts, I saw something more substantial, which was that my thinking was taking place via the disciplines — and to that I added the realization that my poor book of so long ago had stated itself as an attempt to create a subdiscipline. I have just recently reread Douglas Bennett’s very perceptive quick history of the liberal arts in the 20th century where he notes that the organization of colleges by these disciplines that we now take so for granted in fact was a fast and dramatic occurrence between about 1890 and 1910. Today, it seems, we really care more about them than we do about the whole self or whatever the liberal arts ideal is.

So I got angry at the disciplines, and there is reason for that. It gets difficult to understand, especially at the graduate level, why a doctorate in literature and a doctorate in physics exist on the same campus when it seems they might as well be pursued on different planets. During a year when I served as graduate dean at the University of Michigan, I attended a physics lecture and was seated next to the chair of the comparative literature program. As the scientist went on, my neighbor whispered to me incredulously, “This guy thinks the world is real.” That takes C.P. Snow’s two-worlds problem to a new and desperate place.

Or again, I invited the scientist from NYU who had successfully submitted an article of post-structuralist nonsense to a journal of literary theory and had his hoax accepted to speak at Michigan, with a panel of three scientists and three humanists responding. As the large crowd left the room, the conversations were remarkable. The scientists in the audience to a person found the critique of the pretension of literary theory wholly persuasive. The humanists to a person felt that their colleagues had successfully parried the attack, no question about it, reminding the physical and life scientists that their language could be pretty thick to an outsider too and that the very history of science could be seen as the overturning of accepted truths to be revealed as unintended hoaxes.

And so, enraged at the disciplines, I tried to imagine what it would be like to have a university, a world, a mind that did not rely on the disciplines — and failed.

And my next move is to say, perhaps this is fine. If general education is tantamount to a mild benevolence toward humanity, involvement in a discipline is like falling passionately in love with a particular person. We need both. It is okay to be captured by an ecstatic interest. But we also know the danger of early love. In the words of Gordon McRae or somebody, “I say that falling in love is wonderful.” And indeed it is arguable at least that we do not induct students into a love of the life of the mind by abstractions but by finding the single discipline that fixes their fascination.

Even so, we want that fascination to be versatile, to be capable, that is, of moving from one arena of thought to another, or at least to understanding why someone else would care passionately about something else. Every summer, I spend a week on an island off Lake Winnipesaukee. This is very odd for me, as my relation to nature is such that a friend once asked if I had suffered a traumatic experience in a forest or a park. I prefer my nature in iambic pentameters, and this family island, without electricity or plumbing, I have dubbed The Island Without Toilets. Still, it is restful, and each year we campers read and discuss a book or essay. One year it was Bill McKibben’s book, The Age of Missing Information. In this tome, McKibben contrasts a day spent hiking to a modest mountaintop to a day spent watching a full 24 hours of each channel of a cable television system in Virginia. (The fact that there were only 90 channels in 1992 tells us that we are losing more information all the time.) The book is somewhat eco-snobby, but McKibben’s main contrast is really not between the natural world and its vastly inferior electronic similitude or replacement but between deep knowledge and sound bites.

He illustrates deep knowledge by an Adirondack farmer’s conversation concerning each and all the species of apple. There is so much to know, it turns out, about apples; indeed, there is so much to know about everything. As I wrote a few years ago, “Life may appear a deserted street. But we open one manhole cover to find a complex world of piano-tuning, another to discover a world of baseball, still others to discover intricate worlds of gemology and wine, physical laws and lyric poetry, of logic and even of television.” And I asked, “Do our schools and colleges and universities reliably thrill their charges with this sense of plenitude?”

They do not. And while I cannot even imagine a world without the disciplines — which are really the academic organization of each of these microcosms of wonder -- I can imagine them contributing to an overall world flaming with interest. Falling in love is great and irreplaceable, but how about reimagining the campus as Big Love, Morman polygamy for all sexes, or at least as a commune, where each of us is mated to a discipline but lives in close proximity with family-like others on a daily basis.

That is, I believe, what we are, however awkwardly, attempting by having the disciplines inhabiting the same campus. However much general education has been swamped by disciplinary insistence, a remnant remains. Even academics tend to tell other people where they went to college, not so much in what they majored. We probably already possess the right mechanism for a 21st century renaissance. It just needs some adjustments.

I want to suggest two such adjustments. One is in the relation of the arts and sciences to the world; and another readjusts the arts and sciences in relation to themselves and to professional education.

II.

When I was at the University of Michigan several years ago, something shocking took place. The sciences faculty, en masse, threatened to leave the college of liberal arts. “How could the sciences leave the arts and sciences any more than Jerry could leave Ben and Jerry’s?,” I asked someone who had been present at these secession meetings. “The same way another Jerry could leave Dean Lewis and Jerry Martin,” he replied. Somehow to Michigan’s credit, the rebellion was quelled, but to me it is suggestive of the weakness of the liberal arts ideal at many of our institutions.

There are many signs of its frailty, beginning with the frequent statistic that more students at four-year colleges now major in leisure studies than in mathematics and the sciences. It is difficult to find a middle or high school where anyone speaks of the liberal arts, and much as I have been worrying about the disciplines, aside from scattered efforts they seem to have been missing in action from much of the last forty years of discussion of school reform. In speaking about the arts and sciences in relation to the world, I want to suggest, though, that the lording of the disciplines over general education and the absence of the excitement of the disciplines in the schools have everything to do with each other.

This near paradox can be illustrated best if I stay within my own neighborhood of the humanities for this aspect of the argument. Last month, filled with nostalgia, I agreed to serve on a panel for the National Humanities Association, which advocates to Congress for funding for these impoverished disciplines. My job was to provide one version of the speech for the public efficacy of English and history, religion and philosophy and so on. I decided to fulfill this assignment rapidly and then to ask why, if we believed in the public efficacy of the humanities, we utterly ignored it in our mentoring of graduate students in these disciplines.

My argument for the humanities is exactly the same as my argument for the arts and sciences generally. As a young person, I never expected a major battle of my lifetime to be the renewal of dogmatic fundamentalism in opposition to free thinking. I find myself again and again referring to an episode of the television program "The West Wing" that aired shortly after 9-11. The president’s youthful assistant is speaking to a group of visiting school children and he says, “Do you really want to know how to fight terrorists? Do you know what they are really afraid of? Believe in more than one idea.”

This is not always as simple as the Taliban versus Shakespeare. There are subtle discouragements within our own society to the freedom to doubt and the freedom to change one’s mind. And there are elements within each of us that tend toward dogmatism and against the embracing of difference and a will to tolerate complexity. The campus, ideally, is a battleground for this freedom.

Against the many who would tyrannize over thought, we need to fight actively for our kind of education, which is far deeper than the usual political loyalties and divisions. God and George Washington are counting on us. And so are all those kids in East LA scarred by violence and poverty. In a nation of inequality and a world of sorrows, damn us if we neglect to advocate effectively for the only education that lifts up people.

Having said that, I asked why, paraphrasing Emerson, we do not turn our rituals and our rhetoric into reality. Over the last 40 years, the professoriate in the humanities has been a mostly silent witness to an atrocity, a huge waste of human resources. According to Maresi Nerad in the "Ten Years After" study, in a class of 20 English Ph.D.'s at whatever prestigious institution, three or four will end up with tenure-track positions at selective colleges or research universities. And yet this degree program, and all others in the humanities, pretend that all 20 are preparing for such a life. It’s a Ponzi scheme.

When I led the Woodrow Wilson Foundation, we began a Humanities at Work program, one aspect of which was to give little $2,000 scholarships to doctoral students who had found summer work beyond the academy. A cultural anthropologist at Texas worked at a school for delinquent girls who had been abused as children. She employed dance, folktales, autobiographical writings and a whole range of activities related to her expertise to improve these girls’ self-images. A history student at U.Va. created a freedom summer school for fifth graders in Mississippi, teaching them African American history. Meanwhile, we secured thirty positions at corporations and non-profits for doctoral graduates.

Our point was not to become an employment agency but to suggest that every sector of society, from government to K-12 to business, could benefit hugely by the transferable talents of people who think with complexity, write and speak with clarity, and teach with verve and expertise. We wanted such graduates to comprehend the full range of their own possibilities. Should they then decide to enter academia, at least they would perceive this as a free choice. And in the meantime, the liberal arts would populate every social sector as never before. I do not mean it ironically when I look to the liberal arts takeover of the world.

For that to take place at any level of education, I think, we need to marry intellectual hedonism to the responsibility of the intellectual. If we want our professoriate and our students to apply their learning -- and I do -- if we want them not simply to critique society but to constitute it, we must first acknowledge the simple joy of learning as a prime realistic moment. My dear friend Steve Kunkel is a leading investigator at Michigan of the AIDS virus. He is a fine fellow and I am certain that he would wish to reduce human suffering. But when I call Steve at 7 in the morning at his lab, because I know he will be there already, he is there less out of a humanitarian zeal than because he is crazy about science, the rattiest of lab rats. Just so, when I unpack a poem’s meaning, I experience a compulsive enjoyment. This is half of the truth, and it leads someone like Stanley Fish to scorn the other half by writing a book with the title Save the World on Your Own Time.

I think we can devote some school time to saving the world without prescribing or proscribing the terms of its salvation. Louis Menand, surely no philistine, argues that we need to get over our fear of learning that may brush shoulders with the practical and more generously empower our students. Granted, and granted enthusiastically, academic enclosure, the distancing of a quiet mind from the harsh noise of immediacy, is a great joy, even a necessity in the growth of an individual. But when it becomes the end rather than the instrument, we approach social disaster. We must travel back and forth between the academic grove and the city of social urgencies.

This is to say, and beyond the humanities, that a certain precious isolation — is it a fear? — has kept the fruit of the disciplines within the academy, away even from our near neighbors in the schools. The absence of the disciplines from the public life and the bloating of the disciplines to squeeze out the liberal arts ideal in the colleges are part and parcel of the same phenomenon. It is not that the world rejected the liberal arts but that the liberal arts rejected the world.

In a brilliant article, Douglas Bennett provides a brief history of 20th century college in which he notes an increasingly exclusionary notion of the arts and sciences. And this seems to me part and parcel of the same dubious ethic that so distrusts the messiness of the social world. As I read that we arts and science adepts kept purifying ourselves — education is too messy, throw it out, along with the study of law, along with business, along with anything material (again, “That guy thinks the world is real”) -- I am reminded of Walt Whitman’s critique of Matthew Arnold, whom he termed “one of the dudes of Western literature.” To Arnold, Whitman says, “the dirt is so dirty. But everything comes out of the dirt, everything; everything comes out of the people, the people as you find them and leave them: not university people, not F.F.V. people: people, people, just people.”

The liberal arts became pure and they became puerile. Having greatly expanded the old curriculum by addition and subdivision, they spent the rest of the century apologizing by limiting themselves. They expelled fascinating areas of human endeavor that then came to constitute professional education, and professional education proceeded to eat the libbies’ lunch.

Who or what can teach us to do what Menand urges, empower not only our students but our academic disciplines? The answer, plain as can be, is the sciences. Is it any wonder, given the exclusionary bent of the liberal arts, that scientists, whose subject and whose instruments of investigation are often frankly material, might consider secession, especially when social influence, which is also to say funding, was getting thrown away along with whole areas of crucial consequence?

And by the same token, it is the sciences that can teach the humanities in particular how to reconnect. Indeed, a few moments ago, I was calling for the humanities equivalent of tech transfer; and that is half of my hope for a 21st century renaissance.

III.

By a renaissance in our time — in Weisbuch’s day -- I do not mean the recovery of classical learning and its inclusion in a Christian worldview that marked the original. I want to invoke instead the extreme interdisciplinarity of that time when the arts and sciences came so spectacularly into, if not unity, vital relationship, and when learning and worldliness ceased their contradiction. Here is what I mean. I do not in fact live on the campus of Drew University, but in a town 15 miles away, Montclair, New Jersey. Aside from the filming of some scenes featuring AJ Soprano down the street at our high school, the neighborhood was all too quiet when we moved in, with neighbors at most stiffly waving to one another from a distance. Then Tom and Janet and their three moppets moved in, along with Tom’s insane white limousine, the backyard hockey rink, the Halloween scare show, the whole circus. As Tom started offering the middle-school neighborhood kids “rock-star drop-offs” to school in his limo, everything changed. Some of our houses have large front porches, and neighbors began to congregate on summer evenings. Soon, whenever we lit the barby a few families would turn up with their own dogs and steaks and ask if they could join in. There are about ten families now that assist each other in myriad ways, that laugh together and, when necessary, provide solace and support.

The university can become a porch society in relation to the disciplines. Indeed, for the last 40 years we have been experiencing a loosening of the boundaries, as the prefix “bio” gets attached to the other sciences; as environmental studies unites the life sciences, theology, the physical sciences, public policy, even literary criticism; as Henry Louis Gates, as historian, employs genetic research to revise and complicate the notion of racial heritage. And then there is the huge potential of democratizing knowledge and recombining it through the burst of modern technology, one of whose names, significantly, is the Web.

You cannot intend a zeitgeist but you can capitalize upon one, and this is one. A few simple administrative helps occur to me as examples. We can invite more non-academics to join with us in our thinking about the curriculum. We can require our doctoral students to take some time learning a discipline truly a ways from their own rather than requiring the weak cognate or two, and we can take just a few hours to give them a sense of the educational landscape of their country. We can start meeting not with our own kind all the time but across institutional genres, and we can especially cross the divide into public education not so much by teaching teachers how to teach but with the rich ongoing controversies and discoveries of the living disciplines.

Less grandly, within our own institutions, we can pay a bonus to the most distinguished faculty member in each department who will teach the introductory course and a bigger bonus to those who will teach across disciplines, with the size of the bonus depending upon the perceived distance between the disciplines. We can stop attempting to formulate distribution requirements or core curricula via committees of 200, which is frankly hopeless in terms of conveying the excitement of the liberal arts, and instead let groups of five or ten do their inspired thing, spreading successes. We can create any number of rituals that encourage a porch society. As one new faculty member told me at a Woodrow Wilson conference years ago, “My graduate education prepared me to know one thing, to be, say, the world’s greatest expert on roller coasters. But now in my teaching position, I have to run the whole damn amusement park and I know nothing about the other rides, much less health and safety issues, employment practices, you name it.”

We might name this zeitgeist the whole damn amusement park, but I would suggest a naming in the form of a personification: Barack Obama. When I am fundraising, I often chant something of a mantra, and I ask you to forgive its sloganeering. The new knowledge breaks barriers. The new learning takes it to the streets. The new century is global. And the new America is multi-everything. There you go and here he is. Our fresh new president is indeed international, multi-racial, multi-religious, multi-ethnic, a liberal-arts major and law school grad who became a community organizer and breaks barriers with an ease that seems supernal. He was not required; like the courses we choose freely, he was elected.

Barack Obama was born on an island, and at the start of this essay I mentioned the site of my summer challenge, the Island Without Toilets. Our disciplines are islands. Our campuses are islands. And islands are wonderful and in fact essential as retreats for recuperation. But in the pastoral poems of an earlier Renaissance, the over-busy poet rediscovers his soul in a leafy seclusion but then returns, renewed and renewing, to the city. It is time for us to leave our islands. We are equipped.

Author/s: 
Robert Weisbuch
Author's email: 
info@insidehighered.com

Robert Weisbuch is president of Drew University. This essay is adapted from a talk he gave at the 2009 annual meeting of the American Educational Research Association.

Fifty Years After Stonewall

When the police conducted a routine raid on the Stonewall Inn, a bar in Greenwich Village, during the early hours of June 28, 1969, the drag queens did not go quietly. In grief at the death of Judy Garland one week earlier, and just plain tired of being harassed, they fought back -- hurling bricks, trashing cop cars, and in general proving that it is a really bad idea to mess with anybody brave enough to cross-dress in public.

Before you knew it, the Black Panther Party was extending solidarity to the Gay Liberation Front. And now, four decades later, an African-American president is being criticized -- even by some straight Republicans -- for his administration’s inadequate commitment to marriage rights for same-sex couples. Social change often moves in ways that are stranger than anyone can predict.

Today the abbreviation LGBT (covering lesbians, gays, bisexuals, and transgender people) is commonplace. Things only become esoteric when people start adding Q (questioning) and I (intersex). And the scholarship keeps deepening. Six years ago, after publishing a brief survey of historical research on gay and lesbian life, I felt reasonably well-informed (at least for a rather unadventurous heteroetcetera). But having just read a new book by Sherry Wolf called Sexuality and Socialism: History, Politics, and Theory of LGBT Liberation (Haymarket) a few days ago, I am trying to process the information that there were sex-change operations in Soviet Russia during the 1920s. (This was abolished, of course, once Stalinism charted its straight and narrow path to misery.) Who knew? Who, indeed, could even have imagined?

Well, not me, anyway. But the approaching anniversary of Stonewall seemed like a good occasion to consider what the future of LGBT scholarship might bring. I wrote to some well-informed sources to ask:

“By the 50th anniversary of Stonewall, what do you think (or hope) might have changed in scholarship on LGBT issues? Please construe this as broadly as you wish. Is there an incipient trend now that will come to fruition over the next few years? Do you see the exhaustion of some topic, or approach, or set of familiar questions? Or is it a matter of a change in the degree of institutional acceptance or normalization of research?”

The responses were few, alas -- but substantial and provocative. Here, then, in a partial glimpse at what may yet be on the agenda for LGBT studies.

Claire Potter is a professor of history at Wesleyan University. In 2008, she received the Audre Lorde Prize for “Queer Hoover: Sex, Lies, and Political History,” an article appearing in Journal of the History of Sexuality.

One of the changes already underway in GLBTQ studies is, ironically, destabilizing the liberation narrative that begins with Stonewall in 1969 and ends with the right to equal protection in Romer v. Evans (1996). Part of what we know from the great burst of energy that constitutes the field is that the Stonewall Riot we celebrate as the beginning of the liberation movement is not such a watershed, nor is the affirmation of equal protection the end of the story.

For example, I begin the second half of my queer history survey with Susan Stryker’s “Screaming Queens: The Riot at Compton’s Café” documenting a similar San Francisco rebellion in 1966, three years prior to Stonewall; I end with Senator Larry Craig being arrested in a Minneapolis men’s room. GLBTQ liberation is unfinished and becoming more complex as the research emerges that takes us on beyond Stonewall. But I would also add a caveat: Where are the transnational and comparative histories that are on the cutting edge in other fields, like ethnic studies, cultural studies, anthropology and women’s studies?

Just as significant, in my view, is that the greatest social stigma and official discrimination (not to mention inattention in queer courses and integration into the mainstream curriculum) is still aimed at the group we celebrate when we celebrate Stonewall, transgendered and transsexual people. This is an area where we need a lot of growth.

What I would like for transgender studies in 10 years is what is happening already in gay and lesbian history: placing the emergence of identities and the emergence of liberation struggles in a longer history that goes beyond the North American 20th century. Often senior scholars view trans history as “impossible” to write, a past without an archive other than interviews with the living. However, people said that about gay and lesbian history, African‑American women’s history and other new fields, and it always turned out not to be true.

Furthermore, I would argue that trans studies has a tenuous and often politically situational relationship to the GLB and Q of the field, and that needs to be addressed because the critical issues that are specific to trans studies are not being taken seriously in most curricula that claim to actually teach the field.

The final thing I would like to see by 2019 is graduate students writing dissertations in GLBTQ studies being honestly considered for regular old history jobs, rather than jobs in the history of gender: these young people are writing in legal history, urban history, the history of science, political history, medical history and whatnot -- and they are often only considered seriously for jobs in gender or women’s studies.

What pushes a field ahead is when young people can do important research, not be professionally stigmatized for it and know they can make a living as scholars.

Doug Ireland is a veteran political reporter covering both sides of the Atlantic. He is currently the U.S. correspondent and columnist for the French political‑investigative weekly Bakchich, and international affairs editor for Gay City News, New York City's largest LGBT weekly newspaper.

Sad to say, much of what comes out of university gay studies programs these days is altogether too precious, artificial and written in an academic jargon that is indigestible to most LGBT people. Reclaiming our own history is still not getting enough attention from these programs (witness Larry Kramer's long and ultimately failed fight to have the Larry Kramer Initiative he and his brother endowed at Yale become more history‑oriented and relevant).

The OutHistory website -- founded by superb, pioneering gay historian and scholar Jonathan Ned Katz -- desperately needs more institutional financial support to continue and expand its important work of creating the world's largest online historical archive of LGBT historical materials. OutHistory's innovative program to cooperatively and simultaneously co‑publish historian John D'Emilio's work on Chicago LGBT history in that city's gay newspaper, the Windy City Times -- a program which it also hopes to expand -- should be a model for the way gay studies programs can become more relevant to the majority of queers outside the hothouse of academe and to the communities by which our universities are surrounded.

We need to know where we've been to know where we should be going, yet there is still a paucity of attention paid to the history and evolution of the modern gay movement, to the death of gay liberation, with all its glorious rambunctiousness and radical emphasis on difference, and its replacement by what Jeffrey Escoffier has called the assimilationist "gay citizenship movement," which is staid, narrow‑gauge in its fund raising‑driven focus (on gay marriage and the like), and inaccurate in the homogenized, white, nuclear‑family‑imitative portrait of who we are that the wealthiest entities in the institutionalized gay movement present and foster.

One of my greatest criticisms of today's institutionalized gay movement is its isolationism. Our largest national organizations shun the concept of international solidarity with queers being oppressed in other countries, claiming their "mission" is only a domestic one. This is in sharp contrast to European LGBT organizations, where the duty of international solidarity is universal and a priority.

Gay studies programs should be encouraging more scholarship on the 86 countries which, in 2009, still have penal laws against homosexuality on the books, and in helping to give voice to the same‑sexers and gender dissidents in those hostile environments who have difficulty publishing in their own countries or where gay scholarship is banned altogether.

To cite just two examples, the ongoing organized murder campaign of "sexual cleansing" in Iraq being carried out by fundamentalist Shiite death squads with the collusion of the U.S.‑allied government is killing Iraqi queers every day, and the horrors of the Islamic Republic of Iran's violent reign of terror against Iranian LGBTs is driving an ever‑increasing number of them to flee their homeland ‑‑ gay scholars have a role to play in helping these people reclaim their history and culture.

Why is it that the most sensitive, rigorous, and complete account of the way in which homosexuality has been extensively woven into Persian culture in sophisticated ways for over 1500 years has just been published by a non‑gay historian, Janet Afary (Sexual Politics in Modern Iran, Cambridge University Press)? In the hands of Iranian queers, this book will become a weapon of liberation against the theocratic regime's campaign to erase that history and keep it from the Iranian people. University presses need to publish more work by queer writers from LGBT‑oppressing countries (as MIT and Semiotexte have just done with Moroccan writer Abdellah Taia's fine autobiographical novel Salvation Army).

In many countries, homophobia and homophobic laws are part of the legacy of colonialism, and were imported from the West. But where is the gay scholar who has developed a serious critique of and rebuttal to the homophobic conspiracy theories of Columbia University's Joseph Massad, who has invented a "Gay International" he accuses of being a tool of Western imperialism (Massad provides a theoretical framework utilized by infantile leftist defenders of Teheran's theocratic regime for attacks on those, including Iranians, who expose the ayatollahs' inhumane persecutions of queers and sexual dissidents)?

One small, concrete and simple but powerful gesture of international solidarity would be for gay studies programs to sponsor book donation drives to make gay history and culture available to those many queers in oppressed countries who thirst for it as they construct their own identities and struggle for sexual freedom. I can tell you from my own reporting as a journalist that making such knowledge available can save lives.

Let's hope that it won't take 10 years to have less artificial, picky intellectual onanism of the obscure theoretical variety and more gay scholarship that's accessible and relevant to people's lived lives and struggles, in other countries as well as our own.

Marcia M. Gallo is an assistant professor of history at the University of Nevada, Las Vegas. In 2006, she won the Lambda Literary Award for her book Different Daughters: A History of the Daughters of Bilitis and the Rise of the Lesbian Rights Movement (Carroll & Graf).

In considering what I might wish to have changed by the 50th anniversary of Stonewall, a 25‑year old quote from Audre Lorde came to mind: “Somewhere on the edge of consciousness, there is what I call a ‘mythical norm,’ which each one of us knows ‘that is not me.’ In [A]merica this norm is usually defined as white, thin, male, young, heterosexual, Christian, and financially secure. It is within this mythical norm that the trappings of power lie within this society. Those of us who stand outside that power often identify one way in which we are different, and we assume that to be the primary cause of all oppression, forgetting other distortions around difference, some of which we ourselves may be practicing.”

By the time 2019 rolls around, we will need to have plumbed the depths of the “mythical norm” and revealed the “distortions around difference” that still separate the L from the G and the B as well as the T not to mention the Q and the I. In the next decade, I would hope that we deepen our understanding of, and mount effective challenges to, the seductiveness of normative values; question the conflation of equal rights with social justice; and celebrate the significance of queer inclusiveness, innovation, and radicalism.

Specifically, our scholarship must:

(1) acknowledge and analyze the continuing marginalization – and strategies for resistance ‑‑ of many queer people, especially those who are poor, homeless, currently or formerly incarcerated;

(2) restore the “L” -- meaning, give credence and visibility to the power of women’s experiences and leadership, still sorely lacking in our consciousness and in our publications;

(3) refocus on the importance of activism -- especially at local and regional levels, beyond the coasts -- to our research and writing.

Christopher Phelps, currently an associate professor of history at Ohio State University, will join the School of American and Canadian Studies at the University of Nottingham later this year as associate professor. In 2007 his paper “A Neglected Document on Socialism and Sex” appeared in Journal of the History of Sexuality.

I'd like to suggest that the interpretive problem of homosexuality and capitalism still cries out for exploration. John D'Emilio, David Halperin, and others have demonstrated that although same‑sex desire extends to the ancients, homosexuality is a modern phenomenon. As a sexual orientation or identity, homosexuality arose only with the individual wage labor and the separation of household and work characteristic of capitalism.

A mystery remains, though, for how did the very same mode of production that created the conditions for this new consciousness also produce intense compulsions for sexual repression? Why, if capitalism gave rise to homosexuality, are the ardent defenders of capitalism, whether McCarthyist or on our contemporary Republican right, so often obsessed with attacking same‑sex desire? How does capitalism generate both the conditions for homosexuality and the impulse to suppress it?

This relates closely to the modalities by which homosexuality and homophobia are to be found in the same minds, from J. Edgar Hoover and Roy Cohn in the 1950s down to the Ted Haggards and Larry Craigs of the present day. I believe this goes beyond self‑hatred. It speaks to a cultural ambivalence, one still present today. We live in a moment when capitalism is experiencing its deepest crisis in fifty years, even as the movement for gay acceptance seems to be advancing, if haltingly. The recent state approvals of gay marriage, for example, contrast markedly with Nazi Germany, where the economic crisis of the 1930s led to the scapegoating of gays forced to wear pink triangles. How to explain this contrast? In what ways is capitalism liberatory, in what ways constrictive?

Conversely, we need a lot more conceptual thinking about homosexuality and the left, by which I mean specifically the strand of the left that opposes capitalism. Many of the signal breakthroughs in what is now called the gay civil rights movement were the result of thinkers and doers who came out of the anti‑capitalist left, most famously the Mattachine Society and the Gay Liberation Front. This is also true of many lesser‑known examples, such as the Marine Cooks and Stewards, a left‑led union of the 1930s and 1940s that Allan Bérubé was researching before his death. (To topic‑seeking graduate students out there, by the way, Bérubé's project deserves a new champion, and we badly lack a definitive study of the GLF.)

To make such breakthroughs, however, gay leftists often had to break with the parties and movements that taught them so much and enabled them to recognize their own oppression. The founders of the Mattachine were men forced out of the Communist Party, which saw homosexuality as reactionary decadence. The libertarian left, both anarchist and socialist, broke free of the impulse for respectability, but such libertarian and egalitarian radicals were on the margins of the styles of left‑liberalism and Stalinism prevalent on the left at midcentury.

So this deepens the mystery, because it means that while capitalism generated homosexuality, it often takes radicals opposed to capitalism to push sexual liberation forward‑‑and yet sometimes they must do so against the instincts of the dominant left. We would really benefit from a deeper theoretical excavation of this set of problems.

Author/s: 
Scott McLemee
Author's email: 
scott.mclemee@insidehighered.com

You Say You Want a Reference Book About Revolution?

Thanks to an edition now available online from the University of Michigan Library, you can easily look up the word "Revolt" in the great Encyclopedia that Diderot and d'Alembert compiled in the 18th century as part of their challenge to the pre-Enlightenment order of things. A revolt is an "uprising of the people against the sovereign" resulting from (here the entry borrows from Fénelon) "the despair of mistreated people" and "the severity, the loftiness of kings."

That certainly counts as fair warning -- and indeed, the Encyclopedia then shifts into wonkish mode, advising any monarch who happened to be reading that he could best control his subjects "by making himself likable to them... by punishing the guilty, and by relieving the unhappy." Plus he should remember to fund education. Won't someone please, please think of the children? While Louis XVI was by no means a total dullard, it seems this advice was wasted on him. (See also "Regicide.")

Scores of other occasions when "the despair of mistreated people" collided with severe, lofty, and unlikable authority are covered in The International Encyclopedia of Revolution and Protest, 1500 to the Present, just published by Wiley-Blackwell. With seven volumes of text, plus one of index, it covers upheavals on every continent except Antarctica, which tends to be pretty quiet. A digital edition is also available.

The work of an international team, the Encyclopedia is edited by Immanuel Ness, a professor of political science at Brooklyn College of the City University of New York. I have been reading around in it (as is my wont) for the past few weeks, when not following the latest tweets of resistance from within Iran, and wanted to ask Ness a few questions about his project. He responded to them by email. A transcript of the exchange follows.

Q: The title of this reference work raises a question. Protests do sometimes lead to revolution, of course, but none that I've ever been to ever has. Although both activities involve departures from (and opposition to) the routines of a given society, revolution and protest seem to be rather distinct processes. Why bring them together like this?

A: Revolution and social transformation are ultimate goals of protests that have arisen from collective grievances when leaders of states and societies are unable or unwilling to come to terms with abject inequality or injustice. Undeniably not all protests lead to revolutionary change and most protesters will not live to see the results of their actions. But when successful, they are the culmination of waves of social grievances against authoritarianism, social and economic inequality and injustices frequently expressed over decades and even centuries.

In the project, we document when protests lead to revolution as well as demonstrations that are manifestations of systemic injustices, even if a revolution did not result immediately thereafter. Thus, the Bolshevik Revolution consolidated the mass peasant and worker movements that peaked in the early 20th century. By contrast, in the Philippines, the powerful peasant protest movements have failed to lead to a transformation of society. While the goal of creating a democratic and equitable society remains unfulfilled there, mass protests persist against injustice.

Despotic systems of rule can stave off resistance, but the history of the last 500 years demonstrates that revolutionary change is an ineluctable process.

Q: OK, but that assumes revolution and protest are means to the ends of justice and equality. I'm not sure the entries in the Encyclopedia all confirm that notion. There is one on fascism, for example: a movement that regarded itself as revolutionary, as the sociologist Michael Mann has emphasized. And come to think it, the "Tea Party" events in the United States earlier this year were protests, of a sort -- but for the most part they were just media stunts. How do you square this messy reality with what sounds like a base-line conception of revolution and protest as the midwives of progress?

A: In developing the work, we debated whether to include fascism and totalitarianism as social movements, and decided they were necessary to maintain a definitive unbiased understanding of the history of protests and revolutions. In many instances, demagogues across the political spectrum have used populist rhetoric and forces to defend violence and repression.

We were cognizant of the manipulative use of revolutionary rhetoric and symbols by repressive leaders to maintain and achieve power. But these entries also examine the popular discontent and resistance to injustice and oppression. For example, throughout Europe, we focused on the proliferation of partisan opposition to Francoism, Nazism, Fascism, and Stalinism. Similarly in the Global South, we documented popular opposition that emerged in response to dominant religious, ethnic, class, and oligarchic rulers that have relied on violence to repress the powerless.

As sociologist James Scott exposed in his work on guerrilla movements, we documented cases of armed resistance that often redounded against the most powerless that are often caught reluctantly in the crosshairs of conflicts. But, in researching modern history, while we may disparage the motivations of some reactionary movements that were cynically manipulated by leaders, the vast majority of social protest was engaged by ordinary people seeking justice, equality, and social inclusion.

Q: Your project is ambitious; it seems to cover the whole world. Is this a matter of some editorial orientation towards the new global or transnational history, or was it simply a matter of the various movements and uprisings seeming to be interconnected and to influence each other (as the cross-references tend to show)? And why does the period it covers start in 1500?

A: In crafting the project, from the outset, we were mindful of utilizing an approach rooted in world history, which seeks a broader examination of human civilization rather than the geographically parochial and theoretically circumscribed western civilization that I consider fairly indifferent to the majority of people who live throughout the world. Geographically, the project is framed from the perspective of world history, which appreciates the dominant processes of empire, migration, capitalism, environmental change, political, and social movements.

Using a world history frame, we found that many political movements are interconnected as arcs of resistance that appear through the processes of imperial expansion and resistance in various spheres of influence. For instance, Latin American resistance to Spanish colonial rule and then the post-colonial era can be viewed through arcs of resistance against European dominance, slavery, racism, and then indigenous struggles for civil and equal rights that appeared through the last 500 years though emerge more decisively in various historical moments. For instance, the numerous essays on indigenous movements reveal that resistance throughout the Americas is reaching a new apogee in the contemporary era.

The decision to begin with protests and revolutions at 1500 recognizes the important historical and social science research that identifies the beginning of the modern era with the dramatic expansion of European imperialism, emergent capitalism, and slavery that significantly emerged and rapidly expanded as major forces throughout the world. The temporal organization owes much to the path-breaking historical work of Fernand Braudel and the Annales School and Immanuel Wallerstein and subsequent World Systems Theorists.

Q: Any revolution is an interpretive minefield. Even nomenclature provokes arguments. (You can't refer to the Khmer Rouge in Cambodia or the Shining Path in Peru without somebody calling you a running dog lackey of the imperialist bourgeoisie for using those terms, since the respective organizations preferred to be called something else.) How did you strive with the need for balance and objectivity -- given that in some cases the very possibility of them is in dispute? Your introductory note for the Encyclopedia says that each article was examined by two members of the editorial board, and that more than half of the submitted pieces were rejected on various grounds. Did that mean you had to leave certain subjects out?

A: Realizing balance and objectivity in each entry was one of the greatest challenges in editing the work. In part this involved seeking to include editors with erudition in their respective fields who had a range of perspectives on the history of protest and revolution. While contributors were enthusiastic about this work, writers with similar perspectives did not necessarily agree with all the interpretations and conclusions. It reminds me of the Italian adage on the divisions on the left: “amici nemici parenti serpenti” (friends can be enemies but families are like a nest of vipers). Of course, the editors engaged in a respectful exchange of views, but people had different interpretations of the events and organizations. The encyclopedia includes arguments with a variance of opinion, but through the referee process, I ensured that the historical facts were correct, even if people reached different conclusions.

The history of the Cold War demonstrates that the US and Soviet Union supported various movements for the purpose of maintaining influence, even if those movements engaged in horrible acts of genocidal violence and brutality. We document each of these cases candidly even if the facts are jarring to one’s political affinities. The US supported the Khmer Rouge in Cambodia even after the party killed some 2 million civilians and was deposed through armed intervention by Soviet-supported communist Vietnam. Even if the narrative histories are disturbing to Maoist supporters of the Khmer Rouge and other groups, it is crucial that we document the horrific unfolding of events.

Still, in the case of Cambodia, while it is easy to blame the Khmer Rouge for all the violence, history demonstrates that for more than 100 years, European and then U.S, colonialists bear responsibility for destroying a culture and society. So, I think that it is crucial to understand the imperialist antecedents that set the stage for militant separatism as is the case of the Khmer and the Shining Path.

Through peer review we selected the most erudite essays submitted on similar topics. Our goal was to have each entry provide an entry point into a historical field of enquiry through providing extensive references. Even in an eight-volume work, our objective was achieving historical significance while remaining comprehensive. We are updating this work next year to include any essays that are worthy.

Q: The situation in Iran has taken a dramatic turn over the past month. Is this a new stage of the revolution that began in 1978-79? A repudiation of it? Something provoked by the CIA? Part of a larger wave of protests stimulated (directly or indirectly) by the global economy? A predictable consequence of so much of the population being young and full of rising expectations?

A: Well, as a rule, we avoided entries on recent events in the last five-to-ten years since the jury is still out and it is impossible at this point to gain more than a general sense of the social forces on the ground. Thus, while some recent events are included, the passage of time is essential to understand the forces at play. As such we excluded some of the “color revolutions” as it is too soon to discern the various groups engaged in the contestation for power. I have noticed that some in the West have already dubbed the Iranian protests as the “Green Velvet Revolution,” almost if it is part of a branding process.

It appears that some sort of democracy is in play today, irrespective of the forces manipulating the protests for their personal or factional benefit. In the Encyclopedia one can learn that the democracy movement in Iran is not a recent phenomenon but endures from the decisive electoral victory of Mohammad Mossadegh in 1953, which represented a repudiation of British interference in Iran, a theme in the unfolding of events today. But the CIA-supported Shah Mohammad Reza Pahlavi’s 1953 military putsch went on to annihilate all democratic opposition. With all democratic forces crushed by the Shah, the Shiite Islamic clerics gained currency in the wake of the Iranian Revolution of 1979, just as Napoleon consolidated power after the French Revolution. No less, the popular will for democracy, equality, and popular control remains a significant force in Iran as in nineteenth century France. I think that while foreign meddling may have occurred, last month’s elections also reveal that the vision of a democratic and egalitarian society remains unvanquished.

Author/s: 
Scott McLemee
Author's email: 
scott.mclemee@insidehighered.com

Midsummer Miscellany

The fall books have already started piling up. There are titles I’ve asked the publishers to send, and the ones volunteered by eager publicists, and the ceaseless influx of small books of poetry, which fill me with guilt for watching “Law & Order” reruns instead of reading them. (But verily, man cannot live by print alone.)

Before the new publishing season begins and they are lost in the flood, let me take a quick look here at a few recent titles – books I have found absorbing and rewarding, but not had a chance to discuss in this column. The list is miscellaneous, and the tip of an iceberg. I doubt they have much in common. But each title is a reminder of the fine and irreplaceable work that university presses do with no fanfare, and seldom much recognition. And let’s not even talk about profit.

The arrest of Henry Louis Gates Jr. (a.k.a. Gates-gate) has generated great heat but not much light. Various media loudmouths have been outdoing themselves in portraying the Harvard professor as some kind of wild-eyed radical. This is, of course, a matter of ignorance, robust and unashamed. Gates is in reality the most anodyne of centrists. But at least the furious fantasies he has provoked should put to rest for good any notion that the United States has lately turned into a “postracial” society.

It seems like a good moment to recommend Pulitzer Prize-winner Steven Hahn’s new book The Political Worlds of Slavery and Freedom, a compact but challenging volume published by Harvard University Press. The author is a professor of history at the University of Pennsylvania. His three chapters -- each a miniature monograph -- are based on a series of lectures at Harvard, given at Gates’s invitation.

Hahn looks at the complex way the African-American struggle for emancipation took shape both under slavery and in the wake of its abolition. This process involved the creation of institutions for self-governance, as seen in “the efforts of newly freed people to reconstitute their kinship groups, to form squads and other family-based work groups, to pool community resources, and, of course, to acquire land.”

These weren’t just social movements. They contained, argues Hahn, a political element. Hahn considers whether the activity of black Southerners during the Civil War amounted to a variety of slave revolt, and he sketches aspects of the political life of Marcus Garvey’s pan-Africanist group in the United States in the early part of the 20th century. Only the mo

st small-minded conception of American life would assume that these are matters of interest only to black readers. In a healthy culture, this little book would be a best-seller.

A few months ago, an editor asked me to review Adina Hoffman’s biography of Taha Muhammad Ali, My Happiness Bears No Relation to Happiness: A Poet’s Life in the Palestinian Century, published by Yale University Press. To tell the truth, my heart did not initially leap at the opportunity. For I had never read any of his poetry, and rather feared that it might be full of slogans -- that, indeed, the poet’s own life might be one long slogan.

This proves that I am an idiot. A couple of sessions with his selected works revealed Ali to be a wry, ambivalent, and often understated lyricist. (In translation, at least, he seems a little bit like Edgar Lee Masters.) The figure who emerges from Hoffman’s biography is that of a quiet shopkeeper in Nazareth who carefully studied Arabic literary tradition, and also absorbed the influence of the Palestinian nationalist “resistance literature” – then created his own distinctive style: one stripped-down and unrhetorical, but sensitive as a burn.

One of the remarkable things about this biography, as indicated in my review, is that it evokes not only the political and historical context of Ali’s work, but also how his poetry took shape. Its quietness and simplicity are hard-won.

At the other extreme from Ali, perhaps, is Walt Whitman, whose poetic voice is booming, and whose persona always seems a couple of sizes too large for the North American continent. A couple of years back, Duke University Press reprinted his one and only novel: a cautionary tale of the perils of strong drink called Franklin Evans, or The Inebriate. I have somehow never gotten around to reading it, and probably never will. But it is impressive to think that Whitman grew to his familiar cosmic dimensions while stone cold sober.

His poetry certainly intoxicated the readers portrayed in Michael Robertson’s Worshipping Walt: The Whitman Disciples, published by Princeton University Press. The noun in its subtitle is no exaggeration. The readers portrayed here found in Whitman’s work something akin to a new scripture -- nearly as much as followers of Joseph Smith or Mary Baker Eddy did the Book of Mormon or Science and Health.

You can still find R.M. Bucke’s Cosmic Consciousness (1901) -- where Whitman is identified as “the best, most perfect example the world has so far had of the Cosmic Sense" -- in New Age shops. Other disciples took his “chants democratic” as hymns for a worldwide socialist commonwealth. And his invocation of manly “adhesiveness” were understood by a few readers to be a call for what later generations would term gay liberation. Whitman insisted that his homophile readers had misunderstood him, and that when not writing poetry he had been busy fathering illegitimate children all over these United States. The biographers will continue to hash that one out -- though it’s clear that his literary persona, at least, is ready to couple with anything that moves, regardless of gender.

Whitman’s work gave some of his Victorian readers a vision of the world extending far beyond the horizon of the familiar and the acceptable. No surprise that they revered him as a prophet. Robertson, a professor of English at the College of New Jersey, tells the story of his steadily expanding circle of enthusiasts (which at one point aspired to become a global movement) with due appreciation for how profound the literary experience can be, when the right book falls into the right person’s hands.

Of course there are times when reading is a nothing but a guilty pleasure. So to go from the sublime to the sleazy, I have to recommend Jack Vitek’s The Godfather of Tabloid: Generoso Pope Jr. and the National Enquirer, published by the University Press of Kentucky late last year.

Not that the book itself is sleazy. The Enquirer may specialize in celebrity gossip, horrific crimes, UFO abductions, and Elvis Presley's posthumous itinerary. But that's not to say that the author -- an associate professor of English and journalism at Edgewood College, in Madison, Wisc. -- is anything but serious and measured in his approach. Vitek tackles his subject with all due awareness of its lingering cultural relevance. Pope modeled himself on newspaper tycoons such as William Randolph Hearst and Joseph Pulitzer.

The publisher also happens to have had family “connections” (as the preferred expression has it) with what its members do not call the Mafia. He also spent about a year working for the Central Intelligence Agency. This makes it especially interesting to consider the mission statement Pope released when he bought a local tabloid called the New York Enquirer in 1953. “In an age darkened by imperialist tyranny and war,” it said, “the New York Enquirer will fight for the rights of man, the rights of the individual, and will champion human decency and dignity, freedom and peace.”

Any biography moving between lofty rhetoric and very low company is bound to be pretty absorbing. The Enquirer, after it went national, reached a peak circulation of 6.7 million copies per issue in the late 1970s, with Pope playing an aggressive role in crafting its distinctive strain of populist sensationalism.

In a footnote, Vitek points out that Fredric Jameson’s analysis of postmodernism somehow overlooked Pope’s role as formative influence within what Jameson calls "the degraded landscape of schlock and kitsch." Quite right -- and it is good to have this oversight finally corrected.

Author/s: 
Scott McLemee
Author's email: 
scott.mclemee@insidehighered.com

Prophets of Deceit

The recent surge of right-wing fantasy into American public discourse should not be surprising. Claims that Obama is a foreigner, that health-care reform means bureaucratic death squads, that “the country we once knew is being destroyed,” as anguished people at town halls have put it – only on the most superficial level are these beliefs the product of ignorance, irrationality, and intractable boneheadedness.

Let’s face reality. An African-American man without so much as an Anglo-Saxon syllable to his name is now occupying an institution called (not on purely descriptive grounds) the White House. What did you think was going to happen? In the 1760s, George Washington complained that the British had a “systematic plan” to render the Americans “as tame and abject as the blacks we rule over with such arbitrary sway.” (An interesting choice of terms, that.) This is a country in which anxiety goes deep, and all the way back. It is not an afterthought.

Mostly, of course, it stays in check. With enough stress on the system, the craziness tends to flare up, like a cold sore. The “viral” political message involved sounds, in part, something like this:

“What’s wrong? I’ll tell you what is wrong. We have robbed man of his liberty. We have imprisoned him behind the iron bars of bureaucratic persecution. We have taunted the American businessman until he is afraid to sign his name to a pay check for fear he is violating some bureaucratic rule that will call for the surrender of a bond, the appearance before a committee, the persecution before some Washington board, or even imprisonment itself.... In the framework of a democracy the great mass of decent people do not realize what is going on when their interests are betrayed. This is a day to return to the high road, to the main road that leads to the preservation of our democracy, and to the traditions of our republic.”

As it happens, this is not a transcript from Fox News, but taken from the opening pages of Leo Lowenthal and Norbert Guterman’s book Prophets of Deceit: A Study of the Techniques of the American Agitator, first published in 1949 by Harper and Brothers. Plus ça change....

The passage just quoted appears in “The Agitator Speaks” – an introductory segment of the book presenting an archetypal harangue by a Depression-era radio ranter or streetcorner demagogue. Father Couglin remains the most notorious of the lot -- perhaps the only one with name recognition today. But scores of them were in business during the worst of the crisis, and enough of them kept plying their trade after the war to worry the American Jewish Committee, which sponsored the study.

My first reading of Prophets of Deceit was about 20 years ago. At the time, its interest to me was for the most part historical – as an example of Frankfurt School theory being used for empirical social research. Lowenthal, a German emigre, was the main author. The focus of his other research was the sociology of literature and popular culture. Guterman, identified on the title page as a co-author, was primarily a translator. The preface expresses appreciation to a young assistant named Irving Howe for “much help in preparing the final manuscript.” That may understate his role. Some chapters are suspiciously well written.

In analyzing speeches and writings by the Depression agitators, Lowenthal showed a particular interest in how they operated as rhetoric – how the imagery, figures of speech, and recurrent themes worked together, appealing to the half-articulated desires and frustrations of the demagogue’s followers. Another of the Frankfurters, Theodore Adorno, had produced a similar if more narrowly focused monograph, The Psychological Technique of Martin Luther Thomas' Radio Addresses, published a few years ago by Stanford University Press. And Prophets of Deceit itself was the third in the AJC’s five-volume series “Studies in Prejudice.”

The insights and blindspots of this large-scale effort to analyze “the authoritarian personality” generated controversy that continues all these decades later. But I wasn’t thinking of any of that when Prophets of Deceit came back to mind not long ago.

The catalyst, rather, was my first exposure to the cable talk-show host Glenn Beck. His program, on the de facto Republican party network Fox, has been a locus for much of the pseudopopulist craziness about how the Presidency has been taken over by a totalitarian illegal alien. You will find most of the themes of this form of political thinking cataloged by Lowenthal and associates. (Sixty years ago, the ranting tended very quickly to become anti-Semitic, while now it seems the conspiracy is run by the Kenyans. This change deserves closer study.)

But the striking thing about Beck’s program was not its ideological message but something else: its mode of performance, which was so close to that described in Prophets of Deceit that I had to track down a copy to make sure my memory was not playing tricks. The book was reissued a few years ago in an edition of Lowenthal’s collected writings published by Transaction.

In case you have not seen him in action, Beck “weeps for his country.” Quite literally so: the display of waterworks is the most readily parodied aspect of his performance. He confesses to being terrified for the future, and quakes accordingly. He acts out aggressive scenarios, such as one in which he pretended to be Obama throwing gasoline on an Average American and threatening to set him on fire.

Prophets of Deceit describes Beck’s act perfectly, six decades avant la lettre: “something between a tragic recital and a clownish pantomime.”

The performance is intended, not to provide information or even to persuade, but rather to create a space in which rational discussion can be bypassed entirely. The demagogue, whether of old or new vintage, “does not confront his audience from the outside; he seems rather like someone arising from its midst to express its innermost thoughts. He works, so to speak, from inside the audience, stirring up what lies dormant there.... It is difficult to pin him down to anything and he gives the impression that he is deliberately playacting.... Moving in the twilight zone between the respectable and the forbidden, he is ready to use any device, from jokes to doubletalk to wild extravagances.”

Instead of argument about the relative merits of this or that policy or action, this mode fosters beliefs that are “always facile, simple, and final, like daydreams.” The point is not to analyze or to convince members of the public but to offer “permission to indulge in anticipatory fantasies in which they violently discharge those emotions against alleged enemies.”

A lot has changed since Prophets of Deceit appeared, but not everything. Rereading it now leaves the definite sense that we’ve been here before.

Author/s: 
Scott McLemee
Author's email: 
scott.mclemee@insidehighered.com

The Accidental Celebrity

“There are two modes of establishing our reputation: to be praised by honest men, and to be abused by rogues. It is best, however, to secure the former, because it will invariably be accompanied by the latter.”

-- Charles Caleb Colton, Anglican clergyman (1780-1832)

One deleted e-mail marked the beginning of my ordeal. It was finals week, just before Christmas break, when I received a strange message asking me to comment on some kind of online political essay that I had supposedly written. Since I’m not a blogger and make it a point to avoid the many rancorous political forums on the Internet, I immediately dismissed it as spam and hit delete.

But the notes kept coming, increasing in their fervor and frequency, until I could no longer deny it: I was receiving “fan mail.” Some writers called me courageous. Others hailed me as a visionary. A few suggested that I was predestined to play a pivotal role in the apocalyptic events foretold in the Book of Revelation. (Seriously.) Now, over the past 12 years I have published a scholarly book and eight journal articles on various historical topics, but I have to admit that through it all I never even attracted one groupie. So with my curiosity very much piqued, I began an online quest in search of the mysterious article.

I suppose it was inevitable that I was not going to like what I found. There, prominently displayed on a rather extreme Web site, was an essay (information about it can be found here) that likened President Obama to ... Adolf Hitler. Underneath the title was the inscription “by Tim Wood.”

To say I was not pleased would be a colossal understatement. However, even though my parents always told me I was special, a quick Internet search will reveal that I am not, in fact, the world’s only Tim Wood. So I ignored the article -- at least until one of the versions of the essay being forwarded via e-mail mutated into a form which included the rather unambiguous phrase “Professor of History, Southwest Baptist University.” The writer of this message also helpfully appended my office phone number and e-mail address.

Stunned, I struggled to regain my bearings and tried to grasp the full implications of this professional identity theft. Beyond the fact that the comparison is utterly ridiculous (anyone who believes that truly has no understanding of the depths of evil plumbed by the Nazi regime), it was now personal. Who had the right to speak for me like that? How dare they hide behind my name! What if my colleagues -- or my friends and family -- read this and believed it?

But the most pressing question seemed to be what kind of damage control would be necessary in order to prevent this from irreparably damaging my career. And that, in turn, led me to begin reflecting on how scholars will need to safeguard their professional reputations in the 21st century. Although I would never wish this kind of ordeal on anybody, the realist inside me fears that I will not be the last professor to fall victim to digital dishonesty. As academics, we must be aware that our professional reputations are transmitted through the technology of a bygone era, and even then are typically shrouded in secrecy or obscurity. Mentors, colleagues, and administrators exchange sealed and confidential references printed out on university letterhead. Editors, referees, and reviewers validate our scholarly work by allowing us access to or giving us coverage in their publications, but the results of that process all too often lie buried in library stacks and academic databases. In the meantime, the malicious or misinformed denizens of the Web have had time to hit the “forward” button about a million times.

So what lessons have I learned through this ordeal? First of all, be proactive. Once these rumors hit a certain critical mass, ignoring them will not make them go away. Indeed, a situation like this becomes the ultimate test of one’s personal credibility in the workplace. Immediately after I discovered that my specific identity had become attached to that particular article, I treated myself to a tour of the university’s administration building. Everybody from my department chair, to my dean, to the provost, to the directors of human resources, information technology, and university relations heard my side of the story within 48 hours. In my case, I was fortunate enough to have retained the confidence and support of my administration. There is no substitute for goodwill.

Secondly, I tried to remain positive and to find the teaching moment hidden within all of this. I posted an item on the university’s faculty Web page that served both as a public disclaimer and an opportunity to emphasize to students (and anybody else who might read it) why it is that faculty constantly warn against an uncritical acceptance of materials found on the Internet. I reminded my readers that in history, scholars are trained to constantly analyze their sources. Always historians must be aware that the documents they are working with may contain errors, lies, omissions, distortions, or may even turn out to be wholesale forgeries. To navigate those potential pitfalls, scholars check facts and look for other documents that confirm (or contradict) the information found in our sources. We seek to identify the author and understand his or her motives for writing. We try to understand the larger historical and cultural context surrounding a document. By doing our homework, we are better able to judge when people deserve to be “taken at their word.”

This episode has also taught me a tough lesson in maintaining a professional demeanor, even in the face of outrageous provocations. Although the majority of people who wrote to inquire about the article were gracious, and many even apologized for the mistake, enough of my correspondents were belligerent and rude to make me dread opening my inbox every morning. Even after learning I was not the author, many readers clearly still expected me to lend my professional credibility to the essay, vouching for its accuracy and validating its interpretations. After reading my denial (where I explicitly refused to endorse the article’s contents), many supporters of the piece became abusive, writing back to attack the depth of my patriotism, the sincerity of my religious faith, and the integrity of the academic community in the United States in general.

Critics of the essay were not above lashing out either -- even in the absence of evidence. One disgruntled detractor wrote to inform me that my brand of “voodoo” and “fear-mongering” would soon be vanishing into irrelevancy, heralding the advent of a new Age of Reason. (Hopefully that individual’s definition of reason will eventually grow to include a commitment to basic research and fact-checking and an unwillingness to take forwarded e-mails at face value.) In the meantime, along with the angry rants, there came from the fever swamps of political paranoia long-discredited conspiracy theories, urging me to consider that the course of history was being determined by Jewish bankers, or the Jesuits, or the Illuminati, or even flesh-eating space aliens. Frequently at those junctures, I felt the temptation to fire back with a “spirited” and “colorful” rebuttal. However, I resisted for many reasons: because I am ultimately a firm believer in civility in public debate, because I did not want to embarrass the colleagues and administrators who had stood by me through this, and because arguing with people who have already made up their minds and have come to demonize those who disagree is almost always an exercise in futility.

Moreover, this incident has led me to reconsider my somewhat adversarial relationship with technology. (I’m the guy who still refuses to buy a cell phone.) But one of the greatest difficulties I encountered in all of this was finding a platform from which to launch a rebuttal. Although I did write personal replies to many of the people who wrote me inquiring about the article, it seemed clear that such a strategy alone was like battling a plague of locusts with a flyswatter. Instead, Internet rumors are best refuted by channeling people toward some definitive, universally available, online point-of-reference (a Web address, for instance) that exposes the lie. In my case, the university was kind enough to grant me access to a page on its Web site, and I quickly began disseminating the link to my posting. However, that solution may not be available to everyone who falls victim to this kind of a hoax, and I am beginning to believe this issue is far too important for faculty to leave to others anyway. A year ago, I would have considered the creation of an “official Tim Wood Web site” to be pretentious in the extreme. Today, I’m not so sure. Like it or not, faculty are public figures, and if we do not take the initiative to define ourselves in ways that are accessible and relevant to those outside the academy, we risk being defined by others in ways that suit their agenda, not ours.

Finally, confronting this situation has led me to take a fresh look at the qualities that make a good historian. In 1964 Richard Hofstadter, an influential scholar of American politics, wrote an article for Harper’s Magazine entitled “The Paranoid Style in American Politics.” In this passage, he describes a paranoia all too familiar in today’s political discourse:

As a member of the avant-garde who is capable of perceiving the conspiracy before it is fully obvious to an as yet unaroused public, the paranoid is a militant leader. He does not see social conflict as something to be mediated and compromised.... Since what is at stake is always a conflict between absolute good and absolute evil, what is necessary is not compromise but the willingness to fight things out to a finish. Since the enemy is thought of as being totally evil and totally unappeasable, he must be totally eliminated -- if not from the world, at least from the theatre of operations to which the paranoid directs his attention.

As author Dick Meyer pointed out in a 2005 CBS News article, this mentality has come to transcend political labels:

The great dynamic is that so many people .... are convinced that a malevolent opponent wants to destroy their very way of life and has the power to do so. Evangelical Christians may believe that gay marriage, abortion rights, promiscuous and violent popular culture, and gun control are all part of a plot to destroy their community of values. Urban, secular liberals may believe that presidential God-talk, anti-abortion legislators and judges, intrusive Homeland Security programs, and imperialist wars are part of a sinister cabal to quash their very way of life.

Historians often find themselves compared to storytellers, and are lauded for their ability to present compelling interpretations of the past and to craft powerful narratives. But perhaps equally as important is our role as listeners. In an increasingly divided society, consensus will never be achieved by shouting (or e-mailing) until we drown out all competing voices. Instead, the first steps toward reconciliation come by those who seek to understand all aspects of the question and try to remain mindful of the needs of others.

In any case, my battle continues. Monday I will go to work, try to sort through all the chaos, and do my best to help folks figure out the truth. (Which is probably pretty close to what I did before my identity was stolen, come to think of it....) And I will continue to contemplate the ways in which this experience will change the way I present myself as a professor and a historian. In the meantime, if any of you encounter any online rantings and ravings that claim to be by me, do not necessarily believe them. Things are not always what they seem.

Author/s: 
Timothy L. Wood
Author's email: 
info@insidehighered.com

Timothy L. Wood is an assistant professor of history at Southwest Baptist University in Bolivar, Missouri. He is the author of Agents of Wrath, Sowers of Discord: Authority and Dissent in Puritan Massachusetts, 1630-1655 (Routledge).

Wrong Things, Rightly Named

Suppose that, 30 or 40 years ago, the news media of the West had gotten hold of a KGB document reviewing its experiences in interrogating those who posed a menace to the peace, progress, and stability of the People’s Democracies. For younger readers, perhaps I should explain that the Soviet Union and its client states liked to call their system by that cheerful term. And yes, they were serious. Self-deception is a powerful force, sometimes.

Suppose the report listed such methods of information-gathering as beatings, suffocation, and mock executions. And suppose, too -- on a lurid note -- that it mentioned using threats to murder or sexually violate members of a prisoner’s family. Now imagine numerous pages of the report were redacted, so that you could only guess what horrors they might chronicle.

With all of that as a given, then... How much debate would there have been over the moral status of these acts? Would someone who insisted that they did not constitute torture get a hearing? Could a serious case be made that it was in the best interests of justice to move forward without dwelling on the past?

If so, would such arguments have been presented in major newspapers, magazines, and broadcasts? Or would they have been heard in out-of-the-way meeting halls, where the only cheer was borrowed from posters of the National Council of American-Soviet Friendship?

This thought experiment comes to mind, of course, in the wake of reading about the report of the CIA’s Office of the Inspector General. The analogy is not perfect by any means. No comparable degree of “openness” (however grossly inappropriate that word seems in this case) existed on the other side of the old Iron Curtain. But let’s not cheer ourselves hoarse over that fact just yet.

Actions that would have been judged without hesitation to be torture if conducted by a member of the East German secret police (or, in the case of waterboarding, by the Khmer Rouge) did not meet the wonderfully scrupulous standards laid out seven years ago by the Department of Justice’s Office of Legal Counsel. If more testimony to the power of self-deception needed, this would do.

When the CIA made its evaluation of various bloody-minded interrogation practices in 2004, the Bush administration’s response was reportedly frustration that the techniques hadn’t been more effective. The assessment of the Obama administration seems to be that torture has been both unproductive and damaging for “soft power” – a public-relations nightmare. This is progress, of a kind. If somebody decides to give up sociopathic behavior on the grounds it is proving bad for business, that is only just so much reason for relief. But it is preferable to the alternative.

It might be possible to hold ourselves to higher standards than that. But first it would be necessary to face reality. One place to start is Tzvetan Todorov’s little book Torture and the War on Terror, first published in France last year and now available in translation from Seagull Books (distributed by the University of Chicago Press).

Todorov once lived in what was called, at the time, the People’s Republic of Bulgaria. As an émigré in Paris in the 1960s, he wrote The Poetics of Prose and other standard works in structuralist literary criticism – as well as a study of the idiosyncratic Russian theorist Mikhail Bakhtin that, in my opinion, made Bakhtin’s thought seem a lot more systematic than it really was.

Over the past quarter century, Todorov’s concerns have shifted from the structuralist analysis of literary language to a moral inquiry into the historical reality of violence and domination, including books on the Nazi and Stalinist concentration camps.

Torture and the War on Terror is more pamphlet than treatise. Some pages revisit points that ought to be familiar to anyone who has given any thought to the experience of the past eight years. To grasp the essential meaninglessness of a phrase like “war on terror” (you can’t bomb a state of mind) does not require a degree in linguistics or rhetoric. But then, the ability to state the obvious can have its uses.

The document prepared by the Justice Department in August 2002 carefully parsed its definition of torture so that it covered only acts leading to the "severe pain" characteristic of permanent “impairment of bodily function.” Todorov does not hesitate to specify what is going on within that semantic maneuver: “The reasoning of this memorandum – paradoxically so, for a legal document prepared by competent jurists – proceeds from a form of magical thinking insofar as it pretends that we can act on things by changing their names.” That about covers it. The expression “magical thinking” covers a great deal of our public life in those years – a time exemplified by the consistently miraculous heroics of Jack Bauer on “24.”

As both a student of the phenomenon of state violence and a former resident of People’s Bulgaria, Todorov is willing to recognize and name what has been going on this past decade. We need to read the following and remember that it is what goes in the history books:

“In prisons scattered throughout countries outside the United States, the detainees have been regularly raped, hung from hooks, immersed in water, burned, attached to electrodes, deprived of food, water or medicine, attacked by dogs, and beaten until their bones are broken. On military bases or on American territory, they have been subjected to sensory deprivation and to other violent sensory treatments, forced to wear headphones so they cannot hear, hoods so they cannot see, surgical masks to keep them from smelling, and thick gloves that interfere with the sense of touch. They have been subjected to nonstop ‘white noise’ or to the regular alternation of deafening noise and total silence; prevented from sleeping, either by use of bright lights or by being subjected to interrogations that can last twenty hours on end, forty-eight days in a row; and taken from extreme cold to extreme heat and vice versa. None of these methods cause ‘the impairment of bodily function,’ but they are known to cause the rapid destruction of personal identity.”

Given the inefficacy of torture as a way to extract intelligence, its real “value” comes in the form of retribution -- and the feeling of restored mastery this permits.

“Reducing the other to a state of utter powerlessness,” writes Todorov, “gives you a sense of limitless power. This feeling is obtained more from torture than from murder, because, once dead, the other is an inert object that can no longer yield the jubilation that comes from wholly defeating the will of the other. On the other hand, raping a woman in front of her husband, parents, and children or torturing a child in front of his father yields an illusion of omnipotence and a sense of absolute sovereignty. Transgressing human laws in this way makes you feel close to the gods.... Torture leaves an indelible mark not only on the victim but also on the torturer.”

Todorov might have pushed this line of thinking (with its nod to Hegel’s dialectic of the struggle for recognition) a lot further than he does. The “indelible mark” can take various forms, and it is not restricted to those who directly wield the instruments of torture.

The craving for “an illusion of omnipotence and a sense of absolute sovereignty” is something best channeled into wish fulfillment-oriented forms of entertainment. There it can be aggrandized yet contained. Money and commodities change hands; the consumer gets a catharsis of sorts; civil society muddles along, and everybody wins.

When sectors of the populace come to regard its pursuit in reality as a necessary part of the business of the state, things are on a different and more worrying terrain. A host of strange side effects then follow – including nostalgia for 9/11 itself in some quarters, since the country was so deeply “unified” on 9/12. A scarcely concealed yearning for another terrorist assault makes perfect sense, given that it would presumably justify another sustained effort to assert American omnipotence and sovereignty. (In saying it “makes perfect sense,” I mean, of course, in a perfectly insane way.)

“As a rule,” writes Todorov, “citizens in liberal democracies will condemn without hesitation the violent practices of a state that will tolerate torture, and especially of a state that systematizes its use, as in the case of totalitarian regimes. Now we have discovered that these same democracies can adopt totalitarian attitudes without changing their overall structure. This cancer does not eat away at a single individual; its metastases are found in people who thought they had eradicated it in others and considered themselves immune. That is why we cannot be reassured.”

True enough. But we have a long way to go before reassurance will be desirable, let alone possible.

Author/s: 
Scott McLemee
Author's email: 
scott.mclemee@insidehighered.com

Pages

Subscribe to RSS - History
Back to Top