Jerome Karabel's The Chosen is the big meta-academic book of the season -- a scholarly epic reconstructing "the hidden history of admission and exclusion at Harvard, Yale, and Princeton," as the subtitle puts it. Karabel, who is a professor of sociology at the University of California at Berkeley, has fished documents out of the archive with the muckraking zeal worthy of an investigative journalist. And his book, published this month by Houghton Mifflin, is written in far brisker narrative prose than you might expect from somebody working in either sociology or education. That's not meant as a dis to those worthy fields. But in either, the emphasis on calibrating one's method does tend to make storytelling an afterthought.
For Karabel really does have a story to tell. The Chosen shows how the gentlemanly anti-Semitism of the early 20th century precipitated a deep shift in how the country's three most prestigious universities went about the self-appointed task of selecting and grooming an elite.
It is (every aspect of it, really) a touchy subject. The very title of the book is a kind of sucker-punch. It is an old anti-Jewish slur, of course. It's an allusion to Jehovah's selection of the Jews as the Chosen People, of course. It's also a term sometimes used, with a sarcastic tone, as an ethnic slur. But Karabel turns it back against the WASP establishment itself -- in ways too subtle, and certainly too well-researched, to be considered merely polemical. (I'm going to highlight some of the more rancor-inspiring implications below, but that is due to my lack of Professor Karabel's good manners.)
The element of exposé pretty much guarantees the book a readership among people fascinated or wounded by the American status system. Which is potentially, of course, a very large readership indeed. But "The Chosen" is also interesting as an example of sociology being done in almost classical vein. It is a study of what, almost a century ago, Vilfredo Pareto called "the circulation of elites" -- the process through which "the governing elite is always in a state of slow and continuous transformation ... never being today what it was yesterday."
In broad outline, the story goes something like this. Once upon a time, there were three old and distinguished universities on the east coast of the United States. The Big Three were each somewhat distinctive in character, but also prone to keeping an eye on one another's doings.
Harvard was the school with the most distinguished scholars on its faculty -- and it was also the scene of President Charles Eliot's daring experiment in letting undergraduates pick most of their courses as "electives." There were plenty of the "stupid young sons of the rich" on campus (as one member of the Board of Overseers put it in 1904), but the student body was also relatively diverse. At the other extreme, Princeton was the country club that F. Scott Fitzgerald later described in This Side of Paradise. (When asked how many students there were on campus, a Princeton administrator famously replied, "About 10 percent.")
Finally, there was Yale, which had crafted its institutional identity as an alternative to the regional provincialism of Harvard, or Princeton's warm bath of snobbery. It was "the one place where money makes no difference ... where you stand for what you are," in the words of the then-beloved college novel Dink Stover, about a clean-cut and charismatic Yalie.
But by World War One, something was menacing these idyllic institutions: Namely, immigration in general and "the Hebrew invasion" in particular. A meeting of New England deans in the spring of 1918 took this on directly. A large and growing percentage of incoming students were the bright and driven children of Eastern European Jewish immigrants. This was particularly true at Harvard, where almost a fifth of the freshman class that year was Jewish. A few years later, the figure would reach 13 percent at Yale -- and even at Princeton, the number of Jewish students had doubled its prewar level.
At the same time, the national discussion over immigration was being shaped by three prominent advocates of "scientific" racism who worried about the decline of America's Nordic stock. They were Madison Grant (Yale 1887), Henry Fairfield Osborne (Princeton 1877), and Lothrop Stoddard (Harvard 1905).
There was, in short, an air of crisis at the Big Three. Even the less robustly bigoted administrators worried about (as one Harvard official put it) "the disinclination, whether justified or not, on the part of non-Jewish students to be thrown into contact with so large a proportion of Jewish undergraduates."
Such, then, was the catalyst for the emergence, at each university, of an intricate and slightly preposterous set of formulae governing the admissions process. Academic performance (the strong point of the Jewish applicants) would be a factor -- but one strictly subordinated to a systematic effort to weigh "character."
That was an elusive quality, of course. But administrators knew when they saw it. Karabel describes the "typology" that Harvard used to make an initial characterization of applicants. The code system included the Boondocker ("unsophisticated rural background"), the Taconic ("culturally depressed background," "low income"), and the Krunch ("main strength is athletic," "prospective varsity athlete"). One student at Yale was selected over an applicant with a stronger record and higher exam scores because, as an administrator put it, "we just thought he was more of a guy."
Now, there is a case to be made for a certain degree of flexibility in admissions criteria. If anything, given our reflex-like tendency to see diversity as such as an intrinsic good, it seems counterintuitive to suggest otherwise. There might be some benefit to the devil's-advocate exercise of trying to imagine the case for strictly academic standards.
But Karabel's meticulous and exhaustive record of how the admissions process changed is not presented as an argument for that sort of meritocracy. First of all, it never prevailed to begin with.
A certain gentlemanly disdain for mere study was always part of the Big Three ethos. Nor had there ever been any risk that the dim sons of wealthy alumni would go without the benefits of a prestigious education.
What the convoluted new admissions algorithms did, rather, was permit the institutions to exercise a greater -- but also a more deftly concealed -- authority over the composition of the student body.
"The cornerstones of the new system were discretion and opacity," writes Karabel; "discretion so that gatekeepers would be free to do what they wished and opacity so that how they used their discretion would not be subject to public scrutiny.... Once this capacity to adapt was established, a new admissions regime was in place that was governed by what might be called the 'iron law of admissions': a university will retain a particular admissions policy only so long as it produces outcomes that correspond to perceived institutional interests."
That arrangement allowed for adaptation to social change -- not just by restricting applicants of one minority status in the 1920s, but by incorporating underrepresented students of other backgrounds later. But Karabel's analysis suggests that this had less to do with administratorsbeing "forward-looking and driven by high ideals" than it might appear.
"The Big Three," he writes, "were more often deeply conservative and surprisingly insecure about their status in the higher education pecking order.... Change, when it did come, almost always derived from one of two sources: the continuation of existing policies was believed to pose a threat either to vital institutional interests (above all, maintaining their competitive positions) or to the preservation of the social order of which they were an integral -- and privileged -- part."
Late in the book, Karabel quotes a blistering comment by the American Marxist economist Paul Sweezy (Exeter '27, Harvard '31, Harvard Ph.D. '37) who denounced C. Wright Mills for failing to grasp "the role of the preparatory schools and colleges as recruiters for the ruling class, sucking upwards the ablest elements of the lower classes." Universities such as the Big Three thus performed a double service to the order by "infusing new brains into the ruling class and weakening the potential leadership of the working class."
Undoubtedly so, once upon a time -- but today, perhaps, not so much. The neglect of their duties by the Big Three bourgeoisie is pretty clear from the statistics.
"By 2000," writes Karabel, "the cost of a year at Harvard, Yale, and Princeton had reached the staggering sum of more than $35,000 -- an amount that well under 10 percent of American families could afford....Yet at all three institutions, a majority of students were able to pay their expenses without financial assistance -- compelling testimony that, more than thirty years after the introduction of need-blind admissions, the Big Three continued to draw most of their students from the most affluent members of society." The number of students at the Big Three coming from families in the bottom half of the national income distribution averages out to about 10 percent.
All of which is (as the revolutionary orators used to say) no accident. It is in keeping with Karabel's analysis that the Big Three make only as many adjustments to their admissions criteria as they must to keep the status quo ante on track. Last year, in a speech at the American Council on Education, Harvard's president, Larry Summers, called for preferences for the economically disadvantaged. But in the absence of any strong political or social movement from below -- an active, noisy menace to business as usual -- it's hard to imagine an institutionalized preference for admitting students from working families into the Big Three. (This would have to include vigorous and fairly expensive campaigns of recruitment and retention.)
As Walter Benn Michaels writes in the latest issue of N+1 magazine, any discussion of class and elite education now is an exercise in the limits of the neoliberal imagination. (His essay was excerpted last weekend in the Ideas section of The Boston Globe.
"Where the old liberalism was interested in mitigating the inequalities produced by the free market," writes Michaels, " neoliberalism -- with its complete faith in the beneficence of the free market -- is interested instead in justifying them. And our schools have a crucial role to play in this. They have become our primary mechanism for convincing ourselves that poor people deserve their poverty, or, to put the point the other way around, they have become our primary mechanism for convincing rich people that we deserve our wealth."
How does this work? Well, it's no secret that going to the Big Three pays off. If, in theory, the door is open to anyone smart and energetic, then everything is fair, right? That's equality of opportunity. And if students at the Big Three then turn out to be drawn mainly from families earning more than $100,000 per year....
Well, life is unfair. But the system isn't.
"But the justification will only work," writes Michaels, if "there really are significant class differences at Harvard. If there really aren't -- if it's your wealth (or your family's wealth) that makes it possible for you to go to an elite school in the first place -- then, of course, the real source of your success is not the fact that you went to an elite school but the fact that your parents were rich enough to give you the kind of preparation that got you admitted to the elite school. The function of the (very few) poor people at Harvard is to reassure the (very many) rich people at Harvard that you can't just buy your way into Harvard."
"Whoever cannot give to himself an adequate account of the past three thousand years," said Goethe, "remains in darkness, without history, living from day to day." That is an expression of a bedrock principle of liberal humanism, European-style. It takes the existence of the educated individual as its basic unit of reference -- its gold standard. But it also judges the quality of that existence by how much the individual has spent in acquiring a sense of the past. That expenditure also means, in effect, going into debt: You’ll never repay everything you owe to previous generations.
That outlook is, when you get right down to it, pretty un-American. It goes against the ideal of unencumbered self-creation that Emerson taught us –- in which we are supposed to throw off the burdens of the past, living always in the vital present. Fortunately, this is not hard to do. The first step is not to learn much history to begin with. (We are good at this.)
Even so, there may be an audience for E. H. Gombrich’s A Little History of the World, now available from Yale University Press, 70 years after it was first written. Imagine Goethe giving up the role of sage long enough to become a children’s author and you will have a reasonably good idea of the book’s content. It goes from prehistory up to the end of the (then-recent) Great War, with particular attention to ancient Greece, the Roman Empire, and the emergence of Judaism, Buddhism, Christianity, and Islam.
As for the style ... well, that is something even more remarkable. The tone is wry, at times, without ever being jokey -- a kind of light seriousness that is very respectful of its young audience. Each chapter is perfectly calibrated to suit the attention span and cognitive powers of a 10 year-old, without ever giving off a trace of condescension.
The effect, even for an adult reader, is incredibly charming –- and, indeed, instructive, at least for anyone with the occasional gap in that interior timeline. (Quick now: Who were the Hohenzollerns? And no, a vague sense that they were German doesn’t count.)
In his later and better-known role as art historian, Gombrich commanded a really humbling degree of erudition, but always with a certain generosity towards his audience. That combination is very much in evidence throughout his first book – one written in what must have been very trying circumstances.
It was Vienna in 1935. Gombrich was 26 and had recently finished his dissertation. (Writing one "was considered very important," he told a presumably incredulous audience at Rutgers University in 1987, "yet it didn’t take more than a little over a year to write.") His immediate job prospects ranged from the nonexistent to the merely terrible. Besides, he was Jewish, and the writing was on the wall, usually in the form of a swastika.
He managed to find part-time employment with a publishing company. He was asked to evaluate a book on world history for children in English, to see if it might be worth translating. He recommended against it, but offered instead to write one directly into German. It took him about six week, writing a chapter a day. The volume did quite well when it appeared in 1936, though the Nazis eventually stopped publication on the grounds of its "pacifism."
By then, he was in London, working at the Warburg Institute (a major art-history collection, where Gombrich in time became director) and aiding the war effort by translating German radio broadcasts into English. Before leaving Vienna, he had agreed to write another book, this one for adolescents, on the history of art. That project that grew into a rather more ambitious work, The Story of Art (1950) – long the standard overview of European art history, from which generations of museum tour-guides have cribbed.
He wrote it – along with his more monographic works on iconography and on the psychology of perception –- in English. When his Little History was reprinted in Germany in the mid-1980s, he wrote an afterward for it; but he turned down offers to have it translated into English, preferring to do that himself, and to make some necessary revisions. It is not clear from the edition now available from Yale just how far Gombrich got with that effort at the time of his death in 2001. (The title page gives the translator as Caroline Mustill.) But he did add a postscript called "The Small Part of the History of the World Which I Have Lived Through" – summing up the 20th century from World War I through the end of the Cold War, and trying to put as optimistic a spin on that record as possible.
The preface by Leonie Gombrich, his granddaughter, quotes some introductory remarks he prepared for the Turkish edition. His Little History, he wrote, "is not, and never was, intended to replace any textbooks of history that may serve a very different purpose at school. I would like my readers to relax, and to follow the story without having to take any notes or to memorize names and dates. In fact, I promise that I shall not examine them on what they have read."
But the book has a strong and serious pedagogical intent, even so. And it comes very directly from Goethe, whose work Gombrich read incessantly as a boy. Upon receiving the Goethe Prize in 1994, Gombrich said that it was the author’s life and writing that taught him "the consoling message ... of a universal citizenship that transcends the confines of nationhood." That seems very much the point of the Little History, which tries to squeeze all of global history into just under three hundred easily read pages –- and I strongly suspect it was just that cosmopolitanism that the Nazi censors really loathed.
Of course, there are gaps and oversights. One that is really troublesome is how the entire history of the Atlantic slave trade is reduced to the dimensions of a brief reference to the Civil War in the United States. This has the effect of making it seem like a distant and cruel episode in the New World, rather than what it really was: A vast and centuries-long process that enriched parts of Europe, depopulated parts of Africa, and anticipated every aspect of totalitarianism possible before the rise of industrialization and mass communications.
Not that Gombrich leaves the history of colonial atrocity entirely out of the picture, especially in recounting the conquest of the Americas: "This chapter in the history of mankind is so appalling and shameful to us Europeans that I would rather not say anything more about it."
In many ways, then, the book is at least as interesting as the specimen of a lost sensibility as it is in its own right, as a first introduction to history. Gombrich later spoke of how much he had been the product of that almost religious veneration of culture that prevailed among the European middle class of the 19th and early 20th centuries.
"I make no great claims for the universality of that tradition," he said during a lecture at Liverpool University in 1981. "Compared to the knowable, its map of knowledge was arbitrary and schematic in the extreme. As is true of all cultures, certain landmarks were supposed to be indispensable for orientation while whole stretches of land remained terra incognita, of relevance only to specialists..... But what I am trying to say is that at least there was a map."
Rick Perlstein, a friend from the days of Lingua Franca, is now working on a book about Richard Nixon. Last year, he published a series of in-depth articles about the Republican Party and the American conservative movement. (Those are not quite the same thing, though that distinction only becomes salient from time to time.) In short, Perlstein has had occasion to think about honesty and dissimulation -- and about the broad, swampy territory in between, where politicians finesse the difference. As do artists and used-car salesmen....
It’s the job of historians to map that territory. But philosophers wander there, too. “What is truth?” as Nietzsche once asked. “A mobile army of metaphors, metonymies, anthropomorphisms. Truths are illusions of which one has forgotten that they are illusions.” Kind of a Cheneyo-Rumsfeldian ring to that thought. It comes from an essay called “On Truth and Lie in an Extra-Moral Sense,” which does, too, come to think of it.
So anyway, about a week ago, Rick pointed out a recent discussion of how the Bush Administration is dealing with critics who accuse it of fudging the intelligence that suggested Saddam Hussein had weapons of mass destruction. The link went to a comment by Joshua Micah Marshall, who is a liberal Democrat of the more temperate sort, not prone to hyperventilation.
“Garden variety lying is knowing it’s Y and saying it’s X,” he wrote, giving Lyndon Johnson on the Gulf of Tonkin as an example. The present executive branch, he continued, shows “a much deeper indifference to factual information in itself.”
Rick posed an interesting question: “Isn't Josh Marshall here describing as the Administration's methodology exactly what that Princeton philosophy prof defines as ‘bullshit’?” That prof being, of course, Harry Frankfurt, whose short and best-selling treatise On Bullshit will probably cover everyone’s Christmas bonus at Princeton University Press this year.
In February, The New York Times beat us by a day or so with its article on the book, which daintily avoided giving its title. But "Intellectual Affairs" first took a close look, not just at Frankfurt’s text -- noting that it remained essentially unchanged since its original publication as a scholarly paper in the 1980s -- but at the philosophical critique of it presented in G.A. Cohen’s essay “Deeper into Bullshit.”
Since then, the call for papers for another volume of meditations on the theme of bull has appeared. Truly, we are living in a golden age.
The gist of Frankfurt’s argument, as you may recall, is that pitching BS is a very different form of activity from merely telling a lie. And Marshall’s comments do somewhat echo the philosopher’s point. Frankfurt would agree that “garden variety lying” is saying one thing when you know another to be true. The liar operates within a domain that acknowledges the difference between accuracy and untruth. The bullshitter, in Frankfurt’s analysis, does not. In a sense, then, the other feature of Marshall’s statement would seem to fit. Bullshit involves something like “indifference to factual information in itself.”
So does it follow, then, that in characterizing the Bush team’s state of mind three years ago, during the run-up to the war, we must choose between the options of incompetence, dishonesty, and bullshit? Please understand that I frame it in such terms, not from any political motive, but purely in the interest of conceptual rigor.
That said.... It seems to me that this range of terms is inadequate. One may agree that Bush et al. are profoundly indifferent to verifiable truth without concluding that the Frankfurt category necessarily applies.
Per G. A. Cohen’s analysis in “Deeper into Bullshit,” we must stress that Frankfurt’s model rests on a particular understanding of the consciousness of the liar. The mind of the bullshitter is defined by contrast to this state. For the liar, (1) the contrast between truth and untruth is clearly discerned, and (2) that difference would be grasped by the person to whom the liar speaks. But the liar’s intentionality also includes (3) some specific and lucidly grasped advantage over the listener made possible by the act of lying.
By contrast, the bullshitter is vague on (1) and radically unconcerned with (2). There is more work to be done on the elements of relationship and efficacy indicated by (3). We lack a carefully argued account of bullshit’s effect on the bullshitee.
There is, however, another possible state of consciousness not adequately described by Frankfurt’s paper. What might be called “the true believer” is someone possessing an intense concern with truth.
But it is a Higher Truth, which the listener may not (indeed, probably cannot) grasp. The true believer is speaking a truth that somehow exceeds the understanding of the person hearing it.
During the Moscow Trials of the late 1930s, Stalin’s attorney lodged numerous charges against the accused that were, by normal standards, absurd. In many cases, the “evidence” could be shown to be false. But so much worse for the facts, at least from the vantage point of the true believer. If you’ve ever known someone who got involved in EST or a multi-level marketing business, the same general principle applies. In each case, it is not quite accurate to say that the true believers are lying. Nor are they bullshitting, in the strictest sense, for they maintain a certain fidelity to the Higher Truth.
Similarly, it did not matter three years ago whether or not any evidence existed to link Saddam and Osama. To anyone possessing the Higher Truth, it was obvious that Iraq must be a training ground for Al Qaeda. And guess what? It is now. So why argue about it?
On a less world-historical scale, I see something interesting and apropos in Academe, the magazine of the American Association of University Professors. In the latest issue, David Horowitz makes clear that he is not a liar just because he told a national television audience something that he knew was not true.
(This item was brought to my attention by a friend who teaches in a state undergoing one of Horowitz’s ideological rectification campaigns. My guess is that he’d rather not be thanked by name.)
Here’s the story so far: In February, while the Ward Churchill debate was heating up, Horowitz appeared on Bill O’Reilly’s program. It came up that Horowitz, like Churchill, had been invited to lecture at Hamilton College at some point. But he was not, he said, “a speaker paid by and invited by the faculty.”
As we all know, university faculties are hotbeds of left-wing extremism. (Especially the business schools and engineering departments. And reports of how hotel-management students are forced to read speeches by Pol Pot are positively blood-curdling.) Anyway, whenever Horowitz appears on campus, it’s because some plucky youngsters invite him. He was at Hamilton because he had been asked by “the conservative kids.”
That came as a surprise to Maurice Isserman, a left-of-center historian who teaches at Hamilton College. When I saw him at a conference a few years ago, he seemed to have a little gray in his hair, and his last book, The Other American: The Life of Michael Harrington, was a biography of the founder of the Democratic Socialists of America. No doubt he’s been called all sorts of things over the years, but “conservative kid” is not one of them. And when Horowitz spoke at Hamilton a few years ago, it was as a guest lecturer in Isserman’s class on the 1960s.
As Isserman put it in the September/October issue of Academe: “Contrary to the impression he gave on "The O’Reilly Factor," Horowitz was, in fact, an official guest of Hamilton College in fall 2002, invited by a faculty member, introduced at his talk by the dean of the faculty, and generously compensated for his time.”
I will leave to you the pleasure and edification of watching Horowitz explain himself in the latest issue of Academe. But in short, he could not tell the truth because that would have been a lie, so he had to say something untrue in order to speak a Higher Truth.
My apologies for the pretzel-like twistiness of that paraphrase. It is all so much clearer in the original Newspeak: Thoughtcrime is doubleplus ungood.
Some months ago I started asking friends, colleagues from my teaching days, researchers in higher education, faculty members of various ages and ranks, deans, provosts and presidents, and focus groups of students: “What’s the status of the Big Questions on your campus?” Quite deliberately I avoided defining “Big Questions,” but I gave as examples such questions as “Who am I? Where do I come from? What am I going to do with my life? What are my values? Is there such a thing as evil? What does it mean to be human? How can I understand suffering and death? What obligations do I have to other people? What does it mean to be a citizen in a democracy? What makes work, or a life, meaningful and satisfying?” In other words, I wanted to know what was happening to questions of meaning and value that traditionally have been close to the heart of a liberal education.
Some of what I found puzzled me. People pointed out quite properly that some Big Questions were alive and well in academia today. These included some questions about the origin of the universe, the emergence of life, the nature of consciousness, and others that have been raised by the scientific breakthroughs of the past few decades.
In the humanities and related social sciences the situation was rather different. Some friends reminded me that, not all big questions were in eclipse. Over the past generation faculty members have paid great attention to questions of racial, ethnicity, gender and sexual identity. Curricular structures, professional patterns, etc. continue to be transformed by this set of questions. Professors, as well as students, care about these questions, and as a result, write, teach and learn with passion about them.
But there was wide agreement that other big questions, the ones about meaning, value, moral and civic responsibility, were in eclipse. To be sure, some individual faculty members addressed them, and when they did, students responded powerfully. In fact, in a recent Teagle-sponsored meeting on a related topic, participants kept using words such as “hungry,” “thirsty,” and “parched” to describe students’ eagerness to find ways in the curriculum, or outside it, to address these questions. But the old curricular structures that put these questions front and center have over the years often faded or been dismantled, including core curricula, great books programs, surveys “from Plato to NATO,” and general education requirements of various sorts. Only rarely have new structures emerged to replace them.
I am puzzled why. To be sure, these Big Questions are hot potatoes. Sensitivities are high. And faculty members always have the excuse that they have other more pressing things to do. Over two years ago, in an article entitled “Aim Low,” Stanley Fish attacked some of the gurus of higher education (notably, Ernest Boyer) and their insistence that college education should “go beyond the developing of intellectual and technical skills and … mastery of a scholarly domain. It should include the competence to act in the world and the judgment to do so wisely” ( Chronicle of Higher Education, May 16 2003). Fish hasn’t been the only one to point out that calls to “fashion” moral and civic-minded citizens, or to “go beyond” academic competency assume that students now routinely achieve such mastery of intellectual and scholarly skills. We all know that’s far from the case.
Minimalist approaches -- ones that limit teaching to what another friend calls “sectoral knowledge -- are alluring. But if you are committed to a liberal education, it’s hard just to aim low and leave it at that. The fact that American university students need to develop basic competencies provides an excuse, not a reason, for avoiding the Big Questions. Students also need to be challenged, provoked, and helped to explore the issues they will inevitable face as citizens and as individuals. Why have we been so reluctant to develop the structures, in the curriculum or beyond it, that provide students with the intellectual tools they need to grapple thoughtfully over the course of a lifetime with these questions?
I see four possible reasons:
1. Faculty members are scared away by the straw man Stanley Fish and others have set up. Despite accusations of liberal bias and “brainwashing” no faculty member I know wants to “mold,” “fashion” or “proselytize” students. But that’s not what exploring the Big Questions is all about. Along with all the paraphernalia college students bring with them these days are Big Questions, often poorly formulated and approached with no clue that anyone in the history of humankind has ever had anything useful to say about any of them. There’s no need to answer those questions for students, or to try to fashion them into noble people or virtuous citizens for the republic. There is, however, every reason to help students develop the vocabularies, the metaphors, the exempla, the historical perspective, the patterns of analysis and argument that let them over time answer them for themselves.
2. A second possible reason is that faculty are put off by the feeling they are not “experts” in these matters. In a culture that quite properly values professional expertise, forays beyond one’s field of competence are understandably suspect. But one does not have to be a moral philosopher to raise the Big Questions and show some of the ways smart people in the past have struggled with them. I won’t pontificate about other fields, but in my own field -- classics and ancient history -- the Big Questions come bubbling up between the floor boards of any text I have ever taught. I don’t have to be a specialist in philosophy or political science to see that Thucydides has something to say about power and morality, or the Odyssey about being a father and a husband. A classicist’s job, as I see it, is to challenge students to think about what’s implicit in a text, help them make it explicit and use that understanding to think with.
3. Or is it that engaging with these “Big Questions” or anything resembling them is the third rail of a professional career. Senior colleagues don’t encourage it; professional journals don’t publish it; deans don’t reward it and a half dozen disgruntled students might sink your tenure case with their teaching evaluations. You learn early on in an academic career not to touch the third rail. If this is right, do we need to rewire the whole reward system of academia?
4. Or, is a former student of mine, now teaching at a fine women’s college, correct when she says that on her campus “It tends to be that … those who talk about morality and the big questions come from such an entrenched far right position … that the rest of us … run for cover.”
Some of the above? All of the above? None of the above? You tell me, but let’s not shrug our shoulders and walk away from the topic until we’ve dealt with one more issue: What happens if, for whatever reason, faculty members run for the hills when the Big Questions, including the ones about morality and civic responsibility, arise? Is this not to lose focus on what matters most in an education intended to last for a lifetime? In running away, do we not then leave the field to ideologues and others we cannot trust, and create a vacuum that may be filled by proselytizers, propagandists, or the unspoken but powerful manipulations of consumer culture? Does this not sever one of the roots that has over the centuries kept liberal education alive and flourishing? But, most serious of all, will we at each Commencement say farewell to another class of students knowing that for all they have learned, they are ill equipped to lead an examined life? And if we do, can we claim to be surprised and without responsibility if a few decades later these same graduates abuse the positions of power and trust in our corporate and civic life to which they have ascended?
W. Robert Connor
W. Robert Connor is president of the Teagle Foundation, which is dedicated to strengthening liberal education. More on the foundation's “Big Questions” project may be found on its Web site. This essay is based on remarks Connor recently made at a meeting of the Middle Atlantic Chapters of Phi Beta Kappa, at the University of Pennsylvania.
I just finished grading a hefty stack of final examinations for my introductory-level U.S. history survey course. The results were baleful.
On one section of the exam, for example, I supplied some identification terms of events and personages covered in class, asking students to supply a definition, date, and significance for each term. In response to “Scopes Monkey Trial,” one student supplied the following:
"The scopes monkey trial was a case in the supreme court that debated teaching evolution in the schools. It happened in 1925. Mr. Scope a teacher in a school wanted to teach about God and did not want to teach about evolution. The ACLU brought in lawyers to help with the case of Mr. Scopes. In the end Mr. Scopes side did not have the people's opinion. Evolution won. It is significant because now you have to teach evolution in school, you can't teach about God."
This answer might be considered a nearly perfect piece of evidence against intelligent design of the universe, since it gets just about everything (apart from the date) wrong: punctuation, spelling, grammar, and historical fact.
For those needing a refresher, Tennessee high school biology teacher John T. Scopes assigned a textbook informed by evolutionary theory, a subject prohibited by the state legislature. The court ruled against Scopes, who had, obviously, broken the law. But the defense won in the court of public opinion, especially after the ACLU’s lawyer, Clarence Darrow, tore apart William Jennings Bryan, the former Democratic presidential candidate, witness for the prosecution, and Biblical fundamentalist. The press dubbed it the "Scopes Monkey Trial" (inaccurately, since the theory of human evolution centered upon apes) and pilloried Bryan. As Will Rogers put it, "I see you can't say that man descended from the ape. At least that's the law in Tennessee. But do they have a law to keep a man from making a jackass of himself?"
An outside observer might ascribe my student’s mistakes to the political culture of this Midwestern city, where barely a day goes by without a letter to the editor in the local paper from some self-appointed foot soldier of the religious right.
That, however, wouldn’t explain another student who thought the 1898 war between the United States and Spain, fought heavily in Cuba, was about communism (not introduced into Cuba until after the 1959 revolution). Nor would it explain a third student who thought that the Scopes verdict condoned Jim Crow racial segregation.
A minority of students performed admirably, receiving grades in the range of A, while hewing, of course, to varied interpretations. Their success proved the exam was based upon reasonable expectations. However, the median exam grade was a C -- the lowest I’ve yet recorded, and fairly devastating for a generation of students who typically aspire to a B.
I was wondering what to make of this dispiriting but solitary data set when I read about the Education Department study released late last week that shows that the average literacy of college-educated Americans declined precipitously between 1992 and 2003. Just 25 percent of college graduates scored high enough on the tests to be deemed “proficient” in literacy.
By this measure, literacy does not denote the mere ability to read and write, but comprehension, analysis, assessment, and reflection. While “proficiency” in such attributes ranks above “basic” or “intermediate,” it hardly denotes rocket science. It simply measures such tasks as comparing the viewpoints in two contrasting editorials.
The error-ridden response I received about the Scopes Monkey Trial speaks less to the ideological clash of science and faith than to a rather more elemental matter. As students in the 1960s used to say, the issue is not the issue. The issue is the declining ability to learn. The problem we face, in all but the most privileged institutions, is a pronounced and increasing deficiency of student readiness, knowledge, and capacity.
Neither right nor left has yet come to terms with the crisis of literacy and its impact on higher education. The higher education program of liberals revolves around access and diversity, laudable aims that do not speak to intellectual standards. Conservatives, for their part, are prone to wild fantasies about totalitarian leftist domination of the campuses. They cannot imagine a failure even more troubling than indoctrination -- the inability of students to assimilate information at all, whether delivered from a perspective of the left or the right.
It would be facile to blame the universities for the literacy crisis, since it pervades our entire culture at every level. The Education Department’s statistics found a 10-year decline in the ability to read and analyze prose in high school, college, and graduate students alike.
However, the crisis affects the university profoundly, and not only at open-enrollment institutions like the regional campus on which I teach. Under economic pressure from declining government funding and faced with market competition from low-bar institutions, many universities have increasingly felt compelled to take on students whose preparation, despite their possession of a high-school degree, is wholly inadequate. This shores up tuition revenue, but the core project of the higher learning is increasingly threatened by the ubiquitousness of semi-literacy.
How can human thought, sustained for generations through the culture of the book, be preserved in the epoch of television and the computer? How can a university system dedicated to the public trust and now badly eroded by market forces carry out its civic and intellectual mission without compromising its integrity?
These questions cry out for answer if we are to stem a tide of semi-literacy that imports nothing less than the erosion of the American mind.
Â Christopher Phelps is associate professor of history at Ohio State University at Mansfield.
If my recent experiences are any indication, we professors face a daunting challenge: The polarized American political environment has conditioned our students to see life in monochrome. The Right tells them to view all as either black or white, while the Left insists that everything is a shade of gray.
We’ve long struggled with the either/or student, the one who writes a history essay in which events are stripped of nuance and presented as the working out of God’s preordained plan; or the sociology student who wants to view poverty as a modern variant of 19th century Social Darwinism. These students -- assuming they’re not acting out some ideological group’s agenda -- can be helped along simply by designing lessons that require them to argue opposing points of view.
Yet despite all the hoopla about the resurgence of conservatism, I get more students whose blinders are more postmodern than traditional. This is to say that many of them don’t see the value of holding a steadfast position on much of anything, nor do they exhibit much understanding of those who do. They live in worlds of constant parsing and exceptions. Let me illumine through two examples.
In history classes dealing with the Gilded Age I routinely assign Edward Bellamy’s utopian novel Looking Backward. In brief, protagonist Julian West employs a hypnotist for his insomnia and retires to an underground chamber. His Boston home burns in 1887 and West is not discovered until 2000, when he is revived by Dr. Leete. He awakens to a cooperative socialist utopia. West’s comments on his time say much about late 19th century social conflict, and Leete’s description of utopian Boston make for interesting class discussion. I know that some students will complain about the novel’s didactic tone, others will argue that Bellamy’s utopia is too homogeneous, and a few will assert that Bellamy’s explanation of how utopia emerged is contrived. What I had not foreseen is how many students find the very notion of a utopia so far-fetched that many can’t move beyond incredulity to consider other themes.
When I paraphrase Oscar Wilde that a map of the world that doesn’t include Utopia isn’t worth glancing at, some students simply don’t get it. "Utopia is impossible” is the most common remark I hear. “Perhaps so,” I challenge, “but is an impossible quest the same as a worthless quest?” That sparks some debate, but the room lights up when I ask students to explain why a utopia is impossible. Their reasons are rooted more in contemporary frustration than historical failure. Multiculturalism is often cited. “The world is too diverse to ever get people to agree” is one rejoinder I often receive.
It’s disturbing enough to contemplate that a social construct designed to promote global understanding can be twisted to justify existing social division, but far more unsettling was often comes next. When I ask students if they could envision dystopia, the floodgates open. No problems on that score! In fact, they draw upon popular culture to chronicle various forms of it: Escape From New York, Blade Runner, Planet of the Apes…. “Could any of these happen?” I timidly ask. “Oh sure, these could happen easily,” I’m told.
My second jolt came in a different form, an interdisciplinary course I teach in which students read Tim O’Brien’s elegantly written Vietnam War novel The Things They Carried. O’Brien violates old novelistic standards; his book is both fictional and autobiographical, with the lines between the two left deliberately blurred. My students adored the book and looked at me as if they had just seen a Model-T Ford when I mentioned that a few critics felt that the book was dishonest because it did not distinguish fact from imagination. “It says right on the cover ‘a work of fiction’” noted one student. When I countered that we ourselves we using it to discuss the actual Vietnam War, several students immediately defended the superiority of metaphorical truth because it “makes you think more.” I then asked students who had seen the film The Deer Hunter whether the famed Russian roulette scene was troubling, given that there was no recorded incident of such events taking place in Vietnam. None of them were bothered by this.
I mentioned John Sayles’ use of composite characters in the film Matewan. They had no problem with that, though none could tell me what actually happened during the bloody coal strikes that convulsed West Virginia in the early 1920s. When I probed whether writers or film makers have any responsibility to tell the truth, not a single student felt they did. “What about politicians?” I asked. While many felt that truth-telling politicians were no more likely than utopia, the consensus view was that they should tell the truth. I then queried, “So who gets to say who has to tell the truth and who gets to stretch it?” I was prepared to rest on my own clever laurels, until I got the students’ rejoinder! Two of my very best students said, in essence, that all ethics are situational, with one remarking, “No one believes there’s an absolute standard of right and wrong.” I tentatively reminded him that many of the 40 million Americans who call themselves "evangelical Christians" believe rather firmly in moral absolutes. From the back of the room pipped a voice, “They need to get over themselves.”
I should interject that this intense give-and-take was possible because I let my students know that their values are their own business. In this debate I went out of way to let them know I wasn’t condemning their values; in fact, I share many of their views on moral relativism, the ambiguity of truth, and artistic license. But I felt I could not allow them to dismiss objective reality so cavalierly. Nor, if I am true to my professed belief in the academy as a place where various viewpoints must be engaged, could I allow them to refuse to consider anyone who holds fast to moral absolutism.
The stories have semi-happy endings. I eventually got my history students to consider the usefulness of utopian thinking. This happened after I suggested that people of the late 19th century had better imaginations than those of the early 21st, which challenged them to contemplate the link between utopian visions and reform, and to see how a moralist like Bellamy could inspire what they would deem more pragmatic social changes. My O’Brien class came through when I taught the concept of simulacra, showed them a clip from the film Wag the Dog and then asked them to contemplate why some see disguised fiction as dangerous. (Some made connections to the current war in Iraq, but that’s another story!)
My goal in both cases was to make students see points of view other than their own. Both incidents also reminded me it’s not just the religious or conservative kids who need to broaden their horizons. We need to get all students to see the world in Technicolor, even when their own social palettes are monochromatic. Indeed, the entire academy could do worse than remember the words of Dudley Field Malone, one of the lawyers who defended John T. Scopes. Malone remarked, “I have never in my life learned anything from any man who agreed with me.”
Robert E. Weir
Robert E. Weir is a visiting professor at Commonwealth College of the University of Massachusetts at Amherst and in the American studies program at Smith College.
Two images of William Jennings Bryan have settled into the public memory, neither of them flattering. One is the fundamentalist mountebank familiar to viewers of Inherit the Wind, with its fictionalized rendering of the Scopes trial. In it, the character based on Bryan proclaims himself “more interested in the Rock of Ages than the age of rocks.” He is, in short, a crowd-pleasing creationist numbskull, and nothing more.
The other portrait of Bryan is less cinematic, but darker. The classic version of it appears in Richard Hofstadter’s classic The American Political Tradition, first published in 1948 and still selling around 10,000 copies each year, according to a forthcoming biography of the historian. Hofstadter sketches the career of Bryan as a populist leader during the economic depression of the 1890s, when he emerged as the Midwest’s fierce and eloquent scourge of the Eastern bankers and industrial monopolies.
Yet this left-leaning Bryan had, in Hofstadter’s account, no meaningful program for change. He was merely a vessel of rage. Incapable of statesmanship, only of high-flown oratory, he was a relic of the agrarian past –- and the prototype of the fascistic demagogues who were discovering their own voices, just as Bryan’s career was reaching its end.
Historians have been challenging these interpretations for decades -– beginning in earnest more than 40 years ago, with the scholarship of Lawrence W. Levine, who is now a professor of history and cultural studies at George Mason University. It was Levine who pointed out that when Bryan denounced evolution, he tended to be thinking more of Nietzsche than of Darwin. And the Nietzsche he feared was not today’s poststructuralist playboy, but the herald of a new age of militaristic brutality.
Still, old caricatures die hard. It may be difficult for the contemporary reader to pick up Michael Kazin’s new book, A Godly Hero: The Life of William Jennings Bryan (Knopf) without imagining that its title contains a snarl and a sneer. Isn’t the rhetoric of evangelical Christianity and anti-elitist sentiment always just a disguise for base motives and cruel intentions? To call someone godly is now, almost by default, to accuse them of hypocrisy.
But Kazin, who is a professor of history at Georgetown University, has a very different story to tell. Revisionist scholarship on Bryan -- the effort to dig beneath the stereotypes and excavate his deeper complexities -- has finally yielded a book that might persuade the general reader to rethink the political role played by “the Great Commoner.”
In an earlier study, The Populist Persuasion: An American History (Basic Books, 1995), Kazin described the emergence in the 19th century of an ideology he called “producerism” – a moral understanding of politics as the struggle between those who built the nation’s infrastructure and those who exploited it. (The farmer or the honest businessman was as much a producer as the industrial worker. Likewise, land speculators and liquor dealers were part of the exploitive class, as were bankers and monopolistic scoundrels.)
The producerist ethos remains a strong undercurrent of American politics today. Bryan was its most eloquent spokesman. He wedded it to a powerful (and by Kazin’s account utterly sincere) belief that politics was a matter of following the commandment to love thy neighbor. As a man of his era, Bryan could be obtuse about how to apply that principle: His attitude toward black Americans was paternalistic, on a good day, and he was indifferent, though not hostile, concerning the specific problems facing immigrants. But Kazin points out that there is no sign of nativist malice in Bryan’s public or private communications. Some of his followers indulged in conspiratorial mutterings against the Catholics or the Jews, but Bryan himself did not. At the same time -- canny politician that he was -- he never challenged the growing power of the Klan during the 1920s.
It’s an absorbing book, especially for its picture of Bryan’s following. (He received an incredible amount of mail from them, only about two percent of which, Kazin notes, has survived.) I contacted Kazin to ask a few questions by e-mail.
Q: By today's standards, Bryan seems like a bundle of contradictions. He was both a fundamentalist Christian and the spokesman for the left wing of the Democratic Party. He embodied a very 19th century notion of "character," but was also exceptionally shrewd about marketing his own personality. For many Americans, he was a beloved elder statesman -- despite losing his two presidential bids and otherwise spending very little time in elected office. How much of that contradictoriness is in Bryan himself, and how much in the eye of the beholder today?
A: Great question! The easiest part to answer is the first: for Bryan and many other reform-minded white Christians, there was no contradiction between their politics and their religion. The “revolution” being made by the Carnegies and Vanderbilts and Rockefellers was destroying the pious republic they knew, or wished to remember (slavery, of course, they forgot about). What Bryan called “applied Christianity” was the natural antidote to the poison of rampant capitalism. The rhetoric of Bellamy, the People’s Party, and the Knights of Labor was full of such notions -– as were the sermons and writings of many Social Gospelers, such as Washington Gladden and Charles Stelzle.
On the character-personality question – I think Warren Susman and many historians he influenced over-dichotomize these two concepts. No serious Christian could favor the latter over the former. Yet, the exigencies of the cultural marketplace and of celebrity culture in particular produced a fascination with the personal lives of the famous. So Bryan, who was as ego-obsessed as any politician, went with the flow, knowing his personality was boosting his political influence. Being a journalist himself, he understood the rules of the emerging game. Do you know Charles Ponce De Leon’s book about celebrity journalism in this period?
Q. Oddly enough, I do, actually. But let's talk about the people to whom Bryan appealed. From digging in the archives, you document that Bryan had a following a loyal following among professionals, and even successful businessmen, who saw themselves as part of the producing class threatened by the plutocratic elite. Was that surprising to discover? Normally you think of populism in that era as the politics of horny-handed toil.
A: As I argued in The Populist Persuasion, when “producerism” became a popular ideal in democratic politics, Americans from many classes were quite happy to embrace it. It thus became an essential contested concept. But among a broad cross-section of people, the critique of finance capital was always stronger in the South and West, where Bryan had his most consistent support, than in places like Philly and NYC.
As for the letters -— I enjoyed that part of the research the most, although it was frustrating as hell to find almost no letters that were written during the campaign of 1908 and only a small number from then until the 1920s. If only WJB or his wife had revealed, somewhere, the criteria they used when dumping all that correspondence! That, at least,would have been a consolation. Of course, if they had kept nearly all of it, I’d still be there in the Manuscript Room at the Library of Congress, tapping away.
Q: I get the impression that Bryan might well have become president if women had been able to vote in 1896 or 1900. How much of his appeal came from expressing the moral and cultural ideals associated with women's "civilizing" role? And how much of it was sex appeal of his rugged personality and magnetic stage presence?
A: Ah, the counterfactuals! Bryan’s image as a “godly hero” certainly did appeal to many women, as did his eloquence and good looks (the latter, at least while he was in his 30s and early 40s). His support for prohibition and woman suffrage would have appealed to many women as well.
In 1896 and 1900, he carried most of the states where women then had the vote (in the Mountain West). Although that may have been because of his free silver and anti-big money stands, which is probably why most men in those states voted for him. On the other hand, his radical image could have limited his appeal to women elsewhere in the country. Women voters, before the 1960s, tended to vote for safe, conservative candidates.
Q: Another counterfactual.... What if Bryan had won? What sort of president would he have been? The man was great at making speeches; none better. But could he really have cut it as Chief Executive?
A: As president, he probably would have been a divisive figure, perhaps an American Hugo Chavez -— but without the benefit of oil revenues! If he tried to carry out the 1896 platform, there may have been a capital strike against him, which would have brought on another depression. If he hadn’t, the Populists and many left Democrats would have deserted him. The sad fact is that he hadn’t built a strong enough constituency to govern, much less to win the election in the first place.
Q: Finally, a question about the subjective dimension of working on this biography. Any moments of profound disillusionment? Rapt admiration? Sudden epiphany?
A: I wish I had time to pursue this important question at length -- perhaps I’ll write an essay about it someday. But briefly: I started reading all those fan letters and experienced an epiphany. Millions of ordinary people adored this guy and thought he was a prophet! And he was certainly fighting the good fight -– against Mark Hanna and his friends who were initiating the U.S. empire.
I also was impressed by his ability as a speech-writer as well as a performer. He could turn a phrase quite brilliantly. But after a year or so, I had to come to grips with his bigotry against black people and his general inability to overcome his mistrust of urban pols (although he didn’t share the anti-Catholicism and anti-Semitism of some of his followers).
The problem was, in the America of a century ago, Bryan would not have been a hero to the white evangelical grassroots if he had been as clever and cosmopolitan a pol as FDR. So I ended up with a historian’s sense of perspective about the limits of the perceptions and achievements of the past. In recent speeches, E.J. Dionne muses that perhaps we should ask “What Would William Jennings Bryan Do?” I’m not sure that’s a useful question.
During the early decades of the 20th century, a newspaper called The Avery Boomer served the 200 or so citizens of Avery, Iowa. It was irregular in frequency, and in other ways as well. Each issue was written and typeset by one Axel Peterson, a Swedish immigrant who described himself as "lame and crippled up," and who had to make time for his journalistic labors while growing potatoes. A member of the Socialist Party, he had once gained some notoriety within it for proposing that America’s radicals take over Mexico to show how they would run things. Peterson was well-read. He developed a number of interesting and unusual scientific theories -- also, it appears, certain distinctive ideas about punctuation.
Peterson regarded himself, as he put it, as "a Social Scientist ... developing Avery as a Social Experiment Station" through his newspaper. He sought to improve the minds and morals of the townspeople. This was not pure altruism. Several of them owed Petersen money; by reforming the town, he hoped to get it back.
But he also wanted citizens to understand that Darwin's theory of evolution was a continuation of Christ's work. He encouraged readers to accelerate the cause of social progress by constantly asking themselves a simple question: "What would Jesus do?"
I discovered the incomparable Peterson recently while doing research among some obscure pamphlets published around 1925. So it was a jolt to find that staple bit of contemporary evangelical Christian pop-culture -- sometimes reduced to an acronym and printed on bracelets -- in such an unusual context. But no accident, as it turns out: Peterson was a fan of the Rev. Charles M. Sheldon’s novel In His Steps (1896), which is credited as the source of the whole phenomenon, although he cannot have anticipated its mass-marketing a century later.
Like my wild potato-growing Darwinian socialist editor, Sheldon thought that asking WWJD? would have social consequences. It would make the person asking it “identify himself with the great causes of Humanity in some personal way that would call for self-denial and suffering,” as one character in the novel puts it.
Not so coincidentally, Garry Wills takes a skeptical look at WWJD in the opening pages of his new book, What Jesus Meant, published by Viking. He takes it as a variety of spiritual kitsch -- an aspect of the fundamentalist and Republican counterculture, cemtered around suburban mega-churches offering a premium on individual salvation.
In any case, says Wills, the question is misleading and perhaps dangerous. The gospels aren’t a record of exemplary moments; the actions of Jesus are not meant as a template. “He is a divine mystery walking among men,” writes Wills. “The only way we can directly imitate him is to act as if we were gods ourselves -- yet that is the very thing he forbids.”
Wills, a professor emeritus of history at Northwestern University, was on the way to becoming a Jesuit when he left the seminary, almost 50 years ago, to begin writing for William F. Buckley at The National Review. At the time, that opinion magazine had a very impressive roster of conservative literary talent; its contributors included Joan Didion, Hugh Kenner, John Leonard, and Evelyn Waugh. (The mental firepower there has fallen off a good bit in the meantime.) Wills came to support the civil rights movement and oppose the Vietnam war, which made for a certain amount of tension; he parted ways with Buckley’s journal in the early 1970s. The story is told in his Confessions of a Conservative (1979) – a fascinating memoir, intercalated with what is, for the nonspecialist anyway, an alarmingly close analysis of St. Augustine’s City of God.
Today -- many books and countless articles later -- Wills is usually described as a liberal in both politics and theology, though that characterization might not hold up under scrutiny. His outlook is sui generis, like that of some vastly more learned Axel Peterson.
His short book on Jesus is a case in point. You pick it up expecting (well, I did, anyway) that Wills might be at least somewhat sympathetic to the efforts of the Jesus Seminar to identify the core teachings of the historical Jesus. Over the years, scholars associated with the seminar cut away more and more of the events and sayings attributed to Jesus in the four gospels, arguing that they were additions, superimposed on the record later.
After all this winnowing, there remained a handful of teachings -- turn the other cheek, be a good Samaritan, love your enemies, have faith in God -- that seemed anodyne, if not actually bland. This is Jesus as groovy rabbi, urging everybody to just be nice. Which, under the circumstances, often seems to the limit of moral ambition available to the liberal imagination.
Wills draws a firm line between his approach and that of the Jesus Seminar. He has no interest in the scholarly quest for “the historical Jesus,” which he calls a variation of fundamentalism: “It believes in the literal sense of the Bible,” writes Wills, “it just reduces the Bible to what it can take as literal quotations from Jesus.” Picking and choosing among the parts of the textual record is anathema to him: “The only Jesus we have,” writes Wills, “is the Jesus of faith. If you reject the faith, there is no reason to trust anything the Gospels say.
He comes very close to the position put forward by C.S. Lewis, that evangelical-Christian favorite. “A man who was merely a man and said the sort of things Jesus said,” as Lewis put it, “would not be a great moral teacher. He would either be a lunatic -- on a level with the man who says he is a poached egg -- or else he would be the Devil of Hell. You must make your choice. Either this man was, and is, the Son of God; or else a madman or something worse.”
That’s a pretty stark range of alternatives. For now I’ll just dodge the question and run the risk of an eternity in weasel hell. Taking it as a given that Jesus is what the Christian scriptures say he claimed to be -- “the only-begotten Son of the Father” -- Wills somehow never succumbs to the dullest consequence of piety, the idea that Jesus is easy to understand. “What he signified is always more challenging than we expect,” he writes, “more outrageous, more egregious.”
He was, as the expression goes, transgressive. He “preferred the company of the lowly and despised the rich and powerful. He crossed lines of ritual purity to deal with the unclean – with lepers, the possessed, the insane, with prostitutes and adulterers and collaborators with Rome. (Was he subtly mocking ritual purification when he filled the waters with wine?) He was called a bastard and was rejected by his own brothers and the rest of his family.”
Some of that alienation had come following his encounter with John the Baptist -- as strange a figure as any in ancient literature: “a wild man, raggedly clad in animal skins, who denounces those coming near to him as ‘vipers offspring.’” Wills writes that the effect on his family must have been dismaying: “They would have felt what families feel today when their sons or daughters join a ‘cult.’”
What emerges from the gospels, as Wills tell it, is a figure so abject as to embody a kind of permanent challenge to any established authority or code of propriety. (What would Jesus do? Hang out on skid row, that’s what.) His last action on earth is to tell a criminal being executed next to him that they will be together in paradise.
Wills says that he intends his book to be a work of devotion, not of scholarship. But the latter is not lacking. He just keeps it subdued. Irritated by the tendency for renderings of Christian scripture to have an elevated and elegant tone, Wills, a classicist by training, makes his own translations. He conveys the crude vigor of New Testament Greek, which has about as much in common with that of Plato as the prose of Mickey Spillane does with James Joyce. (As Nietzsche once put it: “It was cunning of God to learn Greek when He wished to speak to man, and not to learn it better.”)
Stripping away any trace of King James Version brocade, Wills leaves the reader with Jesus’s words in something close to the rough eloquence of the public square. “I say to all you can hear me: Love your foes, help those who hate you, praise those who curse you, pray for those who abuse you. To one who punches your cheek, offer the other cheek. To one seizing your cloak, do not refuse the tunic under it. Whoever asks, give to him. Whoever seizes, do not resist. Exactly how you wish to be treated, in that way treat others.... Your great reward will be that you are the children of the Highest One, who also favors ingrates and scoundrels.”
A bit of sarcasm, perhaps, there at the end -- which is something I don’t remember from Sunday school, though admittedly it has been a while. The strangeness of Jesus comes through clearly; it is a message that stands all “normal” values on their head. And it gives added force to another remark by Nietzsche: “In truth, there was only one Christian, and he died on the cross.”
Thursday was a long day -- one spent with my brain marinating in historiography. I passed the morning with a stack of JSTOR printouts about Richard Hofstadter, whose The American Political Tradition (1948) still sells about 10,000 copies a year. Hofstadter died in 1970. Enough time has passed for his reputation to have been overthrown, restored, and overthrown again. (As someone who grew up listening to theories about the JFK assassination on talk radio in Texas, I can take anti-Hofstadter revisionism seriously only up to a certain point. The man who wrote a book diagnosing The Paranoid Style in American Politics seems like a strong candidate for immortality.)
Only now is there a full-length treatment of his life, Richard Hofstadter: An Intellectual Biography by David S. Brown, just published by the University of Chicago Press. Asked by a magazine to review it, I have been going over the footnotes and making a long march through the secondary literature. Which is easier than writing, of course, and a lot more fun -- the kind of serious-minded procrastination that requires hours. It sure ate up the morning.
Then, after lunch, I headed over the Washington Hilton to pick up press credentials for the annual convention of the Organization of American Historians. Tens of thousands of bloodthirsty jihadist-commie professors are infesting the nation’s campuses, as you have probably been reading of late -- with the historians being a particularly vile lot, turning almost the entire discipline into one big Orwellian indoctrination camp. “Now this,” I thought, “I gotta see.”
Going over the program, it was particularly interesting to notice a session called “The Creation of the Christian Right.” If the rumors were even half true, it would be one long rant against the Bush administration. Each paper would (to renew the Orwell bit) provide the standard Fifteen Minutes Hate, right?
Maybe that should be Twenty Minutes. Who ever keeps within time limits?
Actually, no. Everybody was calm and nobody ran over. The first paper looked at how Protestant and Roman Catholic social conservatives overcame their mutual distrust during the 1950s. Another analyzed the relationship between Billy Graham and Richard Nixon. The third and final presentation argued that the anti-abortion movement played a very minor role in defining the conservative agenda until it got a plank on the GOP’s platform in 1976. (That same year, when Betty Ford told a New York Times reporter that she considered the Roe v. Wade decision to be a fine thing, her comment appeared in the 20th paragraph of an article appearing on page 16 of The New York Times. A First Lady from the GOP making that statement anytime since then would have gotten a little more attention.)
Each presentation was the work of someone who had done substantial work among primary sources, including archival material. The researchers were alert to how the different factions and constituencies of the conservative movement interacted with one another.
But fervor, condemnation, editorializing by proxy? Not a bit of it.
For that matter, you couldn’t even hear the sort of ironic disdain that Hofstadter, writing decades ago, brought to analyzing McCarthyism or the Goldwater campaign. That tone had reflected the Mencken-esque judgement that American conservatism was just another manifestation of boobery and yahooism.
It was puzzling. If ever a session seemed likely to provide a concentrated dose of jihadist-commie propaganda, it would be one called “The Creation of the Christian Right.” Chances are, the young scholars giving papers did have political opinions. But they did not use the podium as a soapbox.
I guess they had been brainwashed by the OAH into practicing the most disinterested, rigorous sort of professional historical inquiry. Apart from being dangerous, those professors sure are sneaky. You’d almost think they were trying to make somebody look like a boob and a yahoo.
Later, another panel discussed the history of the idea of "the liberal establishment." Once again, I went expecting a strident call to destroy the Great Satan of the American Empire. And once again, it was all careful research and calm reason -- despite the fact that the scholar invited to respond to the papers was Michael Kazin, who had even made Horowitz’s list.
Between sessions, there was time to visit the exhibit hall. It was a chance to gaze upon recent offerings from the university presses. All the while, a small but very persistent voice whispered in my ear. “You don’t need more books,” the voice said. “Where would you put them?”
It sounded a lot like my wife.
Other conference-goers were wandering aisles, men and women of all ages; and some bore expressions suggesting that they, too, received similar wireless transmission from significant others back home. And yet those people picked up the new books, even so. I took courage from their example.
That evening, at a Chinese restaurant a few blocks downhill, I joined a group of convention-goers, most of them associated with Cliopatria, the group blog published by the History News Network. The gathering was all "off the record" -- an occasion for conviviality, rather than for news-gathering. But the relaxed flow of the proceedings took an odd turn around the time the main course arrived.
That was when someone indicated that it might be time for historians to work on a topic that I know rather well -- that, indeed, I had witnessed and to some degree participated in. And that was the late and much-lamented magazine Lingua Franca, the subtitle of which called it “The Review of Academic Life.”
That day, on the Web site of The New York Observer, there had appeared an essay on LF by Ron Rosenbaum -- the author of, among other things, a brilliant and unnerving book called Explaining Hitler.
In his piece, Rosenbaum lauded the magazine as a place that did not merely report on university life, but encouraged "thinking about the nature of human nature and human society, the nature of the cosmos, the nature of the mind itself (thinking about factors that underlie all politics)." Similar tributes were being offered around the table as the dishes were delivered. Somebody compared LF to Partisan Review. One historian suggested that it was time for a monograph.
Meanwhile I chewed my tongue quietly. Between 1995 and 2001, I had been a regular contributor to the magazine. Not that many publications with large audiences would let you write about the literary criticism of Northrop Frye, the philosophical architectonics of Richard McKeon, or the strange little pamphlet that Immanuel Kant wrote about the mystical visions of Emmanuel Swedenborg. Even fewer would then pay you. Now it molders in “the elephants’ graveyard of dead magazines.”
Elephants are supposed to have powerful memories, of course. Now it seems to be time for the historians of journalism to do the remembering. But when I look back at that period, it’s not to recall the glory days. There are too many recollections of botched opportunities and missed deadlines, and the occasional wince-inducing editorial decision. A few droplets of bad blood are sprayed across the sepia-toned mental snapshots. If I tried to write about LF, the result would probably be a satirical novel instead of a eulogy.
It might sound vaguely flattering to imagine that part of one’s own experience will probably, sooner or later, be studied by intelligent people. But in fact it is a little disconcerting.
Scholars will notice aspects of the past that you did not. There will be things charged with indelible personal significance for you that nobody else will recognize. It is hard not to cling to those nuances. To assume that you have a privileged relationship to the past, simply by virtue of having been there. But that’s not how history works.
No, the right attitude is probably the one cultivated by Richard Hofstadter. He was a master at grasping the paradoxes defining his discipline. Few writers have better captured the gap between what people in the past [ital]thought[ital] they were doing, on the one hand, and what their actions actually meant, on the other.
Hofstadter once cited a passage from Nietzsche that summed up his own outlook. “Objection, evasion, joyous distrust, and love of irony are signs of health,” the quotation ran. “Everything absolute belongs to pathology.” It’s worth keeping in mind when in thinking about the private history called memory -- not to mention the yet-unwritten history whizzing past, every hour of every day.
A young Web designer named Aaron Swartz has now created a mirror of the long-defunct Lingua Franca Web site.
For a considerably less impressionistic account of the convention, check out Rick Shenkman’s fine roundup of OAH.
For better and for worse, the American reception of contemporary French thought has often followed a script that frames everything in terms of generational shifts. Lately, that has usually meant baby-boomer narcissism -- as if the youngsters of '68 don't have enough cultural mirrors already. Someone like Bernard-Henri Lévy, the roving playboy philosopher, lends himself to such branding without reserve. Most of his thinking is adequately summed up by a thumbnail biography -- something like, "BHL was a young Maoist radical in 1968, but then he denounced totalitarianism, and started wearing his shirts unbuttoned, and the French left has never recovered."
Nor are American academics altogether immune to such prepackaged blendings of theory and lifestyle. Hey, you -- the Foucauldian with the leather jacket that doesn't fit anymore....Yeah, well, you're complicit too.
But there are thinkers who don't really follow the standard scripts very well, and Pierre Rosanvallon is one them. Democracy Past and Future, the selection of his writings just published by Columbia University Press, provides a long overdue introduction to a figure who defies both sound bites and the familiar academic division of labor. Born in 1948, he spent much of the 1970s as a sort of thinker-in-residence for a major trade union, the Confédération Française Démocratique du Travail, for which he organized seminars and conferences seeking to create a non-Marxist "second left" within the Socialist Party. He emerged as a theoretical voice of the autogestion (self-management) movement. His continuing work on the problem of democracy was honored in 2001 when he became a professor at the Collège de France, where Rosanvallon lectures on the field he calls "the philosophical history of the political."
Rosanvallon has written about the welfare state. Still, he isn't really engaged in political science. He closely studies classical works in political philosophy -- but in a way that doesn't quite seem like intellectual history, since he's trying to use the ideas as much as analyze them. He has published a study of the emergence of universal suffrage that draws on social history. Yet his overall project -- that of defining the essence of democracy -- is quite distinct from that of most social historians. At the same time (and making things all the more complicated) he doesn't do the kind of normative political philosophy one now associates with John Rawls or Jurgen Habermas.
Intrigued by a short intellectual autobiography that Rosanvallon presented at a conference a few years ago, I was glad to see the Columbia volume, which offers a thoughtful cross-section of texts from the past three decades. The editor, Samuel Moyn, is an assistant professor of history at Columbia. He answered my questions on Rosanvallon by e-mail.
Q:Rosanvallon is of the same generation as BHL. They sometimes get lumped together. Is that inevitable? Is it misleading?
A: They are really figures of a different caliber and significance, though you are right to suggest that they lived through the same pivotal moment. Even when he first emerged, Bernard-Henri Lévy faced doubts that he mattered, and a suspicion that he had fabricated his own success through media savvy. One famous thinker asked whether the "new philosophy" that BHL championed was either new or philosophy; and Cornelius Castoriadis attacked BHL and others as "diversionists." Yet BHL drew on some of the same figures Rosanvallon did -- Claude Lefort for example -- in formulating his critique of Stalinist totalitarianism. But Lefort, like Castoriadis and Rosanvallon himself, regretted the trivialization that BHL's meteoric rise to prominence involved.
So the issue is what the reduction of the era to the "new philosophy" risks missing. In retrospect, there is a great tragedy in the fact that BHL and others constructed the "antitotalitarian moment" (as that pivotal era in the late 1970s is called) in a way that gave the impression that a sententious "ethics" and moral vigilance were the simple solution to the failures of utopian politics. And of course BHL managed to convince some people -- though chiefly in this country, if the reception of his recent book is any evidence -- that he incarnated the very "French intellectual" whose past excesses he often denounced.
In the process, other visions of the past and future of the left were ignored. The reception was garbled -- but it is always possible to undo old mistakes. I see the philosophy of democracy Rosanvallon is developing as neither specifically French nor of a past era. At the same time, the goal is not to substitute a true philosopher for a false guru. The point is to use foreign thinkers who are challenging to come to grips with homegrown difficulties.
Q:Rosanvallon's work doesn't fit very well into some of the familiar disciplinary grids. One advantage of being at the Collège de France is that you get to name your own field, which he calls "the philosophical history of the political." But where would he belong in terms of the academic terrain here?
A: You're right. It's plausible to see him as a trespasser across the various disciplinary boundaries. If that fact makes his work of potential interest to a great many people -- in philosophy, politics, sociology, and history -- it also means that readers might have to struggle to see that the protocols of their own disciplines may not exhaust all possible ways of studying their questions.
But it is not as if there have not been significant interventions in the past -- from Max Weber for example, or Michel Foucault in living memory -- that were recognized as doing something relevant to lots of different existing inquiries. In fact, that point suggests that it may miss the point to try to locate such figures on disciplinary maps that are ordinarily so useful. If I had to sum up briefly what Rosanvallon is doing as an intellectual project, I would say that the tradition of which he's a part -- which includes his teacher Lefort as well as some colleagues like Marcel Gauchet and others -- is trying to replace Marxism with a convincing alternative social theory.
Most people write about Marxism as a political program, and of course any alternative to it will also have programmatic implications. But Marxism exercised such appeal because it was also an explanatory theory, one that claimed, by fusing the disciplines, to make a chaotic modern history -- and perhaps history as a whole -- intelligible. Its collapse, as Lefort's own teacher Maurice Merleau-Ponty clearly saw, threatened to leave confusion in its wake, unless some alternative to it is available. (Recall Merleau-Ponty's famous proclamation: "Marxism is not a philosophy of history; it is the philosophy of history, and to renounce it is to dig the grave of reason in history.")
Rosanvallon seems to move about the disciplines because, along with others in the same school, he has been trying to put together a total social theory that would integrate all the aspects of experience into a convincing story. They call the new overall framework they propose "the political," and Rosanvallon personally has focused on making sense of democratic modernity in all its facets. Almost no one I know about in the Anglo-American world has taken up so ambitious and forbidding a transdisciplinary task, but it is a highly important project.
Q:As the title of your collection neatly sums up, Rosanvallon's definitive preoccupation is democracy. But he's not just giving two cheers for it, or drawing up calls for more of it. Nor is his approach, so far as I can tell, either descriptive nor prescriptive. So what does that leave left for a philosopher to do?
A: At the core of his conception of democracy, there is a definitive problem: The new modern sovereign (the "people" who now rule) is impossible to identify or locate with any assurance. Democracy is undoubtedly a liberatory event -- a happy tale of the death of kings. But it must also face the sadly intractable problem of what it means to replace them.
Of course, the history of political theory contains many proposals for discovering the general will. Yet empirical political scientists have long insisted that "the people" do not preexist the procedures chosen for knowing their will. In different words, "the people" is not a naturally occurring object. Rosanvallon's work is, in one way or another, always about this central modern paradox: If, as the U.S. Constitution for instance says, "We the people" are now in charge, it is nevertheless true that we the people have never existed together in one place, living at one time, speaking with one voice. Who, then, is to finally say who "we" are?
The point may seem either abstract or trivial. But the power of Rosanvallon's work comes from his documentation of the ways -- sometimes blatant and sometimes subtle -- that much of the course and many of the dilemmas of modern history can be read through the lens of this paradox. For example, the large options in politics can also be understood as rival answers to the impossible quandary or permanent enigma of the new ruler's identity. Individual politicians claim special access to the popular will either because they might somehow channel what everyone wants or because they think that a rational elite possesses ways of knowing what the elusive sovereign would or should want. Democracy has also been the story, of course, of competing interpretations of what processes or devices are most likely to lead to results approximating the sovereign will.
Recently, Rosanvallon has begun to add to this central story by suggesting that there have always been -- and increasingly now are -- lots of ways outside electoral representation that the people can manifest their will, during the same era that the very idea that there exists a coherent people with a single will has entered a profound crisis.
One of the more potent implications of Rosanvallon's premise that there is no right answer to the question of the people's identity is that political study has to be conceptual but also historical. Basic concepts like the people might suggest a range of possible ways for the sovereign will to be interpreted, but only historical study can uncover the rich variety of actual responses to the difficulty.
The point, Rosanvallon thinks, is especially relevant to political theorists, who often believe they can, simply by thinking hard about what democracy must mean, finally emerge with its true model, whether based on a hypothetical contract, an ideal of deliberation, or something else. But the premise also means that democracy's most basic question is not going to go away, even if there are better and worse responses.
Q:Now to consider the relationship between Rosanvallon's work and political reality "on the ground" right now. Let's start with a domestic topic: the debate over immigration. Or more accurately, the debate over the status of people who are now part of the U.S. economy, but are effectively outside the polity. I'm not asking "what would Rosanvallon do?" here, but rather wondering: Does his work shed any light on the situation? What kinds of questions or points would Rosanvallonists (assuming there are any) be likely to raise in the discussion?
A: It's fair to ask how such an approach might help in analyzing contemporary problems. But his approach always insists on restoring the burning issues of the day to a long historical perspective, and on relating them to democracy's foundational difficulties. Without pretending to guess what Rosanvallon might say about America's recent debate, I might offer a couple of suggestions about how his analysis might begin.
The controversy over immigrants is so passionate, this approach might begin by arguing, not simply because of economic and logistical concerns but also because it reopens (though it was never closed!) the question of the identity of the people in a democracy. The challenge immigrants pose, after all, is not one of inclusion simply in a cultural sense, as Samuel Huntington recently contended, but also and more deeply in a conceptual sense.
In a fascinating chapter of his longest work, on the history of suffrage, Rosanvallon takes up the history of French colonialism, including its immigrant aftermath. There he connects different historical experiences of immigrant inclusion to the conceptual question of what the criteria for exclusion are, arguing that if democracies do not come to a clear resolution about who is inside and outside their polity, they will vacillate between two unsatisfactory syndromes. One is the "liberal" response of taking mere presence on the ground as a proxy for citizenship, falsely converting a political problem into one of future social integration. The other is the "conservative" response of of conceptualizing exclusion, having failed to resolve its meaning politically, in the false terms of cultural, religious, or even racial heterogeneity. Both responses avoid the real issue of the political boundaries of the people.
But Rosanvallon's more recent work allows for another way of looking at the immigration debate. In a new book coming out in French in the fall entitled "Counterdemocracy," whose findings are sketched in a preliminary and summary fashion in the fascinating postscript to the English-language collection, Rosanvallon tries to understand the proliferation of ways that popular expression occurs outside the classical parliamentary conception of representation. There, he notes that immigration is one of several issues around which historically "the people" have manifested their search for extraparliamentary voice.
For Rosanvallon, the point here is not so much to condemn populist backlash, as if it would help much simply to decry the breakdown of congressional lawmaking under pressure. Rather, one might have to begin by contemplating the historical emergence of a new form of democracy -- what he calls unpolitical democracy -- that often crystallizes today around such a hot-button topic as the status of immigrants. This reframing doesn't solve the problem but might help see that its details turn out to be implicated in a general transformation of how democracy works.
Q:OK, now on to foreign policy. In some circles, the invasion of Iraq was justified as antitotalitarianism in action, and as the first stage a process of building democracy. (Such are the beauty and inspiration of high ideals.) Does Rosanvallon's work lend itself to support for "regime change" via military means? Has he written anything about "nation building"?
A: This is a very important question. I write in my introduction to the collection about the contemporary uses of antitotalitarianism, and I do so mainly to make criticize the recent drift in uses of that concept.
Of course, when the critique of totalitarianism activated a generation, it was the Soviet Union above all that drew their fire. But their critique was always understood to have its most salient implications for the imagination of reform at home, and especially for the renewal of the left. This is what has changed recently, in works of those "liberal hawks," like Peter Beinart and Paul Berman, who made themselves apologists for the invasion of Iraq in the name of antitotalitarian values. Not only did they eviscerate the theoretical substance on which the earlier critique of totalitarianism drew -- from the work of philosophers like Hannah Arendt and Claude Lefort among others -- but they wholly externalized the totalitarian threat so that their critique of it no longer had any connection to a democratic program. It became purely a rhetoric for the overthrow of enemies rather than a program for the creation or reform of democracies. In the updated approach, what democracy is does not count as a problem.
It is clear that this ideological development, with all of its real-world consequences, has spelled the end of the antitotalitarian coalition that came together across borders, uniting the European left (Eastern and Western) with American liberalism, thirty years ago. That the attempt to update it and externalize that project had failed became obvious even before the Iraq adventure came to grief -- the project garnered too few allies internationally.
Now it is perfectly true that the dissolution of this consensus leaves open the problem of how democrats should think about foreign policy, once spreading it evangelistically has been unmasked as delusional or imperialistic. A few passages in the collection suggest that Rosanvallon thinks the way to democratize the world is through democratization of existing democracies -- the reinvigoration of troubled democracies is prior to the project of their externalization and duplication. Clearly this response will not satisfy anyone who believes that the main problem in the world is democracy's failure to take root everywhere, rather than its profound difficulties where it already is. But clarifying the history and present of democracy inside is of undoubted relevance to its future outside.
Q:There are some very striking passages in the book that discuss the seeming eclipse of the political now. More is involved than the withdrawl from civic participation into a privatized existence. (At the same time, that's certainly part of it.) Does Rosanvallon provide an account of how this hollowing-out of democracy has come to pass? Can it be reversed? And would its reversal necessarily be a good thing?
A: One of the most typical responses to the apparent rise of political apathy in recent decades has been nostalgia for some prior society -- classical republics or early America are often cited -- that are supposed to have featured robust civic engagement. The fashion of "republicanism" in political theory, from Arendt to Michael Sandel or Quentin Skinner, is a good example. But Rosanvallon observes that the deep explanation for what is happening is a collapse of the model of democracy based on a powerful will.
The suggestion here is that the will of the people is not simply hard to locate or identify; its very existence as the foundation of democratic politics has become hard to credit anymore. The challenge is to respond by taking this transformation as the starting point of the analysis. And there appears to be no return to what has been lost.
But in his new work, anticipated in the postscript, Rosanvallon shows that the diagnosis may be faulty anyway. What is really happening, he suggests, is not apathy towards or retreat from politics in a simple sense, but the rise of new forms of democracy -- or counterdemocracy -- outside the familiar model of participation and involvement. New forms seeking expression have multiplied, through an explosion of devices, even if they may seem an affront to politics as it has ordinarily been conceptualized.
Rosanvallon's current theory is devoted to the project of putting the multiplication of representative mechanisms -- ones that do not fit on existing diagrams of power -- into one picture. But the goal, he says, is not just to make sense of them but also to find a way for analysis to lead to reform. As one of Rosanvallon's countrymen and predecessors, Alexis de Tocqueville, might have put it: Democracy still requires a new political science, one that can take it by the hand and help to sanctify its striving.
For further reading: Professor Moyn is co-author (with Andrew Jainhill of the University of California at Berkeley) of an extensive analysis of the sources and inner tensions of Rosanvallon's thought on democracy, available online.Â And in an essay appearing on the Open Democracy Webs ite in 2004, Rosanvallon reflected on globalization, terrorism, and the war in Iraq.