Necessary Evils

  "In a time of war," wrote Cicero, "the laws are silent." (That's "inter arma silent leges," in case some nuance is missing from the usual English rendering.)

Well, perhaps not quite silent. Marouf A. Hasian's In the Name of Necessity: Military Tribunals and the Loss of American Civil Liberties, available next month from the University of Alabama Press, revisits more than 200 years of American argumentation for and against the legitimacy of "military justice."

That phrase merits the scare quote marks because it is very much open to question whether they quite belong together. You don't need to be a pacifist, or even to harbor any doubt about liberal democracy, to have such concerns. The role of the military is, of course, to fight; and the legitimacy of its monopoly on violence derives (in modern societies anyway) from its subordination to a lawful order. At best -- so the argument might go -- the military can pursue a just policy, as subject to oversight and review by outside institutions. Hence the rise of what is called the "civilianization" of military law.

That's the theory, anyway. The actual record is a good bit messier, as Hasian, an associate professor of communications at the University of Utah, shows in some detail. His book presents a series of analytic retellings of events from the Revolutionary War through the detainments at Guantanamo Bay. To some degree, then, it overlaps with William Rehnquist's All the Laws But One: Civil Liberties in Wartime, (1998, which focused mainly on cases from the Civil War and the two World Wars.

But the difference is not simply a matter of the opening of a whole new chapter in history over the past four years. Hasian's book is the work of a scholar who has taken "the rhetorical turn" -- drawing on the toolkit of concepts from one of the founding disciplines of humanistic study. A social historian or a law professor might also cover, as he does, the 1862 U.S-Dakota war tribunal, which led to the execution of a group of Native Americans -- or the 1942 trial of several German saboteurs, captured shortly after they had been deposited on the coasts of New York and Florida, along with bomb-making materials, by U-boat. But Hasian treats these cases neither as events (as a historian would) nor as precedents (the lawyer's concern).

The emphasis in his book falls, rather, on how a particular element of persuasion took shape in each case: the argument of necessity. In each case, the claim was made that circumstances demanded the suspension of normal legal procedures and guarantees, and their replacement by military tribunals that practiced the warlike virtues of secrecy, efficiency, and swiftness.

A philosopher or legal theorist might want to dissect the validity, coherence, or applicability of "necessity" as a principle applied in such cases. Hasian's approach treats it, not as a concept, but as what rhetoric scholars have in recent years called an "ideograph" -- that is, "a key evocative term or phrase that illustrates the political allegiances of an individual and a community in a major social, political, economic, or legal controversy." Other ideographs include such terms as "equality," "progress," and "freedom."

The range of definitions and of emotional charge for each term varies. They have a rather timeless sound, but a complex history of mutations in meaning. And in the heat of debate, they can be made to perform a variety of functions. The meaning of an ideograph in a given context is marked by that context.

Perhaps the strongest version of the argument from necessity is the one that Lincoln made against those who criticized him for suspending habeus corpus during the Civil War: "Are all the laws, but one, to go unexecuted, and the government go to pieces, lest that one be violated?" In other words: Moments of extremity can require the temporary sacrifice of some civil liberties to preserve the rest.

Rehnquist signaled his basic agreement with this line of thought by titling his book All the Laws But One. "It is neither desirable nor is it remotely likely," he wrote there, "that civil liberty will occupy as favored a position in wartime as it does in peacetime."

But even the fairly straightforward affirmation of necessity as a legitimate ground for suspending civil liberties is the result of (and a moment of conflict within) a complicated history of arguments. In tracing out the history of necessity, Hasian identifies two strands of -- well, it's not clear what the expression would be. Perhaps "ideographic DNA"? One he calls the "Tory" concept of necessity; the other, the "Whig" version.

In the Tory framing, there are "many times when a society is called upon to defend itself against riots, revolutions, and rebellions," as Hasian puts it. It is the responsibility of the monarch or the executive branch to recognize the danger and respond accordingly. "Since this is an issue of survival, the military authorities should be given a great deal of discretion. In these situations, the 'will' of those in authority will be of paramount importance."

(In other words, an element of sovereign authority is handed over to the military. The commanding officer is then in the position to say, "I am the law." And legitimately so.)

By contrast, the Whiggish conception of necessity sees "relatively few times when a society has to worry about exigent circumstances." Responsibility for judging whether or not a real emergency exists should fall to the parliament or the legislative branch -- to which the military must remain accountable.

Appropriately enough, given a Whiggish sensibility, this means a certain degree of guardedness and jealousy about the degree of judicial authority delegated to the military. There will be a tendency towards suspicion that the trust might be abused. The Whig discourse on necessity wants to keep to a bare minimum the scope, duration, and degree of secrecy that military tribunals may claim.

The classic formulation of the Whig conception in American history is Ex parte Milligan, from 1866, in which the Supreme Court found that the Union military authorities had overstepped by arresting and trying a Confederate sympathizer in Indiana -- a state where the normal functioning of the court system had not been interrupted by the war.

Of course, Ex parte Milligan fans have taken some hits lately. We had a good run there, for a while. But lately all the swagger comes from enthusiasts for Ex parte Quirin (1942), which denied the claim of German saboteurs to appeal for civil trials.

What makes Hasian's account of Quirin so interesting is his suggestion that some Supreme Court justices "actually thought that their decision would be construed as falling in line with the precedents that placed limits on military commissions and executive power." But if that was the intention 60 years ago, you'd never know it to read the newspapers today.

This is an aerial overview of In the Name of Necessity. The real provocations are in the details. Perhaps the analytic category of ideograph sounds a trifle thin -- turning bloody arguments into something rather anemic. But Hasian's book is ultimately more polemical than that. The framework is only just technically "value neutral." He's got a position to stake out.

"In the very, very rare cases of extreme necessity," he writes, "when Congress and the United Nations have decided we need to impose martial law or have commissions in occupied lands, we may have situations where all of the civil courts are closed and where the military may need more discretion."

That much, Hasian will concede to the Tory worldview, and no more. But even then, such assertions of power "need to be held in check by recognizing that most of the time we should begin with the baseline 'Whig' assumption that we want to maintain the civilianization of the military, and not the other way around."

OK, fair enough. Now how will that play out in the courts under Chief Justice Roberts? And particularly under a circumstance in which the Tories are so powerful that nobody really doubts that Chief Justice Roberts will be presiding?

That Whig in extremis John Milton said that necessity is "ever the tyrant's plea." But we might be entering a period when the plea doesn't even have to be made -- when war doesn't silence law, but writes it.

Scott McLemee
Author's email: 

Scott McLemee writes Intellectual Affairs on Tuesdays and Thursdays. Suggestions and ideas for future columns are welcome.

Hitler -- the Classic?

It is disagreeable to approach the cashier with a book called How to Read Hitler. One way to take the stink off would be to purchase one or two other volumes in the new How to Read series published by W. W. Norton, which also includes short guides to Shakespeare, Nietzsche, Freud, and Wittgenstein. But at the time, standing in line at a neighborhood bookstore a couple weeks ago, I wasn't aware of those other titles. (The only thing mitigating the embarrassment was knowing that my days as a skinhead, albeit a non-Nazi one, are long over.) And anyway, the appearance of Adolf Hitler in such distinguished literary and philosophical company raises more troubling questions than it resolves.

"Intent on letting the reader experience the pleasures and intellectual stimulation in reading classic authors," according to the back cover, "the How to Read series will facilitate and enrich your understanding of texts vital to the canon." The series editor is Simon Critchley, a professor of philosophy at the New School in New York City, who looms ever larger as the guy capable of defending poststructuralist thought from its naysayers. Furthermore, he's sharp and lucid about it, in ways that might just persuade those naysayers to read Derrida before denouncing him. (Yeah, that'll happen.)

Somehow it is not that difficult to imagine members of the National Association of Scholars waving around the How to Read paperbacks during Congressional hearings, wildly indignant at Critchley's implicit equation of Shakespeare and Hitler as "classic authors" who are "vital to the canon."

False alarm! Sure, the appearance of the Fuhrer alongside the Bard is a bit of a provocation. But Neil Gregor, the author of How to Read Hitler, is a professor of modern German history at the University of Southampton, and under no illusions about the Fuhrer's originality as a thinker or competence as a writer.

About Mein Kampf, Gregor notes that there is "an unmistakably 'stream of consciousness' quality to the writing, which does not appear to have undergone even the most basic editing, let alone anything like polishing." Although Gregor does not mention it, the title Hitler originally gave to the book reveals his weakness for the turgid and the pompous: Four and a Half Years of Struggle against Lies, Stupidity and Cowardice. (The much snappier My Struggle was his publisher's suggestion.)

Incompetent writers make history, too. And learning to read them is not that easy. The fact that Hitler had ideas, rather than just obsessions, is disobliging to consider. Many of the themes and images in his writing reflect an immersion in the fringe literature of his day -- the large body of ephemeral material analyzed in Fritz Stern in his classic study The Politics of Cultural Despair: The Rise of the Germanic Ideology.

But Gregor for the most part ignores this influence on Hitler. He emphasizes, instead, the elements of Hitler's thinking that were, in their day, utterly mainstream. He could quote whole paragraphs Carl de Clausewitz on strategy. And his racist world view drew out the most virulent consequences of the theories of Arthur de Gobineau and Houston Stewart Chamberlain.(While Hitler was dictating his memoirs in a prison following the Beer Hall Putsch, he could point with admiration to one effort to translate their doctrines into policy: The immigration restrictions imposed in the United States in the 1920s.) 

Gregor's method is to select passages from Mein Kampf and from an untitled sequel, published posthumously as Hitler's Second Book. He then carefully unpacks them -- showing what else is going on within the text, beneath the level of readily paraphrasable content. With his political autobiography, Hitler was not just recycling the standard complaints of the extreme right, or indulging in Wagnerian arias of soapbox oratory. He was also competing with exponents of similar nationalist ideas. He wrote in order to establish himself as the (literally) commanding figure in the movement.

So there is an implicit dialogue going on, disguised as a rather bombastic monologue. "Long passages of Hitler's writings," as Gregor puts it, "take the form of an extended critique of the political decisions of the late nineteenth century.... Hitler reveals himself not only as a nationalist politician and racist thinker, but -- this is a central characteristic of fascist ideology -- as offering a vision of revitalization and rebirth following the perceived decay of the liberal era, whose failings he intends to overcome."

The means of that "overcoming" were, of course, murderous in practice. The vicious and nauseating imagery accompanying any mention of the Jews -- the obsessive way Hitler constantly returns to metaphors of disease, decay, and infestation -- is the first stage of a dehumanization that is itself an incipient act of terror. The genocidal implications of such language are clear enough. But Gregor is careful to distinguish between the racist stratum of Hitler's dogma (which was uncommonly virulent even compared to the "normal" anti-Semitism of his day) and the very widespread use of militarized imagery and rhetoric in German culture following World War I.

"Many of the anti-Semitic images in Hitler's writing can be found in, say, the work of Houston Stewart Chamberlain," writes Gregor. "Yet when reading Chamberlain's work we hardly sense that we are dealing with an advocate of murder. When reading Hitler, by contrast, we often do -- even before we have considered the detail of what he is discussing. This is because the message is not only to be found in the arguments of the text, but is embedded in the language itself."

How to Read Hitler is a compact book, and a work of "high popularization" rather than a monograph. The two short pages of recommended readings at the end are broad, pointing to works of general interest (for example, The Coming of the Third Reich by Richard Evans) rather than journal articles. It will find its way soon enough into high-school and undergraduate history classrooms -- not to mention the demimonde of "buffs" whose fascination with the Third Reich has kept the History Channel profitable over the years.

At the same time, Gregor's little book is an understated, but very effective, advertisement for the "cultural turn" in historical scholarship. It is an example, that is, of one way historians go about examining not just what documents tell us about the past, but how the language and assumptions of a text operated at the time. His presentation of this approach avoids grand displays of methodological intent. Instead the book just goes about its business -- very judiciously, I think.

But there is one omission that is bothersome. Perhaps it is just an oversight, or, more likely, a side effect of the barriers between disciplines. Either way, it is a great disservice that How to Read Hitler nowhere points out the original effort by someone writing in English to analyze the language and inner logic of Mein Kampf --  the essay by Kenneth Burke called "The Rhetoric of Hitler's 'Battle,' " published in The Southern Review in 1939. (In keeping with my recent enthusing over the "golden age" of the academic literary quarterly, it is worth noting that the Review was published at Louisiana State University and edited by a professor there named Robert Penn Warren.)

Burke's essay was, at the time, an unusual experiment: An analysis of a political text using the tools of literary analysis that Burke had developed while studying Shakespeare and Coleridge. He had published the first translations of Thomas Mann's Death in Venice and of portions of Oswald Spengler's Decline of the West -- arguably a uniquely suitable preparation for the job of reading Hitler. And just as various German émigrés had tried to combine Marx and Freud in an effort to grasp "the mass psychology of fascism" (as Wilhelm Reich's title had it), so had Burke worked out his own combination of the two in a series of strange and brilliant writings published throughout the Depression.

But he kept all of that theoretical apparatus offstage, for the most part, in his long review-essay on a then-new translation of Mein Kampf. Instead, Burke read Hitler's narrative and imagery very closely -- showing how an "exasperating, even nauseating" book served to incite and inspire a mass movement.

This wasn't an abstract exercise. "Let us try," wrote Burke, "to discover what kind of 'medicine' this medicine man has concocted, that we may know, with greater accuracy, exactly what to guard against, if we are to forestall the concocting of similar medicine in America."

Burke's analysis is a [ital]tour de force[ital]. Revisiting it now, after Gregor's How to Read volume, it is striking how much they overlap in method and implication. In 1941, Burke reprinted it in his collection The Philosophy of Literary Form, which is now available from the University of California Press. You can also find it in a very useful anthology of Burke's writings called On Symbols and Society, which appears in the University of Chicago Press's series called "The Heritage of Sociology."

"Above all," wrote Burke in 1939, "I believe we must make it apparent that Hitler appeals by relying upon a bastardization of fundamentally religious patterns of thought. In this, if properly presented, there is no slight to religion. There is nothing in religion proper that requires a fascist state. There is much in religion, when misused, that does lead to a fascist state. There is a Latin proverb, Corruptio optimi pessima, 'the corruption of the best is the worst.' And it is the corruptors of religion who are a major menace to the world today, in giving the profound patterns of religious thought a crude and sinister distortion."

Scott McLemee
Author's email: 

The Chosen Few

Jerome Karabel's The Chosen is the big meta-academic book of the season -- a scholarly epic reconstructing "the hidden history of admission and exclusion at Harvard, Yale, and Princeton," as the subtitle puts it. Karabel, who is a professor of sociology at the University of California at Berkeley, has fished documents out of the archive with the muckraking zeal worthy of an investigative journalist. And his book, published this month by Houghton Mifflin, is written in far brisker narrative prose than you might expect from somebody working in either sociology or education. That's not meant as a dis to those worthy fields. But in either, the emphasis on calibrating one's method does tend to make storytelling an afterthought.

For Karabel really does have a story to tell. The Chosen shows how the gentlemanly anti-Semitism of the early 20th century precipitated a deep shift in how the country's three most prestigious universities went about the self-appointed task of selecting and grooming an elite.

It is (every aspect of it, really) a touchy subject. The very title of the book is a kind of sucker-punch. It is an old anti-Jewish slur, of course. It's an allusion to Jehovah's selection of the Jews as the Chosen People, of course. It's also a term sometimes used, with a sarcastic tone, as an ethnic slur. But Karabel turns it back against the WASP establishment itself -- in ways too subtle, and certainly too well-researched, to be considered merely polemical. (I'm going to highlight some of the more rancor-inspiring implications below, but that is due to my lack of Professor Karabel's good manners.)

The element of exposé pretty much guarantees the book a readership among people fascinated or wounded by the American status system. Which is potentially, of course, a very large readership indeed. But "The Chosen" is also interesting as an example of sociology being done in almost classical vein. It is a study of what, almost a century ago, Vilfredo Pareto called "the circulation of elites" -- the process through which "the governing elite is always in a state of slow and continuous transformation ... never being today what it was yesterday."

In broad outline, the story goes something like this. Once upon a time, there were three old and distinguished universities on the east coast of the United States. The Big Three were each somewhat distinctive in character, but also prone to keeping an eye on one another's doings.

Harvard was the school with the most distinguished scholars on its faculty -- and it was also the scene of President Charles Eliot's daring experiment in letting undergraduates pick most of their courses as "electives." There were plenty of the "stupid young sons of the rich" on campus (as one member of the Board of Overseers put it in 1904), but the student body was also relatively diverse. At the other extreme, Princeton was the country club that F. Scott Fitzgerald later described in This Side of Paradise. (When asked how many students there were on campus, a Princeton administrator famously replied, "About 10 percent.")

Finally, there was Yale, which had crafted its institutional identity as an alternative to the regional provincialism of Harvard, or Princeton's warm bath of snobbery. It was "the one place where money makes no difference ... where you stand for what you are," in the words of the then-beloved college novel Dink Stover, about a clean-cut and charismatic Yalie.

But by World War One, something was menacing these idyllic institutions: Namely, immigration in general and "the Hebrew invasion" in particular. A meeting of New England deans in the spring of 1918 took this on directly. A large and growing percentage of incoming students were the bright and driven children of Eastern European Jewish immigrants. This was particularly true at Harvard, where almost a fifth of the freshman class that year was Jewish. A few years later, the figure would reach 13 percent at Yale -- and even at Princeton, the number of Jewish students had doubled its prewar level.

At the same time, the national discussion over immigration was being shaped by three prominent advocates of "scientific" racism who worried about the decline of America's Nordic stock. They were Madison Grant (Yale 1887), Henry Fairfield Osborne (Princeton 1877), and Lothrop Stoddard (Harvard 1905).

There was, in short, an air of crisis at the Big Three. Even the less robustly bigoted administrators worried about (as one Harvard official put it) "the disinclination, whether justified or not, on the part of non-Jewish students to be thrown into contact with so large a proportion of Jewish undergraduates."

Such, then, was the catalyst for the emergence, at each university, of an intricate and slightly preposterous set of formulae governing the admissions process. Academic performance (the strong point of the Jewish applicants) would be a factor -- but one strictly subordinated to a systematic effort to weigh "character."

That was an elusive quality, of course. But administrators knew when they saw it. Karabel describes the "typology" that Harvard used to make an initial characterization of applicants. The code system included the Boondocker ("unsophisticated rural background"), the Taconic ("culturally depressed background," "low income"), and the Krunch ("main strength is athletic," "prospective varsity athlete"). One student at Yale was selected over an applicant with a stronger record and higher exam scores because, as an administrator put it, "we just thought he was more of a guy."

Now, there is a case to be made for a certain degree of flexibility in admissions criteria. If anything, given our reflex-like tendency to see diversity as such as an intrinsic good, it seems counterintuitive to suggest otherwise. There might be some benefit to the devil's-advocate exercise of trying to imagine the case for strictly academic standards.

But Karabel's meticulous and exhaustive record of how the admissions process changed is not presented as an argument for that sort of meritocracy. First of all, it never prevailed to begin with.

A certain gentlemanly disdain for mere study was always part of the Big Three ethos. Nor had there ever been any risk that the dim sons of wealthy alumni would go without the benefits of a prestigious education.

What the convoluted new admissions algorithms did, rather, was permit the institutions to exercise a greater -- but also a more deftly concealed -- authority over the composition of the student body.

"The cornerstones of the new system were discretion and opacity," writes Karabel; "discretion so that gatekeepers would be free to do what they wished and opacity so that how they used their discretion would not be subject to public scrutiny.... Once this capacity to adapt was established, a new admissions regime was in place that was governed by what might be called the 'iron law of admissions': a university will retain a particular admissions policy only so long as it produces outcomes that correspond to perceived institutional interests."

That arrangement allowed for adaptation to social change -- not just by restricting applicants of one minority status in the 1920s, but by incorporating underrepresented students of other backgrounds later. But Karabel's analysis suggests that this had less to do with administratorsbeing "forward-looking and driven by high ideals" than it might appear.

"The Big Three," he writes, "were more often deeply conservative and surprisingly insecure about their status in the higher education pecking order.... Change, when it did come, almost always derived from one of two sources: the continuation of existing policies was believed to pose a threat either to vital institutional interests (above all, maintaining their competitive positions) or to the preservation of the social order of which they were an integral -- and privileged -- part."

Late in the book, Karabel quotes a blistering comment by the American Marxist economist Paul Sweezy (Exeter '27, Harvard '31, Harvard Ph.D. '37) who denounced C. Wright Mills for failing to grasp "the role of the preparatory schools and colleges as recruiters for the ruling class, sucking upwards the ablest elements of the lower classes." Universities such as the Big Three thus performed a double service to the order by "infusing new brains into the ruling class and weakening the potential leadership of the working class."

Undoubtedly so, once upon a time -- but today, perhaps, not so much. The neglect of their duties by the Big Three bourgeoisie is pretty clear from the statistics.

"By 2000," writes Karabel, "the cost of a year at Harvard, Yale, and Princeton had reached the staggering sum of more than $35,000 -- an amount that well under 10 percent of American families could afford....Yet at all three institutions, a majority of students were able to pay their expenses without financial assistance -- compelling testimony that, more than thirty years after the introduction of need-blind admissions, the Big Three continued to draw most of their students from the most affluent members of society." The number of students at the Big Three coming from families in the bottom half of the national income distribution averages out to about 10 percent.

All of which is (as the revolutionary orators used to say) no accident. It is in keeping with Karabel's analysis that the Big Three make only as many adjustments to their admissions criteria as they must to keep the status quo ante on track. Last year, in a speech at the American Council on Education, Harvard's president, Larry Summers, called for preferences for the economically disadvantaged. But in the absence of any strong political or social movement from below -- an active, noisy menace to business as usual -- it's hard to imagine an institutionalized preference for admitting students from working families into the Big Three. (This would have to include vigorous and fairly expensive campaigns of recruitment and retention.)

As Walter Benn Michaels writes in the latest issue of N+1 magazine, any discussion of class and elite education now is an exercise in the limits of the neoliberal imagination. (His essay was excerpted last weekend in the Ideas section of The Boston Globe.

"Where the old liberalism was interested in mitigating the inequalities produced by the free market," writes Michaels, " neoliberalism -- with its complete faith in the beneficence of the free market -- is interested instead in justifying them. And our schools have a crucial role to play in this. They have become our primary mechanism for convincing ourselves that poor people deserve their poverty, or, to put the point the other way around, they have become our primary mechanism for convincing rich people that we deserve our wealth."

How does this work? Well, it's no secret that going to the Big Three pays off. If, in theory, the door is open to anyone smart and energetic, then everything is fair, right? That's equality of opportunity. And if students at the Big Three then turn out to be drawn mainly from families earning more than $100,000 per year....

Well, life is unfair. But the system isn't.

"But the justification will only work," writes Michaels, if "there really are significant class differences at Harvard. If there really aren't -- if it's your wealth (or your family's wealth) that makes it possible for you to go to an elite school in the first place -- then, of course, the real source
of your success is not the fact that you went to an elite school but the fact that your parents were rich enough to give you the kind of preparation that got you admitted to the elite school. The function of the (very few) poor people at Harvard is to reassure the (very many) rich people at Harvard that you can't just buy your way into Harvard."

Scott McLemee
Author's email: 

A Child's Garden of Culture and Atrocity

"Whoever cannot give to himself an adequate account of the past three thousand years," said Goethe, "remains in darkness, without history, living from day to day." That is an expression of a bedrock principle of liberal humanism, European-style. It takes the existence of the educated individual as its basic unit of reference -- its gold standard. But it also judges the quality of that existence by how much the individual has spent in acquiring a sense of the past. That expenditure also means, in effect, going into debt: You’ll never repay everything you owe to previous generations.

That outlook is, when you get right down to it, pretty un-American. It goes against the ideal of unencumbered self-creation that Emerson taught us –- in which we are supposed to throw off the burdens of the past, living always in the vital present. Fortunately, this is not hard to do. The first step is not to learn much history to begin with. (We are good at this.)

Even so, there  may be an audience for E. H. Gombrich’s A Little History of the World, now available from Yale University Press, 70 years after it was first written. Imagine Goethe giving up the role of sage long enough to become a children’s author and you will have a reasonably good idea of the book’s content. It goes from prehistory up to the end of the (then-recent) Great War, with particular attention to ancient Greece, the Roman Empire, and the emergence of Judaism, Buddhism, Christianity, and Islam.

As for the style ... well, that is something even more remarkable. The tone is wry, at times, without ever being jokey -- a kind of light seriousness that is very respectful of its young audience. Each chapter is perfectly calibrated to suit the attention span and cognitive powers of a 10 year-old, without ever giving off a trace of condescension.

The effect, even for an adult reader, is incredibly charming –- and, indeed, instructive, at least for anyone with the occasional gap in that interior timeline. (Quick now: Who were the Hohenzollerns? And no, a vague sense that they were German doesn’t count.)

In his later and better-known role as art historian, Gombrich commanded a really humbling degree of erudition, but always with a certain generosity towards his audience. That combination is very much in evidence throughout his first book – one written in what must have been very trying circumstances.

It was Vienna in 1935. Gombrich was 26 and had recently finished his dissertation. (Writing one "was considered very important," he told a presumably incredulous audience at Rutgers University in 1987, "yet it didn’t take more than a little over a year to write.") His immediate job prospects ranged from the nonexistent to the merely terrible. Besides, he was Jewish, and the writing was on the wall, usually in the form of a swastika.

He managed to find part-time employment with a publishing company. He was asked to evaluate a book on world history for children in English, to see if it might be worth translating. He recommended against it, but offered instead to write one directly into German. It took him about six week, writing a chapter a day. The volume did quite well when it appeared in 1936, though the Nazis eventually stopped publication on the grounds of its "pacifism."

By then, he was in London, working at the Warburg Institute (a major art-history collection, where Gombrich in time became director) and aiding the war effort by translating German radio broadcasts into English. Before leaving Vienna, he had agreed to write another book, this one for adolescents, on the history of art. That project that grew into a rather more ambitious work, The Story of Art (1950) – long the standard overview of European art history, from which generations of museum tour-guides have cribbed.

He wrote it – along with his more monographic works on iconography and on the psychology of perception –- in English. When his Little History was reprinted in Germany in the mid-1980s, he wrote an afterward for it; but he turned down offers to have it translated into English, preferring to do that himself, and to make some necessary revisions. It is not clear from the edition now available from Yale just how far Gombrich got with that effort at the time of his death in 2001. (The title page gives the translator as Caroline Mustill.) But he did add a postscript called "The Small Part of the History of the World Which I Have Lived Through" – summing up the 20th century from World War I through the end of the Cold War, and trying to put as optimistic a spin on that record as possible.

The preface by Leonie Gombrich, his granddaughter, quotes some introductory remarks he prepared for the Turkish edition. His Little History, he wrote, "is not, and never was, intended to replace any textbooks of history that may serve a very different purpose at school. I would like my readers to relax, and to follow the story without having to take any notes or to memorize names and dates. In fact, I promise that I shall not examine them on what they have read."

But the book has a strong and serious pedagogical intent, even so. And it comes very directly from Goethe, whose work Gombrich read incessantly as a boy. Upon receiving the Goethe Prize in 1994, Gombrich said that it was the author’s life and writing that taught him "the consoling message ... of a universal citizenship that transcends the confines of nationhood." That seems very much the point of the Little History, which tries to squeeze all of global history into just under three hundred easily read pages –- and I strongly suspect it was just that cosmopolitanism that the Nazi censors really loathed.

Of course, there are gaps and oversights. One that is really troublesome is how the entire history of the Atlantic slave trade is reduced to the dimensions of a brief reference to the Civil War in the United States. This has the effect of making it seem like a distant and cruel episode in the New World, rather than what it really was: A vast and centuries-long process that enriched parts of Europe, depopulated parts of Africa, and anticipated every aspect of totalitarianism possible before the rise of industrialization and mass communications.

Not that Gombrich leaves the history of colonial atrocity entirely out of the picture, especially in recounting the conquest of the Americas: "This chapter in the history of mankind is so appalling and shameful to us Europeans that I would rather not say anything more about it."

In many ways, then, the book is at least as interesting as the specimen of a lost sensibility as it is in its own right, as a first introduction to history. Gombrich later spoke of how much he had been the product of that almost religious veneration of culture that prevailed among the European middle class of the 19th and early 20th centuries.

"I make no great claims for the universality of that tradition," he said during a lecture at Liverpool University in 1981. "Compared to the knowable, its map of knowledge was arbitrary and schematic in the extreme. As is true of all cultures, certain landmarks were supposed to be indispensable for orientation while whole stretches of land remained terra incognita, of relevance only to specialists..... But what I am trying to say is that at least there was a map."

Scott McLemee
Author's email: 

Scott McLemee writes Intellectual Affairs on Tuesdays and Thursdays. Suggestions and ideas for future columns are welcome.

Piled Higher and Deeper

Rick Perlstein, a friend from the days of Lingua Franca, is now working on a book about Richard Nixon. Last year, he published a series of in-depth articles about the Republican Party and the American conservative movement. (Those are not quite the same thing, though that distinction only becomes salient from time to time.) In short, Perlstein has had occasion to think about honesty and dissimulation -- and about the broad, swampy territory in between, where politicians finesse the difference. As do artists and used-car salesmen....

It’s the job of historians to map that territory. But philosophers wander there, too. “What is truth?” as Nietzsche once asked. “A mobile army of metaphors, metonymies, anthropomorphisms. Truths are illusions of which one has forgotten that they are illusions.” Kind of a Cheneyo-Rumsfeldian ring to that thought. It comes from an essay called “On Truth and Lie in an Extra-Moral Sense,” which does, too, come to think of it.  

So anyway, about a week ago, Rick pointed out a recent discussion of how the Bush Administration is dealing with critics who accuse it of fudging the intelligence that suggested Saddam Hussein had weapons of mass destruction. The link went to a comment by Joshua Micah Marshall, who is a liberal Democrat of the more temperate sort, not prone to hyperventilation. 

“Garden variety lying is knowing it’s Y and saying it’s X,” he wrote, giving Lyndon Johnson on the Gulf of Tonkin as an example. The present executive branch, he continued, shows “a much deeper indifference to factual information in itself.”

Rick posed an interesting question: “Isn't Josh Marshall here describing as the Administration's methodology exactly what that Princeton philosophy prof defines as ‘bullshit’?” That prof being, of course, Harry Frankfurt, whose short and best-selling treatise On Bullshit will probably cover everyone’s Christmas bonus at Princeton University Press this year. 

In February, The New York Times beat us by a day or so with its article on the book, which daintily avoided giving its title. But "Intellectual Affairs" first took a close look, not just at Frankfurt’s text -- noting that it remained essentially unchanged since its original publication as a scholarly paper in the 1980s -- but at the philosophical critique of it presented in G.A. Cohen’s essay “Deeper into Bullshit.” 

Since then, the call for papers for another volume of meditations on the theme of bull has appeared. Truly, we are living in a golden age.

The gist of Frankfurt’s argument, as you may recall, is that pitching BS is a very different form of activity from merely telling a lie. And Marshall’s comments do somewhat echo the philosopher’s point. Frankfurt would agree that “garden variety lying” is saying one thing when you know another to be true. The liar operates within a domain that acknowledges the difference between accuracy and untruth. The bullshitter, in Frankfurt’s analysis, does not. In a sense, then, the other feature of Marshall’s statement would seem to fit. Bullshit involves something like “indifference to factual information in itself.”

So does it follow, then, that in characterizing the Bush team’s state of mind three years ago, during the run-up to the war, we must choose between the options of incompetence, dishonesty, and bullshit?
Please understand that I frame it in such terms, not from any political motive, but purely in the interest of conceptual rigor. 

That said.... It seems to me that this range of terms is inadequate. One may agree that Bush et al. are profoundly indifferent to verifiable truth without concluding that the Frankfurt category necessarily applies.

Per G. A. Cohen’s analysis in “Deeper into Bullshit,” we must stress that Frankfurt’s model rests on a particular understanding of the consciousness of the liar. The mind of the bullshitter is defined by contrast to this state. For the liar, (1) the contrast between truth and untruth is clearly discerned, and (2) that difference would be grasped by the person to whom the liar speaks. But the liar’s intentionality also includes (3) some specific and lucidly grasped advantage over the listener made possible by the act of lying.

By contrast, the bullshitter is vague on (1) and radically unconcerned with (2). There is more work to be done on the elements of relationship and efficacy indicated by (3). We lack a carefully argued account of bullshit’s effect on the bullshitee.

There is, however, another possible state of consciousness not adequately described by Frankfurt’s paper. What might be called “the true believer” is someone possessing an intense concern with truth.

But it is a Higher Truth, which the listener may not (indeed, probably cannot) grasp. The true believer is speaking a truth that somehow exceeds the understanding of the person hearing it.

During the Moscow Trials of the late 1930s, Stalin’s attorney lodged numerous charges against the accused that were, by normal standards, absurd. In many cases, the “evidence” could be shown to be false. But so much worse for the facts, at least from the vantage point of the true believer. If you’ve ever known someone who got involved in EST or a multi-level marketing business, the same general principle applies. In each case, it is not quite accurate to say that the true believers are lying. Nor are they bullshitting, in the strictest sense, for they maintain a certain fidelity to the Higher Truth. 

Similarly, it did not matter three years ago whether or not any evidence existed to link Saddam and Osama. To anyone possessing the Higher Truth, it was obvious that Iraq must be a training ground for Al Qaeda. And guess what? It is now. So why argue about it?

On a less world-historical scale, I see something interesting and apropos in Academe, the magazine of the American Association of University Professors. In the latest issue, David Horowitz makes clear that he is not a liar just because he told a national television audience something that he knew was not true. 

(This item was brought to my attention by a friend who teaches in a state undergoing one of Horowitz’s ideological rectification campaigns. My guess is that he’d rather not be thanked by name.)

Here’s the story so far: In February, while the Ward Churchill debate was heating up, Horowitz appeared on Bill O’Reilly’s program. It came up that Horowitz, like Churchill, had been invited to lecture at Hamilton College at some point. But he was not, he said, “a speaker paid by and invited by the faculty.” 

As we all know, university faculties are hotbeds of left-wing extremism. (Especially the business schools and engineering departments. And reports of how hotel-management students are forced to read speeches by Pol Pot are positively blood-curdling.) Anyway, whenever Horowitz appears on campus, it’s because some plucky youngsters invite him. He was at Hamilton because he had been asked by “the conservative kids.”

That came as a surprise to Maurice Isserman, a left-of-center historian who teaches at Hamilton College. When I saw him at a conference a few years ago, he seemed to have a little gray in his hair, and his last book, The Other American: The Life of Michael Harrington, was a biography of the founder of the Democratic Socialists of America. No doubt he’s been called all sorts of things over the years, but “conservative kid” is not one of them. And when Horowitz spoke at Hamilton a few years ago, it was as a guest lecturer in Isserman’s class on the 1960s. 

As Isserman put it in the September/October issue of Academe: “Contrary to the impression he gave on "The O’Reilly Factor," Horowitz was, in fact, an official guest of Hamilton College in fall 2002, invited by a faculty member, introduced at his talk by the dean of the faculty, and generously compensated for his time.”

I will leave to you the pleasure and edification of watching Horowitz explain himself in the latest issue of Academe. But in short, he could not tell the truth because that would have been a lie, so he had to say something untrue in order to speak a Higher Truth. 

My apologies for the pretzel-like twistiness of that paraphrase. It is all so much clearer in the original Newspeak: Thoughtcrime is doubleplus ungood.

Scott McLemee
Author's email: 

Where Have All the Big Questions Gone?

Some months ago I started asking friends, colleagues from my teaching days, researchers in higher education, faculty members of various ages and ranks, deans, provosts and presidents, and focus groups of students: “What’s the status of the Big Questions on your campus?” Quite deliberately I avoided defining “Big Questions,” but I gave as examples such questions as “Who am I? Where do I come from? What am I going to do with my life? What are my values? Is there such a thing as evil? What does it mean to be human? How can I understand suffering and death? What obligations do I have to other people? What does it mean to be a citizen in a democracy? What makes work, or a life, meaningful and satisfying?” In other words, I wanted to know what was happening to questions of meaning and value that traditionally have been close to the heart of a liberal education.

Some of what I found puzzled me. People pointed out quite properly that some Big Questions were alive and well in academia today. These included some questions about the origin of the universe, the emergence of life, the nature of consciousness, and others that have been raised by the scientific breakthroughs of the past few decades.

In the humanities and related social sciences the situation was rather different. Some friends reminded me that, not all big questions were in eclipse. Over the past generation faculty members have paid great attention to questions of racial, ethnicity, gender and sexual identity. Curricular structures, professional patterns, etc. continue to be transformed by this set of questions. Professors, as well as students, care about these  questions, and as a result, write, teach and learn with passion about them.

But there was wide agreement that other big questions, the ones about meaning, value, moral and civic responsibility, were in eclipse. To be sure, some individual faculty members addressed them, and when they did, students responded powerfully. In fact, in a recent Teagle-sponsored meeting on a related topic, participants kept using words such as “hungry,” “thirsty,” and “parched” to describe students’ eagerness to find ways in the curriculum, or outside it, to address these questions. But the old curricular structures that put these questions front and center have over the years often faded or been dismantled, including core curricula, great books programs, surveys “from Plato to NATO,” and general education requirements of various sorts. Only rarely have new structures emerged to replace them.

I am puzzled why. To be sure, these Big Questions are hot potatoes. Sensitivities are high. And faculty members always have the excuse that they have other more pressing things to do. Over two years ago, in an article entitled “Aim Low,” Stanley Fish attacked some of the gurus of higher education (notably, Ernest Boyer) and their insistence that college education should “go beyond the developing of intellectual and technical skills and … mastery of a scholarly domain. It should include the competence to act in the world and the judgment to do so wisely” ( Chronicle of Higher Education, May 16 2003). Fish hasn’t been the only one to point out that calls to “fashion” moral and civic-minded citizens, or to “go beyond” academic competency assume  that students now routinely achieve such mastery of intellectual and scholarly skills. We all know that’s far from the case.

Minimalist approaches -- ones that limit teaching to what another friend calls “sectoral knowledge -- are alluring. But if you are committed to a liberal education, it’s hard just to aim low and leave it at that. The fact that American university students need to develop basic competencies provides an excuse, not a reason, for avoiding the Big Questions. Students also need to be challenged, provoked, and helped to explore the issues they will inevitable face as citizens and as individuals. Why have we been so reluctant to develop the structures, in the curriculum or beyond it, that provide students with the intellectual tools they need to grapple thoughtfully over the course of a lifetime with these questions?

I see four possible reasons:

1. Faculty members are scared away by the straw man Stanley Fish and others have set up. Despite accusations of liberal bias and “brainwashing” no faculty member I know wants to “mold,” “fashion” or “proselytize” students.  But that’s not what exploring the Big Questions is all about.  Along with all the paraphernalia college students bring with them these days are Big Questions, often poorly formulated and approached with no clue that anyone in the history of humankind has ever had anything useful to say about any of them. There’s no need to answer those questions for students, or to try to fashion them into noble people or virtuous citizens for the republic. There is, however, every reason to help students develop the vocabularies, the metaphors, the exempla, the historical perspective, the patterns of analysis and argument that let them over time answer them for themselves.

2. A second possible reason is that faculty are put off by the feeling they are not “experts” in these matters. In a culture that quite properly values professional expertise, forays beyond one’s field of competence are understandably suspect. But one does not have to be a moral philosopher to raise the Big Questions and show some of the ways smart people in the past have struggled with them. I won’t pontificate about other fields, but in my own field -- classics and ancient history -- the Big Questions come bubbling up between the floor boards of any text I have ever taught. I don’t have to be a specialist in philosophy or political science to see that Thucydides has something to say about power and morality, or the Odyssey about being a father and a husband. A classicist’s job, as I see it, is to challenge students to think about what’s implicit in a text, help them make it explicit and use that understanding to think with.

3. Or is it that engaging with these “Big Questions” or anything resembling them is the third rail of a professional career. Senior colleagues don’t encourage it; professional journals don’t publish it; deans don’t reward it and a half dozen disgruntled students might sink your tenure case with their teaching evaluations. You learn early on in an academic career not to touch the third rail. If this is right, do we need to rewire the whole reward system of academia?

4. Or, is a former student of mine, now teaching at a fine women’s college, correct when she says that on her campus “It tends to be that … those who talk about morality and the big questions come from such an entrenched far right position … that the rest of us … run for cover.”  

Some of the above? All of the above? None of the above? You tell me, but let’s not shrug our shoulders and walk away from the topic until we’ve dealt with one more issue: What happens if, for whatever reason, faculty members run for the hills when the Big Questions, including the ones about morality and civic responsibility, arise? Is this not to lose focus on what matters most in an education intended to last for a lifetime? In running away, do we not then leave the field to ideologues and others we cannot trust, and create a vacuum that may be filled by proselytizers, propagandists, or the unspoken but powerful manipulations of consumer culture? Does this not sever one of the roots that has over the centuries kept liberal education alive and flourishing? But, most serious of all, will we at each Commencement say farewell to another class of students knowing that for all they have learned, they are ill equipped to lead an examined life? And if we do, can we claim to be surprised and without responsibility if a few decades later these same graduates abuse the positions of power and trust in our corporate and civic life to which they have ascended?

W. Robert Connor
Author's email: 

W. Robert Connor is president of the Teagle Foundation, which is dedicated to strengthening liberal education. More on the foundation's “Big Questions” project may be found on its Web site. This essay is based on remarks Connor recently made at a meeting of the Middle Atlantic Chapters of Phi Beta Kappa, at the University of Pennsylvania.

The Lowering of Higher Education

I just finished grading a hefty stack of final examinations for my introductory-level U.S. history survey course. The results were baleful.

On one section of the exam, for example, I supplied some identification terms of events and personages covered in class, asking students to supply a definition, date, and significance for each term. In response to “Scopes Monkey Trial,” one student supplied the following:

"The scopes monkey trial was a case in the supreme court that debated teaching evolution in the schools. It happened in 1925. Mr. Scope a teacher in a school wanted to teach about God and did not want to teach about evolution. The ACLU brought in lawyers to help with the case of Mr. Scopes. In the end Mr. Scopes side did not have the people's opinion. Evolution won. It is significant because now you have to teach evolution in school, you can't teach about God."

This answer might be considered a nearly perfect piece of evidence against intelligent design of the universe, since it gets just about everything (apart from the date) wrong: punctuation, spelling, grammar, and historical fact.

For those needing a refresher, Tennessee high school biology teacher John T. Scopes assigned a textbook informed by evolutionary theory, a subject prohibited by the state legislature. The court ruled against Scopes, who had, obviously, broken the law. But the defense won in the court of public opinion, especially after the ACLU’s lawyer, Clarence Darrow, tore apart William Jennings Bryan, the former Democratic presidential candidate, witness for the prosecution, and Biblical fundamentalist. The press dubbed it the "Scopes Monkey Trial" (inaccurately, since the theory of human evolution centered upon apes) and pilloried Bryan. As Will Rogers put it, "I see you can't say that man descended from the ape. At least that's the law in Tennessee. But do they have a law to keep a man from making a jackass of himself?"

An outside observer might ascribe my student’s mistakes to the political culture of this Midwestern city, where barely a day goes by without a letter to the editor in the local paper from some self-appointed foot soldier of the religious right.

That, however, wouldn’t explain another student who thought the 1898 war between the United States and Spain, fought heavily in Cuba, was about communism (not introduced into Cuba until after the 1959 revolution). Nor would it explain a third student who thought that the Scopes verdict condoned Jim Crow racial segregation.

A minority of students performed admirably, receiving grades in the range of A, while hewing, of course, to varied interpretations. Their success proved the exam was based upon reasonable expectations. However, the median exam grade was a C -- the lowest I’ve yet recorded, and fairly devastating for a generation of students who typically aspire to a B.

I was wondering what to make of this dispiriting but solitary data set when I read about the Education Department study released late last week that shows that the average literacy of college-educated Americans declined precipitously between 1992 and 2003. Just 25 percent of college graduates scored high enough on the tests to be deemed “proficient” in literacy.

By this measure, literacy does not denote the mere ability to read and write, but comprehension, analysis, assessment, and reflection. While “proficiency” in such attributes ranks above “basic” or “intermediate,” it hardly denotes rocket science. It simply measures such tasks as comparing the viewpoints in two contrasting editorials.

The error-ridden response I received about the Scopes Monkey Trial speaks less to the ideological clash of science and faith than to a rather more elemental matter. As students in the 1960s used to say, the issue is not the issue. The issue is the declining ability to learn. The problem we face, in all but the most privileged institutions, is a pronounced and increasing deficiency of student readiness, knowledge, and capacity.

Neither right nor left has yet come to terms with the crisis of literacy and its impact on higher education. The higher education program of liberals revolves around access and diversity, laudable aims that do not speak to intellectual standards. Conservatives, for their part, are prone to wild fantasies about totalitarian leftist domination of the campuses. They cannot imagine a failure even more troubling than indoctrination -- the inability of students to assimilate information at all, whether delivered from a perspective of the left or the right.

It would be facile to blame the universities for the literacy crisis, since it pervades our entire culture at every level. The Education Department’s statistics found a 10-year decline in the ability to read and analyze prose in high school, college, and graduate students alike.

However, the crisis affects the university profoundly, and not only at open-enrollment institutions like the regional campus on which I teach. Under economic pressure from declining government funding and faced with market competition from low-bar institutions, many universities have increasingly felt compelled to take on students whose preparation, despite their possession of a high-school degree, is wholly inadequate. This shores up tuition revenue, but the core project of the higher learning is increasingly threatened by the ubiquitousness of semi-literacy.

How can human thought, sustained for generations through the culture of the book, be preserved in the epoch of television and the computer? How can a university system dedicated to the public trust and now badly eroded by market forces carry out its civic and intellectual mission without compromising its integrity?

These questions cry out for answer if we are to stem a tide of semi-literacy that imports nothing less than the erosion of the American mind.

Christopher Phelps
Author's email: 

 Christopher Phelps is associate professor of history at Ohio State University at Mansfield.

Technicolor Dreams

If my recent experiences are any indication, we professors face a daunting challenge: The polarized American political environment has conditioned our students to see life in monochrome. The Right tells them to view all as either black or white, while the Left insists that everything is a shade of gray.

We’ve long struggled with the either/or student, the one who writes a history essay in which events are stripped of nuance and presented as the working out of God’s preordained plan; or the sociology student who wants to view poverty as a modern variant of 19th century Social Darwinism. These students -- assuming they’re not acting out some ideological group’s agenda -- can be helped along simply by designing lessons that require them to argue opposing points of view.

Yet despite all the hoopla about the resurgence of conservatism, I get more students whose blinders are more postmodern than traditional. This is to say that many of them don’t see the value of holding a steadfast position on much of anything, nor do they exhibit much understanding of those who do. They live in worlds of constant parsing and exceptions. Let me illumine through two examples.

In history classes dealing with the Gilded Age I routinely assign Edward Bellamy’s utopian novel Looking Backward.  In brief, protagonist Julian West employs a hypnotist for his insomnia and retires to an underground chamber. His Boston home burns in 1887 and West is not discovered until 2000, when he is revived by Dr. Leete. He awakens to a cooperative socialist utopia. West’s comments on his time say much about late 19th century social conflict, and Leete’s description of utopian Boston make for interesting class discussion. I know that some students will complain about the novel’s didactic tone, others will argue that Bellamy’s utopia is too homogeneous, and a few will assert that Bellamy’s explanation of how utopia emerged is contrived. What I had not foreseen is how many students find the very notion of a utopia so far-fetched that many can’t move beyond incredulity to consider other themes.

When I paraphrase Oscar Wilde that a map of the world that doesn’t include Utopia isn’t worth glancing at, some students simply don’t get it. "Utopia is impossible” is the most common remark I hear. “Perhaps so,” I challenge, “but is an impossible quest the same as a worthless quest?” That sparks some debate, but the room lights up when I ask students to explain why a utopia is impossible. Their reasons are rooted more in contemporary frustration than historical failure. Multiculturalism is often cited. “The world is too diverse to ever get people to agree” is one rejoinder I often receive.

It’s disturbing enough to contemplate that a social construct designed to promote global understanding can be twisted to justify existing social division, but far more unsettling was often comes next. When I ask students if they could envision dystopia, the floodgates open. No problems on that score!  In fact, they draw upon popular culture to chronicle various forms of it: Escape From New York, Blade Runner, Planet of the Apes…. “Could any of these happen?” I timidly ask. “Oh sure, these could happen easily,” I’m told.

My second jolt came in a different form, an interdisciplinary course I teach in which students read Tim O’Brien’s elegantly written Vietnam War novel The Things They Carried. O’Brien violates old novelistic standards; his book is both fictional and autobiographical, with the lines between the two left deliberately blurred. My students adored the book and looked at me as if they had just seen a Model-T Ford when I mentioned that a few critics felt that the book was dishonest because it did not distinguish fact from imagination. “It says right on the cover ‘a work of fiction’” noted one student.  When I countered that we ourselves we using it to discuss the actual Vietnam War, several students immediately defended the superiority of metaphorical truth because it “makes you think more.” I then asked students who had seen the film The Deer Hunter whether the famed Russian roulette scene was troubling, given that there was no recorded incident of such events taking place in Vietnam. None of them were bothered by this.

I mentioned John Sayles’ use of composite characters in the film Matewan. They had no problem with that, though none could tell me what actually happened during the bloody coal strikes that convulsed West Virginia in the early 1920s. When I probed whether writers or film makers have any responsibility to tell the truth, not a single student felt they did. “What about politicians?” I asked. While many felt that truth-telling politicians were no more likely than utopia, the consensus view was that they should tell the truth. I then queried, “So who gets to say who has to tell the truth and who gets to stretch it?” I was prepared to rest on my own clever laurels, until I got the students’ rejoinder! Two of my very best students said, in essence, that all ethics are situational, with one remarking, “No one believes there’s an absolute standard of right and wrong.” I tentatively reminded him that many of the 40 million Americans who call themselves "evangelical Christians" believe rather firmly in moral absolutes. From the back of the room pipped a voice, “They need to get over themselves.”

I should interject that this intense give-and-take was possible because I let my students know that their values are their own business. In this debate I went out of way to let them know I wasn’t condemning their values; in fact, I share many of their views on moral relativism, the ambiguity of truth, and artistic license. But I felt I could not allow them to dismiss objective reality so cavalierly. Nor, if I am true to my professed belief in the academy as a place where various viewpoints must be engaged, could I allow them to refuse to consider anyone who holds fast to moral absolutism.

The stories have semi-happy endings. I eventually got my history students to consider the usefulness of utopian thinking. This happened after I suggested that people of the late 19th century had better imaginations than those of the early 21st, which challenged them to contemplate the link between utopian visions and reform, and to see how a moralist like Bellamy could inspire what they would deem more pragmatic social changes. My O’Brien class came through when I taught the concept of simulacra, showed them a clip from the film Wag the Dog and then asked them to contemplate why some see disguised fiction as dangerous. (Some made connections to the current war in Iraq, but that’s another story!)

My goal in both cases was to make students see points of view other than their own. Both incidents also reminded me it’s not just the religious or conservative kids who need to broaden their horizons. We need to get all students to see the world in Technicolor, even when their own social palettes are monochromatic. Indeed, the entire academy could do worse than remember the words of Dudley Field Malone, one of the lawyers who defended John T. Scopes. Malone remarked, “I have never in my life learned anything from any man who agreed with me.” 

Robert E. Weir
Author's email: 

Robert E. Weir is a visiting professor at Commonwealth College of the University of Massachusetts at Amherst and in the American studies program at Smith College.

Doing the Lord's Work

Two images of William Jennings Bryan have settled into the public memory, neither of them flattering. One is the fundamentalist mountebank familiar to viewers of Inherit the Wind, with its fictionalized rendering of the Scopes trial. In it, the character based on Bryan proclaims himself “more interested in the Rock of Ages than the age of rocks.” He is, in short, a crowd-pleasing creationist numbskull, and nothing more.

The other portrait of Bryan is less cinematic, but darker. The classic version of it appears in Richard Hofstadter’s classic The American Political Tradition, first published in 1948 and still selling around 10,000 copies each year, according to a forthcoming biography of the historian. Hofstadter sketches the career of Bryan as a populist leader during the economic depression of the 1890s, when he emerged as the Midwest’s fierce and eloquent scourge of the Eastern bankers and industrial monopolies.

Yet this left-leaning Bryan had, in Hofstadter’s account, no meaningful program for change. He was merely a vessel of rage. Incapable of statesmanship, only of high-flown oratory, he was a relic of the agrarian past –- and the prototype of the fascistic demagogues who were discovering their own voices, just as Bryan’s career was reaching its end.

Historians have been challenging these interpretations for decades -– beginning in earnest more than 40 years ago, with the scholarship of Lawrence W. Levine, who is now a professor of history and cultural studies at George Mason University. It was Levine who pointed out that when Bryan denounced evolution, he tended to be thinking more of Nietzsche than of Darwin. And the Nietzsche he feared was not today’s poststructuralist playboy, but the herald of a new age of militaristic brutality.

Still, old caricatures die hard. It may be difficult for the contemporary reader to pick up Michael Kazin’s new book, A Godly Hero: The Life of William Jennings Bryan (Knopf) without imagining that its title contains a snarl and a sneer. Isn’t the rhetoric of evangelical Christianity and anti-elitist sentiment always just a disguise for base motives and cruel intentions? To call someone godly is now, almost by default, to accuse them of hypocrisy.

But Kazin, who is a professor of history at Georgetown University, has a very different story to tell. Revisionist scholarship on Bryan -- the effort to dig beneath the stereotypes and excavate his deeper complexities -- has finally yielded a book that might persuade the general reader to rethink the political role played by “the Great Commoner.”

In an earlier study, The Populist Persuasion: An American History (Basic Books, 1995), Kazin described the emergence in the 19th century of an ideology he called “producerism” – a moral understanding of politics as the struggle between those who built the nation’s infrastructure and those who exploited it. (The farmer or the honest businessman was as much a producer as the industrial worker. Likewise, land speculators and liquor dealers were part of the exploitive class, as were bankers and monopolistic scoundrels.)

The producerist ethos remains a strong undercurrent of American politics today. Bryan was its most eloquent spokesman. He wedded it to a powerful (and by Kazin’s account utterly sincere) belief that politics was a matter of following the commandment to love thy neighbor. As a man of his era, Bryan could be obtuse about how to apply that principle: His attitude toward black Americans was paternalistic, on a good day, and he was indifferent, though not hostile, concerning the specific problems facing immigrants. But Kazin points out that there is no sign of nativist malice in Bryan’s public or private communications. Some of his followers indulged in conspiratorial mutterings against the Catholics or the Jews, but Bryan himself did not. At the same time -- canny politician that he was -- he never challenged the growing power of the Klan during the 1920s.

It’s an absorbing book, especially for its picture of Bryan’s following. (He received an incredible amount of mail from them, only about two percent of which, Kazin notes, has survived.) I contacted Kazin to ask a few questions by e-mail.  

Q: By today's standards, Bryan seems like a bundle of contradictions. He was both a fundamentalist Christian and the spokesman for the left wing of the Democratic Party. He embodied a very 19th century notion of "character," but was also exceptionally shrewd about marketing his own personality. For many Americans, he was a beloved elder statesman -- despite losing his two presidential bids and otherwise spending very little time in elected office. How much of that contradictoriness is in Bryan himself, and how much in the eye of the beholder today?

A: Great question! The easiest part to answer is the first: for Bryan and many other reform-minded white Christians, there was no contradiction between their politics and their religion. The “revolution” being made by the Carnegies and Vanderbilts and Rockefellers was destroying the pious republic they knew, or wished to remember (slavery, of course, they forgot about). What Bryan called “applied Christianity” was the natural antidote to the poison of rampant capitalism. The rhetoric of Bellamy, the People’s Party, and the Knights of Labor was full of such notions -– as were the sermons and writings of many Social Gospelers, such as Washington Gladden and Charles Stelzle.

On the character-personality question – I think Warren Susman and many historians he influenced over-dichotomize these two concepts. No serious Christian could favor the latter over the former. Yet, the exigencies of the cultural marketplace and of celebrity culture in particular produced a fascination with the personal lives of the famous. So Bryan, who was as ego-obsessed as any politician, went with the flow, knowing his personality was boosting his political influence. Being a journalist himself, he understood the rules of the emerging game. Do you know Charles Ponce De Leon’s book about celebrity journalism in this period?

Q. Oddly enough, I do, actually. But let's talk about the people to whom Bryan appealed. From digging in the archives, you document that Bryan had a following a loyal following among professionals, and even successful businessmen, who saw themselves as part of the producing class threatened by the plutocratic elite. Was that surprising to discover? Normally you think of populism in that era as the politics of horny-handed toil.

A: As I argued in The Populist Persuasion, when “producerism” became a popular ideal in democratic politics, Americans from many classes were quite happy to embrace it. It thus became an essential contested concept. But among a broad cross-section of people, the critique of finance capital was always stronger in the South and West, where Bryan had his most consistent support, than in places like Philly and NYC.

As for the letters -— I enjoyed that part of the research the most, although it was frustrating as hell to find almost no letters that were written during the campaign of 1908 and only a small number from then until the 1920s. If only WJB or his wife had revealed, somewhere, the criteria they used when dumping all that correspondence! That, at least,would have been a consolation. Of course, if they had kept nearly all of it, I’d still be there in the Manuscript Room at the Library of Congress, tapping away.

Q: I get the impression that Bryan might well have become president if women had been able to vote in 1896 or 1900. How much of his appeal came from expressing the moral and cultural ideals associated with women's "civilizing" role? And how much of it was sex appeal of his rugged personality and magnetic stage presence? 

A: Ah, the counterfactuals! Bryan’s image as a “godly hero” certainly did appeal to many women, as did his eloquence and good looks (the latter, at least while he was in his 30s and early 40s). His support for prohibition and woman suffrage would have appealed to many women as well.

In 1896 and 1900, he carried most of the states where women then had the vote (in the Mountain West). Although that may have been because of his free silver and anti-big money stands, which is probably why most men in those states voted for him. On the other hand, his radical image could have limited his appeal to women elsewhere in the country. Women voters, before the 1960s, tended to vote for safe, conservative candidates.

Q: Another counterfactual.... What if Bryan had won? What sort of president would he have been? The man was great at making speeches; none better. But could he really have cut it as Chief Executive?

A: As president, he probably would have been a divisive figure, perhaps an American Hugo Chavez -— but without the benefit of oil revenues! If he tried to carry out the 1896 platform, there may have been a capital strike against him, which would have brought on another depression. If he hadn’t, the Populists and many left Democrats would have deserted him. The sad fact is that he hadn’t built a strong enough constituency to govern, much less to win the election in the first place.

Q: Finally, a question about the subjective dimension of working on this biography. Any moments of profound disillusionment? Rapt admiration? Sudden epiphany?

A: I wish I had time to pursue this important question at length -- perhaps I’ll write an essay about it someday. But briefly: I started reading all those fan letters and experienced an epiphany. Millions of ordinary people adored this guy and thought he was a prophet! And he was certainly fighting the good fight -– against Mark Hanna and his friends who were initiating the U.S. empire.

I also was impressed by his ability as a speech-writer as well as a performer. He could turn a phrase quite brilliantly. But after a year or so, I had to come to grips with his bigotry against black people and his general inability to overcome his mistrust of urban pols (although he didn’t share the anti-Catholicism and anti-Semitism of some of his followers).

The problem was, in the America of a century ago, Bryan would not have been a hero to the white evangelical grassroots if he had been as clever and cosmopolitan a pol as FDR. So I ended up with a historian’s sense of perspective about the limits of the perceptions and achievements of the past. In recent speeches, E.J. Dionne muses that perhaps we should ask “What Would William Jennings Bryan Do?” I’m not sure that’s a useful question. 

Scott McLemee
Author's email: 

Oh, Good Lord...

During the early decades of the 20th century, a newspaper called The Avery Boomer served the 200 or so citizens of Avery, Iowa. It was irregular in frequency, and in other ways as well. Each issue was written and typeset by one Axel Peterson, a Swedish immigrant who described himself as "lame and crippled up," and who had to make time for his journalistic labors while growing potatoes. A member of the Socialist Party, he had once gained some notoriety within it for proposing that America’s radicals take over Mexico to show how they would run things. Peterson was well-read. He developed a number of interesting and unusual scientific theories -- also, it appears, certain distinctive ideas about punctuation.

Peterson regarded himself, as he put it, as "a Social Scientist ... developing Avery as a Social Experiment Station" through his newspaper. He sought to improve the minds and morals of the townspeople. This was not pure altruism. Several of them owed Petersen money; by reforming the town, he hoped to get it back.

But he also wanted citizens to understand that Darwin's theory of evolution was a continuation of Christ's work. He encouraged readers to accelerate the cause of social progress by constantly asking themselves a simple question: "What would Jesus do?"

I discovered the incomparable Peterson recently while doing research among some obscure pamphlets published around 1925. So it was a jolt to find that staple bit of contemporary evangelical Christian pop-culture -- sometimes reduced to an acronym and printed on bracelets -- in such an unusual context. But no accident, as it turns out: Peterson was a fan of the Rev. Charles M. Sheldon’s novel In His Steps (1896), which is credited as the source of the whole phenomenon, although he cannot have anticipated its mass-marketing a century later.

Like my wild potato-growing Darwinian socialist editor, Sheldon thought that asking WWJD? would have social consequences. It would make the person asking it “identify himself with the great causes of Humanity in some personal way that would call for self-denial and suffering,” as one character in the novel puts it.

Not so coincidentally, Garry Wills takes a skeptical look at WWJD in the opening pages of his new book, What Jesus Meant, published by Viking. He takes it as a variety of spiritual kitsch -- an aspect of the fundamentalist and Republican counterculture, cemtered around suburban mega-churches offering a premium on individual salvation.

In any case, says Wills, the question is misleading and perhaps dangerous. The gospels aren’t a record of exemplary moments; the actions of Jesus are not meant as a template. “He is a divine mystery walking among men,” writes Wills. “The only way we can directly imitate him is to act as if we were gods ourselves -- yet that is the very thing he forbids.”

Wills, a professor emeritus of history at Northwestern University, was on the way to becoming a Jesuit when he left the seminary, almost 50 years ago, to begin writing for William F. Buckley at The National Review. At the time, that opinion magazine had a very impressive roster of conservative literary talent; its contributors included Joan Didion, Hugh Kenner, John Leonard, and Evelyn Waugh. (The mental firepower there has fallen off a good bit in the meantime.) Wills came to support the civil rights movement and oppose the Vietnam war, which made for a certain amount of tension; he parted ways with Buckley’s journal in the early 1970s. The story is told in his Confessions of a Conservative (1979) – a fascinating memoir, intercalated with what is, for the nonspecialist anyway, an alarmingly close analysis of St. Augustine’s City of God.

Today -- many books and countless articles later -- Wills is usually described as a liberal in both politics and theology, though that characterization might not hold up under scrutiny. His outlook is sui generis, like that of some vastly more learned Axel Peterson.

His short book on Jesus is a case in point. You pick it up expecting (well, I did, anyway) that Wills might be at least somewhat sympathetic to the efforts of the Jesus Seminar to identify the core teachings of the historical Jesus. Over the years, scholars associated with the seminar cut away more and more of the events and sayings attributed to Jesus in the four gospels, arguing that they were additions, superimposed on the record later.

After all this winnowing, there remained a handful of teachings -- turn the other cheek, be a good Samaritan, love your enemies, have faith in God -- that seemed anodyne, if not actually bland. This is Jesus as groovy rabbi, urging everybody to just be nice. Which, under the circumstances, often seems to the limit of moral ambition available to the liberal imagination.

Wills draws a firm line between his approach and that of the Jesus Seminar. He has no interest in the scholarly quest for “the historical Jesus,”  which he calls  a variation of fundamentalism: “It believes in the literal sense of the Bible,” writes Wills, “it just reduces the Bible to what it can take as literal quotations from Jesus.” Picking and choosing among the parts of the textual record is anathema to him: “The only Jesus we have,” writes Wills, “is the Jesus of faith. If you reject the faith, there is no reason to trust anything the Gospels say.

He comes very close to the position put forward by C.S. Lewis, that evangelical-Christian favorite. “A man who was merely a man and said the sort of things Jesus said,” as Lewis put it, “would not be a great moral teacher. He would either be a lunatic -- on a level with the man who says he is a poached egg -- or else he would be the Devil of Hell. You must make your choice. Either this man was, and is, the Son of God; or else a madman or something worse.”

That’s a pretty stark range of alternatives. For now I’ll just dodge the question and run the risk of an eternity in weasel hell. Taking it as a given that Jesus is what the Christian scriptures say he claimed to be -- “the only-begotten Son of the Father” -- Wills somehow never succumbs to the dullest consequence of piety, the idea that Jesus is easy to understand. “What he signified is always more challenging than we expect,” he writes, “more outrageous, more egregious.”

He was, as the expression goes, transgressive. He “preferred the company of the lowly and despised the rich and powerful. He crossed lines of ritual purity to deal with the unclean – with lepers, the possessed, the insane, with prostitutes and adulterers and collaborators with Rome. (Was he subtly mocking ritual purification when he filled the waters with wine?) He was called a bastard and was rejected by his own brothers and the rest of his family.”

Some of that alienation had come following his encounter with John the Baptist -- as strange a figure as any in ancient literature: “a wild man, raggedly clad in animal skins, who denounces those coming near to him as ‘vipers offspring.’” Wills writes that the effect on his family must have been dismaying: “They would have felt what families feel today when their sons or daughters join a ‘cult.’”

What emerges from the gospels, as Wills tell it, is a figure so abject as to embody a kind of permanent challenge to any established authority or code of propriety. (What would Jesus do? Hang out on skid row, that’s what.) His last action on earth is to tell a criminal being executed next to him that they will be together in paradise.

Wills says that he intends his book to be a work of devotion, not of scholarship. But the latter is not lacking. He just keeps it subdued. Irritated by the tendency for renderings of Christian scripture to have an elevated and elegant tone, Wills, a classicist by training, makes his own translations. He conveys the crude vigor of New Testament Greek, which has about as much in common with that of Plato as the prose of Mickey Spillane does with James Joyce. (As Nietzsche once put it: “It was cunning of God to learn Greek when He wished to speak to man, and not to learn it better.”)

Stripping away any trace of King James Version brocade, Wills leaves the reader with Jesus’s words in something close to the rough eloquence of the public square. “I say to all you can hear me: Love your foes, help those who hate you, praise those who curse you, pray for those who abuse you. To one who punches your cheek, offer the other cheek. To one seizing your cloak, do not refuse the tunic under it. Whoever asks, give to him. Whoever seizes, do not resist. Exactly how you wish to be treated, in that way treat others.... Your great reward will be that you are the children of the Highest One, who also favors ingrates and scoundrels.”

A bit of sarcasm, perhaps, there at the end -- which is something I don’t remember from Sunday school, though admittedly it has been a while. The strangeness of Jesus comes through clearly; it is a message that stands all “normal” values on their head. And it gives added force to another remark by Nietzsche: “In truth, there was only one Christian, and he died on the cross.”

Scott McLemee
Author's email: 

Scott McLemee writes Intellectual Affairs each week. Suggestions and ideas for future columns are welcome.


Subscribe to RSS - History
Back to Top