History

Other Casualties

One part of Milovan Djilas's Conversations with Stalin lingers in the memory well after the rest of the book fades. The author himself calls it "a scene such as might be found only in Shakespeare's plays." Actually, it does have its parallels to Rabelais, as well; for like many another gathering of the Soviet elite amidst the privations of World War II that Djilas recounts, there is an enormous feast, and a marathon drinking session.

This particular miniature carnival occurs in the final months of the war. Stalin is hosting a reception at the Kremlin for the Yugoslavian delegation. But before the partying begins, he must disburden himself; for Stalin has heard that Djilas (who would later become vice president under Marshall Tito) has criticized the behavior of some units of the Red Army as it has made its way across Europe.

"Does Djilas, who is himself a writer, not know what human suffering and the human heart are?" cries Stalin. "Can't he understand it if a soldier who has crossed thousands of kilometers through blood and fire and death has fun with a woman or takes a trifle?"

By "having fun," he was referring to well over two million rapes, by Soviet soldiers, of women of all ages and backgrounds. The very indiscriminateness of the sexual violence gives the lie to the idea that it was revenge for the suffering inflicted by the Germans. Inmates liberated from Nazi concentration camps were raped as well.

As for Djilas, it must have seemed, for a moment, as if Stalin's outburst were the kiss of death. Luckily for him, the dictator's mood changed. "He proposed frequent toasts," recalls the author, "flattered one person, joked with another, teased a third, kissed my wife because she was a Serb, and again shed tears over the hardships of the Red Army and Yugoslav ingratitude."

Perhaps in response to the criticism, Stalin issued a command that soldiers behave themselves. The Soviet officers read the proclamation to their troops with a smirk. Everyone knew it meant nothing. Boys will be boys.

The anonymous memoir A Woman in Berlin, now appearing in a new English translation from Metropolitan Books, is an extraordinary chronicle of life in the streets as the Thousand Year Reich turned into rubble and the advancing "Ivans" had their fun. The author was a German editor and journalist who died in 2001. Her book, based on a diary kept over two months during the spring of 1945, first appeared in English in 1954. It was only published in German in 1959, where it seems tohave been regarded as an intolerable faux pas, a violation of the unstated rule that the events never be mentioned again.

The book's rediscovery now comes in the wake of Antony Beevor's massive documentation of the rape campaign in The Fall of Berlin 1945, published three years ago by Viking Press. To judge by the reservations of some military historians, Beevor's account may not be the last word on howSoviet forces advanced into Germany. (A reviewer for Parameters, the journal of the U.S. Army War College, praised it as a work of popular history, but lodged some complaints about certain gaps in the book's account of troop manuevers.) Yet the book did take an unflinching look at the extent of the sexual terror.

Beevor supplies an introduction to the new edition of A Woman in Berlin, situating the document in historical context. He notes, for example, that the statistics about rape for Berlin "are probably the most reliable in all of Germany," falling somewhere between 95,000 and 130,000 victims "according to the two leading hospitals."

He also points out that there is no particular evidence that rape was treated as a deliberate strategy of war -- as human-rights activists have recently charged the Sudanese military with doing in Darfur.  "No document from the Soviet archives indicates anything of the sort in 1945," writes Beevor. But he suggests that the scale of the attacks may have been a by-product of the Red Army's internal culture, even so: "Many soldiers had been so humiliated by their own officers and commissars
during the four years of war that they felt driven to expiate bitterness, and German women presented the easiest target. Polish women and female slave laborers in Germany also suffered."

Reading the memoir itself, you find all such interpretive questions being put on hold. It is not just a document. The author, an urbane and articulate woman in her early 30s, writes about the fall of Berlin and her own repeated violation with an astounding coolness -- a bitter, matter-of-fact lucidity, the extreme candor of which is almost disconcerting, given the lack of even a hint of self-pity.

"No doubt about it," she writes after being raped several times in a row. "I have to find a single wolf  to keep away the pack. An officer, as high-ranking as possible, a commandant, a general, whatever I can manage. After all, what are my brains for, my little knowledge of the enemy's language?... My mind is firmly made up. I'll think of something when the time comes. I grin to myself in secret, feel as if I'm performing on the stage. I couldn't care less about the lot of them! I've never been so removed from myself, so alienated. All my feelings seem dead, except for the drive to live."

I've just reviewed the latest edition of A Woman in Berlin for Newsday, and will spare you a recycling of that effort (now available here ). Since then, a look at other reviews has revealed some debate over the authenticity of the book. The comments of J.G. Ballard ( no stranger to questions of sexuality in extreme conditions ) are indicative.

"It is hard to believe, as the author claims, that it was jotted down with a pencil stub on old scraps of paper while she crouched on her bed between bouts of rape," wrote Ballard in The New Statesman a few weeks ago. "The tone is so dispassionate, scenes described in so literary a way, with poignant references to the strangeness of silence and the plaintive cry of a distant bird. We live at a time that places an almost sentimental value on the unsparing truth, however artfully deployed. But the diary seems convincingly real, whether assembled later from the testimonies of a number of women or recorded at first hand by the author."

Given that concern, it is worth looking up the original edition of A Woman in Berlin, now more than 50 years old. It came with an introduction by C.W. Ceram, whose book Gods, Graves, and Scholars, first published in 1951, remains one of the best introductions to the history of archeology. Ceram recalls meeting the author of A Woman in Berlin not long after the war.

"From some hints that she dropped," he wrote, "I learned of this diary's existence. When, after another six months passed, I was permitted to read it, I found described in detail what I already knew from the accounts of others."

That means Ceram saw the book in 1947, at the latest. "It took me more than five years, however, to persuade the author that her diary was unique, that it simply had to be published."

She had, he writes, "jotted down in old ledgers and on loose pages what happened to her.... These pages lie before me as I write. Their vividness as expressed in the furtiveness of the short penciled notes; the excitement they emanate whenever the pencil refuses to describe the facts; the combination of shorthand, longhand, and secret code ... all of this will probably be lost in the depersonalizing effect of the printed word."

Ceram's introduction is interesting for its testimony about the book's provenance. But that remark about "the depersonalizing effect of the printed word" will seem odd to anyone who has read A Woman in Berlin.

In many ways, of course, the book is an account of brutality. (War is a force that turns people into things, as Simone Weil once put it; and killing them is just one of the ways.) But the anonymous author also created a record of what is involved in resisting depersonalization. At times, she is able to see the occupiers, too, as human beings. You cannot put the book down without wondering about the rest of her life.

Author/s: 
Scott McLemee
Author's email: 
scott.mclemee@insidehighered.com

Necessary Evils

  "In a time of war," wrote Cicero, "the laws are silent." (That's "inter arma silent leges," in case some nuance is missing from the usual English rendering.)

Well, perhaps not quite silent. Marouf A. Hasian's In the Name of Necessity: Military Tribunals and the Loss of American Civil Liberties, available next month from the University of Alabama Press, revisits more than 200 years of American argumentation for and against the legitimacy of "military justice."

That phrase merits the scare quote marks because it is very much open to question whether they quite belong together. You don't need to be a pacifist, or even to harbor any doubt about liberal democracy, to have such concerns. The role of the military is, of course, to fight; and the legitimacy of its monopoly on violence derives (in modern societies anyway) from its subordination to a lawful order. At best -- so the argument might go -- the military can pursue a just policy, as subject to oversight and review by outside institutions. Hence the rise of what is called the "civilianization" of military law.

That's the theory, anyway. The actual record is a good bit messier, as Hasian, an associate professor of communications at the University of Utah, shows in some detail. His book presents a series of analytic retellings of events from the Revolutionary War through the detainments at Guantanamo Bay. To some degree, then, it overlaps with William Rehnquist's All the Laws But One: Civil Liberties in Wartime, (1998, which focused mainly on cases from the Civil War and the two World Wars.

But the difference is not simply a matter of the opening of a whole new chapter in history over the past four years. Hasian's book is the work of a scholar who has taken "the rhetorical turn" -- drawing on the toolkit of concepts from one of the founding disciplines of humanistic study. A social historian or a law professor might also cover, as he does, the 1862 U.S-Dakota war tribunal, which led to the execution of a group of Native Americans -- or the 1942 trial of several German saboteurs, captured shortly after they had been deposited on the coasts of New York and Florida, along with bomb-making materials, by U-boat. But Hasian treats these cases neither as events (as a historian would) nor as precedents (the lawyer's concern).

The emphasis in his book falls, rather, on how a particular element of persuasion took shape in each case: the argument of necessity. In each case, the claim was made that circumstances demanded the suspension of normal legal procedures and guarantees, and their replacement by military tribunals that practiced the warlike virtues of secrecy, efficiency, and swiftness.

A philosopher or legal theorist might want to dissect the validity, coherence, or applicability of "necessity" as a principle applied in such cases. Hasian's approach treats it, not as a concept, but as what rhetoric scholars have in recent years called an "ideograph" -- that is, "a key evocative term or phrase that illustrates the political allegiances of an individual and a community in a major social, political, economic, or legal controversy." Other ideographs include such terms as "equality," "progress," and "freedom."

The range of definitions and of emotional charge for each term varies. They have a rather timeless sound, but a complex history of mutations in meaning. And in the heat of debate, they can be made to perform a variety of functions. The meaning of an ideograph in a given context is marked by that context.

Perhaps the strongest version of the argument from necessity is the one that Lincoln made against those who criticized him for suspending habeus corpus during the Civil War: "Are all the laws, but one, to go unexecuted, and the government go to pieces, lest that one be violated?" In other words: Moments of extremity can require the temporary sacrifice of some civil liberties to preserve the rest.

Rehnquist signaled his basic agreement with this line of thought by titling his book All the Laws But One. "It is neither desirable nor is it remotely likely," he wrote there, "that civil liberty will occupy as favored a position in wartime as it does in peacetime."

But even the fairly straightforward affirmation of necessity as a legitimate ground for suspending civil liberties is the result of (and a moment of conflict within) a complicated history of arguments. In tracing out the history of necessity, Hasian identifies two strands of -- well, it's not clear what the expression would be. Perhaps "ideographic DNA"? One he calls the "Tory" concept of necessity; the other, the "Whig" version.

In the Tory framing, there are "many times when a society is called upon to defend itself against riots, revolutions, and rebellions," as Hasian puts it. It is the responsibility of the monarch or the executive branch to recognize the danger and respond accordingly. "Since this is an issue of survival, the military authorities should be given a great deal of discretion. In these situations, the 'will' of those in authority will be of paramount importance."

(In other words, an element of sovereign authority is handed over to the military. The commanding officer is then in the position to say, "I am the law." And legitimately so.)

By contrast, the Whiggish conception of necessity sees "relatively few times when a society has to worry about exigent circumstances." Responsibility for judging whether or not a real emergency exists should fall to the parliament or the legislative branch -- to which the military must remain accountable.

Appropriately enough, given a Whiggish sensibility, this means a certain degree of guardedness and jealousy about the degree of judicial authority delegated to the military. There will be a tendency towards suspicion that the trust might be abused. The Whig discourse on necessity wants to keep to a bare minimum the scope, duration, and degree of secrecy that military tribunals may claim.

The classic formulation of the Whig conception in American history is Ex parte Milligan, from 1866, in which the Supreme Court found that the Union military authorities had overstepped by arresting and trying a Confederate sympathizer in Indiana -- a state where the normal functioning of the court system had not been interrupted by the war.

Of course, Ex parte Milligan fans have taken some hits lately. We had a good run there, for a while. But lately all the swagger comes from enthusiasts for Ex parte Quirin (1942), which denied the claim of German saboteurs to appeal for civil trials.

What makes Hasian's account of Quirin so interesting is his suggestion that some Supreme Court justices "actually thought that their decision would be construed as falling in line with the precedents that placed limits on military commissions and executive power." But if that was the intention 60 years ago, you'd never know it to read the newspapers today.

This is an aerial overview of In the Name of Necessity. The real provocations are in the details. Perhaps the analytic category of ideograph sounds a trifle thin -- turning bloody arguments into something rather anemic. But Hasian's book is ultimately more polemical than that. The framework is only just technically "value neutral." He's got a position to stake out.

"In the very, very rare cases of extreme necessity," he writes, "when Congress and the United Nations have decided we need to impose martial law or have commissions in occupied lands, we may have situations where all of the civil courts are closed and where the military may need more discretion."

That much, Hasian will concede to the Tory worldview, and no more. But even then, such assertions of power "need to be held in check by recognizing that most of the time we should begin with the baseline 'Whig' assumption that we want to maintain the civilianization of the military, and not the other way around."

OK, fair enough. Now how will that play out in the courts under Chief Justice Roberts? And particularly under a circumstance in which the Tories are so powerful that nobody really doubts that Chief Justice Roberts will be presiding?

That Whig in extremis John Milton said that necessity is "ever the tyrant's plea." But we might be entering a period when the plea doesn't even have to be made -- when war doesn't silence law, but writes it.

Author/s: 
Scott McLemee
Author's email: 
scott.mclemee@insidehighered.com

Scott McLemee writes Intellectual Affairs on Tuesdays and Thursdays. Suggestions and ideas for future columns are welcome.

The Media World as It Is

I direct a journalism school known for its support of the First Amendment, which we celebrate annually with speeches and case studies. As such, I am a source on free press issues. Reporters contact me about such cases as the Ward Churchill fiasco at the University of Colorado, asking if his “little Eichmanns” depiction of 9/11 victims is protected speech -- perhaps speech that should protected by me. I deflect those calls, believing such controversies are less about free speech and more about a media culture that values opinion more than fact.

There are many reasons for this, but nearly all point to new technology and profit margin. For starters, opinion costs less to disseminate than fact and can be aligned with a target market. Media chains that care more about revenue than reputation have purchased outlets from families that had safeguarded rights in hometowns for generations. Computerization and downsizing of newsrooms deleted reporters from the scene so that they became less visible and therefore, vital. Meanwhile communication technology became affordable so that consumers had in their palms the power to broadcast or publish at will.

The news industry has changed so much, so quickly. To be sure, some of that change has been refreshing and long in coming. The Internet, and blogging in particular, have created digital dialogues about media, challenging corporate business as usual. But the promise of technology -- that it would build social networks, democratize news and generally enhance information in two-way flows -- has always hinged on the presumption of readily available and verifiable information.

What are the consequences, not only for media, but for academe, when opinion displaces fact?

The Social Idea

I worked as a correspondent and state editor for United Press International in the 1970s. Members of my generation built on the Edward R. Murrow legacy of intermingling fact with experience. Murrow, an original "embedded" journalist, went on a World War II bombing mission over Germany, reporting for CBS radio network. According to Murrow’s code of ethics, reality was the observed state of people and the world. In other words he thought reporters had to be on the scene to report fact reliably. Practicing that, he brought the war in Europe to America, just as my generation brought home another war with a different outcome -- the war in Vietnam.

Because universities dealt with fact, they played a role in the social protests of that era. Although organizations and movements such as Students for a Democractic Society (led by student editor Tom Hayden at Michigan) and Free Speech (University of California-Berkeley) began in the early to mid 1960s, they came together and spurred protests after news coverage of the 1968 Democratic Convention in Chicago. In 1970, coverage of the invasion of Cambodia sparked a protest at Kent State University that killed four students and injured eight. More protests followed nationwide with two more students killed at Jackson State University. Hundreds of colleges and universities shut down as students went on strike, with subsequent protests often tied to a specific news event. While those protests were political, they were usually in response to factual reporting. Lest we forget, 63 journalists died covering the wars in Vietnam and Cambodia. More recently, 25 journalists died in 2004 alone covering the war in Iraq. One has to ask oneself why that fact alone is scarcely known on college campuses.

Journalists fed Vietnam-era protests simply by reporting fact in a culture that still appreciated its power. We differed from Murrow-era journalists, our mentors, relying less on emotion and more on anonymous sources, for which we caught (and still catch) hell, filing reports in a detached, impartial voice. We practiced objectivity, realizing it could not be fully attained but amassing factual fragments so that a discernable mosaic could emerge.

We tried to see the world as it was, not as we wished it were.

That definition is derived from the British poet Matthew Arnold, who wrote about "genuine scientific passion" in the 1869 essay, “Culture and Anarchy.” Arnold maintained that people who saw things as they were also apprehended a culture beyond the consciousness of science -- a moral culture, informed by conscience. This, Arnold wrote, was the "social idea" that made such people "the true apostles of equality" who "have a passion for diffusing, for making prevail … from one end of society to the other, the best knowledge, the best ideas of their time."

I read “Culture and Anarchy” during the adversary days of the Watergate era. It seemed an appropriate title. Doing so I understood the role of journalism in promoting the “social idea.” The most popular television news show then was “60 Minutes,” on Murrow’s old network, CBS. The show had a novel segment called “Point/Counterpoint.” The most heated segments featured liberal Shana Alexander against conservative James J. Kilpatrick.

Their debates heralded a hitherto unexplored news phenomenon in which sources could pit one constellation of facts against an opposing constellation. This media milieu existed well into the 1990s, diluting the previous culture of fact and transforming it into one of factoid (partial/pseudo-fact).  But fact maintained some power.

“Point/Counterpoint” soon changed that. Keep in mind that this segment was so startling at the time that a new satire show, “Saturday Night Live,” ridiculed it regularly with Jane Curtin assailing the viewpoint of Dan Aykroyd who, invariably, would respond, “Jane, you ignorant …” -- and then he said a word that a savvy source, knowing today’s opinionated media, would not tell a reporter, if sharing this anecdote, fully aware of free speech rights, cognizant that the omitted word is a matter of record and also a matter of fact. This is not political correctness but what occurs in a culture of knee-jerk opinions. Responsible people, or people responsible for others, are aware of that culture and wary about adding their informed voices to the social debate, leaving that to those who would seek celebrity or who would entertain or worse, strike fear and outrage in others.

Fear and outrage are byproducts of an uninformed society. Perhaps that is why Americans increasingly embrace entertainment. James Howard Kunstler in his prescient 1996 book Home from Nowhere maintains that no other society but ours has been so preoccupied with instantaneous make-believe and on-demand fantasy. Because we fear so much, Kunstler writes, “we must keep the TVs turned on at all waking hours and at very high volume.” To escape fear, we amuse ourselves to death -- a phrase associated with a 1985 book by the great communication scholar, Neil Postman, who died in 2003, although many, perhaps ones reading this column, were not informed about the fact of his passing.

Just Another Viewpoint

When families who lost relatives in the 9/11 attacks were still grieving, Ward Churchill, the Colorado professor, was comparing their loved ones to “little Eichmanns.” His inflammatory essay lay dormant on the Internet until only recently. The controversy that arose because of Churchill’s opinions became so intense that Elizabeth Hoffman, president at Colorado, announced her resignation amid the media clamor. To be sure, Hoffmann was dealing with other controversies at the time, but coverage of Churchill became so intense that it might have contributed to that resignation.

A few years ago I could have invited Ward Churchill to Ames, Iowa, during a First Amendment event for a debate about his views. To do so now would assemble a media circus, bringing controversy to my journalism school. And what good would my counterpoint to his opinions accomplish, however factual I could make such an argument, when my invitation and my motive for making it, would be the news rather than the substance of any rebuttal?

In the new media environment, fact -- even all-inclusive, verifiable, comprehensive fact -- is seen as just another viewpoint, just another opinion in the menu of fame on demand facilitated by Internet and cable television. So when a professor writes an essay (or a phrase in that essay) so sensational that it sparks a nationwide debate about free speech or academic freedom, journalists are missing the point. Such controversies, shaped by media practice, merely amuse the opinionated public.

Case in point: Fox’s "American Idol" reportedly inspired 500 million votes this season, quadrupling the number of ballots cast in the last U.S. presidential election. True, many viewers voted more than once for favorite contestants, but that only documents the culture of opinion, especially popular with younger viewers.

David Mindich, author of Just the Facts and Tuned Out: Why Americans Under 40 Don't Follow the News, says journalists have to compete now with shows like “Fear Factor” and “Friends” and so are overemphasizing humor, conflict and sex. Mindich, chair of the journalism and mass communication department at Saint Michael’s College, believes that the power of fact has diminished in this media universe. “One of the most powerful things about journalism itself is that it can communicate to a large audience and then we can have discussions about facts and where the facts bring us; but if we no longer are paying attention, then the facts don’t have the same weight. In the absence of fact opinion becomes more powerful. It’s not only the journalists themselves; it’s the culture apart from the news that has abandoned political discourse based on commonly agreed upon facts.”

In our day, points and counterpoints may be passionate but often also uninformed and usually accusatory. Who wants to participate in a media spectacle where audience and other sources, rather than the reporters, instinctively go for the jugular? Too often in this environment, the only people willing to speak out -- to contribute to the social debate -- are those with special interests or with nothing to lose and celebrity to gain.

The New Silent Majority

Sources who can explain the complex issues of our era, including biotechnology and bioterrorism, often opt out of the social debate. This includes scientists at our best universities. They see the media world as it is … and so have refrained from commenting on it. Increasingly the new silent majority will not go public with their facts or informed perspectives because, they realize, they will be pilloried for doing so by the omnipresent fear-mongers and sensationalists who provide a diet of conflict and provocation in the media.

And that creates a crisis for the First Amendment, which exists because the founders believed that truth would rise to the top -- providing people could read. That is also why education is associated with free speech and why, for generations, equal access to education has been an issue in our country and continues to be in our time. Education and information are requisite in a republic where we elect our representatives; to downsize or cut allocations for either puts the country at risk. Society is experiencing the consequences of cuts to the classroom and the newsroom, and we are getting the governments that we deserve, including blue vs. red states in a divided, point/counterpoint electorate.

What will become of journalism in this perplexing milieu? What happens when profit rather than truth rises to the top? According to David Mindich, “When profit trumps truth, journalism values are diluted, and then people start to wonder about the value of journalism in the first place.” Without facts, he says, people "start to forget the purpose of the First Amendment and then that, in turn, weakens journalism, and it’s a downward spiral from there."

The only one way to stop the spiral is through re-investment in journalism and education. As for me, a journalism educator, my highest priority is training students for the media environment that used to exist, the one concerned about fact holding government accountable — no matter what the cost. Sooner or later, there will be a place again for fact-gathering journalists. There will be a tipping point when profit plummets for lack of newsroom personnel and technology fails to provide the fix. That day is coming quickly for newspapers publishers, in particular, who are struggling to compete online without realizing there are no competitors on front doors and welcome mats of American homes, their erstwhile domain. They will realize that the best way to attract new readers is to hire more reporters and place them where citizens can see them on the scene as witness, disseminating verifiable truths of the day.

Author/s: 
Michael Bugeja
Author's email: 
info@insidehighered.com

Michael Bugeja directs the Greenlee School of Journalism and Communication at Iowa State University. He is the author of Interpersonal Divide: The Search (Oxford University Press, 2005).

Hitler -- the Classic?

It is disagreeable to approach the cashier with a book called How to Read Hitler. One way to take the stink off would be to purchase one or two other volumes in the new How to Read series published by W. W. Norton, which also includes short guides to Shakespeare, Nietzsche, Freud, and Wittgenstein. But at the time, standing in line at a neighborhood bookstore a couple weeks ago, I wasn't aware of those other titles. (The only thing mitigating the embarrassment was knowing that my days as a skinhead, albeit a non-Nazi one, are long over.) And anyway, the appearance of Adolf Hitler in such distinguished literary and philosophical company raises more troubling questions than it resolves.

"Intent on letting the reader experience the pleasures and intellectual stimulation in reading classic authors," according to the back cover, "the How to Read series will facilitate and enrich your understanding of texts vital to the canon." The series editor is Simon Critchley, a professor of philosophy at the New School in New York City, who looms ever larger as the guy capable of defending poststructuralist thought from its naysayers. Furthermore, he's sharp and lucid about it, in ways that might just persuade those naysayers to read Derrida before denouncing him. (Yeah, that'll happen.)

Somehow it is not that difficult to imagine members of the National Association of Scholars waving around the How to Read paperbacks during Congressional hearings, wildly indignant at Critchley's implicit equation of Shakespeare and Hitler as "classic authors" who are "vital to the canon."

False alarm! Sure, the appearance of the Fuhrer alongside the Bard is a bit of a provocation. But Neil Gregor, the author of How to Read Hitler, is a professor of modern German history at the University of Southampton, and under no illusions about the Fuhrer's originality as a thinker or competence as a writer.

About Mein Kampf, Gregor notes that there is "an unmistakably 'stream of consciousness' quality to the writing, which does not appear to have undergone even the most basic editing, let alone anything like polishing." Although Gregor does not mention it, the title Hitler originally gave to the book reveals his weakness for the turgid and the pompous: Four and a Half Years of Struggle against Lies, Stupidity and Cowardice. (The much snappier My Struggle was his publisher's suggestion.)

Incompetent writers make history, too. And learning to read them is not that easy. The fact that Hitler had ideas, rather than just obsessions, is disobliging to consider. Many of the themes and images in his writing reflect an immersion in the fringe literature of his day -- the large body of ephemeral material analyzed in Fritz Stern in his classic study The Politics of Cultural Despair: The Rise of the Germanic Ideology.

But Gregor for the most part ignores this influence on Hitler. He emphasizes, instead, the elements of Hitler's thinking that were, in their day, utterly mainstream. He could quote whole paragraphs Carl de Clausewitz on strategy. And his racist world view drew out the most virulent consequences of the theories of Arthur de Gobineau and Houston Stewart Chamberlain.(While Hitler was dictating his memoirs in a prison following the Beer Hall Putsch, he could point with admiration to one effort to translate their doctrines into policy: The immigration restrictions imposed in the United States in the 1920s.) 

Gregor's method is to select passages from Mein Kampf and from an untitled sequel, published posthumously as Hitler's Second Book. He then carefully unpacks them -- showing what else is going on within the text, beneath the level of readily paraphrasable content. With his political autobiography, Hitler was not just recycling the standard complaints of the extreme right, or indulging in Wagnerian arias of soapbox oratory. He was also competing with exponents of similar nationalist ideas. He wrote in order to establish himself as the (literally) commanding figure in the movement.

So there is an implicit dialogue going on, disguised as a rather bombastic monologue. "Long passages of Hitler's writings," as Gregor puts it, "take the form of an extended critique of the political decisions of the late nineteenth century.... Hitler reveals himself not only as a nationalist politician and racist thinker, but -- this is a central characteristic of fascist ideology -- as offering a vision of revitalization and rebirth following the perceived decay of the liberal era, whose failings he intends to overcome."

The means of that "overcoming" were, of course, murderous in practice. The vicious and nauseating imagery accompanying any mention of the Jews -- the obsessive way Hitler constantly returns to metaphors of disease, decay, and infestation -- is the first stage of a dehumanization that is itself an incipient act of terror. The genocidal implications of such language are clear enough. But Gregor is careful to distinguish between the racist stratum of Hitler's dogma (which was uncommonly virulent even compared to the "normal" anti-Semitism of his day) and the very widespread use of militarized imagery and rhetoric in German culture following World War I.

"Many of the anti-Semitic images in Hitler's writing can be found in, say, the work of Houston Stewart Chamberlain," writes Gregor. "Yet when reading Chamberlain's work we hardly sense that we are dealing with an advocate of murder. When reading Hitler, by contrast, we often do -- even before we have considered the detail of what he is discussing. This is because the message is not only to be found in the arguments of the text, but is embedded in the language itself."

How to Read Hitler is a compact book, and a work of "high popularization" rather than a monograph. The two short pages of recommended readings at the end are broad, pointing to works of general interest (for example, The Coming of the Third Reich by Richard Evans) rather than journal articles. It will find its way soon enough into high-school and undergraduate history classrooms -- not to mention the demimonde of "buffs" whose fascination with the Third Reich has kept the History Channel profitable over the years.

At the same time, Gregor's little book is an understated, but very effective, advertisement for the "cultural turn" in historical scholarship. It is an example, that is, of one way historians go about examining not just what documents tell us about the past, but how the language and assumptions of a text operated at the time. His presentation of this approach avoids grand displays of methodological intent. Instead the book just goes about its business -- very judiciously, I think.

But there is one omission that is bothersome. Perhaps it is just an oversight, or, more likely, a side effect of the barriers between disciplines. Either way, it is a great disservice that How to Read Hitler nowhere points out the original effort by someone writing in English to analyze the language and inner logic of Mein Kampf --  the essay by Kenneth Burke called "The Rhetoric of Hitler's 'Battle,' " published in The Southern Review in 1939. (In keeping with my recent enthusing over the "golden age" of the academic literary quarterly, it is worth noting that the Review was published at Louisiana State University and edited by a professor there named Robert Penn Warren.)

Burke's essay was, at the time, an unusual experiment: An analysis of a political text using the tools of literary analysis that Burke had developed while studying Shakespeare and Coleridge. He had published the first translations of Thomas Mann's Death in Venice and of portions of Oswald Spengler's Decline of the West -- arguably a uniquely suitable preparation for the job of reading Hitler. And just as various German émigrés had tried to combine Marx and Freud in an effort to grasp "the mass psychology of fascism" (as Wilhelm Reich's title had it), so had Burke worked out his own combination of the two in a series of strange and brilliant writings published throughout the Depression.

But he kept all of that theoretical apparatus offstage, for the most part, in his long review-essay on a then-new translation of Mein Kampf. Instead, Burke read Hitler's narrative and imagery very closely -- showing how an "exasperating, even nauseating" book served to incite and inspire a mass movement.

This wasn't an abstract exercise. "Let us try," wrote Burke, "to discover what kind of 'medicine' this medicine man has concocted, that we may know, with greater accuracy, exactly what to guard against, if we are to forestall the concocting of similar medicine in America."

Burke's analysis is a [ital]tour de force[ital]. Revisiting it now, after Gregor's How to Read volume, it is striking how much they overlap in method and implication. In 1941, Burke reprinted it in his collection The Philosophy of Literary Form, which is now available from the University of California Press. You can also find it in a very useful anthology of Burke's writings called On Symbols and Society, which appears in the University of Chicago Press's series called "The Heritage of Sociology."

"Above all," wrote Burke in 1939, "I believe we must make it apparent that Hitler appeals by relying upon a bastardization of fundamentally religious patterns of thought. In this, if properly presented, there is no slight to religion. There is nothing in religion proper that requires a fascist state. There is much in religion, when misused, that does lead to a fascist state. There is a Latin proverb, Corruptio optimi pessima, 'the corruption of the best is the worst.' And it is the corruptors of religion who are a major menace to the world today, in giving the profound patterns of religious thought a crude and sinister distortion."

Author/s: 
Scott McLemee
Author's email: 
scott.mclemee@insidehighered.com

The Chosen Few

Jerome Karabel's The Chosen is the big meta-academic book of the season -- a scholarly epic reconstructing "the hidden history of admission and exclusion at Harvard, Yale, and Princeton," as the subtitle puts it. Karabel, who is a professor of sociology at the University of California at Berkeley, has fished documents out of the archive with the muckraking zeal worthy of an investigative journalist. And his book, published this month by Houghton Mifflin, is written in far brisker narrative prose than you might expect from somebody working in either sociology or education. That's not meant as a dis to those worthy fields. But in either, the emphasis on calibrating one's method does tend to make storytelling an afterthought.

For Karabel really does have a story to tell. The Chosen shows how the gentlemanly anti-Semitism of the early 20th century precipitated a deep shift in how the country's three most prestigious universities went about the self-appointed task of selecting and grooming an elite.

It is (every aspect of it, really) a touchy subject. The very title of the book is a kind of sucker-punch. It is an old anti-Jewish slur, of course. It's an allusion to Jehovah's selection of the Jews as the Chosen People, of course. It's also a term sometimes used, with a sarcastic tone, as an ethnic slur. But Karabel turns it back against the WASP establishment itself -- in ways too subtle, and certainly too well-researched, to be considered merely polemical. (I'm going to highlight some of the more rancor-inspiring implications below, but that is due to my lack of Professor Karabel's good manners.)

The element of exposé pretty much guarantees the book a readership among people fascinated or wounded by the American status system. Which is potentially, of course, a very large readership indeed. But "The Chosen" is also interesting as an example of sociology being done in almost classical vein. It is a study of what, almost a century ago, Vilfredo Pareto called "the circulation of elites" -- the process through which "the governing elite is always in a state of slow and continuous transformation ... never being today what it was yesterday."

In broad outline, the story goes something like this. Once upon a time, there were three old and distinguished universities on the east coast of the United States. The Big Three were each somewhat distinctive in character, but also prone to keeping an eye on one another's doings.

Harvard was the school with the most distinguished scholars on its faculty -- and it was also the scene of President Charles Eliot's daring experiment in letting undergraduates pick most of their courses as "electives." There were plenty of the "stupid young sons of the rich" on campus (as one member of the Board of Overseers put it in 1904), but the student body was also relatively diverse. At the other extreme, Princeton was the country club that F. Scott Fitzgerald later described in This Side of Paradise. (When asked how many students there were on campus, a Princeton administrator famously replied, "About 10 percent.")

Finally, there was Yale, which had crafted its institutional identity as an alternative to the regional provincialism of Harvard, or Princeton's warm bath of snobbery. It was "the one place where money makes no difference ... where you stand for what you are," in the words of the then-beloved college novel Dink Stover, about a clean-cut and charismatic Yalie.

But by World War One, something was menacing these idyllic institutions: Namely, immigration in general and "the Hebrew invasion" in particular. A meeting of New England deans in the spring of 1918 took this on directly. A large and growing percentage of incoming students were the bright and driven children of Eastern European Jewish immigrants. This was particularly true at Harvard, where almost a fifth of the freshman class that year was Jewish. A few years later, the figure would reach 13 percent at Yale -- and even at Princeton, the number of Jewish students had doubled its prewar level.

At the same time, the national discussion over immigration was being shaped by three prominent advocates of "scientific" racism who worried about the decline of America's Nordic stock. They were Madison Grant (Yale 1887), Henry Fairfield Osborne (Princeton 1877), and Lothrop Stoddard (Harvard 1905).

There was, in short, an air of crisis at the Big Three. Even the less robustly bigoted administrators worried about (as one Harvard official put it) "the disinclination, whether justified or not, on the part of non-Jewish students to be thrown into contact with so large a proportion of Jewish undergraduates."

Such, then, was the catalyst for the emergence, at each university, of an intricate and slightly preposterous set of formulae governing the admissions process. Academic performance (the strong point of the Jewish applicants) would be a factor -- but one strictly subordinated to a systematic effort to weigh "character."

That was an elusive quality, of course. But administrators knew when they saw it. Karabel describes the "typology" that Harvard used to make an initial characterization of applicants. The code system included the Boondocker ("unsophisticated rural background"), the Taconic ("culturally depressed background," "low income"), and the Krunch ("main strength is athletic," "prospective varsity athlete"). One student at Yale was selected over an applicant with a stronger record and higher exam scores because, as an administrator put it, "we just thought he was more of a guy."

Now, there is a case to be made for a certain degree of flexibility in admissions criteria. If anything, given our reflex-like tendency to see diversity as such as an intrinsic good, it seems counterintuitive to suggest otherwise. There might be some benefit to the devil's-advocate exercise of trying to imagine the case for strictly academic standards.

But Karabel's meticulous and exhaustive record of how the admissions process changed is not presented as an argument for that sort of meritocracy. First of all, it never prevailed to begin with.

A certain gentlemanly disdain for mere study was always part of the Big Three ethos. Nor had there ever been any risk that the dim sons of wealthy alumni would go without the benefits of a prestigious education.

What the convoluted new admissions algorithms did, rather, was permit the institutions to exercise a greater -- but also a more deftly concealed -- authority over the composition of the student body.

"The cornerstones of the new system were discretion and opacity," writes Karabel; "discretion so that gatekeepers would be free to do what they wished and opacity so that how they used their discretion would not be subject to public scrutiny.... Once this capacity to adapt was established, a new admissions regime was in place that was governed by what might be called the 'iron law of admissions': a university will retain a particular admissions policy only so long as it produces outcomes that correspond to perceived institutional interests."

That arrangement allowed for adaptation to social change -- not just by restricting applicants of one minority status in the 1920s, but by incorporating underrepresented students of other backgrounds later. But Karabel's analysis suggests that this had less to do with administratorsbeing "forward-looking and driven by high ideals" than it might appear.

"The Big Three," he writes, "were more often deeply conservative and surprisingly insecure about their status in the higher education pecking order.... Change, when it did come, almost always derived from one of two sources: the continuation of existing policies was believed to pose a threat either to vital institutional interests (above all, maintaining their competitive positions) or to the preservation of the social order of which they were an integral -- and privileged -- part."

Late in the book, Karabel quotes a blistering comment by the American Marxist economist Paul Sweezy (Exeter '27, Harvard '31, Harvard Ph.D. '37) who denounced C. Wright Mills for failing to grasp "the role of the preparatory schools and colleges as recruiters for the ruling class, sucking upwards the ablest elements of the lower classes." Universities such as the Big Three thus performed a double service to the order by "infusing new brains into the ruling class and weakening the potential leadership of the working class."

Undoubtedly so, once upon a time -- but today, perhaps, not so much. The neglect of their duties by the Big Three bourgeoisie is pretty clear from the statistics.

"By 2000," writes Karabel, "the cost of a year at Harvard, Yale, and Princeton had reached the staggering sum of more than $35,000 -- an amount that well under 10 percent of American families could afford....Yet at all three institutions, a majority of students were able to pay their expenses without financial assistance -- compelling testimony that, more than thirty years after the introduction of need-blind admissions, the Big Three continued to draw most of their students from the most affluent members of society." The number of students at the Big Three coming from families in the bottom half of the national income distribution averages out to about 10 percent.

All of which is (as the revolutionary orators used to say) no accident. It is in keeping with Karabel's analysis that the Big Three make only as many adjustments to their admissions criteria as they must to keep the status quo ante on track. Last year, in a speech at the American Council on Education, Harvard's president, Larry Summers, called for preferences for the economically disadvantaged. But in the absence of any strong political or social movement from below -- an active, noisy menace to business as usual -- it's hard to imagine an institutionalized preference for admitting students from working families into the Big Three. (This would have to include vigorous and fairly expensive campaigns of recruitment and retention.)

As Walter Benn Michaels writes in the latest issue of N+1 magazine, any discussion of class and elite education now is an exercise in the limits of the neoliberal imagination. (His essay was excerpted last weekend in the Ideas section of The Boston Globe.

"Where the old liberalism was interested in mitigating the inequalities produced by the free market," writes Michaels, " neoliberalism -- with its complete faith in the beneficence of the free market -- is interested instead in justifying them. And our schools have a crucial role to play in this. They have become our primary mechanism for convincing ourselves that poor people deserve their poverty, or, to put the point the other way around, they have become our primary mechanism for convincing rich people that we deserve our wealth."

How does this work? Well, it's no secret that going to the Big Three pays off. If, in theory, the door is open to anyone smart and energetic, then everything is fair, right? That's equality of opportunity. And if students at the Big Three then turn out to be drawn mainly from families earning more than $100,000 per year....

Well, life is unfair. But the system isn't.

"But the justification will only work," writes Michaels, if "there really are significant class differences at Harvard. If there really aren't -- if it's your wealth (or your family's wealth) that makes it possible for you to go to an elite school in the first place -- then, of course, the real source
of your success is not the fact that you went to an elite school but the fact that your parents were rich enough to give you the kind of preparation that got you admitted to the elite school. The function of the (very few) poor people at Harvard is to reassure the (very many) rich people at Harvard that you can't just buy your way into Harvard."

Author/s: 
Scott McLemee
Author's email: 
scott.mclemee@insidehighered.com

A Child's Garden of Culture and Atrocity

"Whoever cannot give to himself an adequate account of the past three thousand years," said Goethe, "remains in darkness, without history, living from day to day." That is an expression of a bedrock principle of liberal humanism, European-style. It takes the existence of the educated individual as its basic unit of reference -- its gold standard. But it also judges the quality of that existence by how much the individual has spent in acquiring a sense of the past. That expenditure also means, in effect, going into debt: You’ll never repay everything you owe to previous generations.

That outlook is, when you get right down to it, pretty un-American. It goes against the ideal of unencumbered self-creation that Emerson taught us –- in which we are supposed to throw off the burdens of the past, living always in the vital present. Fortunately, this is not hard to do. The first step is not to learn much history to begin with. (We are good at this.)

Even so, there  may be an audience for E. H. Gombrich’s A Little History of the World, now available from Yale University Press, 70 years after it was first written. Imagine Goethe giving up the role of sage long enough to become a children’s author and you will have a reasonably good idea of the book’s content. It goes from prehistory up to the end of the (then-recent) Great War, with particular attention to ancient Greece, the Roman Empire, and the emergence of Judaism, Buddhism, Christianity, and Islam.

As for the style ... well, that is something even more remarkable. The tone is wry, at times, without ever being jokey -- a kind of light seriousness that is very respectful of its young audience. Each chapter is perfectly calibrated to suit the attention span and cognitive powers of a 10 year-old, without ever giving off a trace of condescension.

The effect, even for an adult reader, is incredibly charming –- and, indeed, instructive, at least for anyone with the occasional gap in that interior timeline. (Quick now: Who were the Hohenzollerns? And no, a vague sense that they were German doesn’t count.)

In his later and better-known role as art historian, Gombrich commanded a really humbling degree of erudition, but always with a certain generosity towards his audience. That combination is very much in evidence throughout his first book – one written in what must have been very trying circumstances.

It was Vienna in 1935. Gombrich was 26 and had recently finished his dissertation. (Writing one "was considered very important," he told a presumably incredulous audience at Rutgers University in 1987, "yet it didn’t take more than a little over a year to write.") His immediate job prospects ranged from the nonexistent to the merely terrible. Besides, he was Jewish, and the writing was on the wall, usually in the form of a swastika.

He managed to find part-time employment with a publishing company. He was asked to evaluate a book on world history for children in English, to see if it might be worth translating. He recommended against it, but offered instead to write one directly into German. It took him about six week, writing a chapter a day. The volume did quite well when it appeared in 1936, though the Nazis eventually stopped publication on the grounds of its "pacifism."

By then, he was in London, working at the Warburg Institute (a major art-history collection, where Gombrich in time became director) and aiding the war effort by translating German radio broadcasts into English. Before leaving Vienna, he had agreed to write another book, this one for adolescents, on the history of art. That project that grew into a rather more ambitious work, The Story of Art (1950) – long the standard overview of European art history, from which generations of museum tour-guides have cribbed.

He wrote it – along with his more monographic works on iconography and on the psychology of perception –- in English. When his Little History was reprinted in Germany in the mid-1980s, he wrote an afterward for it; but he turned down offers to have it translated into English, preferring to do that himself, and to make some necessary revisions. It is not clear from the edition now available from Yale just how far Gombrich got with that effort at the time of his death in 2001. (The title page gives the translator as Caroline Mustill.) But he did add a postscript called "The Small Part of the History of the World Which I Have Lived Through" – summing up the 20th century from World War I through the end of the Cold War, and trying to put as optimistic a spin on that record as possible.

The preface by Leonie Gombrich, his granddaughter, quotes some introductory remarks he prepared for the Turkish edition. His Little History, he wrote, "is not, and never was, intended to replace any textbooks of history that may serve a very different purpose at school. I would like my readers to relax, and to follow the story without having to take any notes or to memorize names and dates. In fact, I promise that I shall not examine them on what they have read."

But the book has a strong and serious pedagogical intent, even so. And it comes very directly from Goethe, whose work Gombrich read incessantly as a boy. Upon receiving the Goethe Prize in 1994, Gombrich said that it was the author’s life and writing that taught him "the consoling message ... of a universal citizenship that transcends the confines of nationhood." That seems very much the point of the Little History, which tries to squeeze all of global history into just under three hundred easily read pages –- and I strongly suspect it was just that cosmopolitanism that the Nazi censors really loathed.

Of course, there are gaps and oversights. One that is really troublesome is how the entire history of the Atlantic slave trade is reduced to the dimensions of a brief reference to the Civil War in the United States. This has the effect of making it seem like a distant and cruel episode in the New World, rather than what it really was: A vast and centuries-long process that enriched parts of Europe, depopulated parts of Africa, and anticipated every aspect of totalitarianism possible before the rise of industrialization and mass communications.

Not that Gombrich leaves the history of colonial atrocity entirely out of the picture, especially in recounting the conquest of the Americas: "This chapter in the history of mankind is so appalling and shameful to us Europeans that I would rather not say anything more about it."

In many ways, then, the book is at least as interesting as the specimen of a lost sensibility as it is in its own right, as a first introduction to history. Gombrich later spoke of how much he had been the product of that almost religious veneration of culture that prevailed among the European middle class of the 19th and early 20th centuries.

"I make no great claims for the universality of that tradition," he said during a lecture at Liverpool University in 1981. "Compared to the knowable, its map of knowledge was arbitrary and schematic in the extreme. As is true of all cultures, certain landmarks were supposed to be indispensable for orientation while whole stretches of land remained terra incognita, of relevance only to specialists..... But what I am trying to say is that at least there was a map."

Author/s: 
Scott McLemee
Author's email: 
scott.mclemee@insidehighered.com

Scott McLemee writes Intellectual Affairs on Tuesdays and Thursdays. Suggestions and ideas for future columns are welcome.

Piled Higher and Deeper

Rick Perlstein, a friend from the days of Lingua Franca, is now working on a book about Richard Nixon. Last year, he published a series of in-depth articles about the Republican Party and the American conservative movement. (Those are not quite the same thing, though that distinction only becomes salient from time to time.) In short, Perlstein has had occasion to think about honesty and dissimulation -- and about the broad, swampy territory in between, where politicians finesse the difference. As do artists and used-car salesmen....

It’s the job of historians to map that territory. But philosophers wander there, too. “What is truth?” as Nietzsche once asked. “A mobile army of metaphors, metonymies, anthropomorphisms. Truths are illusions of which one has forgotten that they are illusions.” Kind of a Cheneyo-Rumsfeldian ring to that thought. It comes from an essay called “On Truth and Lie in an Extra-Moral Sense,” which does, too, come to think of it.  

So anyway, about a week ago, Rick pointed out a recent discussion of how the Bush Administration is dealing with critics who accuse it of fudging the intelligence that suggested Saddam Hussein had weapons of mass destruction. The link went to a comment by Joshua Micah Marshall, who is a liberal Democrat of the more temperate sort, not prone to hyperventilation. 

“Garden variety lying is knowing it’s Y and saying it’s X,” he wrote, giving Lyndon Johnson on the Gulf of Tonkin as an example. The present executive branch, he continued, shows “a much deeper indifference to factual information in itself.”

Rick posed an interesting question: “Isn't Josh Marshall here describing as the Administration's methodology exactly what that Princeton philosophy prof defines as ‘bullshit’?” That prof being, of course, Harry Frankfurt, whose short and best-selling treatise On Bullshit will probably cover everyone’s Christmas bonus at Princeton University Press this year. 

In February, The New York Times beat us by a day or so with its article on the book, which daintily avoided giving its title. But "Intellectual Affairs" first took a close look, not just at Frankfurt’s text -- noting that it remained essentially unchanged since its original publication as a scholarly paper in the 1980s -- but at the philosophical critique of it presented in G.A. Cohen’s essay “Deeper into Bullshit.” 

Since then, the call for papers for another volume of meditations on the theme of bull has appeared. Truly, we are living in a golden age.

The gist of Frankfurt’s argument, as you may recall, is that pitching BS is a very different form of activity from merely telling a lie. And Marshall’s comments do somewhat echo the philosopher’s point. Frankfurt would agree that “garden variety lying” is saying one thing when you know another to be true. The liar operates within a domain that acknowledges the difference between accuracy and untruth. The bullshitter, in Frankfurt’s analysis, does not. In a sense, then, the other feature of Marshall’s statement would seem to fit. Bullshit involves something like “indifference to factual information in itself.”

So does it follow, then, that in characterizing the Bush team’s state of mind three years ago, during the run-up to the war, we must choose between the options of incompetence, dishonesty, and bullshit?
Please understand that I frame it in such terms, not from any political motive, but purely in the interest of conceptual rigor. 

That said.... It seems to me that this range of terms is inadequate. One may agree that Bush et al. are profoundly indifferent to verifiable truth without concluding that the Frankfurt category necessarily applies.

Per G. A. Cohen’s analysis in “Deeper into Bullshit,” we must stress that Frankfurt’s model rests on a particular understanding of the consciousness of the liar. The mind of the bullshitter is defined by contrast to this state. For the liar, (1) the contrast between truth and untruth is clearly discerned, and (2) that difference would be grasped by the person to whom the liar speaks. But the liar’s intentionality also includes (3) some specific and lucidly grasped advantage over the listener made possible by the act of lying.

By contrast, the bullshitter is vague on (1) and radically unconcerned with (2). There is more work to be done on the elements of relationship and efficacy indicated by (3). We lack a carefully argued account of bullshit’s effect on the bullshitee.

There is, however, another possible state of consciousness not adequately described by Frankfurt’s paper. What might be called “the true believer” is someone possessing an intense concern with truth.

But it is a Higher Truth, which the listener may not (indeed, probably cannot) grasp. The true believer is speaking a truth that somehow exceeds the understanding of the person hearing it.

During the Moscow Trials of the late 1930s, Stalin’s attorney lodged numerous charges against the accused that were, by normal standards, absurd. In many cases, the “evidence” could be shown to be false. But so much worse for the facts, at least from the vantage point of the true believer. If you’ve ever known someone who got involved in EST or a multi-level marketing business, the same general principle applies. In each case, it is not quite accurate to say that the true believers are lying. Nor are they bullshitting, in the strictest sense, for they maintain a certain fidelity to the Higher Truth. 

Similarly, it did not matter three years ago whether or not any evidence existed to link Saddam and Osama. To anyone possessing the Higher Truth, it was obvious that Iraq must be a training ground for Al Qaeda. And guess what? It is now. So why argue about it?

On a less world-historical scale, I see something interesting and apropos in Academe, the magazine of the American Association of University Professors. In the latest issue, David Horowitz makes clear that he is not a liar just because he told a national television audience something that he knew was not true. 

(This item was brought to my attention by a friend who teaches in a state undergoing one of Horowitz’s ideological rectification campaigns. My guess is that he’d rather not be thanked by name.)

Here’s the story so far: In February, while the Ward Churchill debate was heating up, Horowitz appeared on Bill O’Reilly’s program. It came up that Horowitz, like Churchill, had been invited to lecture at Hamilton College at some point. But he was not, he said, “a speaker paid by and invited by the faculty.” 

As we all know, university faculties are hotbeds of left-wing extremism. (Especially the business schools and engineering departments. And reports of how hotel-management students are forced to read speeches by Pol Pot are positively blood-curdling.) Anyway, whenever Horowitz appears on campus, it’s because some plucky youngsters invite him. He was at Hamilton because he had been asked by “the conservative kids.”

That came as a surprise to Maurice Isserman, a left-of-center historian who teaches at Hamilton College. When I saw him at a conference a few years ago, he seemed to have a little gray in his hair, and his last book, The Other American: The Life of Michael Harrington, was a biography of the founder of the Democratic Socialists of America. No doubt he’s been called all sorts of things over the years, but “conservative kid” is not one of them. And when Horowitz spoke at Hamilton a few years ago, it was as a guest lecturer in Isserman’s class on the 1960s. 

As Isserman put it in the September/October issue of Academe: “Contrary to the impression he gave on "The O’Reilly Factor," Horowitz was, in fact, an official guest of Hamilton College in fall 2002, invited by a faculty member, introduced at his talk by the dean of the faculty, and generously compensated for his time.”

I will leave to you the pleasure and edification of watching Horowitz explain himself in the latest issue of Academe. But in short, he could not tell the truth because that would have been a lie, so he had to say something untrue in order to speak a Higher Truth. 

My apologies for the pretzel-like twistiness of that paraphrase. It is all so much clearer in the original Newspeak: Thoughtcrime is doubleplus ungood.

Author/s: 
Scott McLemee
Author's email: 
scott.mclemee@insidehighered.com

Where Have All the Big Questions Gone?

Some months ago I started asking friends, colleagues from my teaching days, researchers in higher education, faculty members of various ages and ranks, deans, provosts and presidents, and focus groups of students: “What’s the status of the Big Questions on your campus?” Quite deliberately I avoided defining “Big Questions,” but I gave as examples such questions as “Who am I? Where do I come from? What am I going to do with my life? What are my values? Is there such a thing as evil? What does it mean to be human? How can I understand suffering and death? What obligations do I have to other people? What does it mean to be a citizen in a democracy? What makes work, or a life, meaningful and satisfying?” In other words, I wanted to know what was happening to questions of meaning and value that traditionally have been close to the heart of a liberal education.

Some of what I found puzzled me. People pointed out quite properly that some Big Questions were alive and well in academia today. These included some questions about the origin of the universe, the emergence of life, the nature of consciousness, and others that have been raised by the scientific breakthroughs of the past few decades.

In the humanities and related social sciences the situation was rather different. Some friends reminded me that, not all big questions were in eclipse. Over the past generation faculty members have paid great attention to questions of racial, ethnicity, gender and sexual identity. Curricular structures, professional patterns, etc. continue to be transformed by this set of questions. Professors, as well as students, care about these  questions, and as a result, write, teach and learn with passion about them.

But there was wide agreement that other big questions, the ones about meaning, value, moral and civic responsibility, were in eclipse. To be sure, some individual faculty members addressed them, and when they did, students responded powerfully. In fact, in a recent Teagle-sponsored meeting on a related topic, participants kept using words such as “hungry,” “thirsty,” and “parched” to describe students’ eagerness to find ways in the curriculum, or outside it, to address these questions. But the old curricular structures that put these questions front and center have over the years often faded or been dismantled, including core curricula, great books programs, surveys “from Plato to NATO,” and general education requirements of various sorts. Only rarely have new structures emerged to replace them.

I am puzzled why. To be sure, these Big Questions are hot potatoes. Sensitivities are high. And faculty members always have the excuse that they have other more pressing things to do. Over two years ago, in an article entitled “Aim Low,” Stanley Fish attacked some of the gurus of higher education (notably, Ernest Boyer) and their insistence that college education should “go beyond the developing of intellectual and technical skills and … mastery of a scholarly domain. It should include the competence to act in the world and the judgment to do so wisely” ( Chronicle of Higher Education, May 16 2003). Fish hasn’t been the only one to point out that calls to “fashion” moral and civic-minded citizens, or to “go beyond” academic competency assume  that students now routinely achieve such mastery of intellectual and scholarly skills. We all know that’s far from the case.

Minimalist approaches -- ones that limit teaching to what another friend calls “sectoral knowledge -- are alluring. But if you are committed to a liberal education, it’s hard just to aim low and leave it at that. The fact that American university students need to develop basic competencies provides an excuse, not a reason, for avoiding the Big Questions. Students also need to be challenged, provoked, and helped to explore the issues they will inevitable face as citizens and as individuals. Why have we been so reluctant to develop the structures, in the curriculum or beyond it, that provide students with the intellectual tools they need to grapple thoughtfully over the course of a lifetime with these questions?

I see four possible reasons:

1. Faculty members are scared away by the straw man Stanley Fish and others have set up. Despite accusations of liberal bias and “brainwashing” no faculty member I know wants to “mold,” “fashion” or “proselytize” students.  But that’s not what exploring the Big Questions is all about.  Along with all the paraphernalia college students bring with them these days are Big Questions, often poorly formulated and approached with no clue that anyone in the history of humankind has ever had anything useful to say about any of them. There’s no need to answer those questions for students, or to try to fashion them into noble people or virtuous citizens for the republic. There is, however, every reason to help students develop the vocabularies, the metaphors, the exempla, the historical perspective, the patterns of analysis and argument that let them over time answer them for themselves.

2. A second possible reason is that faculty are put off by the feeling they are not “experts” in these matters. In a culture that quite properly values professional expertise, forays beyond one’s field of competence are understandably suspect. But one does not have to be a moral philosopher to raise the Big Questions and show some of the ways smart people in the past have struggled with them. I won’t pontificate about other fields, but in my own field -- classics and ancient history -- the Big Questions come bubbling up between the floor boards of any text I have ever taught. I don’t have to be a specialist in philosophy or political science to see that Thucydides has something to say about power and morality, or the Odyssey about being a father and a husband. A classicist’s job, as I see it, is to challenge students to think about what’s implicit in a text, help them make it explicit and use that understanding to think with.

3. Or is it that engaging with these “Big Questions” or anything resembling them is the third rail of a professional career. Senior colleagues don’t encourage it; professional journals don’t publish it; deans don’t reward it and a half dozen disgruntled students might sink your tenure case with their teaching evaluations. You learn early on in an academic career not to touch the third rail. If this is right, do we need to rewire the whole reward system of academia?

4. Or, is a former student of mine, now teaching at a fine women’s college, correct when she says that on her campus “It tends to be that … those who talk about morality and the big questions come from such an entrenched far right position … that the rest of us … run for cover.”  

Some of the above? All of the above? None of the above? You tell me, but let’s not shrug our shoulders and walk away from the topic until we’ve dealt with one more issue: What happens if, for whatever reason, faculty members run for the hills when the Big Questions, including the ones about morality and civic responsibility, arise? Is this not to lose focus on what matters most in an education intended to last for a lifetime? In running away, do we not then leave the field to ideologues and others we cannot trust, and create a vacuum that may be filled by proselytizers, propagandists, or the unspoken but powerful manipulations of consumer culture? Does this not sever one of the roots that has over the centuries kept liberal education alive and flourishing? But, most serious of all, will we at each Commencement say farewell to another class of students knowing that for all they have learned, they are ill equipped to lead an examined life? And if we do, can we claim to be surprised and without responsibility if a few decades later these same graduates abuse the positions of power and trust in our corporate and civic life to which they have ascended?
              
    

Author/s: 
W. Robert Connor
Author's email: 
info@insidehighered.com

W. Robert Connor is president of the Teagle Foundation, which is dedicated to strengthening liberal education. More on the foundation's “Big Questions” project may be found on its Web site. This essay is based on remarks Connor recently made at a meeting of the Middle Atlantic Chapters of Phi Beta Kappa, at the University of Pennsylvania.

The Lowering of Higher Education

I just finished grading a hefty stack of final examinations for my introductory-level U.S. history survey course. The results were baleful.

On one section of the exam, for example, I supplied some identification terms of events and personages covered in class, asking students to supply a definition, date, and significance for each term. In response to “Scopes Monkey Trial,” one student supplied the following:

"The scopes monkey trial was a case in the supreme court that debated teaching evolution in the schools. It happened in 1925. Mr. Scope a teacher in a school wanted to teach about God and did not want to teach about evolution. The ACLU brought in lawyers to help with the case of Mr. Scopes. In the end Mr. Scopes side did not have the people's opinion. Evolution won. It is significant because now you have to teach evolution in school, you can't teach about God."

This answer might be considered a nearly perfect piece of evidence against intelligent design of the universe, since it gets just about everything (apart from the date) wrong: punctuation, spelling, grammar, and historical fact.

For those needing a refresher, Tennessee high school biology teacher John T. Scopes assigned a textbook informed by evolutionary theory, a subject prohibited by the state legislature. The court ruled against Scopes, who had, obviously, broken the law. But the defense won in the court of public opinion, especially after the ACLU’s lawyer, Clarence Darrow, tore apart William Jennings Bryan, the former Democratic presidential candidate, witness for the prosecution, and Biblical fundamentalist. The press dubbed it the "Scopes Monkey Trial" (inaccurately, since the theory of human evolution centered upon apes) and pilloried Bryan. As Will Rogers put it, "I see you can't say that man descended from the ape. At least that's the law in Tennessee. But do they have a law to keep a man from making a jackass of himself?"

An outside observer might ascribe my student’s mistakes to the political culture of this Midwestern city, where barely a day goes by without a letter to the editor in the local paper from some self-appointed foot soldier of the religious right.

That, however, wouldn’t explain another student who thought the 1898 war between the United States and Spain, fought heavily in Cuba, was about communism (not introduced into Cuba until after the 1959 revolution). Nor would it explain a third student who thought that the Scopes verdict condoned Jim Crow racial segregation.

A minority of students performed admirably, receiving grades in the range of A, while hewing, of course, to varied interpretations. Their success proved the exam was based upon reasonable expectations. However, the median exam grade was a C -- the lowest I’ve yet recorded, and fairly devastating for a generation of students who typically aspire to a B.

I was wondering what to make of this dispiriting but solitary data set when I read about the Education Department study released late last week that shows that the average literacy of college-educated Americans declined precipitously between 1992 and 2003. Just 25 percent of college graduates scored high enough on the tests to be deemed “proficient” in literacy.

By this measure, literacy does not denote the mere ability to read and write, but comprehension, analysis, assessment, and reflection. While “proficiency” in such attributes ranks above “basic” or “intermediate,” it hardly denotes rocket science. It simply measures such tasks as comparing the viewpoints in two contrasting editorials.

The error-ridden response I received about the Scopes Monkey Trial speaks less to the ideological clash of science and faith than to a rather more elemental matter. As students in the 1960s used to say, the issue is not the issue. The issue is the declining ability to learn. The problem we face, in all but the most privileged institutions, is a pronounced and increasing deficiency of student readiness, knowledge, and capacity.

Neither right nor left has yet come to terms with the crisis of literacy and its impact on higher education. The higher education program of liberals revolves around access and diversity, laudable aims that do not speak to intellectual standards. Conservatives, for their part, are prone to wild fantasies about totalitarian leftist domination of the campuses. They cannot imagine a failure even more troubling than indoctrination -- the inability of students to assimilate information at all, whether delivered from a perspective of the left or the right.

It would be facile to blame the universities for the literacy crisis, since it pervades our entire culture at every level. The Education Department’s statistics found a 10-year decline in the ability to read and analyze prose in high school, college, and graduate students alike.

However, the crisis affects the university profoundly, and not only at open-enrollment institutions like the regional campus on which I teach. Under economic pressure from declining government funding and faced with market competition from low-bar institutions, many universities have increasingly felt compelled to take on students whose preparation, despite their possession of a high-school degree, is wholly inadequate. This shores up tuition revenue, but the core project of the higher learning is increasingly threatened by the ubiquitousness of semi-literacy.

How can human thought, sustained for generations through the culture of the book, be preserved in the epoch of television and the computer? How can a university system dedicated to the public trust and now badly eroded by market forces carry out its civic and intellectual mission without compromising its integrity?

These questions cry out for answer if we are to stem a tide of semi-literacy that imports nothing less than the erosion of the American mind.

Author/s: 
Christopher Phelps
Author's email: 
info@insidehighered.com

 Christopher Phelps is associate professor of history at Ohio State University at Mansfield.

Technicolor Dreams

If my recent experiences are any indication, we professors face a daunting challenge: The polarized American political environment has conditioned our students to see life in monochrome. The Right tells them to view all as either black or white, while the Left insists that everything is a shade of gray.

We’ve long struggled with the either/or student, the one who writes a history essay in which events are stripped of nuance and presented as the working out of God’s preordained plan; or the sociology student who wants to view poverty as a modern variant of 19th century Social Darwinism. These students -- assuming they’re not acting out some ideological group’s agenda -- can be helped along simply by designing lessons that require them to argue opposing points of view.

Yet despite all the hoopla about the resurgence of conservatism, I get more students whose blinders are more postmodern than traditional. This is to say that many of them don’t see the value of holding a steadfast position on much of anything, nor do they exhibit much understanding of those who do. They live in worlds of constant parsing and exceptions. Let me illumine through two examples.

In history classes dealing with the Gilded Age I routinely assign Edward Bellamy’s utopian novel Looking Backward.  In brief, protagonist Julian West employs a hypnotist for his insomnia and retires to an underground chamber. His Boston home burns in 1887 and West is not discovered until 2000, when he is revived by Dr. Leete. He awakens to a cooperative socialist utopia. West’s comments on his time say much about late 19th century social conflict, and Leete’s description of utopian Boston make for interesting class discussion. I know that some students will complain about the novel’s didactic tone, others will argue that Bellamy’s utopia is too homogeneous, and a few will assert that Bellamy’s explanation of how utopia emerged is contrived. What I had not foreseen is how many students find the very notion of a utopia so far-fetched that many can’t move beyond incredulity to consider other themes.

When I paraphrase Oscar Wilde that a map of the world that doesn’t include Utopia isn’t worth glancing at, some students simply don’t get it. "Utopia is impossible” is the most common remark I hear. “Perhaps so,” I challenge, “but is an impossible quest the same as a worthless quest?” That sparks some debate, but the room lights up when I ask students to explain why a utopia is impossible. Their reasons are rooted more in contemporary frustration than historical failure. Multiculturalism is often cited. “The world is too diverse to ever get people to agree” is one rejoinder I often receive.

It’s disturbing enough to contemplate that a social construct designed to promote global understanding can be twisted to justify existing social division, but far more unsettling was often comes next. When I ask students if they could envision dystopia, the floodgates open. No problems on that score!  In fact, they draw upon popular culture to chronicle various forms of it: Escape From New York, Blade Runner, Planet of the Apes…. “Could any of these happen?” I timidly ask. “Oh sure, these could happen easily,” I’m told.

My second jolt came in a different form, an interdisciplinary course I teach in which students read Tim O’Brien’s elegantly written Vietnam War novel The Things They Carried. O’Brien violates old novelistic standards; his book is both fictional and autobiographical, with the lines between the two left deliberately blurred. My students adored the book and looked at me as if they had just seen a Model-T Ford when I mentioned that a few critics felt that the book was dishonest because it did not distinguish fact from imagination. “It says right on the cover ‘a work of fiction’” noted one student.  When I countered that we ourselves we using it to discuss the actual Vietnam War, several students immediately defended the superiority of metaphorical truth because it “makes you think more.” I then asked students who had seen the film The Deer Hunter whether the famed Russian roulette scene was troubling, given that there was no recorded incident of such events taking place in Vietnam. None of them were bothered by this.

I mentioned John Sayles’ use of composite characters in the film Matewan. They had no problem with that, though none could tell me what actually happened during the bloody coal strikes that convulsed West Virginia in the early 1920s. When I probed whether writers or film makers have any responsibility to tell the truth, not a single student felt they did. “What about politicians?” I asked. While many felt that truth-telling politicians were no more likely than utopia, the consensus view was that they should tell the truth. I then queried, “So who gets to say who has to tell the truth and who gets to stretch it?” I was prepared to rest on my own clever laurels, until I got the students’ rejoinder! Two of my very best students said, in essence, that all ethics are situational, with one remarking, “No one believes there’s an absolute standard of right and wrong.” I tentatively reminded him that many of the 40 million Americans who call themselves "evangelical Christians" believe rather firmly in moral absolutes. From the back of the room pipped a voice, “They need to get over themselves.”

I should interject that this intense give-and-take was possible because I let my students know that their values are their own business. In this debate I went out of way to let them know I wasn’t condemning their values; in fact, I share many of their views on moral relativism, the ambiguity of truth, and artistic license. But I felt I could not allow them to dismiss objective reality so cavalierly. Nor, if I am true to my professed belief in the academy as a place where various viewpoints must be engaged, could I allow them to refuse to consider anyone who holds fast to moral absolutism.

The stories have semi-happy endings. I eventually got my history students to consider the usefulness of utopian thinking. This happened after I suggested that people of the late 19th century had better imaginations than those of the early 21st, which challenged them to contemplate the link between utopian visions and reform, and to see how a moralist like Bellamy could inspire what they would deem more pragmatic social changes. My O’Brien class came through when I taught the concept of simulacra, showed them a clip from the film Wag the Dog and then asked them to contemplate why some see disguised fiction as dangerous. (Some made connections to the current war in Iraq, but that’s another story!)

My goal in both cases was to make students see points of view other than their own. Both incidents also reminded me it’s not just the religious or conservative kids who need to broaden their horizons. We need to get all students to see the world in Technicolor, even when their own social palettes are monochromatic. Indeed, the entire academy could do worse than remember the words of Dudley Field Malone, one of the lawyers who defended John T. Scopes. Malone remarked, “I have never in my life learned anything from any man who agreed with me.” 

Author/s: 
Robert E. Weir
Author's email: 
info@insidehighered.com

Robert E. Weir is a visiting professor at Commonwealth College of the University of Massachusetts at Amherst and in the American studies program at Smith College.

Pages

Subscribe to RSS - History
Back to Top