History

Ambiguous Legacy

There will be a meeting tonight in Washington to celebrate the life of James Weinstein, the radical historian and publisher who died in Chicago last Thursday. The news was by no means unexpected. But the gathering is impromptu, and it will probably be small.

I suppose one thing we will all have in common is an inability to refer to the deceased as "James Weinstein." He was Jimmy. It's a fair guess that the turnout will include union organizers and progressive lobbyists and a few journalists. There will undoubtedly be an academic or two -- or several, if you count the defrocked, the ABD's, and the folks who otherwise decided (contra David Horowitz) that university life is not necessarily conducive to being a leftist.

Many people know that Weinstein's book The Decline of Socialism in America, 1912-1925 (first published in 1967 and reprinted by Rutgers University Press in 1984) started out as his dissertation. After all this time, it remains a landmark work in the scholarship on U.S. radicalism. But only this weekend, in talking with a mutual friend, did I learn that he never actually bothered to get the Ph.D.

Diagnosed with brain cancer, Jimmy spent the final weeks of his life in bed at home. He gave a series of interviews to Miles Harvey, an author and former managing editor at In These Times, the progressive magazine that Jimmy founded. The body of reminiscences is now being transcribed, and will join the collection of the Oral History Research Office at Columbia University.

"We both knew we were in a race against time," Miles said when we talked by phone over the weekend. "We mined a lot of interesting stuff. Jimmy was the Zelig of the American left."

The son of a prosperous businessman, he worked for years in electronics factories as a rank-and-file Communist union member. One of his anecdotes from that era is something of a legend -- has become, even, a part of history. One day a comrade asked Jimmy to give a ride to a taciturn fellow doing party business of an undisclosed nature. A few years later, he recognized the passenger as Julius Rosenberg. (Suffice it to say that Weinstein's future biographer will probably find a day-by-day account of his life during the early 1950s in the FBI surveillance files.)

Jimmy left the party in 1956, as part of a major exodus in the wake of Khrushchev's denunciation of the crimes of Stalin. He was never apologetic about his membership. But neither was he even slightly sentimental about it.

Well before massive documentation from the Russian archives settled the question, he dismissed the arguments of those who insisted that the American CP and the Soviet spy apparatus in the U.S. had to be considered as completely distinct entities. Any good party member would have been glad to help out, he said: "We would have considered it an honor." (Jimmy himself never received that distinction. According to Miles Harvey, the request that he chauffeur Julius Rosenberg has less to do with Jimmy's reliability as a revolutionary than it did with the fact that he was one of the Communists on hand who owned a car.)

The fact that he once said this at a public event, where non-leftists could hear him -- and that he did so during the Reagan administration, no less -- is still held against him in some circles.

The usual pattern, of course, is to abandon a rigid, dogmatic political ideology -- and then to adopt another one. People spend entire careers boldly denouncing other people for their own previous mistakes. It's easy work, and the market for it is steady.

Jimmy followed a different course. To begin with, he had never been all that keen on the ideological nuances of the Communist movement. He certainly knew his Marx and Lenin from studying at the party's famous Jefferson School of Social Science, in New York. But somehow the doctrinal points counted less than what he'd picked up from all those years as a union activist. At least that's the impression of his friend Jim McNeill, another former managing editor at In These Times. (McNeill, who is now an organizer for the Service Employees International Union.)

Nearing 30, Weinstein decided to go to graduate school to study history; and his instinct was to dig into an earlier period of American radicalism -- when it spoke an idiom that was much less purely Marxist, and a lot more influential. Up through World War I, the Socialists successfully fielded candidates in local elections and even get the occasional member into Congress. And Eugene Debs, a figure beloved even by those who didn't share his vision of the proletarian commonwealth, could win nearly a million votes for president while imprisoned for an antiwar speech.

Weinstein's research was, in short, a glimpse of an alternative that had been lost. It wasn't simply a matter of government repression, either. There were streaks of doctrinal puritanism, of apocalyptic revolutionism, that eventually proved corrosive. "In large part," as he later put it, "the failure of the American left has been internal." (Whether or not he made the connection isn't clear, but his own experience in the CP would tend to confirm this. As bad as McCarthyism had been for the party, members started quitting en masse once they had to face the truth about Stalin.)

Boiled down, his conclusions amounted to a demand for a major upheaval in the culture of the left. What it needed for the long term, in effect, was a healthy dose of pragmatism. It would also mean learning to think of reforms as part of the process of undermining the power of the profit system -- rather than implicitly seeing reforms as, at best, a kind of compromise with capitalism.

Had he done only that initial study of the Socialist Party (finished in 1962, though only published five years later), Jimmy Weinstein would merit a small but honorable spot in the history of the American left. But in fact he did a lot more.

Today's academic left is very much a star system. Jimmy never had a place in it. If that bothered him, he did a good job of keeping quiet about it. But just for the record, it's worth mentioning that he was present at the creation.

He was part of the group in Madison, Wisconsin that published Studies on the Left between 1959 and 1967. It was the first scholarly journal of Marxist analysis to appear in the United States since at least the end of World War II, and an important point of connection between the American New Left and international currents in radical thought. (The first translation of Walter Benjamin's "The Work of Art in the Age of Mechanical Reproduction," for example, appeared in Studies.)

Jimmy's brief memoir of this period can be found in a volume edited by the radical historian Paul Buhle called History and the New Left: Madison, Wisconsin, 1950-1970 (Temple University Press, 1990). There has long been a tendency to treat the intellectual history of the American left as unfolding primarily in New York City. This is understandable, in some ways, but it introduces gross distortions. It's worth remembering that one of the major publications serving to revitalize radical scholarship was the product of a group of graduate students at the University of Wisconsin. It appears that Buhle's anthology is now out of print. But what's more surprising, I think, is that more research hasn't been done on "the Madison intellectuals" in the meantime.

In keeping with Miles Harvey's characterization of Weinstein as "the Zelig of the American left," we next find him at the Chicago convention of Students for a Democratic Society in 1969. That was the one where -- just as the antiwar movement was starting to get a hearing on Main Street USA -- rival factions waved copies of the Little Red Book in the air and expelled one another. (Want evidence that the left's deepest wounds are self-inflicted? There you go.)

Repelled by the wild-eyed hysteria and terrorist romanticism of the Weather Underground (of which, one of his cousins was a member), Jimmy helped start another journal, Socialist Revolution, which was always more cerebral than its up-against-the-wall title might suggest. In 1978, it changed its name to Socialist Review. (This abandonment of "revolution" inspired a certain amount of hand-wringing in some quarters.) It was the venue where, in 1985, Donna Haraway first published her "Cyborg Manifesto." For years afterward, the rumor went around that SR was about to drop "Socialist" from its title, to be replaced by "Postmodern." But in fact it continues now as Radical Society -- a distant descendant of its ancestor, by now, though it still bears a family resemblance to the publications that Jimmy worked on long ago.

Jimmy's last major venture as a publisher -- the culmination of his dream of converting the lessons of radical history into something practical and effective, here and now -- was In These Times, which started as a newspaper in 1976 and turned into a magazine sometime around 1990. A collection of articles from the magazine's first quarter century appeared in 2002 as the book Appeal to Reason -- a title echoing the name of the most widely circulated newspaper of the old Socialist Party.

Pat Aufderheide, now a professor of communications at American University, was ITT's culture editor from 1978 through 1982. She writes about the experience in her book The Daily Planet: A Critic on the Capitalist Culture Beat (University of Minnesota Press, 2000). A whole generation of people were entranced by the countercultural idea that "the personal is the political" -- or its academic doppelganger, the Foucauldian notion that power was everywhere and inescapable. These were recipes, she notes, for "self-marginalization and political fundamentalism" on the left.

"For In These Times," writes Aufderheide, "politics is the prosaic complex of institutions, structures and actions through which people organize consciously for social change.... Richard Rorty would put it in the reformist left category. It is read largely by leftists who do organizing or other practical political work, through labor unions, universities and schools, churches, nonprofit organizations and local and regional government. These are smart people, many of whom are not intellectuals, and who mostly come home late and tired."

The importance of reaching that public -- indeed, the very possibility of doing so -- tends to be overlooked by many people engaged in left-wing academic discourse. ("Our comrades in armchairs," as activists sometimes put it.)

In her book, Aufderheide recalls dealing with "a vocal contingent of academics" who were "always ready to pounce on lack of subtlety, creeping cheerleading, or sentimentality" in the magazine's cultural coverage. "Their critical acuteness, however, often seemed exercised for the satisfaction of intellectual one-upmanship," she writes. "When I begged them to write, to point me to other writers, to serve on the board, there was almost always a stunned silence."

The problem is self-perpetuating, Perhaps it comes down to a lack of good examples. And in that regard, Jimmy's death is more than a personal loss to his friends and family.

It's worth mentioning that, along the way, he wrote a number of other books, with The Long Detour: The History and Failure of the American Left  (Westview, 2003) being his last. It was also his favorite, according to Miles Harvey, whose series of deathbed  interviews will, in time, serve as the starting point for some historical researcher who has perhaps not yet heard of James Weinstein.

To be candid, I didn't care for his final book quite as much as the one he published in 1975 called Ambiguous Legacy: The Left in American Politics. The books are similar in a lot of ways. I'm not sure that my preference for one over the other is entirely defensible.

But it was Ambiguous Legacy that Jimmy inscribed when we met, about 10 years ago. My copy of his first book, the one on the Socialist Party, he dedicated "with hope for our future." Only later did I look at the other volume. Beneath the greeting -- and before his signature -- he wrote: "The legacy is more ambiguous than ever."

Author/s: 
Scott McLemee
Author's email: 
scott.mclemee@insidehighered.com

Scott McLemee was a contributing editor for In These Times between 1995 and 2001. His column Intellectual Affairs appears here on each Tuesday and Thursday.

Throat Culture

For the past few days, I've been waiting for a review copy of Bob Woodward's book The Secret Man: The Story of Watergate's Deep Throat to arrive from Simon and Schuster. So there has been some time to contemplate the way that (no longer quite so) mysterious figure has been "inscribed" ina "double register" of "the historical imaginary," as the cult-stud lingo has it. (Sure hope there's a chance to use "imbricated discourse" soon. Man, that would be sweet.)

Putting it in slightly more commonplace terms: Two versions of Deep Throat have taken shape in the past 30 years or so. They correspond to two different ways of experiencing the odd, complex relationship between media and historical memory.

On the one hand, there was Deep Throat as a participant in a real historical event -- making the question of his motivation an important factor in making sense of what happened. It was even, perhaps, the key to understanding the "deep politics" of Watergate, the hidden forces behind Richard Nixon's fall. The element of lasting secrecy made it all kind of blurry, but in a fascinating way, like some especially suggestive Rorschach blot.

On the other hand, there was Deep Throat as pure icon -- a reference you could recognize (sort of) even without possessing any clear sense of his role in Watergate. It started out with Hal Holbrook's performance in All the President's Men -- which, in turn, was echoed by "the cigarette-smoking man" on "The X Files," as well as the mysterious source of insider information about the Springfield Republican Party on "The Simpsons." And so Deep Throat (whose pseudonym was itself originally amovie title) becomes a mediatic signifier unmoored to any historical signified. (An allusion to an allusion to a secret thus forgotten.)

Different as they might be, these two versions of Deep Throat aren't mutually exclusive. The discourses can indeed become imbricated ( yes!), as in the memorable film Dick, which reveals Deep Throat as a pair of idealistic schoolgirls who guide the cluelessly bumbling Woodward and Bernstein through the mysteries of the Nixon White House.

There is something wonderful about this silly premise: In rewriting the history of Watergate, Dick follows the actual events, yet somehow neutralizes their dire logic by just the slightest shift ofemphasis. The deepest secret of an agonizing national crisis turns out to be something absurd.

That perspective is either comically subversive or deeply cynical. Either way, it's been less anticlimactic, somehow, than the revelation of Deep Throat's real identity as the former FBI official Mark Felt. So much for the more elaborate theories about Watergate - that it was, for example, a "silent coup" by a hard-right anticommunist faction of the U.S. military, upset by the administration's dealings with the Soviets and the Chinese. And Deep Throat's role as emblem of noir-ish intrigue may never recover from the impact of the recent, brightly lit video footage of Mark Felt -- half-dazed, half mugging for the camera.

And there have been other disappointments. This week, I had an interesting exchange by e-mail with Bill Gaines, a professor of journalism at the University of Illinois at Urbana-Champaign and two-time winner of the Pulitzer, not counting his two other times as finalist. His part in the DeepThroat saga came late in the story, and it's caused him a certain amount of grief.

But it was also -- this seems to me obvious -- quite honorable. If anything, it is even more worthy of note now that Bob Woodward is telling his side of the story. (While Carl Bernstein also has a chapter in the book, it was Woodward who had the connection with Felt.)

In 1999, Gaines and his students began an investigation designed to determine the identity of Deep Throat. The project lasted four years. It involved sifting through thousands of pages of primary documents and reading acres of Watergate memoir and analysis -- as well as comparing the original articles by Woodward and Bernstein from The Washington Post to the narrative they provided in their book All the President's Men. Gaines also tracked down earlier versions of the manuscript for that volume -- drafted before Woodward decided to reveal that he had a privileged source of inside information.

Gaines and his students compiled a database they used to determine which of the likely candidates would have actually been in a position to leak the information that Deep Throat provided. In April 2003, they held a press conference at the Watergate complex in Washington, DC, where they revealed ... the wrong guy.

After a period of thinking that Deep Throat must have been Patrick Buchanan (once a speechwriter for Nixon), the researchers concluded that it had actually been Fred Fielding, an attorney who had worked as assistant to John Dean. The original report from the project making the case for Fielding is still available online -- now updated with a text from Gaines saying, "We were wrong."

The aftermath of Felt's revelation, in late May, was predictably unpleasant for Gaines. There were hundreds of e-mail messages, and his phone rang off the hook. "Some snickered as if we had run the wrong way with the football," he told me.

But he added, "My students were extremely loyal and have told anyone who will listen that they were thrilled with being a part of this project even though it failed." Some of those who worked on the project came around to help Gaines with the deluge of correspondence, and otherwise lend moral support.

As mistaken deductions go, the argument offered by Gaines and his students two years ago is pretty rigorous. Its one major error seems to have come at an early stage, with the assumption that Woodward's account of Deep Throat was as exact as discretion would allow. That was in keeping with Woodward's own statements, over the years. "It's okay to leave things out to protect the identity of a source," he told the San Francisco Chronicle in 2002, "but to add something affirmative that isn't true is to publish something you know to be an inaccuracy. I don't believe that's ethical for a reporter."

The problem is that the original account of Deep Throat doesn't line up quite perfectly with what is known about Mark Felt. Some of the discrepancies are small, but puzzling even so. Deep Throat is a chain smoker, while Felt claimed to have given up the demon weed in 1943. "The idea that Felt only smokes in the garage [during his secretive rendezvous with Woodward] is a little hard to swallow," says Gaines. "I cannot picture him buying a pack and throwing the rest away for the drama it will provide." By contrast, Fielding was a smoker.

More substantive, perhaps, are questions about what Deep Throat knew and how he knew it. Gaines and his students noted that statements attributed to Deep Throat in All the President's Men were credited to a White House source in the original newspaper articles by Woodward and Bernstein. (Felt was second in command at the FBI, not someone working directly for the White House, as was Fielding.)

Deep Throat provided authoritative information gleaned from listening to Nixon's secret recordings during a meeting in November 1973. That was several months after Felt left the FBI. And to complicate things still more, no one from the FBI had been at the meeting where the recordings were played.

According to Gaines, that means Felt could only have learned about the contents of the recordings at third hand, at best. Felt was, as Gaines put it in an e-mail note, ""so far removed that his comments to Woodward would have to be considered hearsay, and not the kind of thing a reporter could write for fact by quoting an anonymous source."

When I ask Gaines if there is anything he hopes to learn from Bob Woodward's new book, he mentions hoping for some insight into one of the more memorable descriptions of the secret source -- the one about how Deep Throat "knew too much literature too well." In any case, Gaines make a strong argument that Woodward himself took a certain amount of literary license in transforming Felt into Deep Throat.

"We know from our copy of an earlier manuscript that Woodward changed some direct quotes attributed to Throat," he notes. "They were not major changes, but enough to tell us that he was loose with the quotes. There is information attributed to Throat that Felt would not have had, or that doesnot agree with what we found in FBI files."

As the saying has it, journalists write a first draft of history. One of the ethical questions involves trying to figure out just how much discretion they get in polishing the manuscript. Gaines seems careful not to say anything too forceful on this score -- though he does make clear that he isn't charging Woodward with creating a composite character.

That has long been one of the suspicions about Deep Throat. Even the new revelation hasn't quite dispelled it. Just after Felt went public with his announcement, Jon Wiener, a professor of history at the University of California at Irvine, reviewed some of the grounds for thinking that "several people who provided key information ... were turned into a composite figure for dramatic purposes" by Woodward and Bernstein. (You can find more of Wiener's comments here, at the very end of the article.)

For his part, Gaines says that the Deep Throat investigation isn't quite closed -- although he wishes it were. "I have always wanted to move on to something more important for the class project," he told me, "but the students and the media have caused us to keep going back to the Throat story."

Maybe now they should look into the mystery surrounding Deep Throat's most famous line: his memorable injunction to Woodward, "Follow the money."

It appears in the movie version of All the President's Men, though it can't be found in the book. When asked about it in an interview some years ago, Woodward guessed that it was an embellishment by William Goldman, the screenwriter. But Goldman has insisted that he got the line from Woodward.

Now it's part of the national mythology. But it may never have actually happened. Sometimes I wish the discourses would stop imbricating long enough to get this kind of thing sorted out.

Author/s: 
Scott McLemee
Author's email: 
scott.mclemee@insidehighered.com

Real Knowledge

During the heyday of American economic and geographical expansion, in the late 19th century, the men who sold real estate occupied a distinct vocational niche. They were slightly less respectable than, say, riverboat gamblers -- but undoubtedly more so than pirates on the open seas. It was a good job for someone who didn’t mind leaving town quickly.

But about 100 years ago, something important began to happen, as Jeffrey M. Hornstein recounts in A Nation of Realtors: A Cultural History of the Twentieth Century American Middle Class, published this spring by Duke University Press.  Some of those engaged in the trade started to understand themselves as professionals.

They created local realty boards and introduced licensing as means by which reputable practitioners could distinguish themselves from grifters. And in time, they were well enough organized to lobby the federal government on housing policy –- favoring developments that encouraged the building of single-family units, rather than public housing. Their efforts, as Hornstein writes, "would effectively create a broad new white middle class haven in the suburbs, while leaving behind the upper class and the poor in cities increasingly polarized by race and wealth."

I picked up A Nation of Realtors expecting a mixture of social history and Glengarry Glen Ross. It's actually something different: a contribution to understanding how certain aspects of middle-class identity took shape -- both among the men (and later, increasingly, women) who identified themselves as Realtors and among their customers. Particularly interesting is the chapter "Applied Realology," which recounts the early efforts of a handful of academics to create a field of study that would then (in turn) bolster the profession’s claims to legitimacy and rigor.

Hornstein recently answered a series of questions about his book -- a brief shift of his attention back to scholarly concerns, since he is now organizing director of Service Employees International Union, Local 36, in Philadelphia.

Q:Before getting to your book, let me ask about your move from historical research to union organizing. What's the story behind that?

A: I was applying to graduate school in my senior year of college and my advisor told me that while he was sure I could handle grad school, he saw me as more of "a politician than a political scientist." I had always been involved in organizing people and was a campus leader. But I also enjoyed academic work, and went on to get two graduate degrees, one in political science from Penn, another in history from the University of Maryland.

While I was doing the history Ph.D. at Maryland, a group of teaching assistants got together and realized that we were an exploited group that could benefit from a union. Helping to form an organizing committee, affiliating with a national union, getting to know hard-boiled organizers (many of whom were also intellectuals), and attempting to persuade my peers that they needed to take control of their own working conditions through collective action captured my imagination and interest much more than research, writing, or teaching.  

After a long intellectual and personal journey, I finally defended my dissertation. The academic job market looked bleak, particularly as a graduate of a non-elite institution. And when I was honest with myself, I realized that my experience forming a graduate employee union engaged me far more than the intellectual work.

Armed with this insight, I put the diss in a box, and two weeks later, I was at the AFL-CIO’s Organizing Institute getting my first taste of what it would be like to organize workers as a vocation. In the dark barroom in the basement of the George Meany Center for Labor Studies, a recruiter from an SEIU local in Ohio approached me and asked me if I’d like to spend the next few years of my life living in Red Roof Inns, trying to help low-wage workers improve their lives. Two weeks later, I landed in Columbus, Ohio and I was soon hooked.  

And I would add this: The supply of talented and committed organizers is far outstripped by the demand. The labor movement’s current crisis is, frankly, a huge opportunity for energetic and idealistic people to make a real difference. Hard work and commitment is really rewarded in the labor movement, and one can move quickly into positions of responsibility. It’s very demanding and often frustrating work, but it’s about as fulfilling a vocation as I could imagine.

Q:You discuss the emergence of realtors as the rise of a new kind of social identity, "the business professional." But I'm left wondering about early local real-estate boards. They sound kind of like lodges or fraternal groups, as much as anything else. In what sense are they comparable to today's professional organizations, as opposed to, say, the Elks or the Jaycees?

A: Indeed, early boards were very much like fraternal organizations. They were all male and clubby, there was often a "board home" that offered a retreat space, and so on. Early real estate board newsletters are rife with the sorts of jokes about women and minorities that were standard fare in the 1910s and 1920s -- jokes that, I argue, help to police the boundaries of masculinity.  

In the early chapters of the book, I provide brief sketches of the workings of the Chicago and Philadelphia real estate boards, as well as a sort of anthropological view of early real estate conventions. My favorite was the 1915 Los Angeles convention, during which the main social event was a drag party. In my view, the conventions, the board meetings, the social events, the publications, all formed a homosocial space in which a particular sort of masculinity was performed, where the conventions of middle-class masculinity were established and reinforced.  

In the early 1920's, the emphasis began to shift from fraternalism to a more technocratic, professional modality.  Herbert Nelson took the helm at the National Association of Real Estate Boards in 1923, and he started to make NAREB look much more like a modern professional organization. In some respects he created the mold. He made long-term strategic plans, asserted the necessity for a permanent Realtor presence in Washington, D.C., pushed for standards for licensing, worked with Herbert Hoover’s Commerce Department to promulgate a standard zoning act, and linked up with Professor Richard T. Ely [of the University of Wisconsin at Madison] to help "scientize" the field.  

Nelson served as executive director of NAREB for over 30 years. During his tenure, the organization grew, differentiated, specialized, and became a powerful national political actor. In sum, it became a true modern professional association in most ways. Yet like most other professional organizations prior to the ascendancy of feminism and the major incursion of women into the professions, masculine clubbiness remained an important element in the organizational culture well into the 1970s.    

In sum, the story I tell about the complex interdependencies of class, gender, and work identities is largely about the Realtors’ attempts to transform an Elks-like organization into a modern, "professional" business association.

Q:On the one hand, they see what they are doing as a kind of applied social science -- also creating, as you put it, "a professional metanarrative." On the other hand, you note that Ely's Institute for Research in Land Economics was a casualty of the end of the real estate bubble. Doesn't that justify some cynicism about realtors' quest for academic legitimacy?

A: I don’t see the Realtors or the social scientists like Ely in cynical terms at all. In fact, both parties are quite earnest about what they’re doing, in my view. Ely was nothing if not a true believer in the socially transformative power of his research and of social scientific research in general. He managed to persuade a faction of influential Realtors, primarily large-scale developers ("community-builders") such as J.C. Nichols, that research was the key to professionalism, prosperity, and high-quality real estate development.  
Ely’s Institute was not a casualty of the implosion of the 1926 Florida real estate bubble as such. But the real estate collapse and the ensuing Depression made it much harder for the Realtors to make claims to authority based on disinterested science.

It’s not that the grounding of the whole field of Land Economics was problematic – at least no more so than any other field of social or human science, particularly one that produces knowledge that can be used for commercial purposes.  

The academic field was in its infancy in the 1910s and 1920s, and there were intra-disciplinary squabbles between the older, more historical economists like Ely and the younger generation, which was much more model- and mathematics-driven. At the same time, there were sharp divisions among Realtors between those who believed that professionalism required science (and licensing, and zoning, and so on) and those who rejected this idea.  

So, yes, the Elyian attempt at organizing the real estate industry on a purely ‘scientific’ basis, operating primarily in the interest of the social good, was largely a failure. However, the 1920s mark a watershed in that the National Association became a major producer and consumer of social scientific knowledge. Business schools began to offer real estate as a course of study. Textbooks, replete with charts and graphs and economic equations, proliferated. Prominent academics threw their lot in with the Realtors.

In the end, the industry established its own think tank, the Urban Land Institute, the motto of which is “Under All, The Land” -- taken straight from Ely’s work. But the profession itself remained divided over the value of ‘science’ – the community-builders generally supported efforts to scientize the field, while those on the more speculative end of the profession were generally opposed.  

But again, I don’t think that the grounding of the field of land economics is any more questionable than any other subfield of economics, such as finance or accounting.

Q:Your book left me with a sort of chicken-and-egg question. You connect the growth of the profession with certain cultural norms -- the tendency to define oneself as middle-class, the expectation of private home ownership, etc. Didn't those aspirations have really deep roots in American culture, which the Realtors simply appealed to as part of their own legitimization? Or were they more the result of lobbying, advertising, and other activities of the real-estate profession?

A: Absolutely, these tendencies have roots deep in American culture. The term "middle class" was not really used until the late 19th century -- "middling sorts" was the more prevalent term before then. The "classless society" has long been a trope in American culture, the idea that with hard work, perseverance, and a little luck, anyone can "make it" in America, that the boundaries between social positions are fluid, etc.  

But it’s not until the early-to-mid 20th century that homeownership and middle-class identity come to be conflated.  The "American Dream" is redefined from being about political freedom to being about homeownership. At around the same time, debt is redefined as "credit" and "equity."

So, yes, I ‘d agree to some extent that the Realtors tapped into longstanding cultural norms as part of their efforts at self-legitimization. Like most successful political actors, they harnessed cultural commonsense for their own ends – namely, to make homeownership integral to middle-class identity. Their political work enabled them, in the midst of the Depression, to get the National Housing Act passed as they wrote it -- with provisions that greatly privileged just the sort of single-family, suburban homes leading members of NAREB were intent on building.  

The Realtors used the cultural material at hand to make their interests seem to be the interests of the whole society. But, as we know from many fine studies of suburban development, many people and many competing visions of the American landscape were marginalized in the process.

Author/s: 
Scott McLemee
Author's email: 
scott.mclemee@insidehighered.com

Scott McLemee writes Intellectual Affairs on Tuesdays and Thursdays.

Other Casualties

One part of Milovan Djilas's Conversations with Stalin lingers in the memory well after the rest of the book fades. The author himself calls it "a scene such as might be found only in Shakespeare's plays." Actually, it does have its parallels to Rabelais, as well; for like many another gathering of the Soviet elite amidst the privations of World War II that Djilas recounts, there is an enormous feast, and a marathon drinking session.

This particular miniature carnival occurs in the final months of the war. Stalin is hosting a reception at the Kremlin for the Yugoslavian delegation. But before the partying begins, he must disburden himself; for Stalin has heard that Djilas (who would later become vice president under Marshall Tito) has criticized the behavior of some units of the Red Army as it has made its way across Europe.

"Does Djilas, who is himself a writer, not know what human suffering and the human heart are?" cries Stalin. "Can't he understand it if a soldier who has crossed thousands of kilometers through blood and fire and death has fun with a woman or takes a trifle?"

By "having fun," he was referring to well over two million rapes, by Soviet soldiers, of women of all ages and backgrounds. The very indiscriminateness of the sexual violence gives the lie to the idea that it was revenge for the suffering inflicted by the Germans. Inmates liberated from Nazi concentration camps were raped as well.

As for Djilas, it must have seemed, for a moment, as if Stalin's outburst were the kiss of death. Luckily for him, the dictator's mood changed. "He proposed frequent toasts," recalls the author, "flattered one person, joked with another, teased a third, kissed my wife because she was a Serb, and again shed tears over the hardships of the Red Army and Yugoslav ingratitude."

Perhaps in response to the criticism, Stalin issued a command that soldiers behave themselves. The Soviet officers read the proclamation to their troops with a smirk. Everyone knew it meant nothing. Boys will be boys.

The anonymous memoir A Woman in Berlin, now appearing in a new English translation from Metropolitan Books, is an extraordinary chronicle of life in the streets as the Thousand Year Reich turned into rubble and the advancing "Ivans" had their fun. The author was a German editor and journalist who died in 2001. Her book, based on a diary kept over two months during the spring of 1945, first appeared in English in 1954. It was only published in German in 1959, where it seems tohave been regarded as an intolerable faux pas, a violation of the unstated rule that the events never be mentioned again.

The book's rediscovery now comes in the wake of Antony Beevor's massive documentation of the rape campaign in The Fall of Berlin 1945, published three years ago by Viking Press. To judge by the reservations of some military historians, Beevor's account may not be the last word on howSoviet forces advanced into Germany. (A reviewer for Parameters, the journal of the U.S. Army War College, praised it as a work of popular history, but lodged some complaints about certain gaps in the book's account of troop manuevers.) Yet the book did take an unflinching look at the extent of the sexual terror.

Beevor supplies an introduction to the new edition of A Woman in Berlin, situating the document in historical context. He notes, for example, that the statistics about rape for Berlin "are probably the most reliable in all of Germany," falling somewhere between 95,000 and 130,000 victims "according to the two leading hospitals."

He also points out that there is no particular evidence that rape was treated as a deliberate strategy of war -- as human-rights activists have recently charged the Sudanese military with doing in Darfur.  "No document from the Soviet archives indicates anything of the sort in 1945," writes Beevor. But he suggests that the scale of the attacks may have been a by-product of the Red Army's internal culture, even so: "Many soldiers had been so humiliated by their own officers and commissars
during the four years of war that they felt driven to expiate bitterness, and German women presented the easiest target. Polish women and female slave laborers in Germany also suffered."

Reading the memoir itself, you find all such interpretive questions being put on hold. It is not just a document. The author, an urbane and articulate woman in her early 30s, writes about the fall of Berlin and her own repeated violation with an astounding coolness -- a bitter, matter-of-fact lucidity, the extreme candor of which is almost disconcerting, given the lack of even a hint of self-pity.

"No doubt about it," she writes after being raped several times in a row. "I have to find a single wolf  to keep away the pack. An officer, as high-ranking as possible, a commandant, a general, whatever I can manage. After all, what are my brains for, my little knowledge of the enemy's language?... My mind is firmly made up. I'll think of something when the time comes. I grin to myself in secret, feel as if I'm performing on the stage. I couldn't care less about the lot of them! I've never been so removed from myself, so alienated. All my feelings seem dead, except for the drive to live."

I've just reviewed the latest edition of A Woman in Berlin for Newsday, and will spare you a recycling of that effort (now available here ). Since then, a look at other reviews has revealed some debate over the authenticity of the book. The comments of J.G. Ballard ( no stranger to questions of sexuality in extreme conditions ) are indicative.

"It is hard to believe, as the author claims, that it was jotted down with a pencil stub on old scraps of paper while she crouched on her bed between bouts of rape," wrote Ballard in The New Statesman a few weeks ago. "The tone is so dispassionate, scenes described in so literary a way, with poignant references to the strangeness of silence and the plaintive cry of a distant bird. We live at a time that places an almost sentimental value on the unsparing truth, however artfully deployed. But the diary seems convincingly real, whether assembled later from the testimonies of a number of women or recorded at first hand by the author."

Given that concern, it is worth looking up the original edition of A Woman in Berlin, now more than 50 years old. It came with an introduction by C.W. Ceram, whose book Gods, Graves, and Scholars, first published in 1951, remains one of the best introductions to the history of archeology. Ceram recalls meeting the author of A Woman in Berlin not long after the war.

"From some hints that she dropped," he wrote, "I learned of this diary's existence. When, after another six months passed, I was permitted to read it, I found described in detail what I already knew from the accounts of others."

That means Ceram saw the book in 1947, at the latest. "It took me more than five years, however, to persuade the author that her diary was unique, that it simply had to be published."

She had, he writes, "jotted down in old ledgers and on loose pages what happened to her.... These pages lie before me as I write. Their vividness as expressed in the furtiveness of the short penciled notes; the excitement they emanate whenever the pencil refuses to describe the facts; the combination of shorthand, longhand, and secret code ... all of this will probably be lost in the depersonalizing effect of the printed word."

Ceram's introduction is interesting for its testimony about the book's provenance. But that remark about "the depersonalizing effect of the printed word" will seem odd to anyone who has read A Woman in Berlin.

In many ways, of course, the book is an account of brutality. (War is a force that turns people into things, as Simone Weil once put it; and killing them is just one of the ways.) But the anonymous author also created a record of what is involved in resisting depersonalization. At times, she is able to see the occupiers, too, as human beings. You cannot put the book down without wondering about the rest of her life.

Author/s: 
Scott McLemee
Author's email: 
scott.mclemee@insidehighered.com

Necessary Evils

  "In a time of war," wrote Cicero, "the laws are silent." (That's "inter arma silent leges," in case some nuance is missing from the usual English rendering.)

Well, perhaps not quite silent. Marouf A. Hasian's In the Name of Necessity: Military Tribunals and the Loss of American Civil Liberties, available next month from the University of Alabama Press, revisits more than 200 years of American argumentation for and against the legitimacy of "military justice."

That phrase merits the scare quote marks because it is very much open to question whether they quite belong together. You don't need to be a pacifist, or even to harbor any doubt about liberal democracy, to have such concerns. The role of the military is, of course, to fight; and the legitimacy of its monopoly on violence derives (in modern societies anyway) from its subordination to a lawful order. At best -- so the argument might go -- the military can pursue a just policy, as subject to oversight and review by outside institutions. Hence the rise of what is called the "civilianization" of military law.

That's the theory, anyway. The actual record is a good bit messier, as Hasian, an associate professor of communications at the University of Utah, shows in some detail. His book presents a series of analytic retellings of events from the Revolutionary War through the detainments at Guantanamo Bay. To some degree, then, it overlaps with William Rehnquist's All the Laws But One: Civil Liberties in Wartime, (1998, which focused mainly on cases from the Civil War and the two World Wars.

But the difference is not simply a matter of the opening of a whole new chapter in history over the past four years. Hasian's book is the work of a scholar who has taken "the rhetorical turn" -- drawing on the toolkit of concepts from one of the founding disciplines of humanistic study. A social historian or a law professor might also cover, as he does, the 1862 U.S-Dakota war tribunal, which led to the execution of a group of Native Americans -- or the 1942 trial of several German saboteurs, captured shortly after they had been deposited on the coasts of New York and Florida, along with bomb-making materials, by U-boat. But Hasian treats these cases neither as events (as a historian would) nor as precedents (the lawyer's concern).

The emphasis in his book falls, rather, on how a particular element of persuasion took shape in each case: the argument of necessity. In each case, the claim was made that circumstances demanded the suspension of normal legal procedures and guarantees, and their replacement by military tribunals that practiced the warlike virtues of secrecy, efficiency, and swiftness.

A philosopher or legal theorist might want to dissect the validity, coherence, or applicability of "necessity" as a principle applied in such cases. Hasian's approach treats it, not as a concept, but as what rhetoric scholars have in recent years called an "ideograph" -- that is, "a key evocative term or phrase that illustrates the political allegiances of an individual and a community in a major social, political, economic, or legal controversy." Other ideographs include such terms as "equality," "progress," and "freedom."

The range of definitions and of emotional charge for each term varies. They have a rather timeless sound, but a complex history of mutations in meaning. And in the heat of debate, they can be made to perform a variety of functions. The meaning of an ideograph in a given context is marked by that context.

Perhaps the strongest version of the argument from necessity is the one that Lincoln made against those who criticized him for suspending habeus corpus during the Civil War: "Are all the laws, but one, to go unexecuted, and the government go to pieces, lest that one be violated?" In other words: Moments of extremity can require the temporary sacrifice of some civil liberties to preserve the rest.

Rehnquist signaled his basic agreement with this line of thought by titling his book All the Laws But One. "It is neither desirable nor is it remotely likely," he wrote there, "that civil liberty will occupy as favored a position in wartime as it does in peacetime."

But even the fairly straightforward affirmation of necessity as a legitimate ground for suspending civil liberties is the result of (and a moment of conflict within) a complicated history of arguments. In tracing out the history of necessity, Hasian identifies two strands of -- well, it's not clear what the expression would be. Perhaps "ideographic DNA"? One he calls the "Tory" concept of necessity; the other, the "Whig" version.

In the Tory framing, there are "many times when a society is called upon to defend itself against riots, revolutions, and rebellions," as Hasian puts it. It is the responsibility of the monarch or the executive branch to recognize the danger and respond accordingly. "Since this is an issue of survival, the military authorities should be given a great deal of discretion. In these situations, the 'will' of those in authority will be of paramount importance."

(In other words, an element of sovereign authority is handed over to the military. The commanding officer is then in the position to say, "I am the law." And legitimately so.)

By contrast, the Whiggish conception of necessity sees "relatively few times when a society has to worry about exigent circumstances." Responsibility for judging whether or not a real emergency exists should fall to the parliament or the legislative branch -- to which the military must remain accountable.

Appropriately enough, given a Whiggish sensibility, this means a certain degree of guardedness and jealousy about the degree of judicial authority delegated to the military. There will be a tendency towards suspicion that the trust might be abused. The Whig discourse on necessity wants to keep to a bare minimum the scope, duration, and degree of secrecy that military tribunals may claim.

The classic formulation of the Whig conception in American history is Ex parte Milligan, from 1866, in which the Supreme Court found that the Union military authorities had overstepped by arresting and trying a Confederate sympathizer in Indiana -- a state where the normal functioning of the court system had not been interrupted by the war.

Of course, Ex parte Milligan fans have taken some hits lately. We had a good run there, for a while. But lately all the swagger comes from enthusiasts for Ex parte Quirin (1942), which denied the claim of German saboteurs to appeal for civil trials.

What makes Hasian's account of Quirin so interesting is his suggestion that some Supreme Court justices "actually thought that their decision would be construed as falling in line with the precedents that placed limits on military commissions and executive power." But if that was the intention 60 years ago, you'd never know it to read the newspapers today.

This is an aerial overview of In the Name of Necessity. The real provocations are in the details. Perhaps the analytic category of ideograph sounds a trifle thin -- turning bloody arguments into something rather anemic. But Hasian's book is ultimately more polemical than that. The framework is only just technically "value neutral." He's got a position to stake out.

"In the very, very rare cases of extreme necessity," he writes, "when Congress and the United Nations have decided we need to impose martial law or have commissions in occupied lands, we may have situations where all of the civil courts are closed and where the military may need more discretion."

That much, Hasian will concede to the Tory worldview, and no more. But even then, such assertions of power "need to be held in check by recognizing that most of the time we should begin with the baseline 'Whig' assumption that we want to maintain the civilianization of the military, and not the other way around."

OK, fair enough. Now how will that play out in the courts under Chief Justice Roberts? And particularly under a circumstance in which the Tories are so powerful that nobody really doubts that Chief Justice Roberts will be presiding?

That Whig in extremis John Milton said that necessity is "ever the tyrant's plea." But we might be entering a period when the plea doesn't even have to be made -- when war doesn't silence law, but writes it.

Author/s: 
Scott McLemee
Author's email: 
scott.mclemee@insidehighered.com

Scott McLemee writes Intellectual Affairs on Tuesdays and Thursdays. Suggestions and ideas for future columns are welcome.

The Media World as It Is

I direct a journalism school known for its support of the First Amendment, which we celebrate annually with speeches and case studies. As such, I am a source on free press issues. Reporters contact me about such cases as the Ward Churchill fiasco at the University of Colorado, asking if his “little Eichmanns” depiction of 9/11 victims is protected speech -- perhaps speech that should protected by me. I deflect those calls, believing such controversies are less about free speech and more about a media culture that values opinion more than fact.

There are many reasons for this, but nearly all point to new technology and profit margin. For starters, opinion costs less to disseminate than fact and can be aligned with a target market. Media chains that care more about revenue than reputation have purchased outlets from families that had safeguarded rights in hometowns for generations. Computerization and downsizing of newsrooms deleted reporters from the scene so that they became less visible and therefore, vital. Meanwhile communication technology became affordable so that consumers had in their palms the power to broadcast or publish at will.

The news industry has changed so much, so quickly. To be sure, some of that change has been refreshing and long in coming. The Internet, and blogging in particular, have created digital dialogues about media, challenging corporate business as usual. But the promise of technology -- that it would build social networks, democratize news and generally enhance information in two-way flows -- has always hinged on the presumption of readily available and verifiable information.

What are the consequences, not only for media, but for academe, when opinion displaces fact?

The Social Idea

I worked as a correspondent and state editor for United Press International in the 1970s. Members of my generation built on the Edward R. Murrow legacy of intermingling fact with experience. Murrow, an original "embedded" journalist, went on a World War II bombing mission over Germany, reporting for CBS radio network. According to Murrow’s code of ethics, reality was the observed state of people and the world. In other words he thought reporters had to be on the scene to report fact reliably. Practicing that, he brought the war in Europe to America, just as my generation brought home another war with a different outcome -- the war in Vietnam.

Because universities dealt with fact, they played a role in the social protests of that era. Although organizations and movements such as Students for a Democractic Society (led by student editor Tom Hayden at Michigan) and Free Speech (University of California-Berkeley) began in the early to mid 1960s, they came together and spurred protests after news coverage of the 1968 Democratic Convention in Chicago. In 1970, coverage of the invasion of Cambodia sparked a protest at Kent State University that killed four students and injured eight. More protests followed nationwide with two more students killed at Jackson State University. Hundreds of colleges and universities shut down as students went on strike, with subsequent protests often tied to a specific news event. While those protests were political, they were usually in response to factual reporting. Lest we forget, 63 journalists died covering the wars in Vietnam and Cambodia. More recently, 25 journalists died in 2004 alone covering the war in Iraq. One has to ask oneself why that fact alone is scarcely known on college campuses.

Journalists fed Vietnam-era protests simply by reporting fact in a culture that still appreciated its power. We differed from Murrow-era journalists, our mentors, relying less on emotion and more on anonymous sources, for which we caught (and still catch) hell, filing reports in a detached, impartial voice. We practiced objectivity, realizing it could not be fully attained but amassing factual fragments so that a discernable mosaic could emerge.

We tried to see the world as it was, not as we wished it were.

That definition is derived from the British poet Matthew Arnold, who wrote about "genuine scientific passion" in the 1869 essay, “Culture and Anarchy.” Arnold maintained that people who saw things as they were also apprehended a culture beyond the consciousness of science -- a moral culture, informed by conscience. This, Arnold wrote, was the "social idea" that made such people "the true apostles of equality" who "have a passion for diffusing, for making prevail … from one end of society to the other, the best knowledge, the best ideas of their time."

I read “Culture and Anarchy” during the adversary days of the Watergate era. It seemed an appropriate title. Doing so I understood the role of journalism in promoting the “social idea.” The most popular television news show then was “60 Minutes,” on Murrow’s old network, CBS. The show had a novel segment called “Point/Counterpoint.” The most heated segments featured liberal Shana Alexander against conservative James J. Kilpatrick.

Their debates heralded a hitherto unexplored news phenomenon in which sources could pit one constellation of facts against an opposing constellation. This media milieu existed well into the 1990s, diluting the previous culture of fact and transforming it into one of factoid (partial/pseudo-fact).  But fact maintained some power.

“Point/Counterpoint” soon changed that. Keep in mind that this segment was so startling at the time that a new satire show, “Saturday Night Live,” ridiculed it regularly with Jane Curtin assailing the viewpoint of Dan Aykroyd who, invariably, would respond, “Jane, you ignorant …” -- and then he said a word that a savvy source, knowing today’s opinionated media, would not tell a reporter, if sharing this anecdote, fully aware of free speech rights, cognizant that the omitted word is a matter of record and also a matter of fact. This is not political correctness but what occurs in a culture of knee-jerk opinions. Responsible people, or people responsible for others, are aware of that culture and wary about adding their informed voices to the social debate, leaving that to those who would seek celebrity or who would entertain or worse, strike fear and outrage in others.

Fear and outrage are byproducts of an uninformed society. Perhaps that is why Americans increasingly embrace entertainment. James Howard Kunstler in his prescient 1996 book Home from Nowhere maintains that no other society but ours has been so preoccupied with instantaneous make-believe and on-demand fantasy. Because we fear so much, Kunstler writes, “we must keep the TVs turned on at all waking hours and at very high volume.” To escape fear, we amuse ourselves to death -- a phrase associated with a 1985 book by the great communication scholar, Neil Postman, who died in 2003, although many, perhaps ones reading this column, were not informed about the fact of his passing.

Just Another Viewpoint

When families who lost relatives in the 9/11 attacks were still grieving, Ward Churchill, the Colorado professor, was comparing their loved ones to “little Eichmanns.” His inflammatory essay lay dormant on the Internet until only recently. The controversy that arose because of Churchill’s opinions became so intense that Elizabeth Hoffman, president at Colorado, announced her resignation amid the media clamor. To be sure, Hoffmann was dealing with other controversies at the time, but coverage of Churchill became so intense that it might have contributed to that resignation.

A few years ago I could have invited Ward Churchill to Ames, Iowa, during a First Amendment event for a debate about his views. To do so now would assemble a media circus, bringing controversy to my journalism school. And what good would my counterpoint to his opinions accomplish, however factual I could make such an argument, when my invitation and my motive for making it, would be the news rather than the substance of any rebuttal?

In the new media environment, fact -- even all-inclusive, verifiable, comprehensive fact -- is seen as just another viewpoint, just another opinion in the menu of fame on demand facilitated by Internet and cable television. So when a professor writes an essay (or a phrase in that essay) so sensational that it sparks a nationwide debate about free speech or academic freedom, journalists are missing the point. Such controversies, shaped by media practice, merely amuse the opinionated public.

Case in point: Fox’s "American Idol" reportedly inspired 500 million votes this season, quadrupling the number of ballots cast in the last U.S. presidential election. True, many viewers voted more than once for favorite contestants, but that only documents the culture of opinion, especially popular with younger viewers.

David Mindich, author of Just the Facts and Tuned Out: Why Americans Under 40 Don't Follow the News, says journalists have to compete now with shows like “Fear Factor” and “Friends” and so are overemphasizing humor, conflict and sex. Mindich, chair of the journalism and mass communication department at Saint Michael’s College, believes that the power of fact has diminished in this media universe. “One of the most powerful things about journalism itself is that it can communicate to a large audience and then we can have discussions about facts and where the facts bring us; but if we no longer are paying attention, then the facts don’t have the same weight. In the absence of fact opinion becomes more powerful. It’s not only the journalists themselves; it’s the culture apart from the news that has abandoned political discourse based on commonly agreed upon facts.”

In our day, points and counterpoints may be passionate but often also uninformed and usually accusatory. Who wants to participate in a media spectacle where audience and other sources, rather than the reporters, instinctively go for the jugular? Too often in this environment, the only people willing to speak out -- to contribute to the social debate -- are those with special interests or with nothing to lose and celebrity to gain.

The New Silent Majority

Sources who can explain the complex issues of our era, including biotechnology and bioterrorism, often opt out of the social debate. This includes scientists at our best universities. They see the media world as it is … and so have refrained from commenting on it. Increasingly the new silent majority will not go public with their facts or informed perspectives because, they realize, they will be pilloried for doing so by the omnipresent fear-mongers and sensationalists who provide a diet of conflict and provocation in the media.

And that creates a crisis for the First Amendment, which exists because the founders believed that truth would rise to the top -- providing people could read. That is also why education is associated with free speech and why, for generations, equal access to education has been an issue in our country and continues to be in our time. Education and information are requisite in a republic where we elect our representatives; to downsize or cut allocations for either puts the country at risk. Society is experiencing the consequences of cuts to the classroom and the newsroom, and we are getting the governments that we deserve, including blue vs. red states in a divided, point/counterpoint electorate.

What will become of journalism in this perplexing milieu? What happens when profit rather than truth rises to the top? According to David Mindich, “When profit trumps truth, journalism values are diluted, and then people start to wonder about the value of journalism in the first place.” Without facts, he says, people "start to forget the purpose of the First Amendment and then that, in turn, weakens journalism, and it’s a downward spiral from there."

The only one way to stop the spiral is through re-investment in journalism and education. As for me, a journalism educator, my highest priority is training students for the media environment that used to exist, the one concerned about fact holding government accountable — no matter what the cost. Sooner or later, there will be a place again for fact-gathering journalists. There will be a tipping point when profit plummets for lack of newsroom personnel and technology fails to provide the fix. That day is coming quickly for newspapers publishers, in particular, who are struggling to compete online without realizing there are no competitors on front doors and welcome mats of American homes, their erstwhile domain. They will realize that the best way to attract new readers is to hire more reporters and place them where citizens can see them on the scene as witness, disseminating verifiable truths of the day.

Author/s: 
Michael Bugeja
Author's email: 
info@insidehighered.com

Michael Bugeja directs the Greenlee School of Journalism and Communication at Iowa State University. He is the author of Interpersonal Divide: The Search (Oxford University Press, 2005).

Hitler -- the Classic?

It is disagreeable to approach the cashier with a book called How to Read Hitler. One way to take the stink off would be to purchase one or two other volumes in the new How to Read series published by W. W. Norton, which also includes short guides to Shakespeare, Nietzsche, Freud, and Wittgenstein. But at the time, standing in line at a neighborhood bookstore a couple weeks ago, I wasn't aware of those other titles. (The only thing mitigating the embarrassment was knowing that my days as a skinhead, albeit a non-Nazi one, are long over.) And anyway, the appearance of Adolf Hitler in such distinguished literary and philosophical company raises more troubling questions than it resolves.

"Intent on letting the reader experience the pleasures and intellectual stimulation in reading classic authors," according to the back cover, "the How to Read series will facilitate and enrich your understanding of texts vital to the canon." The series editor is Simon Critchley, a professor of philosophy at the New School in New York City, who looms ever larger as the guy capable of defending poststructuralist thought from its naysayers. Furthermore, he's sharp and lucid about it, in ways that might just persuade those naysayers to read Derrida before denouncing him. (Yeah, that'll happen.)

Somehow it is not that difficult to imagine members of the National Association of Scholars waving around the How to Read paperbacks during Congressional hearings, wildly indignant at Critchley's implicit equation of Shakespeare and Hitler as "classic authors" who are "vital to the canon."

False alarm! Sure, the appearance of the Fuhrer alongside the Bard is a bit of a provocation. But Neil Gregor, the author of How to Read Hitler, is a professor of modern German history at the University of Southampton, and under no illusions about the Fuhrer's originality as a thinker or competence as a writer.

About Mein Kampf, Gregor notes that there is "an unmistakably 'stream of consciousness' quality to the writing, which does not appear to have undergone even the most basic editing, let alone anything like polishing." Although Gregor does not mention it, the title Hitler originally gave to the book reveals his weakness for the turgid and the pompous: Four and a Half Years of Struggle against Lies, Stupidity and Cowardice. (The much snappier My Struggle was his publisher's suggestion.)

Incompetent writers make history, too. And learning to read them is not that easy. The fact that Hitler had ideas, rather than just obsessions, is disobliging to consider. Many of the themes and images in his writing reflect an immersion in the fringe literature of his day -- the large body of ephemeral material analyzed in Fritz Stern in his classic study The Politics of Cultural Despair: The Rise of the Germanic Ideology.

But Gregor for the most part ignores this influence on Hitler. He emphasizes, instead, the elements of Hitler's thinking that were, in their day, utterly mainstream. He could quote whole paragraphs Carl de Clausewitz on strategy. And his racist world view drew out the most virulent consequences of the theories of Arthur de Gobineau and Houston Stewart Chamberlain.(While Hitler was dictating his memoirs in a prison following the Beer Hall Putsch, he could point with admiration to one effort to translate their doctrines into policy: The immigration restrictions imposed in the United States in the 1920s.) 

Gregor's method is to select passages from Mein Kampf and from an untitled sequel, published posthumously as Hitler's Second Book. He then carefully unpacks them -- showing what else is going on within the text, beneath the level of readily paraphrasable content. With his political autobiography, Hitler was not just recycling the standard complaints of the extreme right, or indulging in Wagnerian arias of soapbox oratory. He was also competing with exponents of similar nationalist ideas. He wrote in order to establish himself as the (literally) commanding figure in the movement.

So there is an implicit dialogue going on, disguised as a rather bombastic monologue. "Long passages of Hitler's writings," as Gregor puts it, "take the form of an extended critique of the political decisions of the late nineteenth century.... Hitler reveals himself not only as a nationalist politician and racist thinker, but -- this is a central characteristic of fascist ideology -- as offering a vision of revitalization and rebirth following the perceived decay of the liberal era, whose failings he intends to overcome."

The means of that "overcoming" were, of course, murderous in practice. The vicious and nauseating imagery accompanying any mention of the Jews -- the obsessive way Hitler constantly returns to metaphors of disease, decay, and infestation -- is the first stage of a dehumanization that is itself an incipient act of terror. The genocidal implications of such language are clear enough. But Gregor is careful to distinguish between the racist stratum of Hitler's dogma (which was uncommonly virulent even compared to the "normal" anti-Semitism of his day) and the very widespread use of militarized imagery and rhetoric in German culture following World War I.

"Many of the anti-Semitic images in Hitler's writing can be found in, say, the work of Houston Stewart Chamberlain," writes Gregor. "Yet when reading Chamberlain's work we hardly sense that we are dealing with an advocate of murder. When reading Hitler, by contrast, we often do -- even before we have considered the detail of what he is discussing. This is because the message is not only to be found in the arguments of the text, but is embedded in the language itself."

How to Read Hitler is a compact book, and a work of "high popularization" rather than a monograph. The two short pages of recommended readings at the end are broad, pointing to works of general interest (for example, The Coming of the Third Reich by Richard Evans) rather than journal articles. It will find its way soon enough into high-school and undergraduate history classrooms -- not to mention the demimonde of "buffs" whose fascination with the Third Reich has kept the History Channel profitable over the years.

At the same time, Gregor's little book is an understated, but very effective, advertisement for the "cultural turn" in historical scholarship. It is an example, that is, of one way historians go about examining not just what documents tell us about the past, but how the language and assumptions of a text operated at the time. His presentation of this approach avoids grand displays of methodological intent. Instead the book just goes about its business -- very judiciously, I think.

But there is one omission that is bothersome. Perhaps it is just an oversight, or, more likely, a side effect of the barriers between disciplines. Either way, it is a great disservice that How to Read Hitler nowhere points out the original effort by someone writing in English to analyze the language and inner logic of Mein Kampf --  the essay by Kenneth Burke called "The Rhetoric of Hitler's 'Battle,' " published in The Southern Review in 1939. (In keeping with my recent enthusing over the "golden age" of the academic literary quarterly, it is worth noting that the Review was published at Louisiana State University and edited by a professor there named Robert Penn Warren.)

Burke's essay was, at the time, an unusual experiment: An analysis of a political text using the tools of literary analysis that Burke had developed while studying Shakespeare and Coleridge. He had published the first translations of Thomas Mann's Death in Venice and of portions of Oswald Spengler's Decline of the West -- arguably a uniquely suitable preparation for the job of reading Hitler. And just as various German émigrés had tried to combine Marx and Freud in an effort to grasp "the mass psychology of fascism" (as Wilhelm Reich's title had it), so had Burke worked out his own combination of the two in a series of strange and brilliant writings published throughout the Depression.

But he kept all of that theoretical apparatus offstage, for the most part, in his long review-essay on a then-new translation of Mein Kampf. Instead, Burke read Hitler's narrative and imagery very closely -- showing how an "exasperating, even nauseating" book served to incite and inspire a mass movement.

This wasn't an abstract exercise. "Let us try," wrote Burke, "to discover what kind of 'medicine' this medicine man has concocted, that we may know, with greater accuracy, exactly what to guard against, if we are to forestall the concocting of similar medicine in America."

Burke's analysis is a [ital]tour de force[ital]. Revisiting it now, after Gregor's How to Read volume, it is striking how much they overlap in method and implication. In 1941, Burke reprinted it in his collection The Philosophy of Literary Form, which is now available from the University of California Press. You can also find it in a very useful anthology of Burke's writings called On Symbols and Society, which appears in the University of Chicago Press's series called "The Heritage of Sociology."

"Above all," wrote Burke in 1939, "I believe we must make it apparent that Hitler appeals by relying upon a bastardization of fundamentally religious patterns of thought. In this, if properly presented, there is no slight to religion. There is nothing in religion proper that requires a fascist state. There is much in religion, when misused, that does lead to a fascist state. There is a Latin proverb, Corruptio optimi pessima, 'the corruption of the best is the worst.' And it is the corruptors of religion who are a major menace to the world today, in giving the profound patterns of religious thought a crude and sinister distortion."

Author/s: 
Scott McLemee
Author's email: 
scott.mclemee@insidehighered.com

The Chosen Few

Jerome Karabel's The Chosen is the big meta-academic book of the season -- a scholarly epic reconstructing "the hidden history of admission and exclusion at Harvard, Yale, and Princeton," as the subtitle puts it. Karabel, who is a professor of sociology at the University of California at Berkeley, has fished documents out of the archive with the muckraking zeal worthy of an investigative journalist. And his book, published this month by Houghton Mifflin, is written in far brisker narrative prose than you might expect from somebody working in either sociology or education. That's not meant as a dis to those worthy fields. But in either, the emphasis on calibrating one's method does tend to make storytelling an afterthought.

For Karabel really does have a story to tell. The Chosen shows how the gentlemanly anti-Semitism of the early 20th century precipitated a deep shift in how the country's three most prestigious universities went about the self-appointed task of selecting and grooming an elite.

It is (every aspect of it, really) a touchy subject. The very title of the book is a kind of sucker-punch. It is an old anti-Jewish slur, of course. It's an allusion to Jehovah's selection of the Jews as the Chosen People, of course. It's also a term sometimes used, with a sarcastic tone, as an ethnic slur. But Karabel turns it back against the WASP establishment itself -- in ways too subtle, and certainly too well-researched, to be considered merely polemical. (I'm going to highlight some of the more rancor-inspiring implications below, but that is due to my lack of Professor Karabel's good manners.)

The element of exposé pretty much guarantees the book a readership among people fascinated or wounded by the American status system. Which is potentially, of course, a very large readership indeed. But "The Chosen" is also interesting as an example of sociology being done in almost classical vein. It is a study of what, almost a century ago, Vilfredo Pareto called "the circulation of elites" -- the process through which "the governing elite is always in a state of slow and continuous transformation ... never being today what it was yesterday."

In broad outline, the story goes something like this. Once upon a time, there were three old and distinguished universities on the east coast of the United States. The Big Three were each somewhat distinctive in character, but also prone to keeping an eye on one another's doings.

Harvard was the school with the most distinguished scholars on its faculty -- and it was also the scene of President Charles Eliot's daring experiment in letting undergraduates pick most of their courses as "electives." There were plenty of the "stupid young sons of the rich" on campus (as one member of the Board of Overseers put it in 1904), but the student body was also relatively diverse. At the other extreme, Princeton was the country club that F. Scott Fitzgerald later described in This Side of Paradise. (When asked how many students there were on campus, a Princeton administrator famously replied, "About 10 percent.")

Finally, there was Yale, which had crafted its institutional identity as an alternative to the regional provincialism of Harvard, or Princeton's warm bath of snobbery. It was "the one place where money makes no difference ... where you stand for what you are," in the words of the then-beloved college novel Dink Stover, about a clean-cut and charismatic Yalie.

But by World War One, something was menacing these idyllic institutions: Namely, immigration in general and "the Hebrew invasion" in particular. A meeting of New England deans in the spring of 1918 took this on directly. A large and growing percentage of incoming students were the bright and driven children of Eastern European Jewish immigrants. This was particularly true at Harvard, where almost a fifth of the freshman class that year was Jewish. A few years later, the figure would reach 13 percent at Yale -- and even at Princeton, the number of Jewish students had doubled its prewar level.

At the same time, the national discussion over immigration was being shaped by three prominent advocates of "scientific" racism who worried about the decline of America's Nordic stock. They were Madison Grant (Yale 1887), Henry Fairfield Osborne (Princeton 1877), and Lothrop Stoddard (Harvard 1905).

There was, in short, an air of crisis at the Big Three. Even the less robustly bigoted administrators worried about (as one Harvard official put it) "the disinclination, whether justified or not, on the part of non-Jewish students to be thrown into contact with so large a proportion of Jewish undergraduates."

Such, then, was the catalyst for the emergence, at each university, of an intricate and slightly preposterous set of formulae governing the admissions process. Academic performance (the strong point of the Jewish applicants) would be a factor -- but one strictly subordinated to a systematic effort to weigh "character."

That was an elusive quality, of course. But administrators knew when they saw it. Karabel describes the "typology" that Harvard used to make an initial characterization of applicants. The code system included the Boondocker ("unsophisticated rural background"), the Taconic ("culturally depressed background," "low income"), and the Krunch ("main strength is athletic," "prospective varsity athlete"). One student at Yale was selected over an applicant with a stronger record and higher exam scores because, as an administrator put it, "we just thought he was more of a guy."

Now, there is a case to be made for a certain degree of flexibility in admissions criteria. If anything, given our reflex-like tendency to see diversity as such as an intrinsic good, it seems counterintuitive to suggest otherwise. There might be some benefit to the devil's-advocate exercise of trying to imagine the case for strictly academic standards.

But Karabel's meticulous and exhaustive record of how the admissions process changed is not presented as an argument for that sort of meritocracy. First of all, it never prevailed to begin with.

A certain gentlemanly disdain for mere study was always part of the Big Three ethos. Nor had there ever been any risk that the dim sons of wealthy alumni would go without the benefits of a prestigious education.

What the convoluted new admissions algorithms did, rather, was permit the institutions to exercise a greater -- but also a more deftly concealed -- authority over the composition of the student body.

"The cornerstones of the new system were discretion and opacity," writes Karabel; "discretion so that gatekeepers would be free to do what they wished and opacity so that how they used their discretion would not be subject to public scrutiny.... Once this capacity to adapt was established, a new admissions regime was in place that was governed by what might be called the 'iron law of admissions': a university will retain a particular admissions policy only so long as it produces outcomes that correspond to perceived institutional interests."

That arrangement allowed for adaptation to social change -- not just by restricting applicants of one minority status in the 1920s, but by incorporating underrepresented students of other backgrounds later. But Karabel's analysis suggests that this had less to do with administratorsbeing "forward-looking and driven by high ideals" than it might appear.

"The Big Three," he writes, "were more often deeply conservative and surprisingly insecure about their status in the higher education pecking order.... Change, when it did come, almost always derived from one of two sources: the continuation of existing policies was believed to pose a threat either to vital institutional interests (above all, maintaining their competitive positions) or to the preservation of the social order of which they were an integral -- and privileged -- part."

Late in the book, Karabel quotes a blistering comment by the American Marxist economist Paul Sweezy (Exeter '27, Harvard '31, Harvard Ph.D. '37) who denounced C. Wright Mills for failing to grasp "the role of the preparatory schools and colleges as recruiters for the ruling class, sucking upwards the ablest elements of the lower classes." Universities such as the Big Three thus performed a double service to the order by "infusing new brains into the ruling class and weakening the potential leadership of the working class."

Undoubtedly so, once upon a time -- but today, perhaps, not so much. The neglect of their duties by the Big Three bourgeoisie is pretty clear from the statistics.

"By 2000," writes Karabel, "the cost of a year at Harvard, Yale, and Princeton had reached the staggering sum of more than $35,000 -- an amount that well under 10 percent of American families could afford....Yet at all three institutions, a majority of students were able to pay their expenses without financial assistance -- compelling testimony that, more than thirty years after the introduction of need-blind admissions, the Big Three continued to draw most of their students from the most affluent members of society." The number of students at the Big Three coming from families in the bottom half of the national income distribution averages out to about 10 percent.

All of which is (as the revolutionary orators used to say) no accident. It is in keeping with Karabel's analysis that the Big Three make only as many adjustments to their admissions criteria as they must to keep the status quo ante on track. Last year, in a speech at the American Council on Education, Harvard's president, Larry Summers, called for preferences for the economically disadvantaged. But in the absence of any strong political or social movement from below -- an active, noisy menace to business as usual -- it's hard to imagine an institutionalized preference for admitting students from working families into the Big Three. (This would have to include vigorous and fairly expensive campaigns of recruitment and retention.)

As Walter Benn Michaels writes in the latest issue of N+1 magazine, any discussion of class and elite education now is an exercise in the limits of the neoliberal imagination. (His essay was excerpted last weekend in the Ideas section of The Boston Globe.

"Where the old liberalism was interested in mitigating the inequalities produced by the free market," writes Michaels, " neoliberalism -- with its complete faith in the beneficence of the free market -- is interested instead in justifying them. And our schools have a crucial role to play in this. They have become our primary mechanism for convincing ourselves that poor people deserve their poverty, or, to put the point the other way around, they have become our primary mechanism for convincing rich people that we deserve our wealth."

How does this work? Well, it's no secret that going to the Big Three pays off. If, in theory, the door is open to anyone smart and energetic, then everything is fair, right? That's equality of opportunity. And if students at the Big Three then turn out to be drawn mainly from families earning more than $100,000 per year....

Well, life is unfair. But the system isn't.

"But the justification will only work," writes Michaels, if "there really are significant class differences at Harvard. If there really aren't -- if it's your wealth (or your family's wealth) that makes it possible for you to go to an elite school in the first place -- then, of course, the real source
of your success is not the fact that you went to an elite school but the fact that your parents were rich enough to give you the kind of preparation that got you admitted to the elite school. The function of the (very few) poor people at Harvard is to reassure the (very many) rich people at Harvard that you can't just buy your way into Harvard."

Author/s: 
Scott McLemee
Author's email: 
scott.mclemee@insidehighered.com

A Child's Garden of Culture and Atrocity

"Whoever cannot give to himself an adequate account of the past three thousand years," said Goethe, "remains in darkness, without history, living from day to day." That is an expression of a bedrock principle of liberal humanism, European-style. It takes the existence of the educated individual as its basic unit of reference -- its gold standard. But it also judges the quality of that existence by how much the individual has spent in acquiring a sense of the past. That expenditure also means, in effect, going into debt: You’ll never repay everything you owe to previous generations.

That outlook is, when you get right down to it, pretty un-American. It goes against the ideal of unencumbered self-creation that Emerson taught us –- in which we are supposed to throw off the burdens of the past, living always in the vital present. Fortunately, this is not hard to do. The first step is not to learn much history to begin with. (We are good at this.)

Even so, there  may be an audience for E. H. Gombrich’s A Little History of the World, now available from Yale University Press, 70 years after it was first written. Imagine Goethe giving up the role of sage long enough to become a children’s author and you will have a reasonably good idea of the book’s content. It goes from prehistory up to the end of the (then-recent) Great War, with particular attention to ancient Greece, the Roman Empire, and the emergence of Judaism, Buddhism, Christianity, and Islam.

As for the style ... well, that is something even more remarkable. The tone is wry, at times, without ever being jokey -- a kind of light seriousness that is very respectful of its young audience. Each chapter is perfectly calibrated to suit the attention span and cognitive powers of a 10 year-old, without ever giving off a trace of condescension.

The effect, even for an adult reader, is incredibly charming –- and, indeed, instructive, at least for anyone with the occasional gap in that interior timeline. (Quick now: Who were the Hohenzollerns? And no, a vague sense that they were German doesn’t count.)

In his later and better-known role as art historian, Gombrich commanded a really humbling degree of erudition, but always with a certain generosity towards his audience. That combination is very much in evidence throughout his first book – one written in what must have been very trying circumstances.

It was Vienna in 1935. Gombrich was 26 and had recently finished his dissertation. (Writing one "was considered very important," he told a presumably incredulous audience at Rutgers University in 1987, "yet it didn’t take more than a little over a year to write.") His immediate job prospects ranged from the nonexistent to the merely terrible. Besides, he was Jewish, and the writing was on the wall, usually in the form of a swastika.

He managed to find part-time employment with a publishing company. He was asked to evaluate a book on world history for children in English, to see if it might be worth translating. He recommended against it, but offered instead to write one directly into German. It took him about six week, writing a chapter a day. The volume did quite well when it appeared in 1936, though the Nazis eventually stopped publication on the grounds of its "pacifism."

By then, he was in London, working at the Warburg Institute (a major art-history collection, where Gombrich in time became director) and aiding the war effort by translating German radio broadcasts into English. Before leaving Vienna, he had agreed to write another book, this one for adolescents, on the history of art. That project that grew into a rather more ambitious work, The Story of Art (1950) – long the standard overview of European art history, from which generations of museum tour-guides have cribbed.

He wrote it – along with his more monographic works on iconography and on the psychology of perception –- in English. When his Little History was reprinted in Germany in the mid-1980s, he wrote an afterward for it; but he turned down offers to have it translated into English, preferring to do that himself, and to make some necessary revisions. It is not clear from the edition now available from Yale just how far Gombrich got with that effort at the time of his death in 2001. (The title page gives the translator as Caroline Mustill.) But he did add a postscript called "The Small Part of the History of the World Which I Have Lived Through" – summing up the 20th century from World War I through the end of the Cold War, and trying to put as optimistic a spin on that record as possible.

The preface by Leonie Gombrich, his granddaughter, quotes some introductory remarks he prepared for the Turkish edition. His Little History, he wrote, "is not, and never was, intended to replace any textbooks of history that may serve a very different purpose at school. I would like my readers to relax, and to follow the story without having to take any notes or to memorize names and dates. In fact, I promise that I shall not examine them on what they have read."

But the book has a strong and serious pedagogical intent, even so. And it comes very directly from Goethe, whose work Gombrich read incessantly as a boy. Upon receiving the Goethe Prize in 1994, Gombrich said that it was the author’s life and writing that taught him "the consoling message ... of a universal citizenship that transcends the confines of nationhood." That seems very much the point of the Little History, which tries to squeeze all of global history into just under three hundred easily read pages –- and I strongly suspect it was just that cosmopolitanism that the Nazi censors really loathed.

Of course, there are gaps and oversights. One that is really troublesome is how the entire history of the Atlantic slave trade is reduced to the dimensions of a brief reference to the Civil War in the United States. This has the effect of making it seem like a distant and cruel episode in the New World, rather than what it really was: A vast and centuries-long process that enriched parts of Europe, depopulated parts of Africa, and anticipated every aspect of totalitarianism possible before the rise of industrialization and mass communications.

Not that Gombrich leaves the history of colonial atrocity entirely out of the picture, especially in recounting the conquest of the Americas: "This chapter in the history of mankind is so appalling and shameful to us Europeans that I would rather not say anything more about it."

In many ways, then, the book is at least as interesting as the specimen of a lost sensibility as it is in its own right, as a first introduction to history. Gombrich later spoke of how much he had been the product of that almost religious veneration of culture that prevailed among the European middle class of the 19th and early 20th centuries.

"I make no great claims for the universality of that tradition," he said during a lecture at Liverpool University in 1981. "Compared to the knowable, its map of knowledge was arbitrary and schematic in the extreme. As is true of all cultures, certain landmarks were supposed to be indispensable for orientation while whole stretches of land remained terra incognita, of relevance only to specialists..... But what I am trying to say is that at least there was a map."

Author/s: 
Scott McLemee
Author's email: 
scott.mclemee@insidehighered.com

Scott McLemee writes Intellectual Affairs on Tuesdays and Thursdays. Suggestions and ideas for future columns are welcome.

Piled Higher and Deeper

Rick Perlstein, a friend from the days of Lingua Franca, is now working on a book about Richard Nixon. Last year, he published a series of in-depth articles about the Republican Party and the American conservative movement. (Those are not quite the same thing, though that distinction only becomes salient from time to time.) In short, Perlstein has had occasion to think about honesty and dissimulation -- and about the broad, swampy territory in between, where politicians finesse the difference. As do artists and used-car salesmen....

It’s the job of historians to map that territory. But philosophers wander there, too. “What is truth?” as Nietzsche once asked. “A mobile army of metaphors, metonymies, anthropomorphisms. Truths are illusions of which one has forgotten that they are illusions.” Kind of a Cheneyo-Rumsfeldian ring to that thought. It comes from an essay called “On Truth and Lie in an Extra-Moral Sense,” which does, too, come to think of it.  

So anyway, about a week ago, Rick pointed out a recent discussion of how the Bush Administration is dealing with critics who accuse it of fudging the intelligence that suggested Saddam Hussein had weapons of mass destruction. The link went to a comment by Joshua Micah Marshall, who is a liberal Democrat of the more temperate sort, not prone to hyperventilation. 

“Garden variety lying is knowing it’s Y and saying it’s X,” he wrote, giving Lyndon Johnson on the Gulf of Tonkin as an example. The present executive branch, he continued, shows “a much deeper indifference to factual information in itself.”

Rick posed an interesting question: “Isn't Josh Marshall here describing as the Administration's methodology exactly what that Princeton philosophy prof defines as ‘bullshit’?” That prof being, of course, Harry Frankfurt, whose short and best-selling treatise On Bullshit will probably cover everyone’s Christmas bonus at Princeton University Press this year. 

In February, The New York Times beat us by a day or so with its article on the book, which daintily avoided giving its title. But "Intellectual Affairs" first took a close look, not just at Frankfurt’s text -- noting that it remained essentially unchanged since its original publication as a scholarly paper in the 1980s -- but at the philosophical critique of it presented in G.A. Cohen’s essay “Deeper into Bullshit.” 

Since then, the call for papers for another volume of meditations on the theme of bull has appeared. Truly, we are living in a golden age.

The gist of Frankfurt’s argument, as you may recall, is that pitching BS is a very different form of activity from merely telling a lie. And Marshall’s comments do somewhat echo the philosopher’s point. Frankfurt would agree that “garden variety lying” is saying one thing when you know another to be true. The liar operates within a domain that acknowledges the difference between accuracy and untruth. The bullshitter, in Frankfurt’s analysis, does not. In a sense, then, the other feature of Marshall’s statement would seem to fit. Bullshit involves something like “indifference to factual information in itself.”

So does it follow, then, that in characterizing the Bush team’s state of mind three years ago, during the run-up to the war, we must choose between the options of incompetence, dishonesty, and bullshit?
Please understand that I frame it in such terms, not from any political motive, but purely in the interest of conceptual rigor. 

That said.... It seems to me that this range of terms is inadequate. One may agree that Bush et al. are profoundly indifferent to verifiable truth without concluding that the Frankfurt category necessarily applies.

Per G. A. Cohen’s analysis in “Deeper into Bullshit,” we must stress that Frankfurt’s model rests on a particular understanding of the consciousness of the liar. The mind of the bullshitter is defined by contrast to this state. For the liar, (1) the contrast between truth and untruth is clearly discerned, and (2) that difference would be grasped by the person to whom the liar speaks. But the liar’s intentionality also includes (3) some specific and lucidly grasped advantage over the listener made possible by the act of lying.

By contrast, the bullshitter is vague on (1) and radically unconcerned with (2). There is more work to be done on the elements of relationship and efficacy indicated by (3). We lack a carefully argued account of bullshit’s effect on the bullshitee.

There is, however, another possible state of consciousness not adequately described by Frankfurt’s paper. What might be called “the true believer” is someone possessing an intense concern with truth.

But it is a Higher Truth, which the listener may not (indeed, probably cannot) grasp. The true believer is speaking a truth that somehow exceeds the understanding of the person hearing it.

During the Moscow Trials of the late 1930s, Stalin’s attorney lodged numerous charges against the accused that were, by normal standards, absurd. In many cases, the “evidence” could be shown to be false. But so much worse for the facts, at least from the vantage point of the true believer. If you’ve ever known someone who got involved in EST or a multi-level marketing business, the same general principle applies. In each case, it is not quite accurate to say that the true believers are lying. Nor are they bullshitting, in the strictest sense, for they maintain a certain fidelity to the Higher Truth. 

Similarly, it did not matter three years ago whether or not any evidence existed to link Saddam and Osama. To anyone possessing the Higher Truth, it was obvious that Iraq must be a training ground for Al Qaeda. And guess what? It is now. So why argue about it?

On a less world-historical scale, I see something interesting and apropos in Academe, the magazine of the American Association of University Professors. In the latest issue, David Horowitz makes clear that he is not a liar just because he told a national television audience something that he knew was not true. 

(This item was brought to my attention by a friend who teaches in a state undergoing one of Horowitz’s ideological rectification campaigns. My guess is that he’d rather not be thanked by name.)

Here’s the story so far: In February, while the Ward Churchill debate was heating up, Horowitz appeared on Bill O’Reilly’s program. It came up that Horowitz, like Churchill, had been invited to lecture at Hamilton College at some point. But he was not, he said, “a speaker paid by and invited by the faculty.” 

As we all know, university faculties are hotbeds of left-wing extremism. (Especially the business schools and engineering departments. And reports of how hotel-management students are forced to read speeches by Pol Pot are positively blood-curdling.) Anyway, whenever Horowitz appears on campus, it’s because some plucky youngsters invite him. He was at Hamilton because he had been asked by “the conservative kids.”

That came as a surprise to Maurice Isserman, a left-of-center historian who teaches at Hamilton College. When I saw him at a conference a few years ago, he seemed to have a little gray in his hair, and his last book, The Other American: The Life of Michael Harrington, was a biography of the founder of the Democratic Socialists of America. No doubt he’s been called all sorts of things over the years, but “conservative kid” is not one of them. And when Horowitz spoke at Hamilton a few years ago, it was as a guest lecturer in Isserman’s class on the 1960s. 

As Isserman put it in the September/October issue of Academe: “Contrary to the impression he gave on "The O’Reilly Factor," Horowitz was, in fact, an official guest of Hamilton College in fall 2002, invited by a faculty member, introduced at his talk by the dean of the faculty, and generously compensated for his time.”

I will leave to you the pleasure and edification of watching Horowitz explain himself in the latest issue of Academe. But in short, he could not tell the truth because that would have been a lie, so he had to say something untrue in order to speak a Higher Truth. 

My apologies for the pretzel-like twistiness of that paraphrase. It is all so much clearer in the original Newspeak: Thoughtcrime is doubleplus ungood.

Author/s: 
Scott McLemee
Author's email: 
scott.mclemee@insidehighered.com

Pages

Subscribe to RSS - History
Back to Top