History

Falling Into the Generation Gap

A few weeks ago, sitting over a cup of coffee, a writer in his twenties told me what it had been like to attend a fairly sedate university (I think he used the word "dull") that had a few old-time New Left activists on its faculty.

"If they thought you were interested in anything besides just your career," he said, "if you cared about ideas or issues, they got really excited. They sort of jumped on you."

Now, I expected this to be the prelude to a little tribute to his professors – how they had taken him seriously, opened his mind to an earlier generation’s experience, etc. But no.

"It was like they wanted to finish their youth through you, somehow," he said. "They needed your energy. They needed you to admire them. They were hungry for it. It felt like I had wandered into a crypt full of vampires. After a while, I just wanted to flee."

It was disconcerting to hear. My friend is not a conservative. And in any case, this was not the usual boilerplate about tenured radicals seeking to brainwash their students. He was not complaining about their ideas and outlook. This vivid appraisal of his teachers was not so much ideological as visceral. It tapped into an undercurrent of generational conflict that the endless "culture wars" seldom acknowledge.

You could sum it up neatly by saying that his professors, mostly in their fifties and sixties by now, had been part of the "Baby Boom," while he belonged to "Generation X."

Of course, there was a whole segment of the population that fell between those two big cultural bins -- people born at the end of the 1950s and the start of the 1960s. Our cohort never had a name, which is probably just as well. (For one thing, we’ve never really believed that we are a "we." And beside, the whole idea of a prepackaged identity based on what year you were born seems kind of tacky.)

One effect of living in this no-man’s-land between Boomers and Xers is a tendency to feel both fascinated and repulsed by moments when people really did have a strong sense of belonging to a generation. The ambivalence is confusing. But after a while it seems preferable to nostalgia -- because nostalgia is always rather simple-minded, if not dishonest.

The recent documentary The Weather Underground  (a big hit with the young-activist/antiglobalization crowd) expressed doe-eyed sadness that the terrible Amerikan War Machine had forced young idealists to plant bombs. But it somehow never mentioned that group’s enthusiasm for the Charles Manson "family." (Instead of the two-fingered hippie peace sign, Weather members flashed a three-finger salute, in honor of the fork used to carve the word "war" into one of the victims’ stomach.) Robert McNamara and Henry Kissinger have a lot of things to answer for – but that particular bit of insanity is not one of them.

Paul Berman, who was a member of Students for a Democratic Society at Columbia University during the strike of 1968, has been writing about the legacy of the 1960s for a long time. Sometimes he does so in interesting ways, as in parts of his book A Tale of Two Utopias; and sometimes he draws lessons from history that make an otherwise placid soul pull out his hair with irritation. He has tried to sort the positive aspects of the 1960s out from the negative -- claiming all the good for a revitalized liberalism, while treating the rest as symptoms of a lingering totalitarian mindset and/or psychological immaturity.

Whatever the merits of that analysis, it runs into trouble the minute Berman writes about world history -- which he always paints in broad strokes, using bright and simple colors. In his latest book, Terror and Liberalism,  he summed up the last 300 years in terms that suggested Europe and the United States had grabbed their colonies in a fit of progress-minded enthusiasm. (Economic exploitation, by Berman’s account, had nothing to do with it, or not much.) Liberalism and Terror is a small book, and easy to throw.

His essay in the new issue of Bookforum is, to my mind, part of the thoughtful, reflective, valuable side of Berman’s work. In other words, I did not lose much hair reading it.

The essay has none of that quality my friend mentioned over coffee – the morbid hunger to feast off the fresh blood of a younger generation’s idealism. Berman has fond recollections of the Columbia strike. But that is not the same as being fond of the mentality that it fostered. "Nothing is more bovine than a student movement," he writes, "with the uneducated leading the anti-educated and mooing all the way."

The foil for Berman’s reflections is the sociologist Daniel Bell, who left Columbia in the wake of the strike. At the time, Bell’s book The End of Ideology  was the bete noir of young radicals. (It was the kind of book that made people so furious that they refused to read it – always the sign of the true-believer mentality in full effect.) But it was Bell’s writing on the history of the left in the United States that had the deepest effect on Berman’s own thinking.

Bell noticed, as Berman puts it, "a strange and repeated tendency on the part of the American Left to lose the thread of continuity from one generation to the next, such that each new generation feels impelled to reinvent the entire political tradition."

There is certainly something to this. It applies to Berman himself. After all, Terror and Liberalism is pretty much a jerry-rigged version of the Whig interpretation of history,  updated for duty in the War on Terror. And the memoiristic passages in his Bookforum essay are, in part, a record of his own effort to find "the thread of continuity from one generation to the next."

But something else may be implicit in Bell’s insight about the "strange and repeated tendency" to lose that thread. It is a puzzle for which I have no solution readily at hand. Namely: Why is this tendency limited to the left?

Why is it that young conservatives tend to know who Russell Kirk was, and what Hayek thought, and how Barry Goldwater’s defeat in 1964 prepared the way for Reagan’s victory in 1980? Karl Marx once wrote that "the tradition of all the dead generations weighs like a nightmare on the brain of the living." So how come the conservatives are so well-rested and energetic, while the left has all the bad dreams?

Author/s: 
Scott McLemee
Author's email: 
scott.mclemee@insidehighered.com

Scott McLemee writes Intellectual Affairs on Tuesdays and Thursdays.

Impure Literature

The publication, 100 years ago, of  The Jungle, by Upton Sinclair, in the popular American socialist newspaper Appeal to Reason had an enormous effect -- if not quite the one that its author intended. "I aimed at the public’s heart," Sinclair later said, “and by accident I hit it in the stomach.”

Drawing on interviews with workers in Chicago and his own covert explorations of the city’s meat-processing factories, Sinclair intended the novel to be an expose of brutal working conditions. By the time it appeared as a book the following year, The Jungle’s nauseating revelations were the catalyst for a reform movement culminating in the Pure Food and Drug Act. In portraying the life and struggles of Jurgis Rudkus, a Lithuanian immigrant, Sinclair wanted to write (as he put it), “The Uncle Tom’s Cabin of wage slavery,” thereby ushering in an age of proletarian emancipation. Instead, he obliged the bourgeoisie to regulate itself -- if only to keep from feeling disgust at its breakfast sausages.

In his introduction to a new edition of The Jungle, just published by Bedford/St. Martin’s, Christopher Phelps traces the origins and effects of Sinclair’s novel. Phelps, an associate professor of history at Ohio State University in Mansfield, is currently on a Fulbright fellowship in Poland, where he occupies a distinguished chair in American studies and literature at the University of Lodz. The following is the transcript of an e-mail interview conducted this month.

Q: At one of the major chain bookstores the other day, I noticed at least four editions of The Jungle on the shelf.  Yours wasn’t one of them. Presumably it's just a matter of time. What’s the need, or the added value, of your edition? Some of the versions available are pretty cheap, after all. The book is now in the public domain.

A:  Yes, it’s even available for free online these days, if all you want is the text. This new edition is for readers seeking context. It has a number of unique aspects. I’m pleased about the appendix, a report written by the inspectors President Theodore Roosevelt dispatched to Chicago to investigate Upton Sinclair’s claims about the meatpacking industry. In one workplace, they watch as a pig slides off the line into a latrine, only to be returned to the hook, unwashed, for processing. No other version of The Jungle includes this report, which before now had lapsed into obscurity. The new edition also features an introduction in which I survey the scholarship on the novel and provide findings from my research in Sinclair’s papers held by the Lilly Library at Indiana University. Finally, there are a lot of features aimed at students, including a cartoon, a map, several photographs, a bibliography, a chronology of Sinclair’s life, and a list of questions for discussion. So it doubles as scholarly edition and teaching edition.

Q: Let me ask about teaching the book, then. How does The Jungle go over in the classroom?

A:  Extremely well. Students love it. The challenge of teaching history, especially the survey, is to get students who think history is boring to imagine the past so that it comes alive for them. The Jungle has a compelling story line that captures readers’ attention from its very first scene, a wedding celebration shaded in financial anxiety and doubts about whether Old World cultural traditions can survive in America. From then on, students just want to learn what will befall Jurgis and his family. Along the way, of course, Sinclair injects so much social commentary and description that teachers can easily use students’ interest in the narrative as a point of departure for raising a whole range of issues about the period historians call the Progressive Era.

Q:  As you've said, the new edition includes a government report that appeared in the wake of the novel, confirming the nauseating details. What are the grounds for reading and studying Sinclair's fiction, rather than the government report?

A:  Well, Teddy Roosevelt’s inspectors had the singular mission of determining whether the industry’s slaughtering and processing practices were wholesome. Sinclair, for his part, had many other concerns. What drew him to write about the meatpacking industry in the first place was the crushing of a massive strike of tens of thousands of workers led by the Amalgamated Meat Cutters and Butcher Workmen of North America in 1904. In other words, he wanted to advance the cause of labor by exposing the degradation of work and exploitation of the immigrant poor.

When The Jungle became a bestseller, Sinclair was frustrated that the public furor centered almost exclusively on whether the companies were grinding up rats into sausage or disguising malodorous tinned beef with dyes. These were real concerns, but Sinclair cared most of all about the grinding up of workers. I included this government report, therefore, not only because it confirms Sinclair’s portrait of unsanitary meat processing, but because it exemplifies the constriction of Sinclair’s panorama of concerns to the worries of the middle-class consumer.

It further shows how Sinclair’s socialist proposal of public ownership was set aside in favor of regulatory measures like the Pure Food and Drug Act and Meat Inspection Act of 1906. Of course, that did not surprise Sinclair. He was proud, rightly so, of having been a catalyst for reform. Now, just as the report must be read with this kind of critical eye, so too the novel ought not be taken literally.

Q:  Right. All kinds of problems come from taking any work of literature, even the most intentionally documentary, as giving the reader direct access to history.

A: Nowadays The Jungle is much more likely to be assigned in history courses than in literature courses, and yet it is a work of fiction. You point to a major problem, which we might call the construction of realism. I devote a good deal of attention to literary form and genre in my introduction, because I think they are crucial and should not be shunted aside. I note the influence upon The Jungle of the sentimentalism of Harriet Beecher Stowe, of naturalist and realist writers like William Dean Howells and Frank Norris, and of the popular dime novels of Horatio Alger. Sinclair was writing a novel, not a government report. He fancied himself an American Zola, the Stowe of wage slavery.

A good teacher ought to be able to take into account this status of the text as a work of creative literature while still drawing out its historical value. We might consider Jurgis, for example, as the personification of a class. He receives far more lumps in life than any single worker would in 1906, but the problems he encounters, such as on-the-job injury or the compulsion to make one’s children work, were in fact dilemmas for the working class of the time.

In my introduction, I contrast the novel with what historians now think about immigrant enclaves, the labor process, gender relations, and race. There is no determinant answer to the question of how well The Jungle represented such social realities. Many things it depicted extremely well, others abominably, race being in the latter category. If we keep in mind that realism is literary, fabricated, we can see that Sinclair’s background afforded him a discerning view of many social developments, making him a visionary, even while he was blind in other ways. Those failings are themselves revelatory of phenomena of the period, such as the racism then commonplace among white liberals, socialists, and labor activists. It’s important that we read the novel on all these levels.

Q: Sinclair wrote quite a few other novels, most of them less memorable than The Jungle. Well, OK, to be frank,  what I've heard is that they were, for the most part, awful. Is that an unfair judgment? Was The Jungle a case of the right author handling the right subject at the right time?

A:  That's precisely it, I think. Sinclair was uniquely inspired at the moment of writing The Jungle. I've been reading a lot of his other books, and although some have their moments, they sure can give you a headache. Many of them read like failed attempts to recapture that past moment of glory. He lived to be ninety and cranked out a book for every year of his life, so it's a cautionary tale about allowing prolixity to outpace quality. The book of his that I like best after The Jungle is his 1962 autobiography, a book that is wry and whimsical in a surprising and attractive, even disarming, way.

Author/s: 
Scott McLemee
Author's email: 
scott.mclemee@insidehighered.com

Scott McLemee writes Intellectual Affairs on Tuesdays and Thursdays.

Show Clio the Money!

A member of Congress who says “history” is not necessarily thinking of the same enterprise as a professional historian. This is no Beltway-induced conceptual blockage: For civilians, the important thing about history is story, not methodology. (Even the most devoted viewers of the History Channel have no sense of the century-long debates over the "the objectivity question.")

But the stakes of mutual incomprehension are higher when the federal budget is involved -- when the member of Congress is voting on whether or not to fund initiatives designed to improve history education, mainly at the primary and secondary levels. For example, there is the $11.2 million that the National Endowment for the Humanities has requested for next year for We the People. And then there's the $119 million in the president’s budget slotted for the Teaching American History, a program of the Department of Education.

In such cases, it really matters whether legislators understand history to mean (1) a field producing new knowledge about the past or (2) a really cool holographic diorama of the Pilgrims at prayer.

The smart money would, of course, bet on the diorama. But history in the other sense is represented in Washington by the National Coalition on History, representing the interests of more than 70 professional organizations for historians and archivists. As it happens, all of this lobbying clout is exercised by one person, Bruce Craig, usually with the assistance of an intern.

Craig took over as director (and de facto staff) of the coalition in early 2003 -- just as it was shedding its earlier, clunkier identity as the National Coordinating Committee for the Promotion of History. Like its predecessor, the NCH runs out of an office in the American Historical Association building on Capitol Hill.

I recently interviewed Craig by telephone from his home in West Virginia -- an excellent choice of residence, since it makes him a constituent of Sen. Robert Byrd, whose baby Teaching American History really is. But I happen to know that is a coincidence. It turns out that we met a dozen years ago, when his wife and I both worked as archival technicians in the manuscript division of the Library of Congress. (Our job was history at the lowliest level: sorting dead people’s mail.)

Back then, Craig was working on a dissertation about Harry Dexter White, a Treasury official and co-founder of the World Bank and the International Monetary Fund, who was accused by Whittaker Chambers of being a Soviet operative.

Craig's findings (available in a book published last year) were that White engaged in “a species of espionage” for the Russians, yet was not guilty of subverting American policy in their favor. It is a nice distinction -- one likely to offend those who prefer a simpler estimate, one way or the other, of Joseph McCarthy’s place in history.

But that studied indifference to ideological default settings is not just a scholarly stance. Listening to Craig, it sounds like the best tool in the lobbyist’s kit.

In the course of our discussion, I tried to draw Craig out on whether the mid-1990s battles over multiculturalism, the Enola Gay exhibit, and such still echo around Capitol Hill. His response is ... well, not evasive, exactly. But he has an impressive knack for finding finds terms that are practical, nonpartisan, and diplomatic.

The culture war "doesn’t come up often," he said. "Congress is very concerned with school kids, with whether or not they know American history. And of course they should be concerned with that. Part of our role is to make sure that ‘history’ doesn’t end up being defined narrowly, as just American history -- that the ancient world, and comparative history, also get included."

With the Teaching American History program, of course, the national (if not nationalistic) focus is evident from the very name. Craig says the challenge is to keep “from too narrow an emphasis on particular types of American history, so that it just becomes a kind of civics lesson.”

By meeting with Congressional staff and getting historians to testify in committee, the National Coalition for History is trying to recalibrate what legislators mean by “traditional American history.” It's a matter, in effect, of making sure that the term covers both the doings of white guys in powdered wigs at the Constitutional Convention and the slave revolts that sometimes kept them from getting a good night's sleep.

Quite a bit of the NCH’s activity concerns matters that are upstream from the classroom – with issues, that is, affecting how history gets “done” by researchers. Craig lobbies in support of the Open Government Act, designed to bolster the Freedom of Information access to documents. Organizations belonging to the coalition are up in arms, understandably enough, about a renewed effort to zero out the budget for the National Historical Publications and Records Commission, which provides grants for the preparation of editions of historical documents. And the NCH appears to be making progress in saving the program.

And in preparing the coalition’s weekly electronic newsletter, The NCH Washington Update (archived here), Craig keeps up with the corridor politics of government agencies involved in historical matters. Did you know, for example, that the National Parks Service is a hotbed of internal conflict over grants for historical preservation projects? Chances are that, no, you did not know that -- let alone that a recent major reorganization of one section of the Park Service is known as "the May 3 massacre.” (Read all about it here.) It's the sort of inside-the-beltway news that helps keep historians connected with the bureaucratic developments indirectly shaping their field.

From talking to Craig and reading the coalition’s press, the impression forms of a lobby that is, as the saying goes, “post-ideological.”

You know the drill: Pragmatism is all. Politics is the art of compromise in pursuit of the possible. That sort of thing.

But my own instinct is always to historicize such “post-ideological” thinking. To see it, first of all, as taking shape in a specific historical period (the 1990s, pretty much), and to understand it as reflecting a particular set of vested interests. In short, the "post-ideological" outlook is precisely the ideology of the professional-managerial class, i.e., extremely skilled brain workers who want to do their jobs without having to dread weird lurches in political governance.  

Now, some of my lingo here (“historicize,” “class”) is faintly marxisant, of course. But for what it’s worth, similar notions do pop up even when conservatives think about the recent past. As a case in point, check out the conservative historian Richard Jensen’s analysis of the culture wars.

Historians don’t all share the same, presumably leftist, politics -- no matter what the polemicists say. But they do share the same interest in seeing that libraries and archives stay open, and that “history” be understood to embrace a range of periods and topics. And also that new generations be encouraged to develop an appetite for learning about the past.

Given all that, there is an incentive to play down ideological fractiousness, as the National Coalition for History does with some finesse. The consequences are a little paradoxical -- creating “an ironic role [for] Washington, D.C.,” in the words of Rick Shenkman, editor of the History News Network.

“The larger story here in my opinion,” Shenman told me in an e-mail note a couple of weeks ago, “is the ironic role of Washington, D.C. in the history wars.  It has been the Right that has largely been behind the fantastic increase in appropriations for history over the last few years. Lynne Cheney has played a role as has Sen. Lamar Alexander. Robert Byrd, though a liberal of sorts, has pressed his history agenda on quite conservative grounds. And the beneficiary of the funds?  It's those liberal historians across the country whom David Horowitz thinks are undermining the Republic!”  

Not that the National Coalition for History, or anybody else for that matter, is being exactly Machiavellian about any of it. In the end, it’s all about the dead presidents. At the risk of being crass, you might best understand even the politics of scholarship by following the money.

“We have no space Hubble to rally around,” as Shenkman puts it. “So historians have used the easiest arguments at hand in support of their projects -- and that happens to be the patriotic argument.”

Author/s: 
Scott McLemee
Author's email: 
scott.mclemee@insidehighered.com

Scott McLemee writes Intellectual Affairs on Tuesdays and Thursdays. His last column explored the intellectual legacy of Paul Ricoeur.

Ambiguous Legacy

There will be a meeting tonight in Washington to celebrate the life of James Weinstein, the radical historian and publisher who died in Chicago last Thursday. The news was by no means unexpected. But the gathering is impromptu, and it will probably be small.

I suppose one thing we will all have in common is an inability to refer to the deceased as "James Weinstein." He was Jimmy. It's a fair guess that the turnout will include union organizers and progressive lobbyists and a few journalists. There will undoubtedly be an academic or two -- or several, if you count the defrocked, the ABD's, and the folks who otherwise decided (contra David Horowitz) that university life is not necessarily conducive to being a leftist.

Many people know that Weinstein's book The Decline of Socialism in America, 1912-1925 (first published in 1967 and reprinted by Rutgers University Press in 1984) started out as his dissertation. After all this time, it remains a landmark work in the scholarship on U.S. radicalism. But only this weekend, in talking with a mutual friend, did I learn that he never actually bothered to get the Ph.D.

Diagnosed with brain cancer, Jimmy spent the final weeks of his life in bed at home. He gave a series of interviews to Miles Harvey, an author and former managing editor at In These Times, the progressive magazine that Jimmy founded. The body of reminiscences is now being transcribed, and will join the collection of the Oral History Research Office at Columbia University.

"We both knew we were in a race against time," Miles said when we talked by phone over the weekend. "We mined a lot of interesting stuff. Jimmy was the Zelig of the American left."

The son of a prosperous businessman, he worked for years in electronics factories as a rank-and-file Communist union member. One of his anecdotes from that era is something of a legend -- has become, even, a part of history. One day a comrade asked Jimmy to give a ride to a taciturn fellow doing party business of an undisclosed nature. A few years later, he recognized the passenger as Julius Rosenberg. (Suffice it to say that Weinstein's future biographer will probably find a day-by-day account of his life during the early 1950s in the FBI surveillance files.)

Jimmy left the party in 1956, as part of a major exodus in the wake of Khrushchev's denunciation of the crimes of Stalin. He was never apologetic about his membership. But neither was he even slightly sentimental about it.

Well before massive documentation from the Russian archives settled the question, he dismissed the arguments of those who insisted that the American CP and the Soviet spy apparatus in the U.S. had to be considered as completely distinct entities. Any good party member would have been glad to help out, he said: "We would have considered it an honor." (Jimmy himself never received that distinction. According to Miles Harvey, the request that he chauffeur Julius Rosenberg has less to do with Jimmy's reliability as a revolutionary than it did with the fact that he was one of the Communists on hand who owned a car.)

The fact that he once said this at a public event, where non-leftists could hear him -- and that he did so during the Reagan administration, no less -- is still held against him in some circles.

The usual pattern, of course, is to abandon a rigid, dogmatic political ideology -- and then to adopt another one. People spend entire careers boldly denouncing other people for their own previous mistakes. It's easy work, and the market for it is steady.

Jimmy followed a different course. To begin with, he had never been all that keen on the ideological nuances of the Communist movement. He certainly knew his Marx and Lenin from studying at the party's famous Jefferson School of Social Science, in New York. But somehow the doctrinal points counted less than what he'd picked up from all those years as a union activist. At least that's the impression of his friend Jim McNeill, another former managing editor at In These Times. (McNeill, who is now an organizer for the Service Employees International Union.)

Nearing 30, Weinstein decided to go to graduate school to study history; and his instinct was to dig into an earlier period of American radicalism -- when it spoke an idiom that was much less purely Marxist, and a lot more influential. Up through World War I, the Socialists successfully fielded candidates in local elections and even get the occasional member into Congress. And Eugene Debs, a figure beloved even by those who didn't share his vision of the proletarian commonwealth, could win nearly a million votes for president while imprisoned for an antiwar speech.

Weinstein's research was, in short, a glimpse of an alternative that had been lost. It wasn't simply a matter of government repression, either. There were streaks of doctrinal puritanism, of apocalyptic revolutionism, that eventually proved corrosive. "In large part," as he later put it, "the failure of the American left has been internal." (Whether or not he made the connection isn't clear, but his own experience in the CP would tend to confirm this. As bad as McCarthyism had been for the party, members started quitting en masse once they had to face the truth about Stalin.)

Boiled down, his conclusions amounted to a demand for a major upheaval in the culture of the left. What it needed for the long term, in effect, was a healthy dose of pragmatism. It would also mean learning to think of reforms as part of the process of undermining the power of the profit system -- rather than implicitly seeing reforms as, at best, a kind of compromise with capitalism.

Had he done only that initial study of the Socialist Party (finished in 1962, though only published five years later), Jimmy Weinstein would merit a small but honorable spot in the history of the American left. But in fact he did a lot more.

Today's academic left is very much a star system. Jimmy never had a place in it. If that bothered him, he did a good job of keeping quiet about it. But just for the record, it's worth mentioning that he was present at the creation.

He was part of the group in Madison, Wisconsin that published Studies on the Left between 1959 and 1967. It was the first scholarly journal of Marxist analysis to appear in the United States since at least the end of World War II, and an important point of connection between the American New Left and international currents in radical thought. (The first translation of Walter Benjamin's "The Work of Art in the Age of Mechanical Reproduction," for example, appeared in Studies.)

Jimmy's brief memoir of this period can be found in a volume edited by the radical historian Paul Buhle called History and the New Left: Madison, Wisconsin, 1950-1970 (Temple University Press, 1990). There has long been a tendency to treat the intellectual history of the American left as unfolding primarily in New York City. This is understandable, in some ways, but it introduces gross distortions. It's worth remembering that one of the major publications serving to revitalize radical scholarship was the product of a group of graduate students at the University of Wisconsin. It appears that Buhle's anthology is now out of print. But what's more surprising, I think, is that more research hasn't been done on "the Madison intellectuals" in the meantime.

In keeping with Miles Harvey's characterization of Weinstein as "the Zelig of the American left," we next find him at the Chicago convention of Students for a Democratic Society in 1969. That was the one where -- just as the antiwar movement was starting to get a hearing on Main Street USA -- rival factions waved copies of the Little Red Book in the air and expelled one another. (Want evidence that the left's deepest wounds are self-inflicted? There you go.)

Repelled by the wild-eyed hysteria and terrorist romanticism of the Weather Underground (of which, one of his cousins was a member), Jimmy helped start another journal, Socialist Revolution, which was always more cerebral than its up-against-the-wall title might suggest. In 1978, it changed its name to Socialist Review. (This abandonment of "revolution" inspired a certain amount of hand-wringing in some quarters.) It was the venue where, in 1985, Donna Haraway first published her "Cyborg Manifesto." For years afterward, the rumor went around that SR was about to drop "Socialist" from its title, to be replaced by "Postmodern." But in fact it continues now as Radical Society -- a distant descendant of its ancestor, by now, though it still bears a family resemblance to the publications that Jimmy worked on long ago.

Jimmy's last major venture as a publisher -- the culmination of his dream of converting the lessons of radical history into something practical and effective, here and now -- was In These Times, which started as a newspaper in 1976 and turned into a magazine sometime around 1990. A collection of articles from the magazine's first quarter century appeared in 2002 as the book Appeal to Reason -- a title echoing the name of the most widely circulated newspaper of the old Socialist Party.

Pat Aufderheide, now a professor of communications at American University, was ITT's culture editor from 1978 through 1982. She writes about the experience in her book The Daily Planet: A Critic on the Capitalist Culture Beat (University of Minnesota Press, 2000). A whole generation of people were entranced by the countercultural idea that "the personal is the political" -- or its academic doppelganger, the Foucauldian notion that power was everywhere and inescapable. These were recipes, she notes, for "self-marginalization and political fundamentalism" on the left.

"For In These Times," writes Aufderheide, "politics is the prosaic complex of institutions, structures and actions through which people organize consciously for social change.... Richard Rorty would put it in the reformist left category. It is read largely by leftists who do organizing or other practical political work, through labor unions, universities and schools, churches, nonprofit organizations and local and regional government. These are smart people, many of whom are not intellectuals, and who mostly come home late and tired."

The importance of reaching that public -- indeed, the very possibility of doing so -- tends to be overlooked by many people engaged in left-wing academic discourse. ("Our comrades in armchairs," as activists sometimes put it.)

In her book, Aufderheide recalls dealing with "a vocal contingent of academics" who were "always ready to pounce on lack of subtlety, creeping cheerleading, or sentimentality" in the magazine's cultural coverage. "Their critical acuteness, however, often seemed exercised for the satisfaction of intellectual one-upmanship," she writes. "When I begged them to write, to point me to other writers, to serve on the board, there was almost always a stunned silence."

The problem is self-perpetuating, Perhaps it comes down to a lack of good examples. And in that regard, Jimmy's death is more than a personal loss to his friends and family.

It's worth mentioning that, along the way, he wrote a number of other books, with The Long Detour: The History and Failure of the American Left  (Westview, 2003) being his last. It was also his favorite, according to Miles Harvey, whose series of deathbed  interviews will, in time, serve as the starting point for some historical researcher who has perhaps not yet heard of James Weinstein.

To be candid, I didn't care for his final book quite as much as the one he published in 1975 called Ambiguous Legacy: The Left in American Politics. The books are similar in a lot of ways. I'm not sure that my preference for one over the other is entirely defensible.

But it was Ambiguous Legacy that Jimmy inscribed when we met, about 10 years ago. My copy of his first book, the one on the Socialist Party, he dedicated "with hope for our future." Only later did I look at the other volume. Beneath the greeting -- and before his signature -- he wrote: "The legacy is more ambiguous than ever."

Author/s: 
Scott McLemee
Author's email: 
scott.mclemee@insidehighered.com

Scott McLemee was a contributing editor for In These Times between 1995 and 2001. His column Intellectual Affairs appears here on each Tuesday and Thursday.

Throat Culture

For the past few days, I've been waiting for a review copy of Bob Woodward's book The Secret Man: The Story of Watergate's Deep Throat to arrive from Simon and Schuster. So there has been some time to contemplate the way that (no longer quite so) mysterious figure has been "inscribed" ina "double register" of "the historical imaginary," as the cult-stud lingo has it. (Sure hope there's a chance to use "imbricated discourse" soon. Man, that would be sweet.)

Putting it in slightly more commonplace terms: Two versions of Deep Throat have taken shape in the past 30 years or so. They correspond to two different ways of experiencing the odd, complex relationship between media and historical memory.

On the one hand, there was Deep Throat as a participant in a real historical event -- making the question of his motivation an important factor in making sense of what happened. It was even, perhaps, the key to understanding the "deep politics" of Watergate, the hidden forces behind Richard Nixon's fall. The element of lasting secrecy made it all kind of blurry, but in a fascinating way, like some especially suggestive Rorschach blot.

On the other hand, there was Deep Throat as pure icon -- a reference you could recognize (sort of) even without possessing any clear sense of his role in Watergate. It started out with Hal Holbrook's performance in All the President's Men -- which, in turn, was echoed by "the cigarette-smoking man" on "The X Files," as well as the mysterious source of insider information about the Springfield Republican Party on "The Simpsons." And so Deep Throat (whose pseudonym was itself originally amovie title) becomes a mediatic signifier unmoored to any historical signified. (An allusion to an allusion to a secret thus forgotten.)

Different as they might be, these two versions of Deep Throat aren't mutually exclusive. The discourses can indeed become imbricated ( yes!), as in the memorable film Dick, which reveals Deep Throat as a pair of idealistic schoolgirls who guide the cluelessly bumbling Woodward and Bernstein through the mysteries of the Nixon White House.

There is something wonderful about this silly premise: In rewriting the history of Watergate, Dick follows the actual events, yet somehow neutralizes their dire logic by just the slightest shift ofemphasis. The deepest secret of an agonizing national crisis turns out to be something absurd.

That perspective is either comically subversive or deeply cynical. Either way, it's been less anticlimactic, somehow, than the revelation of Deep Throat's real identity as the former FBI official Mark Felt. So much for the more elaborate theories about Watergate - that it was, for example, a "silent coup" by a hard-right anticommunist faction of the U.S. military, upset by the administration's dealings with the Soviets and the Chinese. And Deep Throat's role as emblem of noir-ish intrigue may never recover from the impact of the recent, brightly lit video footage of Mark Felt -- half-dazed, half mugging for the camera.

And there have been other disappointments. This week, I had an interesting exchange by e-mail with Bill Gaines, a professor of journalism at the University of Illinois at Urbana-Champaign and two-time winner of the Pulitzer, not counting his two other times as finalist. His part in the DeepThroat saga came late in the story, and it's caused him a certain amount of grief.

But it was also -- this seems to me obvious -- quite honorable. If anything, it is even more worthy of note now that Bob Woodward is telling his side of the story. (While Carl Bernstein also has a chapter in the book, it was Woodward who had the connection with Felt.)

In 1999, Gaines and his students began an investigation designed to determine the identity of Deep Throat. The project lasted four years. It involved sifting through thousands of pages of primary documents and reading acres of Watergate memoir and analysis -- as well as comparing the original articles by Woodward and Bernstein from The Washington Post to the narrative they provided in their book All the President's Men. Gaines also tracked down earlier versions of the manuscript for that volume -- drafted before Woodward decided to reveal that he had a privileged source of inside information.

Gaines and his students compiled a database they used to determine which of the likely candidates would have actually been in a position to leak the information that Deep Throat provided. In April 2003, they held a press conference at the Watergate complex in Washington, DC, where they revealed ... the wrong guy.

After a period of thinking that Deep Throat must have been Patrick Buchanan (once a speechwriter for Nixon), the researchers concluded that it had actually been Fred Fielding, an attorney who had worked as assistant to John Dean. The original report from the project making the case for Fielding is still available online -- now updated with a text from Gaines saying, "We were wrong."

The aftermath of Felt's revelation, in late May, was predictably unpleasant for Gaines. There were hundreds of e-mail messages, and his phone rang off the hook. "Some snickered as if we had run the wrong way with the football," he told me.

But he added, "My students were extremely loyal and have told anyone who will listen that they were thrilled with being a part of this project even though it failed." Some of those who worked on the project came around to help Gaines with the deluge of correspondence, and otherwise lend moral support.

As mistaken deductions go, the argument offered by Gaines and his students two years ago is pretty rigorous. Its one major error seems to have come at an early stage, with the assumption that Woodward's account of Deep Throat was as exact as discretion would allow. That was in keeping with Woodward's own statements, over the years. "It's okay to leave things out to protect the identity of a source," he told the San Francisco Chronicle in 2002, "but to add something affirmative that isn't true is to publish something you know to be an inaccuracy. I don't believe that's ethical for a reporter."

The problem is that the original account of Deep Throat doesn't line up quite perfectly with what is known about Mark Felt. Some of the discrepancies are small, but puzzling even so. Deep Throat is a chain smoker, while Felt claimed to have given up the demon weed in 1943. "The idea that Felt only smokes in the garage [during his secretive rendezvous with Woodward] is a little hard to swallow," says Gaines. "I cannot picture him buying a pack and throwing the rest away for the drama it will provide." By contrast, Fielding was a smoker.

More substantive, perhaps, are questions about what Deep Throat knew and how he knew it. Gaines and his students noted that statements attributed to Deep Throat in All the President's Men were credited to a White House source in the original newspaper articles by Woodward and Bernstein. (Felt was second in command at the FBI, not someone working directly for the White House, as was Fielding.)

Deep Throat provided authoritative information gleaned from listening to Nixon's secret recordings during a meeting in November 1973. That was several months after Felt left the FBI. And to complicate things still more, no one from the FBI had been at the meeting where the recordings were played.

According to Gaines, that means Felt could only have learned about the contents of the recordings at third hand, at best. Felt was, as Gaines put it in an e-mail note, ""so far removed that his comments to Woodward would have to be considered hearsay, and not the kind of thing a reporter could write for fact by quoting an anonymous source."

When I ask Gaines if there is anything he hopes to learn from Bob Woodward's new book, he mentions hoping for some insight into one of the more memorable descriptions of the secret source -- the one about how Deep Throat "knew too much literature too well." In any case, Gaines make a strong argument that Woodward himself took a certain amount of literary license in transforming Felt into Deep Throat.

"We know from our copy of an earlier manuscript that Woodward changed some direct quotes attributed to Throat," he notes. "They were not major changes, but enough to tell us that he was loose with the quotes. There is information attributed to Throat that Felt would not have had, or that doesnot agree with what we found in FBI files."

As the saying has it, journalists write a first draft of history. One of the ethical questions involves trying to figure out just how much discretion they get in polishing the manuscript. Gaines seems careful not to say anything too forceful on this score -- though he does make clear that he isn't charging Woodward with creating a composite character.

That has long been one of the suspicions about Deep Throat. Even the new revelation hasn't quite dispelled it. Just after Felt went public with his announcement, Jon Wiener, a professor of history at the University of California at Irvine, reviewed some of the grounds for thinking that "several people who provided key information ... were turned into a composite figure for dramatic purposes" by Woodward and Bernstein. (You can find more of Wiener's comments here, at the very end of the article.)

For his part, Gaines says that the Deep Throat investigation isn't quite closed -- although he wishes it were. "I have always wanted to move on to something more important for the class project," he told me, "but the students and the media have caused us to keep going back to the Throat story."

Maybe now they should look into the mystery surrounding Deep Throat's most famous line: his memorable injunction to Woodward, "Follow the money."

It appears in the movie version of All the President's Men, though it can't be found in the book. When asked about it in an interview some years ago, Woodward guessed that it was an embellishment by William Goldman, the screenwriter. But Goldman has insisted that he got the line from Woodward.

Now it's part of the national mythology. But it may never have actually happened. Sometimes I wish the discourses would stop imbricating long enough to get this kind of thing sorted out.

Author/s: 
Scott McLemee
Author's email: 
scott.mclemee@insidehighered.com

Real Knowledge

During the heyday of American economic and geographical expansion, in the late 19th century, the men who sold real estate occupied a distinct vocational niche. They were slightly less respectable than, say, riverboat gamblers -- but undoubtedly more so than pirates on the open seas. It was a good job for someone who didn’t mind leaving town quickly.

But about 100 years ago, something important began to happen, as Jeffrey M. Hornstein recounts in A Nation of Realtors: A Cultural History of the Twentieth Century American Middle Class, published this spring by Duke University Press.  Some of those engaged in the trade started to understand themselves as professionals.

They created local realty boards and introduced licensing as means by which reputable practitioners could distinguish themselves from grifters. And in time, they were well enough organized to lobby the federal government on housing policy –- favoring developments that encouraged the building of single-family units, rather than public housing. Their efforts, as Hornstein writes, "would effectively create a broad new white middle class haven in the suburbs, while leaving behind the upper class and the poor in cities increasingly polarized by race and wealth."

I picked up A Nation of Realtors expecting a mixture of social history and Glengarry Glen Ross. It's actually something different: a contribution to understanding how certain aspects of middle-class identity took shape -- both among the men (and later, increasingly, women) who identified themselves as Realtors and among their customers. Particularly interesting is the chapter "Applied Realology," which recounts the early efforts of a handful of academics to create a field of study that would then (in turn) bolster the profession’s claims to legitimacy and rigor.

Hornstein recently answered a series of questions about his book -- a brief shift of his attention back to scholarly concerns, since he is now organizing director of Service Employees International Union, Local 36, in Philadelphia.

Q:Before getting to your book, let me ask about your move from historical research to union organizing. What's the story behind that?

A: I was applying to graduate school in my senior year of college and my advisor told me that while he was sure I could handle grad school, he saw me as more of "a politician than a political scientist." I had always been involved in organizing people and was a campus leader. But I also enjoyed academic work, and went on to get two graduate degrees, one in political science from Penn, another in history from the University of Maryland.

While I was doing the history Ph.D. at Maryland, a group of teaching assistants got together and realized that we were an exploited group that could benefit from a union. Helping to form an organizing committee, affiliating with a national union, getting to know hard-boiled organizers (many of whom were also intellectuals), and attempting to persuade my peers that they needed to take control of their own working conditions through collective action captured my imagination and interest much more than research, writing, or teaching.  

After a long intellectual and personal journey, I finally defended my dissertation. The academic job market looked bleak, particularly as a graduate of a non-elite institution. And when I was honest with myself, I realized that my experience forming a graduate employee union engaged me far more than the intellectual work.

Armed with this insight, I put the diss in a box, and two weeks later, I was at the AFL-CIO’s Organizing Institute getting my first taste of what it would be like to organize workers as a vocation. In the dark barroom in the basement of the George Meany Center for Labor Studies, a recruiter from an SEIU local in Ohio approached me and asked me if I’d like to spend the next few years of my life living in Red Roof Inns, trying to help low-wage workers improve their lives. Two weeks later, I landed in Columbus, Ohio and I was soon hooked.  

And I would add this: The supply of talented and committed organizers is far outstripped by the demand. The labor movement’s current crisis is, frankly, a huge opportunity for energetic and idealistic people to make a real difference. Hard work and commitment is really rewarded in the labor movement, and one can move quickly into positions of responsibility. It’s very demanding and often frustrating work, but it’s about as fulfilling a vocation as I could imagine.

Q:You discuss the emergence of realtors as the rise of a new kind of social identity, "the business professional." But I'm left wondering about early local real-estate boards. They sound kind of like lodges or fraternal groups, as much as anything else. In what sense are they comparable to today's professional organizations, as opposed to, say, the Elks or the Jaycees?

A: Indeed, early boards were very much like fraternal organizations. They were all male and clubby, there was often a "board home" that offered a retreat space, and so on. Early real estate board newsletters are rife with the sorts of jokes about women and minorities that were standard fare in the 1910s and 1920s -- jokes that, I argue, help to police the boundaries of masculinity.  

In the early chapters of the book, I provide brief sketches of the workings of the Chicago and Philadelphia real estate boards, as well as a sort of anthropological view of early real estate conventions. My favorite was the 1915 Los Angeles convention, during which the main social event was a drag party. In my view, the conventions, the board meetings, the social events, the publications, all formed a homosocial space in which a particular sort of masculinity was performed, where the conventions of middle-class masculinity were established and reinforced.  

In the early 1920's, the emphasis began to shift from fraternalism to a more technocratic, professional modality.  Herbert Nelson took the helm at the National Association of Real Estate Boards in 1923, and he started to make NAREB look much more like a modern professional organization. In some respects he created the mold. He made long-term strategic plans, asserted the necessity for a permanent Realtor presence in Washington, D.C., pushed for standards for licensing, worked with Herbert Hoover’s Commerce Department to promulgate a standard zoning act, and linked up with Professor Richard T. Ely [of the University of Wisconsin at Madison] to help "scientize" the field.  

Nelson served as executive director of NAREB for over 30 years. During his tenure, the organization grew, differentiated, specialized, and became a powerful national political actor. In sum, it became a true modern professional association in most ways. Yet like most other professional organizations prior to the ascendancy of feminism and the major incursion of women into the professions, masculine clubbiness remained an important element in the organizational culture well into the 1970s.    

In sum, the story I tell about the complex interdependencies of class, gender, and work identities is largely about the Realtors’ attempts to transform an Elks-like organization into a modern, "professional" business association.

Q:On the one hand, they see what they are doing as a kind of applied social science -- also creating, as you put it, "a professional metanarrative." On the other hand, you note that Ely's Institute for Research in Land Economics was a casualty of the end of the real estate bubble. Doesn't that justify some cynicism about realtors' quest for academic legitimacy?

A: I don’t see the Realtors or the social scientists like Ely in cynical terms at all. In fact, both parties are quite earnest about what they’re doing, in my view. Ely was nothing if not a true believer in the socially transformative power of his research and of social scientific research in general. He managed to persuade a faction of influential Realtors, primarily large-scale developers ("community-builders") such as J.C. Nichols, that research was the key to professionalism, prosperity, and high-quality real estate development.  
Ely’s Institute was not a casualty of the implosion of the 1926 Florida real estate bubble as such. But the real estate collapse and the ensuing Depression made it much harder for the Realtors to make claims to authority based on disinterested science.

It’s not that the grounding of the whole field of Land Economics was problematic – at least no more so than any other field of social or human science, particularly one that produces knowledge that can be used for commercial purposes.  

The academic field was in its infancy in the 1910s and 1920s, and there were intra-disciplinary squabbles between the older, more historical economists like Ely and the younger generation, which was much more model- and mathematics-driven. At the same time, there were sharp divisions among Realtors between those who believed that professionalism required science (and licensing, and zoning, and so on) and those who rejected this idea.  

So, yes, the Elyian attempt at organizing the real estate industry on a purely ‘scientific’ basis, operating primarily in the interest of the social good, was largely a failure. However, the 1920s mark a watershed in that the National Association became a major producer and consumer of social scientific knowledge. Business schools began to offer real estate as a course of study. Textbooks, replete with charts and graphs and economic equations, proliferated. Prominent academics threw their lot in with the Realtors.

In the end, the industry established its own think tank, the Urban Land Institute, the motto of which is “Under All, The Land” -- taken straight from Ely’s work. But the profession itself remained divided over the value of ‘science’ – the community-builders generally supported efforts to scientize the field, while those on the more speculative end of the profession were generally opposed.  

But again, I don’t think that the grounding of the field of land economics is any more questionable than any other subfield of economics, such as finance or accounting.

Q:Your book left me with a sort of chicken-and-egg question. You connect the growth of the profession with certain cultural norms -- the tendency to define oneself as middle-class, the expectation of private home ownership, etc. Didn't those aspirations have really deep roots in American culture, which the Realtors simply appealed to as part of their own legitimization? Or were they more the result of lobbying, advertising, and other activities of the real-estate profession?

A: Absolutely, these tendencies have roots deep in American culture. The term "middle class" was not really used until the late 19th century -- "middling sorts" was the more prevalent term before then. The "classless society" has long been a trope in American culture, the idea that with hard work, perseverance, and a little luck, anyone can "make it" in America, that the boundaries between social positions are fluid, etc.  

But it’s not until the early-to-mid 20th century that homeownership and middle-class identity come to be conflated.  The "American Dream" is redefined from being about political freedom to being about homeownership. At around the same time, debt is redefined as "credit" and "equity."

So, yes, I ‘d agree to some extent that the Realtors tapped into longstanding cultural norms as part of their efforts at self-legitimization. Like most successful political actors, they harnessed cultural commonsense for their own ends – namely, to make homeownership integral to middle-class identity. Their political work enabled them, in the midst of the Depression, to get the National Housing Act passed as they wrote it -- with provisions that greatly privileged just the sort of single-family, suburban homes leading members of NAREB were intent on building.  

The Realtors used the cultural material at hand to make their interests seem to be the interests of the whole society. But, as we know from many fine studies of suburban development, many people and many competing visions of the American landscape were marginalized in the process.

Author/s: 
Scott McLemee
Author's email: 
scott.mclemee@insidehighered.com

Scott McLemee writes Intellectual Affairs on Tuesdays and Thursdays.

Other Casualties

One part of Milovan Djilas's Conversations with Stalin lingers in the memory well after the rest of the book fades. The author himself calls it "a scene such as might be found only in Shakespeare's plays." Actually, it does have its parallels to Rabelais, as well; for like many another gathering of the Soviet elite amidst the privations of World War II that Djilas recounts, there is an enormous feast, and a marathon drinking session.

This particular miniature carnival occurs in the final months of the war. Stalin is hosting a reception at the Kremlin for the Yugoslavian delegation. But before the partying begins, he must disburden himself; for Stalin has heard that Djilas (who would later become vice president under Marshall Tito) has criticized the behavior of some units of the Red Army as it has made its way across Europe.

"Does Djilas, who is himself a writer, not know what human suffering and the human heart are?" cries Stalin. "Can't he understand it if a soldier who has crossed thousands of kilometers through blood and fire and death has fun with a woman or takes a trifle?"

By "having fun," he was referring to well over two million rapes, by Soviet soldiers, of women of all ages and backgrounds. The very indiscriminateness of the sexual violence gives the lie to the idea that it was revenge for the suffering inflicted by the Germans. Inmates liberated from Nazi concentration camps were raped as well.

As for Djilas, it must have seemed, for a moment, as if Stalin's outburst were the kiss of death. Luckily for him, the dictator's mood changed. "He proposed frequent toasts," recalls the author, "flattered one person, joked with another, teased a third, kissed my wife because she was a Serb, and again shed tears over the hardships of the Red Army and Yugoslav ingratitude."

Perhaps in response to the criticism, Stalin issued a command that soldiers behave themselves. The Soviet officers read the proclamation to their troops with a smirk. Everyone knew it meant nothing. Boys will be boys.

The anonymous memoir A Woman in Berlin, now appearing in a new English translation from Metropolitan Books, is an extraordinary chronicle of life in the streets as the Thousand Year Reich turned into rubble and the advancing "Ivans" had their fun. The author was a German editor and journalist who died in 2001. Her book, based on a diary kept over two months during the spring of 1945, first appeared in English in 1954. It was only published in German in 1959, where it seems tohave been regarded as an intolerable faux pas, a violation of the unstated rule that the events never be mentioned again.

The book's rediscovery now comes in the wake of Antony Beevor's massive documentation of the rape campaign in The Fall of Berlin 1945, published three years ago by Viking Press. To judge by the reservations of some military historians, Beevor's account may not be the last word on howSoviet forces advanced into Germany. (A reviewer for Parameters, the journal of the U.S. Army War College, praised it as a work of popular history, but lodged some complaints about certain gaps in the book's account of troop manuevers.) Yet the book did take an unflinching look at the extent of the sexual terror.

Beevor supplies an introduction to the new edition of A Woman in Berlin, situating the document in historical context. He notes, for example, that the statistics about rape for Berlin "are probably the most reliable in all of Germany," falling somewhere between 95,000 and 130,000 victims "according to the two leading hospitals."

He also points out that there is no particular evidence that rape was treated as a deliberate strategy of war -- as human-rights activists have recently charged the Sudanese military with doing in Darfur.  "No document from the Soviet archives indicates anything of the sort in 1945," writes Beevor. But he suggests that the scale of the attacks may have been a by-product of the Red Army's internal culture, even so: "Many soldiers had been so humiliated by their own officers and commissars
during the four years of war that they felt driven to expiate bitterness, and German women presented the easiest target. Polish women and female slave laborers in Germany also suffered."

Reading the memoir itself, you find all such interpretive questions being put on hold. It is not just a document. The author, an urbane and articulate woman in her early 30s, writes about the fall of Berlin and her own repeated violation with an astounding coolness -- a bitter, matter-of-fact lucidity, the extreme candor of which is almost disconcerting, given the lack of even a hint of self-pity.

"No doubt about it," she writes after being raped several times in a row. "I have to find a single wolf  to keep away the pack. An officer, as high-ranking as possible, a commandant, a general, whatever I can manage. After all, what are my brains for, my little knowledge of the enemy's language?... My mind is firmly made up. I'll think of something when the time comes. I grin to myself in secret, feel as if I'm performing on the stage. I couldn't care less about the lot of them! I've never been so removed from myself, so alienated. All my feelings seem dead, except for the drive to live."

I've just reviewed the latest edition of A Woman in Berlin for Newsday, and will spare you a recycling of that effort (now available here ). Since then, a look at other reviews has revealed some debate over the authenticity of the book. The comments of J.G. Ballard ( no stranger to questions of sexuality in extreme conditions ) are indicative.

"It is hard to believe, as the author claims, that it was jotted down with a pencil stub on old scraps of paper while she crouched on her bed between bouts of rape," wrote Ballard in The New Statesman a few weeks ago. "The tone is so dispassionate, scenes described in so literary a way, with poignant references to the strangeness of silence and the plaintive cry of a distant bird. We live at a time that places an almost sentimental value on the unsparing truth, however artfully deployed. But the diary seems convincingly real, whether assembled later from the testimonies of a number of women or recorded at first hand by the author."

Given that concern, it is worth looking up the original edition of A Woman in Berlin, now more than 50 years old. It came with an introduction by C.W. Ceram, whose book Gods, Graves, and Scholars, first published in 1951, remains one of the best introductions to the history of archeology. Ceram recalls meeting the author of A Woman in Berlin not long after the war.

"From some hints that she dropped," he wrote, "I learned of this diary's existence. When, after another six months passed, I was permitted to read it, I found described in detail what I already knew from the accounts of others."

That means Ceram saw the book in 1947, at the latest. "It took me more than five years, however, to persuade the author that her diary was unique, that it simply had to be published."

She had, he writes, "jotted down in old ledgers and on loose pages what happened to her.... These pages lie before me as I write. Their vividness as expressed in the furtiveness of the short penciled notes; the excitement they emanate whenever the pencil refuses to describe the facts; the combination of shorthand, longhand, and secret code ... all of this will probably be lost in the depersonalizing effect of the printed word."

Ceram's introduction is interesting for its testimony about the book's provenance. But that remark about "the depersonalizing effect of the printed word" will seem odd to anyone who has read A Woman in Berlin.

In many ways, of course, the book is an account of brutality. (War is a force that turns people into things, as Simone Weil once put it; and killing them is just one of the ways.) But the anonymous author also created a record of what is involved in resisting depersonalization. At times, she is able to see the occupiers, too, as human beings. You cannot put the book down without wondering about the rest of her life.

Author/s: 
Scott McLemee
Author's email: 
scott.mclemee@insidehighered.com

Necessary Evils

  "In a time of war," wrote Cicero, "the laws are silent." (That's "inter arma silent leges," in case some nuance is missing from the usual English rendering.)

Well, perhaps not quite silent. Marouf A. Hasian's In the Name of Necessity: Military Tribunals and the Loss of American Civil Liberties, available next month from the University of Alabama Press, revisits more than 200 years of American argumentation for and against the legitimacy of "military justice."

That phrase merits the scare quote marks because it is very much open to question whether they quite belong together. You don't need to be a pacifist, or even to harbor any doubt about liberal democracy, to have such concerns. The role of the military is, of course, to fight; and the legitimacy of its monopoly on violence derives (in modern societies anyway) from its subordination to a lawful order. At best -- so the argument might go -- the military can pursue a just policy, as subject to oversight and review by outside institutions. Hence the rise of what is called the "civilianization" of military law.

That's the theory, anyway. The actual record is a good bit messier, as Hasian, an associate professor of communications at the University of Utah, shows in some detail. His book presents a series of analytic retellings of events from the Revolutionary War through the detainments at Guantanamo Bay. To some degree, then, it overlaps with William Rehnquist's All the Laws But One: Civil Liberties in Wartime, (1998, which focused mainly on cases from the Civil War and the two World Wars.

But the difference is not simply a matter of the opening of a whole new chapter in history over the past four years. Hasian's book is the work of a scholar who has taken "the rhetorical turn" -- drawing on the toolkit of concepts from one of the founding disciplines of humanistic study. A social historian or a law professor might also cover, as he does, the 1862 U.S-Dakota war tribunal, which led to the execution of a group of Native Americans -- or the 1942 trial of several German saboteurs, captured shortly after they had been deposited on the coasts of New York and Florida, along with bomb-making materials, by U-boat. But Hasian treats these cases neither as events (as a historian would) nor as precedents (the lawyer's concern).

The emphasis in his book falls, rather, on how a particular element of persuasion took shape in each case: the argument of necessity. In each case, the claim was made that circumstances demanded the suspension of normal legal procedures and guarantees, and their replacement by military tribunals that practiced the warlike virtues of secrecy, efficiency, and swiftness.

A philosopher or legal theorist might want to dissect the validity, coherence, or applicability of "necessity" as a principle applied in such cases. Hasian's approach treats it, not as a concept, but as what rhetoric scholars have in recent years called an "ideograph" -- that is, "a key evocative term or phrase that illustrates the political allegiances of an individual and a community in a major social, political, economic, or legal controversy." Other ideographs include such terms as "equality," "progress," and "freedom."

The range of definitions and of emotional charge for each term varies. They have a rather timeless sound, but a complex history of mutations in meaning. And in the heat of debate, they can be made to perform a variety of functions. The meaning of an ideograph in a given context is marked by that context.

Perhaps the strongest version of the argument from necessity is the one that Lincoln made against those who criticized him for suspending habeus corpus during the Civil War: "Are all the laws, but one, to go unexecuted, and the government go to pieces, lest that one be violated?" In other words: Moments of extremity can require the temporary sacrifice of some civil liberties to preserve the rest.

Rehnquist signaled his basic agreement with this line of thought by titling his book All the Laws But One. "It is neither desirable nor is it remotely likely," he wrote there, "that civil liberty will occupy as favored a position in wartime as it does in peacetime."

But even the fairly straightforward affirmation of necessity as a legitimate ground for suspending civil liberties is the result of (and a moment of conflict within) a complicated history of arguments. In tracing out the history of necessity, Hasian identifies two strands of -- well, it's not clear what the expression would be. Perhaps "ideographic DNA"? One he calls the "Tory" concept of necessity; the other, the "Whig" version.

In the Tory framing, there are "many times when a society is called upon to defend itself against riots, revolutions, and rebellions," as Hasian puts it. It is the responsibility of the monarch or the executive branch to recognize the danger and respond accordingly. "Since this is an issue of survival, the military authorities should be given a great deal of discretion. In these situations, the 'will' of those in authority will be of paramount importance."

(In other words, an element of sovereign authority is handed over to the military. The commanding officer is then in the position to say, "I am the law." And legitimately so.)

By contrast, the Whiggish conception of necessity sees "relatively few times when a society has to worry about exigent circumstances." Responsibility for judging whether or not a real emergency exists should fall to the parliament or the legislative branch -- to which the military must remain accountable.

Appropriately enough, given a Whiggish sensibility, this means a certain degree of guardedness and jealousy about the degree of judicial authority delegated to the military. There will be a tendency towards suspicion that the trust might be abused. The Whig discourse on necessity wants to keep to a bare minimum the scope, duration, and degree of secrecy that military tribunals may claim.

The classic formulation of the Whig conception in American history is Ex parte Milligan, from 1866, in which the Supreme Court found that the Union military authorities had overstepped by arresting and trying a Confederate sympathizer in Indiana -- a state where the normal functioning of the court system had not been interrupted by the war.

Of course, Ex parte Milligan fans have taken some hits lately. We had a good run there, for a while. But lately all the swagger comes from enthusiasts for Ex parte Quirin (1942), which denied the claim of German saboteurs to appeal for civil trials.

What makes Hasian's account of Quirin so interesting is his suggestion that some Supreme Court justices "actually thought that their decision would be construed as falling in line with the precedents that placed limits on military commissions and executive power." But if that was the intention 60 years ago, you'd never know it to read the newspapers today.

This is an aerial overview of In the Name of Necessity. The real provocations are in the details. Perhaps the analytic category of ideograph sounds a trifle thin -- turning bloody arguments into something rather anemic. But Hasian's book is ultimately more polemical than that. The framework is only just technically "value neutral." He's got a position to stake out.

"In the very, very rare cases of extreme necessity," he writes, "when Congress and the United Nations have decided we need to impose martial law or have commissions in occupied lands, we may have situations where all of the civil courts are closed and where the military may need more discretion."

That much, Hasian will concede to the Tory worldview, and no more. But even then, such assertions of power "need to be held in check by recognizing that most of the time we should begin with the baseline 'Whig' assumption that we want to maintain the civilianization of the military, and not the other way around."

OK, fair enough. Now how will that play out in the courts under Chief Justice Roberts? And particularly under a circumstance in which the Tories are so powerful that nobody really doubts that Chief Justice Roberts will be presiding?

That Whig in extremis John Milton said that necessity is "ever the tyrant's plea." But we might be entering a period when the plea doesn't even have to be made -- when war doesn't silence law, but writes it.

Author/s: 
Scott McLemee
Author's email: 
scott.mclemee@insidehighered.com

Scott McLemee writes Intellectual Affairs on Tuesdays and Thursdays. Suggestions and ideas for future columns are welcome.

The Media World as It Is

I direct a journalism school known for its support of the First Amendment, which we celebrate annually with speeches and case studies. As such, I am a source on free press issues. Reporters contact me about such cases as the Ward Churchill fiasco at the University of Colorado, asking if his “little Eichmanns” depiction of 9/11 victims is protected speech -- perhaps speech that should protected by me. I deflect those calls, believing such controversies are less about free speech and more about a media culture that values opinion more than fact.

There are many reasons for this, but nearly all point to new technology and profit margin. For starters, opinion costs less to disseminate than fact and can be aligned with a target market. Media chains that care more about revenue than reputation have purchased outlets from families that had safeguarded rights in hometowns for generations. Computerization and downsizing of newsrooms deleted reporters from the scene so that they became less visible and therefore, vital. Meanwhile communication technology became affordable so that consumers had in their palms the power to broadcast or publish at will.

The news industry has changed so much, so quickly. To be sure, some of that change has been refreshing and long in coming. The Internet, and blogging in particular, have created digital dialogues about media, challenging corporate business as usual. But the promise of technology -- that it would build social networks, democratize news and generally enhance information in two-way flows -- has always hinged on the presumption of readily available and verifiable information.

What are the consequences, not only for media, but for academe, when opinion displaces fact?

The Social Idea

I worked as a correspondent and state editor for United Press International in the 1970s. Members of my generation built on the Edward R. Murrow legacy of intermingling fact with experience. Murrow, an original "embedded" journalist, went on a World War II bombing mission over Germany, reporting for CBS radio network. According to Murrow’s code of ethics, reality was the observed state of people and the world. In other words he thought reporters had to be on the scene to report fact reliably. Practicing that, he brought the war in Europe to America, just as my generation brought home another war with a different outcome -- the war in Vietnam.

Because universities dealt with fact, they played a role in the social protests of that era. Although organizations and movements such as Students for a Democractic Society (led by student editor Tom Hayden at Michigan) and Free Speech (University of California-Berkeley) began in the early to mid 1960s, they came together and spurred protests after news coverage of the 1968 Democratic Convention in Chicago. In 1970, coverage of the invasion of Cambodia sparked a protest at Kent State University that killed four students and injured eight. More protests followed nationwide with two more students killed at Jackson State University. Hundreds of colleges and universities shut down as students went on strike, with subsequent protests often tied to a specific news event. While those protests were political, they were usually in response to factual reporting. Lest we forget, 63 journalists died covering the wars in Vietnam and Cambodia. More recently, 25 journalists died in 2004 alone covering the war in Iraq. One has to ask oneself why that fact alone is scarcely known on college campuses.

Journalists fed Vietnam-era protests simply by reporting fact in a culture that still appreciated its power. We differed from Murrow-era journalists, our mentors, relying less on emotion and more on anonymous sources, for which we caught (and still catch) hell, filing reports in a detached, impartial voice. We practiced objectivity, realizing it could not be fully attained but amassing factual fragments so that a discernable mosaic could emerge.

We tried to see the world as it was, not as we wished it were.

That definition is derived from the British poet Matthew Arnold, who wrote about "genuine scientific passion" in the 1869 essay, “Culture and Anarchy.” Arnold maintained that people who saw things as they were also apprehended a culture beyond the consciousness of science -- a moral culture, informed by conscience. This, Arnold wrote, was the "social idea" that made such people "the true apostles of equality" who "have a passion for diffusing, for making prevail … from one end of society to the other, the best knowledge, the best ideas of their time."

I read “Culture and Anarchy” during the adversary days of the Watergate era. It seemed an appropriate title. Doing so I understood the role of journalism in promoting the “social idea.” The most popular television news show then was “60 Minutes,” on Murrow’s old network, CBS. The show had a novel segment called “Point/Counterpoint.” The most heated segments featured liberal Shana Alexander against conservative James J. Kilpatrick.

Their debates heralded a hitherto unexplored news phenomenon in which sources could pit one constellation of facts against an opposing constellation. This media milieu existed well into the 1990s, diluting the previous culture of fact and transforming it into one of factoid (partial/pseudo-fact).  But fact maintained some power.

“Point/Counterpoint” soon changed that. Keep in mind that this segment was so startling at the time that a new satire show, “Saturday Night Live,” ridiculed it regularly with Jane Curtin assailing the viewpoint of Dan Aykroyd who, invariably, would respond, “Jane, you ignorant …” -- and then he said a word that a savvy source, knowing today’s opinionated media, would not tell a reporter, if sharing this anecdote, fully aware of free speech rights, cognizant that the omitted word is a matter of record and also a matter of fact. This is not political correctness but what occurs in a culture of knee-jerk opinions. Responsible people, or people responsible for others, are aware of that culture and wary about adding their informed voices to the social debate, leaving that to those who would seek celebrity or who would entertain or worse, strike fear and outrage in others.

Fear and outrage are byproducts of an uninformed society. Perhaps that is why Americans increasingly embrace entertainment. James Howard Kunstler in his prescient 1996 book Home from Nowhere maintains that no other society but ours has been so preoccupied with instantaneous make-believe and on-demand fantasy. Because we fear so much, Kunstler writes, “we must keep the TVs turned on at all waking hours and at very high volume.” To escape fear, we amuse ourselves to death -- a phrase associated with a 1985 book by the great communication scholar, Neil Postman, who died in 2003, although many, perhaps ones reading this column, were not informed about the fact of his passing.

Just Another Viewpoint

When families who lost relatives in the 9/11 attacks were still grieving, Ward Churchill, the Colorado professor, was comparing their loved ones to “little Eichmanns.” His inflammatory essay lay dormant on the Internet until only recently. The controversy that arose because of Churchill’s opinions became so intense that Elizabeth Hoffman, president at Colorado, announced her resignation amid the media clamor. To be sure, Hoffmann was dealing with other controversies at the time, but coverage of Churchill became so intense that it might have contributed to that resignation.

A few years ago I could have invited Ward Churchill to Ames, Iowa, during a First Amendment event for a debate about his views. To do so now would assemble a media circus, bringing controversy to my journalism school. And what good would my counterpoint to his opinions accomplish, however factual I could make such an argument, when my invitation and my motive for making it, would be the news rather than the substance of any rebuttal?

In the new media environment, fact -- even all-inclusive, verifiable, comprehensive fact -- is seen as just another viewpoint, just another opinion in the menu of fame on demand facilitated by Internet and cable television. So when a professor writes an essay (or a phrase in that essay) so sensational that it sparks a nationwide debate about free speech or academic freedom, journalists are missing the point. Such controversies, shaped by media practice, merely amuse the opinionated public.

Case in point: Fox’s "American Idol" reportedly inspired 500 million votes this season, quadrupling the number of ballots cast in the last U.S. presidential election. True, many viewers voted more than once for favorite contestants, but that only documents the culture of opinion, especially popular with younger viewers.

David Mindich, author of Just the Facts and Tuned Out: Why Americans Under 40 Don't Follow the News, says journalists have to compete now with shows like “Fear Factor” and “Friends” and so are overemphasizing humor, conflict and sex. Mindich, chair of the journalism and mass communication department at Saint Michael’s College, believes that the power of fact has diminished in this media universe. “One of the most powerful things about journalism itself is that it can communicate to a large audience and then we can have discussions about facts and where the facts bring us; but if we no longer are paying attention, then the facts don’t have the same weight. In the absence of fact opinion becomes more powerful. It’s not only the journalists themselves; it’s the culture apart from the news that has abandoned political discourse based on commonly agreed upon facts.”

In our day, points and counterpoints may be passionate but often also uninformed and usually accusatory. Who wants to participate in a media spectacle where audience and other sources, rather than the reporters, instinctively go for the jugular? Too often in this environment, the only people willing to speak out -- to contribute to the social debate -- are those with special interests or with nothing to lose and celebrity to gain.

The New Silent Majority

Sources who can explain the complex issues of our era, including biotechnology and bioterrorism, often opt out of the social debate. This includes scientists at our best universities. They see the media world as it is … and so have refrained from commenting on it. Increasingly the new silent majority will not go public with their facts or informed perspectives because, they realize, they will be pilloried for doing so by the omnipresent fear-mongers and sensationalists who provide a diet of conflict and provocation in the media.

And that creates a crisis for the First Amendment, which exists because the founders believed that truth would rise to the top -- providing people could read. That is also why education is associated with free speech and why, for generations, equal access to education has been an issue in our country and continues to be in our time. Education and information are requisite in a republic where we elect our representatives; to downsize or cut allocations for either puts the country at risk. Society is experiencing the consequences of cuts to the classroom and the newsroom, and we are getting the governments that we deserve, including blue vs. red states in a divided, point/counterpoint electorate.

What will become of journalism in this perplexing milieu? What happens when profit rather than truth rises to the top? According to David Mindich, “When profit trumps truth, journalism values are diluted, and then people start to wonder about the value of journalism in the first place.” Without facts, he says, people "start to forget the purpose of the First Amendment and then that, in turn, weakens journalism, and it’s a downward spiral from there."

The only one way to stop the spiral is through re-investment in journalism and education. As for me, a journalism educator, my highest priority is training students for the media environment that used to exist, the one concerned about fact holding government accountable — no matter what the cost. Sooner or later, there will be a place again for fact-gathering journalists. There will be a tipping point when profit plummets for lack of newsroom personnel and technology fails to provide the fix. That day is coming quickly for newspapers publishers, in particular, who are struggling to compete online without realizing there are no competitors on front doors and welcome mats of American homes, their erstwhile domain. They will realize that the best way to attract new readers is to hire more reporters and place them where citizens can see them on the scene as witness, disseminating verifiable truths of the day.

Author/s: 
Michael Bugeja
Author's email: 
info@insidehighered.com

Michael Bugeja directs the Greenlee School of Journalism and Communication at Iowa State University. He is the author of Interpersonal Divide: The Search (Oxford University Press, 2005).

Hitler -- the Classic?

It is disagreeable to approach the cashier with a book called How to Read Hitler. One way to take the stink off would be to purchase one or two other volumes in the new How to Read series published by W. W. Norton, which also includes short guides to Shakespeare, Nietzsche, Freud, and Wittgenstein. But at the time, standing in line at a neighborhood bookstore a couple weeks ago, I wasn't aware of those other titles. (The only thing mitigating the embarrassment was knowing that my days as a skinhead, albeit a non-Nazi one, are long over.) And anyway, the appearance of Adolf Hitler in such distinguished literary and philosophical company raises more troubling questions than it resolves.

"Intent on letting the reader experience the pleasures and intellectual stimulation in reading classic authors," according to the back cover, "the How to Read series will facilitate and enrich your understanding of texts vital to the canon." The series editor is Simon Critchley, a professor of philosophy at the New School in New York City, who looms ever larger as the guy capable of defending poststructuralist thought from its naysayers. Furthermore, he's sharp and lucid about it, in ways that might just persuade those naysayers to read Derrida before denouncing him. (Yeah, that'll happen.)

Somehow it is not that difficult to imagine members of the National Association of Scholars waving around the How to Read paperbacks during Congressional hearings, wildly indignant at Critchley's implicit equation of Shakespeare and Hitler as "classic authors" who are "vital to the canon."

False alarm! Sure, the appearance of the Fuhrer alongside the Bard is a bit of a provocation. But Neil Gregor, the author of How to Read Hitler, is a professor of modern German history at the University of Southampton, and under no illusions about the Fuhrer's originality as a thinker or competence as a writer.

About Mein Kampf, Gregor notes that there is "an unmistakably 'stream of consciousness' quality to the writing, which does not appear to have undergone even the most basic editing, let alone anything like polishing." Although Gregor does not mention it, the title Hitler originally gave to the book reveals his weakness for the turgid and the pompous: Four and a Half Years of Struggle against Lies, Stupidity and Cowardice. (The much snappier My Struggle was his publisher's suggestion.)

Incompetent writers make history, too. And learning to read them is not that easy. The fact that Hitler had ideas, rather than just obsessions, is disobliging to consider. Many of the themes and images in his writing reflect an immersion in the fringe literature of his day -- the large body of ephemeral material analyzed in Fritz Stern in his classic study The Politics of Cultural Despair: The Rise of the Germanic Ideology.

But Gregor for the most part ignores this influence on Hitler. He emphasizes, instead, the elements of Hitler's thinking that were, in their day, utterly mainstream. He could quote whole paragraphs Carl de Clausewitz on strategy. And his racist world view drew out the most virulent consequences of the theories of Arthur de Gobineau and Houston Stewart Chamberlain.(While Hitler was dictating his memoirs in a prison following the Beer Hall Putsch, he could point with admiration to one effort to translate their doctrines into policy: The immigration restrictions imposed in the United States in the 1920s.) 

Gregor's method is to select passages from Mein Kampf and from an untitled sequel, published posthumously as Hitler's Second Book. He then carefully unpacks them -- showing what else is going on within the text, beneath the level of readily paraphrasable content. With his political autobiography, Hitler was not just recycling the standard complaints of the extreme right, or indulging in Wagnerian arias of soapbox oratory. He was also competing with exponents of similar nationalist ideas. He wrote in order to establish himself as the (literally) commanding figure in the movement.

So there is an implicit dialogue going on, disguised as a rather bombastic monologue. "Long passages of Hitler's writings," as Gregor puts it, "take the form of an extended critique of the political decisions of the late nineteenth century.... Hitler reveals himself not only as a nationalist politician and racist thinker, but -- this is a central characteristic of fascist ideology -- as offering a vision of revitalization and rebirth following the perceived decay of the liberal era, whose failings he intends to overcome."

The means of that "overcoming" were, of course, murderous in practice. The vicious and nauseating imagery accompanying any mention of the Jews -- the obsessive way Hitler constantly returns to metaphors of disease, decay, and infestation -- is the first stage of a dehumanization that is itself an incipient act of terror. The genocidal implications of such language are clear enough. But Gregor is careful to distinguish between the racist stratum of Hitler's dogma (which was uncommonly virulent even compared to the "normal" anti-Semitism of his day) and the very widespread use of militarized imagery and rhetoric in German culture following World War I.

"Many of the anti-Semitic images in Hitler's writing can be found in, say, the work of Houston Stewart Chamberlain," writes Gregor. "Yet when reading Chamberlain's work we hardly sense that we are dealing with an advocate of murder. When reading Hitler, by contrast, we often do -- even before we have considered the detail of what he is discussing. This is because the message is not only to be found in the arguments of the text, but is embedded in the language itself."

How to Read Hitler is a compact book, and a work of "high popularization" rather than a monograph. The two short pages of recommended readings at the end are broad, pointing to works of general interest (for example, The Coming of the Third Reich by Richard Evans) rather than journal articles. It will find its way soon enough into high-school and undergraduate history classrooms -- not to mention the demimonde of "buffs" whose fascination with the Third Reich has kept the History Channel profitable over the years.

At the same time, Gregor's little book is an understated, but very effective, advertisement for the "cultural turn" in historical scholarship. It is an example, that is, of one way historians go about examining not just what documents tell us about the past, but how the language and assumptions of a text operated at the time. His presentation of this approach avoids grand displays of methodological intent. Instead the book just goes about its business -- very judiciously, I think.

But there is one omission that is bothersome. Perhaps it is just an oversight, or, more likely, a side effect of the barriers between disciplines. Either way, it is a great disservice that How to Read Hitler nowhere points out the original effort by someone writing in English to analyze the language and inner logic of Mein Kampf --  the essay by Kenneth Burke called "The Rhetoric of Hitler's 'Battle,' " published in The Southern Review in 1939. (In keeping with my recent enthusing over the "golden age" of the academic literary quarterly, it is worth noting that the Review was published at Louisiana State University and edited by a professor there named Robert Penn Warren.)

Burke's essay was, at the time, an unusual experiment: An analysis of a political text using the tools of literary analysis that Burke had developed while studying Shakespeare and Coleridge. He had published the first translations of Thomas Mann's Death in Venice and of portions of Oswald Spengler's Decline of the West -- arguably a uniquely suitable preparation for the job of reading Hitler. And just as various German émigrés had tried to combine Marx and Freud in an effort to grasp "the mass psychology of fascism" (as Wilhelm Reich's title had it), so had Burke worked out his own combination of the two in a series of strange and brilliant writings published throughout the Depression.

But he kept all of that theoretical apparatus offstage, for the most part, in his long review-essay on a then-new translation of Mein Kampf. Instead, Burke read Hitler's narrative and imagery very closely -- showing how an "exasperating, even nauseating" book served to incite and inspire a mass movement.

This wasn't an abstract exercise. "Let us try," wrote Burke, "to discover what kind of 'medicine' this medicine man has concocted, that we may know, with greater accuracy, exactly what to guard against, if we are to forestall the concocting of similar medicine in America."

Burke's analysis is a [ital]tour de force[ital]. Revisiting it now, after Gregor's How to Read volume, it is striking how much they overlap in method and implication. In 1941, Burke reprinted it in his collection The Philosophy of Literary Form, which is now available from the University of California Press. You can also find it in a very useful anthology of Burke's writings called On Symbols and Society, which appears in the University of Chicago Press's series called "The Heritage of Sociology."

"Above all," wrote Burke in 1939, "I believe we must make it apparent that Hitler appeals by relying upon a bastardization of fundamentally religious patterns of thought. In this, if properly presented, there is no slight to religion. There is nothing in religion proper that requires a fascist state. There is much in religion, when misused, that does lead to a fascist state. There is a Latin proverb, Corruptio optimi pessima, 'the corruption of the best is the worst.' And it is the corruptors of religion who are a major menace to the world today, in giving the profound patterns of religious thought a crude and sinister distortion."

Author/s: 
Scott McLemee
Author's email: 
scott.mclemee@insidehighered.com

Pages

Subscribe to RSS - History
Back to Top