History

End Large Conferences

I’ll play Marc Antony. I have not come to praise large conferences, but to bury them. It is my opinion that mega humanities conferences are way past their sell-by date. For senior faculty the only reason to go is to schmooze with old friends; for junior faculty they are an onerous duty, and for graduate students they are a rip-off for which professional organizations ought to be collectively ashamed.

First codicil: I speak exclusively of humanities conferences, as they are the only ones I know firsthand. Friends in computing and the sciences tell me that collaborative efforts arise from their conferences. I’m willing to believe them. Maybe it’s a cultural thing. Most humanities people find it so hard to collaborate that their wills stipulate that their notes go with them to the grave.

Second codicil: I have only myself to blame for recent travails. I didn't need to go to my unnamed conference, but I got it into my head that it would be fun. I was wrong. It serves me right for violating my principles.

Five years ago I concluded that humanities conferences were out of touch with the times and vowed to attend only smaller regional meetings with less cachet, but more satisfaction. But I didn’t listen to me. Instead I spent four days and a considerable wad of cash jostling among a throng of over three thousand. I returned home more akin to Ponce de Leon, who sought the Fountain of Youth and found mostly dismal swampland. Sound harsh? See if any of these observations resonate with your own.

Problem One: Outmoded Presentations

We live in the communications age, but the memo apparently never circulated among those studying the liberal arts. For reasons arcane and mysterious, humanities scholars still read papers. That’s tedious enough at a small conference where one might attend six three-paper presentations. At my recent conference, sessions commenced at 8 a.m. and ran past 10 p.m. One could have conceivably attended 30 sessions and heard 90 or more papers, though the only ones with the stamina to attend more than six or seven sessions were either posturing or desperate.

I wanted my four-day sojourn to introduce me to new ideas, concepts, and teaching modules, but the reality of such a grueling schedule is that I was running on fumes by the end of day one. It would have helped if presenters took advantage of new technology, but things seldom got more flash than PowerPoint, a program that, alas, seems to encourage more reading. Let me reiterate something I’ve said for years: the death penalty should apply to those who read anything from a PowerPoint slide other than a direct quote. It's an academic conference, for crying out loud; assume your audience is reasonably proficient at reading! Seriously, does anyone need to fly across the country to listen to a paper? Why not do as science conferences have done for years: post papers online and gather to have a serious discussion of those papers?

The mind-numbing tedium of being read to for four days is exacerbated by the fact that many humanities scholars have little idea about the differences between hearing and reading. If you construct a paper that’s so highly nuanced that understanding it rests upon subtle turns of phrase or complicated linguistic shifts, do not look up from your paper with a wan smile indicating you are enamored of your own cleverness; go back to your room and rewrite the damn thing. Audience, clarity, and coherence are pretty much the Big Three for speech and composition, unless one's audience is the International Mindreaders' Society. By the way, is there something wrong with using a map, providing a chart, or summarizing a work that few in the room are likely to have read? And do bother to tell me why your paper matters.

I actually heard several very exciting papers, but most of the offerings were dreadful. Note to young scholars: stop relying on the Internet and check out journals that predate 1995 before you proclaim a “discovery.” And if you really want to stand out, work on your shtick. Guess which papers I remember? Yep -- those in which the presenter did more than read to me.

Critical note to young scholars: Want to turn off everyone in the room? Be the person who doesn’t think that the 20-minute limit applies to you. Nothing says "non-collegial" more clearly.

Problem Two: Expense

Another reason to rethink conferences is that they cost an arm and a leg to attend. I had partial funding from my university because I was presenting -- and no, I bloody well did not read my paper -- but I was still out of pocket for quite a hunk of cash. If you attend a humanities conference and want to stay anywhere near the actual site of the event, plan on $150 per night for lodging in a soulless franchise hotel with windowless conference rooms and quirky technology, $20 per day for Internet access, another $200 for conference fees, roughly $500 for airfare, at least $50 for taxis to and from the airport -- almost no U.S. city has a convenient shuttle service anymore -- and money for whatever you plan on eating.

Budget plenty for the latter if your conference is in what is glibly called a Destination City. That’s shorthand for a theme area marketing itself as unique, though it’s actually a slice of Generica surrounded by shops and restaurants identical to those found in suburban malls in every way except one: captive audiences equal higher prices. (One small example: the Starbucks inside the pedestrian precinct at my hotel charged a buck more per cup than the one on the street 100 yards away.) Do the math and you can see that you can easily drop a few grand on a megaconference. (That’s what some adjuncts are paid per course!)

An immediate cost-saving adjustment would be to confine conferences to airline hub cities such as New York, Chicago, Los Angeles, Atlanta, and Houston. The moment the conference locates to a (not my term) "second-tier" city, allot another few hundred dollars for "connecting flights," a term used by the airline industry because it sounds nicer than saying you’ll spend six hours waiting in a hub, after you’ve sprinted through the airport like Usain Bolt for your next flight, found the gate closed, and retreated to the rebooking counter.

Problem Three: Victimized Grad Students

I'm a parsimonious Scot who resents spending money on boring hotels and lousy food, but I can afford it when I have to. Grad students can’t. A major way in which megaconferences have changed in the past several decades is that there’s considerably less balance between senior scholars, junior colleagues, and graduate students. (Senior scholars used to accompany the latter two in a mentor capacity.) Now there is just a smattering of senior and junior scholars, and they’re often holed up in hotel suites conducting interviews. Whenever they can, search committee members flee the conference and rendezvous with old friends. They might attend a session or two. Unless they have to be there, there aren’t many junior colleagues in attendance at all because they're busy getting material into publication and they can meet presentation expectations at cheaper regional meetings, or save their dough and go to prestigious (-sounding) international gatherings.

So who’s left? Graduate students. Lots of graduate students. So many that conservationists would recommend culling the herd if they were wild mustangs. Grad students have always gone to conferences in hopes of making their mark, attracting attention, and meeting people who can help them advance. That was the way it was done -- 20 years ago. Now network opportunities are slimmer. Whom do they meet? Mostly other grad students, often those massed outside of interview rooms.

Of all the antiquated things about large conferences, the "cattle call" interview is the most perverse. These were barbaric back in the days in which there were jobs; now they’re simply cruel. At least a third of attendees at my conference were grad students from a single discipline: English. As has been discussed many times on this site, most of them shouldn't be in grad school in the first place. How many of the thousand-plus English grad students can realistically hope to get an academic job of any sort?

The Modern Language Association predicts that only 900 English jobs will come open for all of 2011. That’s 900 in all specialties of English, the bulk of which will be in writing and rhetoric, not Austen and Proust. Will a fifth of those at the conference get a job? The odds are long. It's probably more like half of that, and if we're talking about a good job, slice it in half once more. So why ask strapped grad students to attend expensive conferences for 15-minute preliminary interviews? Do a telephone interview, for heaven’s sake; it’s kinder on both grad students and search committees.

As I did as a grad student, many young hopefuls pooled resources and economized where they could, but the sad truth is that the vast majority of attendees spent a small fortune on a gamble whose odds aren't much greater than buying lottery tickets. Are associations playing the role of enabler to grad student delusions? Yes. Here’s another thought: Instead of holding a big conference, sponsor a teleconference. Charge a fee for uploads, but give speakers one-year access to the URL, which they can make available to potential employers. Use the savings to the association to lobby for more tenure-track faculty.

Problem Four: No-Shows

You spend lots of money, you sit through desultory talks, and head off to the one or two sessions that made you want to attend the conference in the first place. What do you find? It’s been canceled because only one of the presenters showed up, and that paper was combined with several others of sessions that suffered the same fate. Didn’t you see the 3x5 card tacked to the conference bulletin board?

As noted above, I’m in favor of putting large conferences to rest. But If we insist on having them, let’s at least make sure they’re as advertised. O.K., things do happen, but in most cases missing presenters are simply AWOL. I know it smacks of McCarthyism, but I’ve come to support the idea of a data bank of no-shows that employers, conference planners, and deans can check.

Problem Five: Urban Sprawl

What’s the point of a conference that’s so big it’s inaccessible? I walked between two different hotels to attend sessions and pored over a Britannica-sized program to locate them. Conference attendees were housed in four "official" hotels and untold numbers of others. With round-the-clock sessions and decentralization, the few networking opportunities that did exist were logistically difficult. It took me two entire days to find my old friends, let alone new folks I wanted to engage. I met two interesting people at the airport. I never saw them again.

In Praise of Small Conferences

There are other problems I’ll leave for now, including the gnawing suspicion that some big conferences have become sinecures for "insiders" who have become "players" within associations. Let’s just say that there is a serious disconnect between how the big conferences operate and what makes sense in the changing world of academe.

Teleconferences with real-time discussion groups and online forums would be one good starting point for reform; providing more resources for regional and local conferences would be another. Small gatherings have issues of their own -- no-shows, sparsely attended sessions, overreliance on volunteers -- but they compensate by offering intimacy, good value, face-to-face feedback, and easier opportunities to network. It's time to give these the cachet they deserve. The big conference is like a one-size-fits-all t-shirt; it simply doesn’t accessorize most people. I’m done. For real. Unless I get funding for an exotic overseas meeting. (Just kidding!)

Author/s: 
Rob Weir
Author's email: 
info@insidehighered.com

Rob Weir, who writes the "Instant Mentor" column for Inside Higher Ed's Career Advice section, has published six books and numerous articles on social and cultural history, and has been cited for excellence in teaching on numerous occasions during his 20 years in the college classroom.

Conference Session Question

I was pleased to see this session in the conference program, organized around a topic to which I’ve dedicated much of my professional life, and I think you presenters have done a wonderful job to an extent. I think we all know what a labor of love organizing a conference session can be, especially when it is on a topic that is fairly complicated – a topic, perhaps, that only a handful of scholars have truly engaged and perhaps upon which only one or two have done any truly definitive work. The panel organizers might have thought to invite a central figure in this field to anchor the session, someone who has covered much of this ground already and is acknowledged to have done the first and still the best work on this question, but I’m told the organizers wanted fresh (as opposed to what, I don’t know!) voices and they invited some, well, emerging scholars to contribute. I think we’d all agree they did a fine job after a fashion; we hardly missed the usual contributors that often present papers on this topic.

But to return to my question – I promise there’s a question in here! – as I sat, rapt, listening to these fine presentations, I started wondering if the panelists were perhaps giving short shrift to some of the definitive findings on this topic that have proved quite sound and durable for almost two decades; I’m sure everyone in the room can tick off the titles of the groundbreaking publications that helped define this field – and, with all due respect, I began to suspect that some of the presenters were taking a rather … cavalier … direction, given the enduring centrality of those seminal works of scholarship with which all of us are familiar. So as I listened I began to formulate a response — we can’t all help but notice that a panel this good often cries out for a respondent, a prominent scholar to draw all the presentations together under the existing — and still quite valid — paradigm.

Thus my query, which should be prefaced with a reminder that in our discipline conference panels like this one ought to be informed by a thorough understanding of, if not respect for, the earlier work that created the very conditions that allow for the continued study of this issue. I hesitate to say "standing on the shoulders of giants," but I would hazard that some of the panel participants have failed to accommodate, much less cite – yes, I said cite – the key sources, which are as relevant today as they were when first published. At the risk of detracting from all this freshness, I can’t help put note that the papers I heard today can do no more than elaborate upon disciplinary principles already well established — footnotes to Plato and all that. And yet, certain experts went unmentioned. Certain still-relevant and available authorities could have spoken today, had one been invited to this session. My question, at last:

Don't you know who I am?

Author/s: 
Daniel J. Ennis
Author's email: 
info@insidehighered.com

Daniel J. Ennis is a professor of English at Coastal Carolina University.

Introducing Myself

Teresa Mangum offers perspective on the job search -- having launched hers during a previous economic downturn.

Abuse of Power

Very rarely do I wish that a professor would write his or her memoirs. Even saying “very rarely” may overstates the frequency of the wish. But if ever there were an exception to be made, it would be for Athan G. Theoharis – the dean of Freedom of Information Act scholarship.

Just to clarify, he is not actually a dean. According to the back cover of his new book, Abuse of Power: How Cold War Surveillance and Secrecy Policy Shaped the Response to 9/11 (Temple University Press), he is now professor emeritus of history at Marquette University. In the 1970s, on behalf of the Senate body best known as the Church Committee, Theoharis dug around in presidential libraries to find Federal Bureau of Investigation records, and he’s spent decades filing FOIA requests. At Marquette, he has “supervised a stable of masters and doctoral students who wrote about the civil liberties record of the FBI,” as one chronicle of the bureau puts it.

At very least, someone needs to sit down with Theoharis for a series of in-depth interviews on how he conducted his research, and trained others to continue the work. From passing references in Abuse of Power, it’s clear that the job requires extraordinary patience and tenacity -- but also a systematic understanding of the bureaucratic mind at its sneakiest. What’s that like? Does the frustration ever get to you? J. Edgar Hoover is named in the titles of three books by Theoharis, and central to several others. How do you learn to think like him without going quite paranoid? These questions blur the line between historiography and autobiography, though in a good way.

Abuse of Power, the author’s 20th book, is nowhere near so personal or ruminative as the one we might, with luck, get out of him one day. Subtitle notwithstanding, it has fairly little to do with 9/11, as such. Nor, for that matter, does Theoharis really make an argument about how Cold War policy “shaped the response” to that day’s attacks. He suggests that the Bush administration’s approach to domestic surveillance was an especially gung-ho version on Hoover’s attitude. But precedent is not cause.

There are a couple of ways to understand what Theoharis is actually doing in this book. One is to treat it as a challenge to American citizens of the present day. The other is to consider it a warning to future generations of historians.

Its contemporary challenge seems aimed at the liberal wing of public opinion in particular; for Theoharis makes the rather provocative case that George W. Bush was (at least in one respect) the fulfiller of FDR’s legacy, rather than its dim negation.

In the mid-1930s, faced with the aggressive pursuit of influence by Germany and the Soviet Union, Roosevelt made “a fundamental shift in the role of the Federal Bureau of Investigation … from [being] a law enforcement agency that sought to develop evidence to prosecute violators of federal laws to an intelligence agency that would seek to acquire advance information about the plans and capabilities of suspected spies and saboteurs.”

Rather than propose legislation to that effect, the president “instead opted for secret executive directives, a method that had as its central purpose the foreclosure of a potentially divisive and contentious debate.” And the director of the FBI was hardly going to object if things were done in secret, at least if he were the one doing them. Like the New Deal, “this profound shift was effected not through a well-defined blueprint but through a series of ad hoc responses” -- creating their own complex dynamics.

Just before the U.S. entered World War II, for example, FBI agents had interviewed almost 33,000 members of the American Legion, looking for people willing to infiltrate targeted organizations or monitor “persons of German, Italian, and Communist sympathies.” The program continued and grew throughout the war. Recruitment efforts intensified during the early 1950s despite grumbling by FBI field agents that maintaining contact with the Legionnaires took up a lot of time without producing much of value. But by then, the whole thing counted as a public relations effort for the FBI.

Meanwhile, more serious intelligence-gathering operations developed with scarcely any oversight. They included wiretaps and break-ins, campaigns to infiltrate and disrupt various organizations, and investigations into the private lives of public figures. Much of this is now common knowledge, of course, though only through the work of Theoharis and other researchers. Less widely known (and even more Ashcroft-y) is the program called Custodial Detention that Hoover launched in September 1939.

Renamed the Security Index in 1943, this created a list of candidates for “preventative detention” in case of emergency. There was also “a plan for the suspension of the Writ of Habeas Corpus,” in the words of FBI assistant director D. Milton Ladd. In 1950, conservative members of Congress were able to override president Truman’s veto to pass an internal security act that included its own provisions for rounding up potential subversives. But it defined the pool of detainees more narrowly than the FBI had, and mandated that they receive a hearing within 48 hours of being taken into custody. The bureau ignored the legislation, and by the 1960s was adding civil-rights and antiwar activists to the list. The program was phased out following Hoover’s death in 1972. By then it was called the Administrative Index: a case of blandness as disguise.

In his final chapter, Theoharis quickly reviews the domestic-surveillance operations that emerged in the wake of the 9/11 attacks, or at least the ones now part of the public record. Their resemblance to Cold War-era programs is not a matter of Hoover’s influence. The NSA has resources that old-school gumshoes could never imagine. According to Theoharis, the FBI had about 200 wiretaps going at any one time throughout the entire country, while intercepting (that is to say, opening) thousands of pieces of correspondence. Under the USA PATRIOT Act, that would count as a slow morning.

The point of Abuse of Power is that “a broad consensus that Congress should enact legislative charters for the intelligence agencies rather than defer to the executive branch” existed for only a brief period, roughly the 1970s. Even then, the “consensus” Theoharis invokes was hardly robust. Following 9/11, the old habits kicked in again -- technologically fortified, with all the ingenuity that highly skilled lawyers can bring to rationalizing decisions that have already been made.

As noted, the book also serves as a warning to those who, down the line, try to research this past decade.

Theoharis describes Hoover's techniques for routing and storing information. Agents would not submit material gathered from break-ins, wiretaps, or highly confidential sources in their official reports, but via letters forwarded to him directly with a code on the envelope. Besides the FBI files, he kept auxiliary archives full of especially sensitive documents. That way, if forced to turn over the bureau’s files on a given topic, he could limit the exposure of what intelligence he had gathered -- and, just as importantly, how he gathered it.

Assembling information was not enough; he had to dissemble it as well. The power to do either has grown exponentially in the meantime. Hoover's material was recorded on paper and audiotape. Just learning to find his way into the maze of Hoover’s evasions cost Theoharis much time and effort. The comparable documents covering the past few years will be far less tangible, more heavily encrypted -- stored in sticks, chips, or clouds. And the people creating (and hiding) information have the warning of Hoover's example. Within a few years of his death, the architecture of secrecy began to crumble. Things will be better hidden, and learning to track them down will be harder.

Abuse of power usually implies confidence that the abuser will escape any consequences. In that case, the historian is able to enforce some kind of accountability, however minimal and belated. Will it still be possible to do that in the future? Theoharis doesn’t seem to be prone to speculation; the occasional references to his own career are rather perfunctory, even self-effacing. But I hope he overcomes his reticence and explains how he found his way through the old labyrinth -- and how he sizes up the one on the horizon.

Author/s: 
Scott McLemee
Author's email: 
scott.mclemee@insidehighered.com

Falling Into the Generation Gap

A few weeks ago, sitting over a cup of coffee, a writer in his twenties told me what it had been like to attend a fairly sedate university (I think he used the word "dull") that had a few old-time New Left activists on its faculty.

"If they thought you were interested in anything besides just your career," he said, "if you cared about ideas or issues, they got really excited. They sort of jumped on you."

Now, I expected this to be the prelude to a little tribute to his professors – how they had taken him seriously, opened his mind to an earlier generation’s experience, etc. But no.

"It was like they wanted to finish their youth through you, somehow," he said. "They needed your energy. They needed you to admire them. They were hungry for it. It felt like I had wandered into a crypt full of vampires. After a while, I just wanted to flee."

It was disconcerting to hear. My friend is not a conservative. And in any case, this was not the usual boilerplate about tenured radicals seeking to brainwash their students. He was not complaining about their ideas and outlook. This vivid appraisal of his teachers was not so much ideological as visceral. It tapped into an undercurrent of generational conflict that the endless "culture wars" seldom acknowledge.

You could sum it up neatly by saying that his professors, mostly in their fifties and sixties by now, had been part of the "Baby Boom," while he belonged to "Generation X."

Of course, there was a whole segment of the population that fell between those two big cultural bins -- people born at the end of the 1950s and the start of the 1960s. Our cohort never had a name, which is probably just as well. (For one thing, we’ve never really believed that we are a "we." And beside, the whole idea of a prepackaged identity based on what year you were born seems kind of tacky.)

One effect of living in this no-man’s-land between Boomers and Xers is a tendency to feel both fascinated and repulsed by moments when people really did have a strong sense of belonging to a generation. The ambivalence is confusing. But after a while it seems preferable to nostalgia -- because nostalgia is always rather simple-minded, if not dishonest.

The recent documentary The Weather Underground  (a big hit with the young-activist/antiglobalization crowd) expressed doe-eyed sadness that the terrible Amerikan War Machine had forced young idealists to plant bombs. But it somehow never mentioned that group’s enthusiasm for the Charles Manson "family." (Instead of the two-fingered hippie peace sign, Weather members flashed a three-finger salute, in honor of the fork used to carve the word "war" into one of the victims’ stomach.) Robert McNamara and Henry Kissinger have a lot of things to answer for – but that particular bit of insanity is not one of them.

Paul Berman, who was a member of Students for a Democratic Society at Columbia University during the strike of 1968, has been writing about the legacy of the 1960s for a long time. Sometimes he does so in interesting ways, as in parts of his book A Tale of Two Utopias; and sometimes he draws lessons from history that make an otherwise placid soul pull out his hair with irritation. He has tried to sort the positive aspects of the 1960s out from the negative -- claiming all the good for a revitalized liberalism, while treating the rest as symptoms of a lingering totalitarian mindset and/or psychological immaturity.

Whatever the merits of that analysis, it runs into trouble the minute Berman writes about world history -- which he always paints in broad strokes, using bright and simple colors. In his latest book, Terror and Liberalism,  he summed up the last 300 years in terms that suggested Europe and the United States had grabbed their colonies in a fit of progress-minded enthusiasm. (Economic exploitation, by Berman’s account, had nothing to do with it, or not much.) Liberalism and Terror is a small book, and easy to throw.

His essay in the new issue of Bookforum is, to my mind, part of the thoughtful, reflective, valuable side of Berman’s work. In other words, I did not lose much hair reading it.

The essay has none of that quality my friend mentioned over coffee – the morbid hunger to feast off the fresh blood of a younger generation’s idealism. Berman has fond recollections of the Columbia strike. But that is not the same as being fond of the mentality that it fostered. "Nothing is more bovine than a student movement," he writes, "with the uneducated leading the anti-educated and mooing all the way."

The foil for Berman’s reflections is the sociologist Daniel Bell, who left Columbia in the wake of the strike. At the time, Bell’s book The End of Ideology  was the bete noir of young radicals. (It was the kind of book that made people so furious that they refused to read it – always the sign of the true-believer mentality in full effect.) But it was Bell’s writing on the history of the left in the United States that had the deepest effect on Berman’s own thinking.

Bell noticed, as Berman puts it, "a strange and repeated tendency on the part of the American Left to lose the thread of continuity from one generation to the next, such that each new generation feels impelled to reinvent the entire political tradition."

There is certainly something to this. It applies to Berman himself. After all, Terror and Liberalism is pretty much a jerry-rigged version of the Whig interpretation of history,  updated for duty in the War on Terror. And the memoiristic passages in his Bookforum essay are, in part, a record of his own effort to find "the thread of continuity from one generation to the next."

But something else may be implicit in Bell’s insight about the "strange and repeated tendency" to lose that thread. It is a puzzle for which I have no solution readily at hand. Namely: Why is this tendency limited to the left?

Why is it that young conservatives tend to know who Russell Kirk was, and what Hayek thought, and how Barry Goldwater’s defeat in 1964 prepared the way for Reagan’s victory in 1980? Karl Marx once wrote that "the tradition of all the dead generations weighs like a nightmare on the brain of the living." So how come the conservatives are so well-rested and energetic, while the left has all the bad dreams?

Author/s: 
Scott McLemee
Author's email: 
scott.mclemee@insidehighered.com

Scott McLemee writes Intellectual Affairs on Tuesdays and Thursdays.

Impure Literature

The publication, 100 years ago, of  The Jungle, by Upton Sinclair, in the popular American socialist newspaper Appeal to Reason had an enormous effect -- if not quite the one that its author intended. "I aimed at the public’s heart," Sinclair later said, “and by accident I hit it in the stomach.”

Drawing on interviews with workers in Chicago and his own covert explorations of the city’s meat-processing factories, Sinclair intended the novel to be an expose of brutal working conditions. By the time it appeared as a book the following year, The Jungle’s nauseating revelations were the catalyst for a reform movement culminating in the Pure Food and Drug Act. In portraying the life and struggles of Jurgis Rudkus, a Lithuanian immigrant, Sinclair wanted to write (as he put it), “The Uncle Tom’s Cabin of wage slavery,” thereby ushering in an age of proletarian emancipation. Instead, he obliged the bourgeoisie to regulate itself -- if only to keep from feeling disgust at its breakfast sausages.

In his introduction to a new edition of The Jungle, just published by Bedford/St. Martin’s, Christopher Phelps traces the origins and effects of Sinclair’s novel. Phelps, an associate professor of history at Ohio State University in Mansfield, is currently on a Fulbright fellowship in Poland, where he occupies a distinguished chair in American studies and literature at the University of Lodz. The following is the transcript of an e-mail interview conducted this month.

Q: At one of the major chain bookstores the other day, I noticed at least four editions of The Jungle on the shelf.  Yours wasn’t one of them. Presumably it's just a matter of time. What’s the need, or the added value, of your edition? Some of the versions available are pretty cheap, after all. The book is now in the public domain.

A:  Yes, it’s even available for free online these days, if all you want is the text. This new edition is for readers seeking context. It has a number of unique aspects. I’m pleased about the appendix, a report written by the inspectors President Theodore Roosevelt dispatched to Chicago to investigate Upton Sinclair’s claims about the meatpacking industry. In one workplace, they watch as a pig slides off the line into a latrine, only to be returned to the hook, unwashed, for processing. No other version of The Jungle includes this report, which before now had lapsed into obscurity. The new edition also features an introduction in which I survey the scholarship on the novel and provide findings from my research in Sinclair’s papers held by the Lilly Library at Indiana University. Finally, there are a lot of features aimed at students, including a cartoon, a map, several photographs, a bibliography, a chronology of Sinclair’s life, and a list of questions for discussion. So it doubles as scholarly edition and teaching edition.

Q: Let me ask about teaching the book, then. How does The Jungle go over in the classroom?

A:  Extremely well. Students love it. The challenge of teaching history, especially the survey, is to get students who think history is boring to imagine the past so that it comes alive for them. The Jungle has a compelling story line that captures readers’ attention from its very first scene, a wedding celebration shaded in financial anxiety and doubts about whether Old World cultural traditions can survive in America. From then on, students just want to learn what will befall Jurgis and his family. Along the way, of course, Sinclair injects so much social commentary and description that teachers can easily use students’ interest in the narrative as a point of departure for raising a whole range of issues about the period historians call the Progressive Era.

Q:  As you've said, the new edition includes a government report that appeared in the wake of the novel, confirming the nauseating details. What are the grounds for reading and studying Sinclair's fiction, rather than the government report?

A:  Well, Teddy Roosevelt’s inspectors had the singular mission of determining whether the industry’s slaughtering and processing practices were wholesome. Sinclair, for his part, had many other concerns. What drew him to write about the meatpacking industry in the first place was the crushing of a massive strike of tens of thousands of workers led by the Amalgamated Meat Cutters and Butcher Workmen of North America in 1904. In other words, he wanted to advance the cause of labor by exposing the degradation of work and exploitation of the immigrant poor.

When The Jungle became a bestseller, Sinclair was frustrated that the public furor centered almost exclusively on whether the companies were grinding up rats into sausage or disguising malodorous tinned beef with dyes. These were real concerns, but Sinclair cared most of all about the grinding up of workers. I included this government report, therefore, not only because it confirms Sinclair’s portrait of unsanitary meat processing, but because it exemplifies the constriction of Sinclair’s panorama of concerns to the worries of the middle-class consumer.

It further shows how Sinclair’s socialist proposal of public ownership was set aside in favor of regulatory measures like the Pure Food and Drug Act and Meat Inspection Act of 1906. Of course, that did not surprise Sinclair. He was proud, rightly so, of having been a catalyst for reform. Now, just as the report must be read with this kind of critical eye, so too the novel ought not be taken literally.

Q:  Right. All kinds of problems come from taking any work of literature, even the most intentionally documentary, as giving the reader direct access to history.

A: Nowadays The Jungle is much more likely to be assigned in history courses than in literature courses, and yet it is a work of fiction. You point to a major problem, which we might call the construction of realism. I devote a good deal of attention to literary form and genre in my introduction, because I think they are crucial and should not be shunted aside. I note the influence upon The Jungle of the sentimentalism of Harriet Beecher Stowe, of naturalist and realist writers like William Dean Howells and Frank Norris, and of the popular dime novels of Horatio Alger. Sinclair was writing a novel, not a government report. He fancied himself an American Zola, the Stowe of wage slavery.

A good teacher ought to be able to take into account this status of the text as a work of creative literature while still drawing out its historical value. We might consider Jurgis, for example, as the personification of a class. He receives far more lumps in life than any single worker would in 1906, but the problems he encounters, such as on-the-job injury or the compulsion to make one’s children work, were in fact dilemmas for the working class of the time.

In my introduction, I contrast the novel with what historians now think about immigrant enclaves, the labor process, gender relations, and race. There is no determinant answer to the question of how well The Jungle represented such social realities. Many things it depicted extremely well, others abominably, race being in the latter category. If we keep in mind that realism is literary, fabricated, we can see that Sinclair’s background afforded him a discerning view of many social developments, making him a visionary, even while he was blind in other ways. Those failings are themselves revelatory of phenomena of the period, such as the racism then commonplace among white liberals, socialists, and labor activists. It’s important that we read the novel on all these levels.

Q: Sinclair wrote quite a few other novels, most of them less memorable than The Jungle. Well, OK, to be frank,  what I've heard is that they were, for the most part, awful. Is that an unfair judgment? Was The Jungle a case of the right author handling the right subject at the right time?

A:  That's precisely it, I think. Sinclair was uniquely inspired at the moment of writing The Jungle. I've been reading a lot of his other books, and although some have their moments, they sure can give you a headache. Many of them read like failed attempts to recapture that past moment of glory. He lived to be ninety and cranked out a book for every year of his life, so it's a cautionary tale about allowing prolixity to outpace quality. The book of his that I like best after The Jungle is his 1962 autobiography, a book that is wry and whimsical in a surprising and attractive, even disarming, way.

Author/s: 
Scott McLemee
Author's email: 
scott.mclemee@insidehighered.com

Scott McLemee writes Intellectual Affairs on Tuesdays and Thursdays.

Show Clio the Money!

A member of Congress who says “history” is not necessarily thinking of the same enterprise as a professional historian. This is no Beltway-induced conceptual blockage: For civilians, the important thing about history is story, not methodology. (Even the most devoted viewers of the History Channel have no sense of the century-long debates over the "the objectivity question.")

But the stakes of mutual incomprehension are higher when the federal budget is involved -- when the member of Congress is voting on whether or not to fund initiatives designed to improve history education, mainly at the primary and secondary levels. For example, there is the $11.2 million that the National Endowment for the Humanities has requested for next year for We the People. And then there's the $119 million in the president’s budget slotted for the Teaching American History, a program of the Department of Education.

In such cases, it really matters whether legislators understand history to mean (1) a field producing new knowledge about the past or (2) a really cool holographic diorama of the Pilgrims at prayer.

The smart money would, of course, bet on the diorama. But history in the other sense is represented in Washington by the National Coalition on History, representing the interests of more than 70 professional organizations for historians and archivists. As it happens, all of this lobbying clout is exercised by one person, Bruce Craig, usually with the assistance of an intern.

Craig took over as director (and de facto staff) of the coalition in early 2003 -- just as it was shedding its earlier, clunkier identity as the National Coordinating Committee for the Promotion of History. Like its predecessor, the NCH runs out of an office in the American Historical Association building on Capitol Hill.

I recently interviewed Craig by telephone from his home in West Virginia -- an excellent choice of residence, since it makes him a constituent of Sen. Robert Byrd, whose baby Teaching American History really is. But I happen to know that is a coincidence. It turns out that we met a dozen years ago, when his wife and I both worked as archival technicians in the manuscript division of the Library of Congress. (Our job was history at the lowliest level: sorting dead people’s mail.)

Back then, Craig was working on a dissertation about Harry Dexter White, a Treasury official and co-founder of the World Bank and the International Monetary Fund, who was accused by Whittaker Chambers of being a Soviet operative.

Craig's findings (available in a book published last year) were that White engaged in “a species of espionage” for the Russians, yet was not guilty of subverting American policy in their favor. It is a nice distinction -- one likely to offend those who prefer a simpler estimate, one way or the other, of Joseph McCarthy’s place in history.

But that studied indifference to ideological default settings is not just a scholarly stance. Listening to Craig, it sounds like the best tool in the lobbyist’s kit.

In the course of our discussion, I tried to draw Craig out on whether the mid-1990s battles over multiculturalism, the Enola Gay exhibit, and such still echo around Capitol Hill. His response is ... well, not evasive, exactly. But he has an impressive knack for finding finds terms that are practical, nonpartisan, and diplomatic.

The culture war "doesn’t come up often," he said. "Congress is very concerned with school kids, with whether or not they know American history. And of course they should be concerned with that. Part of our role is to make sure that ‘history’ doesn’t end up being defined narrowly, as just American history -- that the ancient world, and comparative history, also get included."

With the Teaching American History program, of course, the national (if not nationalistic) focus is evident from the very name. Craig says the challenge is to keep “from too narrow an emphasis on particular types of American history, so that it just becomes a kind of civics lesson.”

By meeting with Congressional staff and getting historians to testify in committee, the National Coalition for History is trying to recalibrate what legislators mean by “traditional American history.” It's a matter, in effect, of making sure that the term covers both the doings of white guys in powdered wigs at the Constitutional Convention and the slave revolts that sometimes kept them from getting a good night's sleep.

Quite a bit of the NCH’s activity concerns matters that are upstream from the classroom – with issues, that is, affecting how history gets “done” by researchers. Craig lobbies in support of the Open Government Act, designed to bolster the Freedom of Information access to documents. Organizations belonging to the coalition are up in arms, understandably enough, about a renewed effort to zero out the budget for the National Historical Publications and Records Commission, which provides grants for the preparation of editions of historical documents. And the NCH appears to be making progress in saving the program.

And in preparing the coalition’s weekly electronic newsletter, The NCH Washington Update (archived here), Craig keeps up with the corridor politics of government agencies involved in historical matters. Did you know, for example, that the National Parks Service is a hotbed of internal conflict over grants for historical preservation projects? Chances are that, no, you did not know that -- let alone that a recent major reorganization of one section of the Park Service is known as "the May 3 massacre.” (Read all about it here.) It's the sort of inside-the-beltway news that helps keep historians connected with the bureaucratic developments indirectly shaping their field.

From talking to Craig and reading the coalition’s press, the impression forms of a lobby that is, as the saying goes, “post-ideological.”

You know the drill: Pragmatism is all. Politics is the art of compromise in pursuit of the possible. That sort of thing.

But my own instinct is always to historicize such “post-ideological” thinking. To see it, first of all, as taking shape in a specific historical period (the 1990s, pretty much), and to understand it as reflecting a particular set of vested interests. In short, the "post-ideological" outlook is precisely the ideology of the professional-managerial class, i.e., extremely skilled brain workers who want to do their jobs without having to dread weird lurches in political governance.  

Now, some of my lingo here (“historicize,” “class”) is faintly marxisant, of course. But for what it’s worth, similar notions do pop up even when conservatives think about the recent past. As a case in point, check out the conservative historian Richard Jensen’s analysis of the culture wars.

Historians don’t all share the same, presumably leftist, politics -- no matter what the polemicists say. But they do share the same interest in seeing that libraries and archives stay open, and that “history” be understood to embrace a range of periods and topics. And also that new generations be encouraged to develop an appetite for learning about the past.

Given all that, there is an incentive to play down ideological fractiousness, as the National Coalition for History does with some finesse. The consequences are a little paradoxical -- creating “an ironic role [for] Washington, D.C.,” in the words of Rick Shenkman, editor of the History News Network.

“The larger story here in my opinion,” Shenman told me in an e-mail note a couple of weeks ago, “is the ironic role of Washington, D.C. in the history wars.  It has been the Right that has largely been behind the fantastic increase in appropriations for history over the last few years. Lynne Cheney has played a role as has Sen. Lamar Alexander. Robert Byrd, though a liberal of sorts, has pressed his history agenda on quite conservative grounds. And the beneficiary of the funds?  It's those liberal historians across the country whom David Horowitz thinks are undermining the Republic!”  

Not that the National Coalition for History, or anybody else for that matter, is being exactly Machiavellian about any of it. In the end, it’s all about the dead presidents. At the risk of being crass, you might best understand even the politics of scholarship by following the money.

“We have no space Hubble to rally around,” as Shenkman puts it. “So historians have used the easiest arguments at hand in support of their projects -- and that happens to be the patriotic argument.”

Author/s: 
Scott McLemee
Author's email: 
scott.mclemee@insidehighered.com

Scott McLemee writes Intellectual Affairs on Tuesdays and Thursdays. His last column explored the intellectual legacy of Paul Ricoeur.

Ambiguous Legacy

There will be a meeting tonight in Washington to celebrate the life of James Weinstein, the radical historian and publisher who died in Chicago last Thursday. The news was by no means unexpected. But the gathering is impromptu, and it will probably be small.

I suppose one thing we will all have in common is an inability to refer to the deceased as "James Weinstein." He was Jimmy. It's a fair guess that the turnout will include union organizers and progressive lobbyists and a few journalists. There will undoubtedly be an academic or two -- or several, if you count the defrocked, the ABD's, and the folks who otherwise decided (contra David Horowitz) that university life is not necessarily conducive to being a leftist.

Many people know that Weinstein's book The Decline of Socialism in America, 1912-1925 (first published in 1967 and reprinted by Rutgers University Press in 1984) started out as his dissertation. After all this time, it remains a landmark work in the scholarship on U.S. radicalism. But only this weekend, in talking with a mutual friend, did I learn that he never actually bothered to get the Ph.D.

Diagnosed with brain cancer, Jimmy spent the final weeks of his life in bed at home. He gave a series of interviews to Miles Harvey, an author and former managing editor at In These Times, the progressive magazine that Jimmy founded. The body of reminiscences is now being transcribed, and will join the collection of the Oral History Research Office at Columbia University.

"We both knew we were in a race against time," Miles said when we talked by phone over the weekend. "We mined a lot of interesting stuff. Jimmy was the Zelig of the American left."

The son of a prosperous businessman, he worked for years in electronics factories as a rank-and-file Communist union member. One of his anecdotes from that era is something of a legend -- has become, even, a part of history. One day a comrade asked Jimmy to give a ride to a taciturn fellow doing party business of an undisclosed nature. A few years later, he recognized the passenger as Julius Rosenberg. (Suffice it to say that Weinstein's future biographer will probably find a day-by-day account of his life during the early 1950s in the FBI surveillance files.)

Jimmy left the party in 1956, as part of a major exodus in the wake of Khrushchev's denunciation of the crimes of Stalin. He was never apologetic about his membership. But neither was he even slightly sentimental about it.

Well before massive documentation from the Russian archives settled the question, he dismissed the arguments of those who insisted that the American CP and the Soviet spy apparatus in the U.S. had to be considered as completely distinct entities. Any good party member would have been glad to help out, he said: "We would have considered it an honor." (Jimmy himself never received that distinction. According to Miles Harvey, the request that he chauffeur Julius Rosenberg has less to do with Jimmy's reliability as a revolutionary than it did with the fact that he was one of the Communists on hand who owned a car.)

The fact that he once said this at a public event, where non-leftists could hear him -- and that he did so during the Reagan administration, no less -- is still held against him in some circles.

The usual pattern, of course, is to abandon a rigid, dogmatic political ideology -- and then to adopt another one. People spend entire careers boldly denouncing other people for their own previous mistakes. It's easy work, and the market for it is steady.

Jimmy followed a different course. To begin with, he had never been all that keen on the ideological nuances of the Communist movement. He certainly knew his Marx and Lenin from studying at the party's famous Jefferson School of Social Science, in New York. But somehow the doctrinal points counted less than what he'd picked up from all those years as a union activist. At least that's the impression of his friend Jim McNeill, another former managing editor at In These Times. (McNeill, who is now an organizer for the Service Employees International Union.)

Nearing 30, Weinstein decided to go to graduate school to study history; and his instinct was to dig into an earlier period of American radicalism -- when it spoke an idiom that was much less purely Marxist, and a lot more influential. Up through World War I, the Socialists successfully fielded candidates in local elections and even get the occasional member into Congress. And Eugene Debs, a figure beloved even by those who didn't share his vision of the proletarian commonwealth, could win nearly a million votes for president while imprisoned for an antiwar speech.

Weinstein's research was, in short, a glimpse of an alternative that had been lost. It wasn't simply a matter of government repression, either. There were streaks of doctrinal puritanism, of apocalyptic revolutionism, that eventually proved corrosive. "In large part," as he later put it, "the failure of the American left has been internal." (Whether or not he made the connection isn't clear, but his own experience in the CP would tend to confirm this. As bad as McCarthyism had been for the party, members started quitting en masse once they had to face the truth about Stalin.)

Boiled down, his conclusions amounted to a demand for a major upheaval in the culture of the left. What it needed for the long term, in effect, was a healthy dose of pragmatism. It would also mean learning to think of reforms as part of the process of undermining the power of the profit system -- rather than implicitly seeing reforms as, at best, a kind of compromise with capitalism.

Had he done only that initial study of the Socialist Party (finished in 1962, though only published five years later), Jimmy Weinstein would merit a small but honorable spot in the history of the American left. But in fact he did a lot more.

Today's academic left is very much a star system. Jimmy never had a place in it. If that bothered him, he did a good job of keeping quiet about it. But just for the record, it's worth mentioning that he was present at the creation.

He was part of the group in Madison, Wisconsin that published Studies on the Left between 1959 and 1967. It was the first scholarly journal of Marxist analysis to appear in the United States since at least the end of World War II, and an important point of connection between the American New Left and international currents in radical thought. (The first translation of Walter Benjamin's "The Work of Art in the Age of Mechanical Reproduction," for example, appeared in Studies.)

Jimmy's brief memoir of this period can be found in a volume edited by the radical historian Paul Buhle called History and the New Left: Madison, Wisconsin, 1950-1970 (Temple University Press, 1990). There has long been a tendency to treat the intellectual history of the American left as unfolding primarily in New York City. This is understandable, in some ways, but it introduces gross distortions. It's worth remembering that one of the major publications serving to revitalize radical scholarship was the product of a group of graduate students at the University of Wisconsin. It appears that Buhle's anthology is now out of print. But what's more surprising, I think, is that more research hasn't been done on "the Madison intellectuals" in the meantime.

In keeping with Miles Harvey's characterization of Weinstein as "the Zelig of the American left," we next find him at the Chicago convention of Students for a Democratic Society in 1969. That was the one where -- just as the antiwar movement was starting to get a hearing on Main Street USA -- rival factions waved copies of the Little Red Book in the air and expelled one another. (Want evidence that the left's deepest wounds are self-inflicted? There you go.)

Repelled by the wild-eyed hysteria and terrorist romanticism of the Weather Underground (of which, one of his cousins was a member), Jimmy helped start another journal, Socialist Revolution, which was always more cerebral than its up-against-the-wall title might suggest. In 1978, it changed its name to Socialist Review. (This abandonment of "revolution" inspired a certain amount of hand-wringing in some quarters.) It was the venue where, in 1985, Donna Haraway first published her "Cyborg Manifesto." For years afterward, the rumor went around that SR was about to drop "Socialist" from its title, to be replaced by "Postmodern." But in fact it continues now as Radical Society -- a distant descendant of its ancestor, by now, though it still bears a family resemblance to the publications that Jimmy worked on long ago.

Jimmy's last major venture as a publisher -- the culmination of his dream of converting the lessons of radical history into something practical and effective, here and now -- was In These Times, which started as a newspaper in 1976 and turned into a magazine sometime around 1990. A collection of articles from the magazine's first quarter century appeared in 2002 as the book Appeal to Reason -- a title echoing the name of the most widely circulated newspaper of the old Socialist Party.

Pat Aufderheide, now a professor of communications at American University, was ITT's culture editor from 1978 through 1982. She writes about the experience in her book The Daily Planet: A Critic on the Capitalist Culture Beat (University of Minnesota Press, 2000). A whole generation of people were entranced by the countercultural idea that "the personal is the political" -- or its academic doppelganger, the Foucauldian notion that power was everywhere and inescapable. These were recipes, she notes, for "self-marginalization and political fundamentalism" on the left.

"For In These Times," writes Aufderheide, "politics is the prosaic complex of institutions, structures and actions through which people organize consciously for social change.... Richard Rorty would put it in the reformist left category. It is read largely by leftists who do organizing or other practical political work, through labor unions, universities and schools, churches, nonprofit organizations and local and regional government. These are smart people, many of whom are not intellectuals, and who mostly come home late and tired."

The importance of reaching that public -- indeed, the very possibility of doing so -- tends to be overlooked by many people engaged in left-wing academic discourse. ("Our comrades in armchairs," as activists sometimes put it.)

In her book, Aufderheide recalls dealing with "a vocal contingent of academics" who were "always ready to pounce on lack of subtlety, creeping cheerleading, or sentimentality" in the magazine's cultural coverage. "Their critical acuteness, however, often seemed exercised for the satisfaction of intellectual one-upmanship," she writes. "When I begged them to write, to point me to other writers, to serve on the board, there was almost always a stunned silence."

The problem is self-perpetuating, Perhaps it comes down to a lack of good examples. And in that regard, Jimmy's death is more than a personal loss to his friends and family.

It's worth mentioning that, along the way, he wrote a number of other books, with The Long Detour: The History and Failure of the American Left  (Westview, 2003) being his last. It was also his favorite, according to Miles Harvey, whose series of deathbed  interviews will, in time, serve as the starting point for some historical researcher who has perhaps not yet heard of James Weinstein.

To be candid, I didn't care for his final book quite as much as the one he published in 1975 called Ambiguous Legacy: The Left in American Politics. The books are similar in a lot of ways. I'm not sure that my preference for one over the other is entirely defensible.

But it was Ambiguous Legacy that Jimmy inscribed when we met, about 10 years ago. My copy of his first book, the one on the Socialist Party, he dedicated "with hope for our future." Only later did I look at the other volume. Beneath the greeting -- and before his signature -- he wrote: "The legacy is more ambiguous than ever."

Author/s: 
Scott McLemee
Author's email: 
scott.mclemee@insidehighered.com

Scott McLemee was a contributing editor for In These Times between 1995 and 2001. His column Intellectual Affairs appears here on each Tuesday and Thursday.

Throat Culture

For the past few days, I've been waiting for a review copy of Bob Woodward's book The Secret Man: The Story of Watergate's Deep Throat to arrive from Simon and Schuster. So there has been some time to contemplate the way that (no longer quite so) mysterious figure has been "inscribed" ina "double register" of "the historical imaginary," as the cult-stud lingo has it. (Sure hope there's a chance to use "imbricated discourse" soon. Man, that would be sweet.)

Putting it in slightly more commonplace terms: Two versions of Deep Throat have taken shape in the past 30 years or so. They correspond to two different ways of experiencing the odd, complex relationship between media and historical memory.

On the one hand, there was Deep Throat as a participant in a real historical event -- making the question of his motivation an important factor in making sense of what happened. It was even, perhaps, the key to understanding the "deep politics" of Watergate, the hidden forces behind Richard Nixon's fall. The element of lasting secrecy made it all kind of blurry, but in a fascinating way, like some especially suggestive Rorschach blot.

On the other hand, there was Deep Throat as pure icon -- a reference you could recognize (sort of) even without possessing any clear sense of his role in Watergate. It started out with Hal Holbrook's performance in All the President's Men -- which, in turn, was echoed by "the cigarette-smoking man" on "The X Files," as well as the mysterious source of insider information about the Springfield Republican Party on "The Simpsons." And so Deep Throat (whose pseudonym was itself originally amovie title) becomes a mediatic signifier unmoored to any historical signified. (An allusion to an allusion to a secret thus forgotten.)

Different as they might be, these two versions of Deep Throat aren't mutually exclusive. The discourses can indeed become imbricated ( yes!), as in the memorable film Dick, which reveals Deep Throat as a pair of idealistic schoolgirls who guide the cluelessly bumbling Woodward and Bernstein through the mysteries of the Nixon White House.

There is something wonderful about this silly premise: In rewriting the history of Watergate, Dick follows the actual events, yet somehow neutralizes their dire logic by just the slightest shift ofemphasis. The deepest secret of an agonizing national crisis turns out to be something absurd.

That perspective is either comically subversive or deeply cynical. Either way, it's been less anticlimactic, somehow, than the revelation of Deep Throat's real identity as the former FBI official Mark Felt. So much for the more elaborate theories about Watergate - that it was, for example, a "silent coup" by a hard-right anticommunist faction of the U.S. military, upset by the administration's dealings with the Soviets and the Chinese. And Deep Throat's role as emblem of noir-ish intrigue may never recover from the impact of the recent, brightly lit video footage of Mark Felt -- half-dazed, half mugging for the camera.

And there have been other disappointments. This week, I had an interesting exchange by e-mail with Bill Gaines, a professor of journalism at the University of Illinois at Urbana-Champaign and two-time winner of the Pulitzer, not counting his two other times as finalist. His part in the DeepThroat saga came late in the story, and it's caused him a certain amount of grief.

But it was also -- this seems to me obvious -- quite honorable. If anything, it is even more worthy of note now that Bob Woodward is telling his side of the story. (While Carl Bernstein also has a chapter in the book, it was Woodward who had the connection with Felt.)

In 1999, Gaines and his students began an investigation designed to determine the identity of Deep Throat. The project lasted four years. It involved sifting through thousands of pages of primary documents and reading acres of Watergate memoir and analysis -- as well as comparing the original articles by Woodward and Bernstein from The Washington Post to the narrative they provided in their book All the President's Men. Gaines also tracked down earlier versions of the manuscript for that volume -- drafted before Woodward decided to reveal that he had a privileged source of inside information.

Gaines and his students compiled a database they used to determine which of the likely candidates would have actually been in a position to leak the information that Deep Throat provided. In April 2003, they held a press conference at the Watergate complex in Washington, DC, where they revealed ... the wrong guy.

After a period of thinking that Deep Throat must have been Patrick Buchanan (once a speechwriter for Nixon), the researchers concluded that it had actually been Fred Fielding, an attorney who had worked as assistant to John Dean. The original report from the project making the case for Fielding is still available online -- now updated with a text from Gaines saying, "We were wrong."

The aftermath of Felt's revelation, in late May, was predictably unpleasant for Gaines. There were hundreds of e-mail messages, and his phone rang off the hook. "Some snickered as if we had run the wrong way with the football," he told me.

But he added, "My students were extremely loyal and have told anyone who will listen that they were thrilled with being a part of this project even though it failed." Some of those who worked on the project came around to help Gaines with the deluge of correspondence, and otherwise lend moral support.

As mistaken deductions go, the argument offered by Gaines and his students two years ago is pretty rigorous. Its one major error seems to have come at an early stage, with the assumption that Woodward's account of Deep Throat was as exact as discretion would allow. That was in keeping with Woodward's own statements, over the years. "It's okay to leave things out to protect the identity of a source," he told the San Francisco Chronicle in 2002, "but to add something affirmative that isn't true is to publish something you know to be an inaccuracy. I don't believe that's ethical for a reporter."

The problem is that the original account of Deep Throat doesn't line up quite perfectly with what is known about Mark Felt. Some of the discrepancies are small, but puzzling even so. Deep Throat is a chain smoker, while Felt claimed to have given up the demon weed in 1943. "The idea that Felt only smokes in the garage [during his secretive rendezvous with Woodward] is a little hard to swallow," says Gaines. "I cannot picture him buying a pack and throwing the rest away for the drama it will provide." By contrast, Fielding was a smoker.

More substantive, perhaps, are questions about what Deep Throat knew and how he knew it. Gaines and his students noted that statements attributed to Deep Throat in All the President's Men were credited to a White House source in the original newspaper articles by Woodward and Bernstein. (Felt was second in command at the FBI, not someone working directly for the White House, as was Fielding.)

Deep Throat provided authoritative information gleaned from listening to Nixon's secret recordings during a meeting in November 1973. That was several months after Felt left the FBI. And to complicate things still more, no one from the FBI had been at the meeting where the recordings were played.

According to Gaines, that means Felt could only have learned about the contents of the recordings at third hand, at best. Felt was, as Gaines put it in an e-mail note, ""so far removed that his comments to Woodward would have to be considered hearsay, and not the kind of thing a reporter could write for fact by quoting an anonymous source."

When I ask Gaines if there is anything he hopes to learn from Bob Woodward's new book, he mentions hoping for some insight into one of the more memorable descriptions of the secret source -- the one about how Deep Throat "knew too much literature too well." In any case, Gaines make a strong argument that Woodward himself took a certain amount of literary license in transforming Felt into Deep Throat.

"We know from our copy of an earlier manuscript that Woodward changed some direct quotes attributed to Throat," he notes. "They were not major changes, but enough to tell us that he was loose with the quotes. There is information attributed to Throat that Felt would not have had, or that doesnot agree with what we found in FBI files."

As the saying has it, journalists write a first draft of history. One of the ethical questions involves trying to figure out just how much discretion they get in polishing the manuscript. Gaines seems careful not to say anything too forceful on this score -- though he does make clear that he isn't charging Woodward with creating a composite character.

That has long been one of the suspicions about Deep Throat. Even the new revelation hasn't quite dispelled it. Just after Felt went public with his announcement, Jon Wiener, a professor of history at the University of California at Irvine, reviewed some of the grounds for thinking that "several people who provided key information ... were turned into a composite figure for dramatic purposes" by Woodward and Bernstein. (You can find more of Wiener's comments here, at the very end of the article.)

For his part, Gaines says that the Deep Throat investigation isn't quite closed -- although he wishes it were. "I have always wanted to move on to something more important for the class project," he told me, "but the students and the media have caused us to keep going back to the Throat story."

Maybe now they should look into the mystery surrounding Deep Throat's most famous line: his memorable injunction to Woodward, "Follow the money."

It appears in the movie version of All the President's Men, though it can't be found in the book. When asked about it in an interview some years ago, Woodward guessed that it was an embellishment by William Goldman, the screenwriter. But Goldman has insisted that he got the line from Woodward.

Now it's part of the national mythology. But it may never have actually happened. Sometimes I wish the discourses would stop imbricating long enough to get this kind of thing sorted out.

Author/s: 
Scott McLemee
Author's email: 
scott.mclemee@insidehighered.com

Real Knowledge

During the heyday of American economic and geographical expansion, in the late 19th century, the men who sold real estate occupied a distinct vocational niche. They were slightly less respectable than, say, riverboat gamblers -- but undoubtedly more so than pirates on the open seas. It was a good job for someone who didn’t mind leaving town quickly.

But about 100 years ago, something important began to happen, as Jeffrey M. Hornstein recounts in A Nation of Realtors: A Cultural History of the Twentieth Century American Middle Class, published this spring by Duke University Press.  Some of those engaged in the trade started to understand themselves as professionals.

They created local realty boards and introduced licensing as means by which reputable practitioners could distinguish themselves from grifters. And in time, they were well enough organized to lobby the federal government on housing policy –- favoring developments that encouraged the building of single-family units, rather than public housing. Their efforts, as Hornstein writes, "would effectively create a broad new white middle class haven in the suburbs, while leaving behind the upper class and the poor in cities increasingly polarized by race and wealth."

I picked up A Nation of Realtors expecting a mixture of social history and Glengarry Glen Ross. It's actually something different: a contribution to understanding how certain aspects of middle-class identity took shape -- both among the men (and later, increasingly, women) who identified themselves as Realtors and among their customers. Particularly interesting is the chapter "Applied Realology," which recounts the early efforts of a handful of academics to create a field of study that would then (in turn) bolster the profession’s claims to legitimacy and rigor.

Hornstein recently answered a series of questions about his book -- a brief shift of his attention back to scholarly concerns, since he is now organizing director of Service Employees International Union, Local 36, in Philadelphia.

Q:Before getting to your book, let me ask about your move from historical research to union organizing. What's the story behind that?

A: I was applying to graduate school in my senior year of college and my advisor told me that while he was sure I could handle grad school, he saw me as more of "a politician than a political scientist." I had always been involved in organizing people and was a campus leader. But I also enjoyed academic work, and went on to get two graduate degrees, one in political science from Penn, another in history from the University of Maryland.

While I was doing the history Ph.D. at Maryland, a group of teaching assistants got together and realized that we were an exploited group that could benefit from a union. Helping to form an organizing committee, affiliating with a national union, getting to know hard-boiled organizers (many of whom were also intellectuals), and attempting to persuade my peers that they needed to take control of their own working conditions through collective action captured my imagination and interest much more than research, writing, or teaching.  

After a long intellectual and personal journey, I finally defended my dissertation. The academic job market looked bleak, particularly as a graduate of a non-elite institution. And when I was honest with myself, I realized that my experience forming a graduate employee union engaged me far more than the intellectual work.

Armed with this insight, I put the diss in a box, and two weeks later, I was at the AFL-CIO’s Organizing Institute getting my first taste of what it would be like to organize workers as a vocation. In the dark barroom in the basement of the George Meany Center for Labor Studies, a recruiter from an SEIU local in Ohio approached me and asked me if I’d like to spend the next few years of my life living in Red Roof Inns, trying to help low-wage workers improve their lives. Two weeks later, I landed in Columbus, Ohio and I was soon hooked.  

And I would add this: The supply of talented and committed organizers is far outstripped by the demand. The labor movement’s current crisis is, frankly, a huge opportunity for energetic and idealistic people to make a real difference. Hard work and commitment is really rewarded in the labor movement, and one can move quickly into positions of responsibility. It’s very demanding and often frustrating work, but it’s about as fulfilling a vocation as I could imagine.

Q:You discuss the emergence of realtors as the rise of a new kind of social identity, "the business professional." But I'm left wondering about early local real-estate boards. They sound kind of like lodges or fraternal groups, as much as anything else. In what sense are they comparable to today's professional organizations, as opposed to, say, the Elks or the Jaycees?

A: Indeed, early boards were very much like fraternal organizations. They were all male and clubby, there was often a "board home" that offered a retreat space, and so on. Early real estate board newsletters are rife with the sorts of jokes about women and minorities that were standard fare in the 1910s and 1920s -- jokes that, I argue, help to police the boundaries of masculinity.  

In the early chapters of the book, I provide brief sketches of the workings of the Chicago and Philadelphia real estate boards, as well as a sort of anthropological view of early real estate conventions. My favorite was the 1915 Los Angeles convention, during which the main social event was a drag party. In my view, the conventions, the board meetings, the social events, the publications, all formed a homosocial space in which a particular sort of masculinity was performed, where the conventions of middle-class masculinity were established and reinforced.  

In the early 1920's, the emphasis began to shift from fraternalism to a more technocratic, professional modality.  Herbert Nelson took the helm at the National Association of Real Estate Boards in 1923, and he started to make NAREB look much more like a modern professional organization. In some respects he created the mold. He made long-term strategic plans, asserted the necessity for a permanent Realtor presence in Washington, D.C., pushed for standards for licensing, worked with Herbert Hoover’s Commerce Department to promulgate a standard zoning act, and linked up with Professor Richard T. Ely [of the University of Wisconsin at Madison] to help "scientize" the field.  

Nelson served as executive director of NAREB for over 30 years. During his tenure, the organization grew, differentiated, specialized, and became a powerful national political actor. In sum, it became a true modern professional association in most ways. Yet like most other professional organizations prior to the ascendancy of feminism and the major incursion of women into the professions, masculine clubbiness remained an important element in the organizational culture well into the 1970s.    

In sum, the story I tell about the complex interdependencies of class, gender, and work identities is largely about the Realtors’ attempts to transform an Elks-like organization into a modern, "professional" business association.

Q:On the one hand, they see what they are doing as a kind of applied social science -- also creating, as you put it, "a professional metanarrative." On the other hand, you note that Ely's Institute for Research in Land Economics was a casualty of the end of the real estate bubble. Doesn't that justify some cynicism about realtors' quest for academic legitimacy?

A: I don’t see the Realtors or the social scientists like Ely in cynical terms at all. In fact, both parties are quite earnest about what they’re doing, in my view. Ely was nothing if not a true believer in the socially transformative power of his research and of social scientific research in general. He managed to persuade a faction of influential Realtors, primarily large-scale developers ("community-builders") such as J.C. Nichols, that research was the key to professionalism, prosperity, and high-quality real estate development.  
Ely’s Institute was not a casualty of the implosion of the 1926 Florida real estate bubble as such. But the real estate collapse and the ensuing Depression made it much harder for the Realtors to make claims to authority based on disinterested science.

It’s not that the grounding of the whole field of Land Economics was problematic – at least no more so than any other field of social or human science, particularly one that produces knowledge that can be used for commercial purposes.  

The academic field was in its infancy in the 1910s and 1920s, and there were intra-disciplinary squabbles between the older, more historical economists like Ely and the younger generation, which was much more model- and mathematics-driven. At the same time, there were sharp divisions among Realtors between those who believed that professionalism required science (and licensing, and zoning, and so on) and those who rejected this idea.  

So, yes, the Elyian attempt at organizing the real estate industry on a purely ‘scientific’ basis, operating primarily in the interest of the social good, was largely a failure. However, the 1920s mark a watershed in that the National Association became a major producer and consumer of social scientific knowledge. Business schools began to offer real estate as a course of study. Textbooks, replete with charts and graphs and economic equations, proliferated. Prominent academics threw their lot in with the Realtors.

In the end, the industry established its own think tank, the Urban Land Institute, the motto of which is “Under All, The Land” -- taken straight from Ely’s work. But the profession itself remained divided over the value of ‘science’ – the community-builders generally supported efforts to scientize the field, while those on the more speculative end of the profession were generally opposed.  

But again, I don’t think that the grounding of the field of land economics is any more questionable than any other subfield of economics, such as finance or accounting.

Q:Your book left me with a sort of chicken-and-egg question. You connect the growth of the profession with certain cultural norms -- the tendency to define oneself as middle-class, the expectation of private home ownership, etc. Didn't those aspirations have really deep roots in American culture, which the Realtors simply appealed to as part of their own legitimization? Or were they more the result of lobbying, advertising, and other activities of the real-estate profession?

A: Absolutely, these tendencies have roots deep in American culture. The term "middle class" was not really used until the late 19th century -- "middling sorts" was the more prevalent term before then. The "classless society" has long been a trope in American culture, the idea that with hard work, perseverance, and a little luck, anyone can "make it" in America, that the boundaries between social positions are fluid, etc.  

But it’s not until the early-to-mid 20th century that homeownership and middle-class identity come to be conflated.  The "American Dream" is redefined from being about political freedom to being about homeownership. At around the same time, debt is redefined as "credit" and "equity."

So, yes, I ‘d agree to some extent that the Realtors tapped into longstanding cultural norms as part of their efforts at self-legitimization. Like most successful political actors, they harnessed cultural commonsense for their own ends – namely, to make homeownership integral to middle-class identity. Their political work enabled them, in the midst of the Depression, to get the National Housing Act passed as they wrote it -- with provisions that greatly privileged just the sort of single-family, suburban homes leading members of NAREB were intent on building.  

The Realtors used the cultural material at hand to make their interests seem to be the interests of the whole society. But, as we know from many fine studies of suburban development, many people and many competing visions of the American landscape were marginalized in the process.

Author/s: 
Scott McLemee
Author's email: 
scott.mclemee@insidehighered.com

Scott McLemee writes Intellectual Affairs on Tuesdays and Thursdays.

Pages

Subscribe to RSS - History
Back to Top