History

The Broccoli of Higher Ed

We hear these days of the "crisis of the humanities." The number of majors, jobs, and student interest in these subjects is dropping. The Boston Globe offered one report on the worries of the humanities in an article last year about the new Mandell Center at Brandeis University. The Globe asserted, "At college campuses around the world, the humanities are hurting. Students are flocking to majors more closely linked to their career ambitions. Grant money and philanthropy are flowing to the sciences. And university presidents are worried about the future of subjects once at the heart of a liberal arts education."

Such gloom must be placed in context. Doubts about the humanities have been around at least since Aristophanes wrote The Clouds. The playwright claimed that if a man engaged in the "new" Socratic form of teaching and questioning, he could wind up with big genitals (apparently seen as a negative side effect) due to a loss of self-control. But the Socratic humanities survived, in spite of the execution of their founder, through the schools of his intellectual son and grandson -- the Academy of Plato and the Lyceum of Aristotle.

I don't think that the humanities are really in a crisis, though perhaps they have a chronic illness. Bachelor's degrees in the humanities have held relatively steady since 1994 at roughly 12-13 percent of all majors. Such figures demonstrate that the health of the humanities is not robust, as measured in terms of student preferences. In contrast, the number of undergraduate business majors is steadily and constantly increasing.

So what has been the response of university and college leaders to the ill health of the humanities?

It has been to declare to applicants, students, faculty, and the public that these subjects are important. It has included more investments in humanities, from new buildings like the Mandel Center at Brandeis, to, in some cases, hiring more faculty and publicizing the humanities energetically. Dartmouth College's president, Jim Yong Kim, recently offered the hortatory remark that "Literature and the arts should not only be for kids who go to cotillion balls to make polite conversation at parties."

I couldn't agree more with the idea that the humanities are important. But this type of approach is what I call the "eat it, it's good for you" response to the curricular doldrums of humanities. That never worked with my children when it came to eating broccoli and it is even less likely to help increase humanities enrollments nationally today.

The dual-horned dilemma of higher education is the erosion of the number of majors in the humanities on the one hand and the long-feared "closing of the American mind" on the other, produced in part by the growing number of students taking what some regard as easy business majors. Yet these problems can only be solved by harnessing the power of culture, by understanding the ethno-axiological soup from which curriculums evolve and find their sustenance. Jerome Bruner has long urged educators to connect with culture, to recognize that the environment in which we operate is a value-laden behemoth whose course changes usually consume decades, a creature that won't be ignored.

It is also vital that we of the humanities not overplay our hands and claim for ourselves a uniqueness that we do not have. For example, it has become nearly a truism to say that the humanities teach "critical thinking skills." This is often correct of humanities instruction (though certainly not universally so). But critical thinking is unique neither to the humanities nor to the arts and sciences more generally. A good business education, for example, teaches critical thinking in management, marketing, accounting, finance, and other courses. More realistically and humbly, what we can say is that the humanities and sciences provide complementary contexts for reasoning and cultural knowledge that are crucial to functioning at a high level in the enveloping society.

Thus, admitting that critical thinking can also be developed in professional schools, we realize that it is enhanced and further developed when the thinker learns to develop analytical skills in history, different languages, philosophy, mathematics, and other contexts. The humanities offer a distinct set of problems that hone thinking skills, even if they are not the only critical thinking game in town. At my institution, Bentley University, and other institutions where most students major in professional fields, for example, English develops vocabulary and clarity of expression while, say, marketing builds on and contributes to these. Science requires empirical verification and consideration of alternatives. Accountancy builds on and contributes to these. Science and English make better business students as business courses improve thinking in the humanities and sciences.

If, like me, you believe that the humanities do have problems to solve, I hope you agree that they are not going to be solved by lamenting the change in culture and exhorting folks to get back on course. That's like holding your finger up to stop a tidal wave. Thinking like this could mean that new buildings dedicated to the humanities will wind up as mausoleums for the mighty dead rather than as centers of engagement with modern culture and the building of futures in contemporary society.

So what is there to do? How do we harness the power of culture to revive and heal the influence of the humanities on future generations? Remember, Popeye didn't eat his spinach only because it was good for him. He ate his spinach because he believed that it was a vital part of his ability to defend himself from the dangers and vicissitudes of life, personified in Bluto. And because he believed that it would give him a good life, represented by Olive Oyl.

Recently, an alumnus of Bentley told me over dinner, "You need business skills to get a job at our firm. But you need the arts and sciences to advance." Now, that is the kind of skyhook that the friends of the humanities need in order to strengthen their numbers, perception, and impact.

While I was considering the offer to come to Bentley as its next dean of arts and sciences, Brown University and another institution were considering me for professorial positions. Although I felt honored, I did not want to polish my own lamp when I felt that much in the humanities and elsewhere in higher education risk becoming a Ponzi scheme, which Wikipedia defines accurately as an "...operation that pays returns to separate investors, not from any actual profit earned by the organization, but from their own money or money paid by subsequent investors."

I wanted to make my small contribution to solving this problem, so I withdrew from consideration for these appointments to become an administrator and face the issue on the front line. And Bentley sounded like exactly the place to be, based on pioneering efforts to integrate the humanities and sciences into professional education -- such as our innovative liberal studies major, in which business majors complete a series of courses, reflections, and a capstone project emerging from their individual integration of humanities, sciences, and business.

Programs that take in students without proper concern for their future or provision for post-graduate opportunities -- how they can use what they have learned in meaningful work -- need to think about the ethics of their situation. Students no longer come mainly from the leisured classes that were prominent at the founding of higher education. Today they need to find gainful employment in which to apply all the substantive things they learn in college. Majors that give no thought to that small detail seem to assume that since the humanities are good for you, the financial commitment and apprenticeship between student and teacher is fully justified. But in these cases, the numbers of students benefit the faculty and particular programs arguably more than they benefit the students themselves. This is a Ponzi scheme. Q.E.D.

The cultural zeitgeist requires of education that it be intellectually well-balanced and focused but also useful. Providing all of these and more is not the commercialization of higher education. Rather, the combination of professional education and the humanities and sciences is an opportunity to at once (re-)engage students in the humanities and to realize Dewey's pragmatic goal of transforming education by coupling concrete objectives with abstract ideas, general knowledge, and theory.

I have labeled this call for a closer connection between the humanities and professional education the "Crucial Educational Fusion." Others have recognized this need, as examples in the new Carnegie Foundation for the Advancement of Teaching book Rethinking Undergraduate Business Education: Liberal Learning for the Profession illustrate. This crucial educational fusion is one solution to the lethargy of the humanities -- breaking down academic silos, building the humanities into professional curriculums, and creating a need for the humanities. Enhancing their flavor like cheese on broccoli.

Author/s: 
Daniel L. Everett
Author's email: 
info@insidehighered.com

Daniel L. Everett is dean of arts and sciences at Bentley University.

Ecumenical vs. Evangelical

Although currently secured behind the subscriber paywall at the Journal of American History, David A. Hollinger’s “After Cloven Tongues of Fire: Ecumenical Protestantism and the Modern Encounter with Diversity” nonetheless seems like part of the emerging conversation during the pre-primary phase of the presidential campaign. Hollinger, a professor of history at the University of California at Berkeley, delivered the paper in March as his presidential address to the Organization of American Historians. This year’s OAH convention was held in Houston -- where, as it happens, Rick Perry recently led a national prayer rally. That may not have been providence at work, but the coincidence seems a bit much.

The paper traces what Hollinger calls the “Protestant dialectic” between ecumenical and evangelic strands of the faith since roughly the end of World War II -- a process “within which the two great rivals for control of the symbolic capital of Christianity defined themselves in terms of each other.” Americans whose understanding of Protestantism comes mainly from political and cultural developments of the past 30 years may be forgiven for wondering what this could possibly mean. The evangelical dimension of Protestantism (the witnessing, proselytizing, and missionary-sending side of it) has very nearly defined itself as the public face of the faith. It can get 30,000 people into a stadium to pray on the weekend. Evangelism as such does not in principle imply a specific outlook on secular matters, but for practical purposes it has become almost synonymous with moral conservatism and usually the political sort as well.

So much for the familiar side of Hollinger’s dialectic. The ecumenicalism that he writes about -- the spirit of de-emphasizing doctrinal conflicts among churches, the better to recognize one another as parts of “the body of Christ” and to join in common work in the world -- is, at this point, a less vigorous presence in public life. But it was once the ethos of an Establishment. “If you were in charge of something big before 1960,” Hollinger writes, “chances are you grew up in a white Protestant milieu. Until the 1970s, moreover, the public face of Protestantism itself remained that of the politically and theologically liberal ecumenists of the National Council of Churches and its pre-1950 predecessor, the Federal Council of Churches.”

A fairly common story is told about the decline of this “so-called Protestant Establishment,” as Hollinger calls it -- and his paper seems to confirm that story, though only up to a point.

“The ecumenists were more institution builders than revivalists,” he writes, “more devoted to creating and maintaining communities than to facilitating a close emotional relationship with the divine, and more frankly concerned with social welfare than with the individual soul…. The ecumenical Protestants of twentieth century America were preoccupied with mobilizing massive constituencies to address social evils. They wanted to reformulate the gospel of the New Testament in terms sufficiently broad to enable people of many cultures and social stations to appreciate its value.”

An invidious way to put this (and those of us who grew up in the evangelical world during the 1970s seldom heard it expressed any other way) is that ecumenical Protestantism became liberal activism in pious disguise. It confused spreading the Gospel with doing social work. And it was prone to undermining moral absolutes, as though Moses had come down from Mount Sinai bearing tablets with the Ten Suggestions. Churches were emptying out as the faithful rejected such all-too-wordly doctrines and instead made their way to the evangelical movement.

Without being so pugnacious about it, Hollinger’s treatment of the ecumenical movement corroborates some of this -- with particular emphasis on how self-critical theologians and clergy became about Christianity’s role in shoring up the inequalities and injustices of American life. The evangelical movement could define itself against such trends. It was not prone to second-guessing its own role in the world, nor to question the idea that the United States was an essentially Christian nation.

In the late 1940s and early ‘50s, the National Association of Evangelicals promoted a Constitutional amendment that began: “This nation devoutly recognizes the authority and law of Jesus Christ, Savior and Ruler of nations, through whom are bestowed the blessings of Almighty God.” (The framers of the Constitution had, through some peculiar oversight, failed to mention God at any point in the document.) A lack of support by ecumenical Protestant clergy meant the amendment didn’t get very far. But the effort itself disproves the idea that evangelicals remained, as Hollinger put it, “politically quiescent until galvanized into political action by the legalization of abortion in 1973 by Roe v. Wade.”

Membership in the ecumenical Protestant denominations (e.g., Methodists, Presbyterians, Episcopalians) began falling in the mid-1960s. But here Hollinger’s interpretation departs from the evangelical tale of Christians fleeing the modernist churches in search of that old-time religion. It was not that “masses of believers switched from the liberal churches to the conservative ones,” he writes, “though some people did just that. The migration to evangelical churches was not large and was actually smaller than the modest migration to Roman Catholicism.”

Rather, the decline of ecumenical churches (alongside the steady growth in numbers and power of the evangelicals) reflected a generational shift, compounded by differences in fertility. Ecumenical couples had fewer children than evangelicals did. And the offspring, in turn, tended not to become members of their parents’ churches, nor to send their own kids to church.

“The evangelical triumph in the numbers game from the 1960s to the early 21st century,” writes Hollinger, “was mostly a matter of birthrates coupled with the greater success of the more tightly boundaried, predominantly southern, evangelical communities in acculturating their children into ancestral religious practices. Evangelicals had more children and kept them.”

There is a good deal of interesting material in the article that I’ve not tried to sum up here – and many aspects of the argument resonate with Hollinger’s earlier work on secularity, cosmopolitanism, and affiliation. It contains an intriguing reference to research which suggests that “young adults of virtually all variety of faith” (including evangelicals) now “talk like classical liberal Protestants.”

What long-term implications that might have, if any, I can’t say, but it seems like something to discuss, especially as political and religious issues get mixed up in debate. That the paper isn’t circulating freely is unfortunate, and Hollinger, as a leader of the Organization of American Historians, ought to do something about that. Mr. President, tear down this paywall!

Author/s: 
Scott McLemee
Author's email: 
scott.mclemee@insidehighered.com

The Balance in History

Smart Title: 
New analysis suggests decline in percentage of historians who focus on Europe, a rise for those focused in Asia -- and diplomatic and military specialties holding their own.

Oral History, Unprotected

Smart Title: 
U.S. government -- opposing Boston College -- argues against researchers' expectation of confidentiality.

End Large Conferences

I’ll play Marc Antony. I have not come to praise large conferences, but to bury them. It is my opinion that mega humanities conferences are way past their sell-by date. For senior faculty the only reason to go is to schmooze with old friends; for junior faculty they are an onerous duty, and for graduate students they are a rip-off for which professional organizations ought to be collectively ashamed.

First codicil: I speak exclusively of humanities conferences, as they are the only ones I know firsthand. Friends in computing and the sciences tell me that collaborative efforts arise from their conferences. I’m willing to believe them. Maybe it’s a cultural thing. Most humanities people find it so hard to collaborate that their wills stipulate that their notes go with them to the grave.

Second codicil: I have only myself to blame for recent travails. I didn't need to go to my unnamed conference, but I got it into my head that it would be fun. I was wrong. It serves me right for violating my principles.

Five years ago I concluded that humanities conferences were out of touch with the times and vowed to attend only smaller regional meetings with less cachet, but more satisfaction. But I didn’t listen to me. Instead I spent four days and a considerable wad of cash jostling among a throng of over three thousand. I returned home more akin to Ponce de Leon, who sought the Fountain of Youth and found mostly dismal swampland. Sound harsh? See if any of these observations resonate with your own.

Problem One: Outmoded Presentations

We live in the communications age, but the memo apparently never circulated among those studying the liberal arts. For reasons arcane and mysterious, humanities scholars still read papers. That’s tedious enough at a small conference where one might attend six three-paper presentations. At my recent conference, sessions commenced at 8 a.m. and ran past 10 p.m. One could have conceivably attended 30 sessions and heard 90 or more papers, though the only ones with the stamina to attend more than six or seven sessions were either posturing or desperate.

I wanted my four-day sojourn to introduce me to new ideas, concepts, and teaching modules, but the reality of such a grueling schedule is that I was running on fumes by the end of day one. It would have helped if presenters took advantage of new technology, but things seldom got more flash than PowerPoint, a program that, alas, seems to encourage more reading. Let me reiterate something I’ve said for years: the death penalty should apply to those who read anything from a PowerPoint slide other than a direct quote. It's an academic conference, for crying out loud; assume your audience is reasonably proficient at reading! Seriously, does anyone need to fly across the country to listen to a paper? Why not do as science conferences have done for years: post papers online and gather to have a serious discussion of those papers?

The mind-numbing tedium of being read to for four days is exacerbated by the fact that many humanities scholars have little idea about the differences between hearing and reading. If you construct a paper that’s so highly nuanced that understanding it rests upon subtle turns of phrase or complicated linguistic shifts, do not look up from your paper with a wan smile indicating you are enamored of your own cleverness; go back to your room and rewrite the damn thing. Audience, clarity, and coherence are pretty much the Big Three for speech and composition, unless one's audience is the International Mindreaders' Society. By the way, is there something wrong with using a map, providing a chart, or summarizing a work that few in the room are likely to have read? And do bother to tell me why your paper matters.

I actually heard several very exciting papers, but most of the offerings were dreadful. Note to young scholars: stop relying on the Internet and check out journals that predate 1995 before you proclaim a “discovery.” And if you really want to stand out, work on your shtick. Guess which papers I remember? Yep -- those in which the presenter did more than read to me.

Critical note to young scholars: Want to turn off everyone in the room? Be the person who doesn’t think that the 20-minute limit applies to you. Nothing says "non-collegial" more clearly.

Problem Two: Expense

Another reason to rethink conferences is that they cost an arm and a leg to attend. I had partial funding from my university because I was presenting -- and no, I bloody well did not read my paper -- but I was still out of pocket for quite a hunk of cash. If you attend a humanities conference and want to stay anywhere near the actual site of the event, plan on $150 per night for lodging in a soulless franchise hotel with windowless conference rooms and quirky technology, $20 per day for Internet access, another $200 for conference fees, roughly $500 for airfare, at least $50 for taxis to and from the airport -- almost no U.S. city has a convenient shuttle service anymore -- and money for whatever you plan on eating.

Budget plenty for the latter if your conference is in what is glibly called a Destination City. That’s shorthand for a theme area marketing itself as unique, though it’s actually a slice of Generica surrounded by shops and restaurants identical to those found in suburban malls in every way except one: captive audiences equal higher prices. (One small example: the Starbucks inside the pedestrian precinct at my hotel charged a buck more per cup than the one on the street 100 yards away.) Do the math and you can see that you can easily drop a few grand on a megaconference. (That’s what some adjuncts are paid per course!)

An immediate cost-saving adjustment would be to confine conferences to airline hub cities such as New York, Chicago, Los Angeles, Atlanta, and Houston. The moment the conference locates to a (not my term) "second-tier" city, allot another few hundred dollars for "connecting flights," a term used by the airline industry because it sounds nicer than saying you’ll spend six hours waiting in a hub, after you’ve sprinted through the airport like Usain Bolt for your next flight, found the gate closed, and retreated to the rebooking counter.

Problem Three: Victimized Grad Students

I'm a parsimonious Scot who resents spending money on boring hotels and lousy food, but I can afford it when I have to. Grad students can’t. A major way in which megaconferences have changed in the past several decades is that there’s considerably less balance between senior scholars, junior colleagues, and graduate students. (Senior scholars used to accompany the latter two in a mentor capacity.) Now there is just a smattering of senior and junior scholars, and they’re often holed up in hotel suites conducting interviews. Whenever they can, search committee members flee the conference and rendezvous with old friends. They might attend a session or two. Unless they have to be there, there aren’t many junior colleagues in attendance at all because they're busy getting material into publication and they can meet presentation expectations at cheaper regional meetings, or save their dough and go to prestigious (-sounding) international gatherings.

So who’s left? Graduate students. Lots of graduate students. So many that conservationists would recommend culling the herd if they were wild mustangs. Grad students have always gone to conferences in hopes of making their mark, attracting attention, and meeting people who can help them advance. That was the way it was done -- 20 years ago. Now network opportunities are slimmer. Whom do they meet? Mostly other grad students, often those massed outside of interview rooms.

Of all the antiquated things about large conferences, the "cattle call" interview is the most perverse. These were barbaric back in the days in which there were jobs; now they’re simply cruel. At least a third of attendees at my conference were grad students from a single discipline: English. As has been discussed many times on this site, most of them shouldn't be in grad school in the first place. How many of the thousand-plus English grad students can realistically hope to get an academic job of any sort?

The Modern Language Association predicts that only 900 English jobs will come open for all of 2011. That’s 900 in all specialties of English, the bulk of which will be in writing and rhetoric, not Austen and Proust. Will a fifth of those at the conference get a job? The odds are long. It's probably more like half of that, and if we're talking about a good job, slice it in half once more. So why ask strapped grad students to attend expensive conferences for 15-minute preliminary interviews? Do a telephone interview, for heaven’s sake; it’s kinder on both grad students and search committees.

As I did as a grad student, many young hopefuls pooled resources and economized where they could, but the sad truth is that the vast majority of attendees spent a small fortune on a gamble whose odds aren't much greater than buying lottery tickets. Are associations playing the role of enabler to grad student delusions? Yes. Here’s another thought: Instead of holding a big conference, sponsor a teleconference. Charge a fee for uploads, but give speakers one-year access to the URL, which they can make available to potential employers. Use the savings to the association to lobby for more tenure-track faculty.

Problem Four: No-Shows

You spend lots of money, you sit through desultory talks, and head off to the one or two sessions that made you want to attend the conference in the first place. What do you find? It’s been canceled because only one of the presenters showed up, and that paper was combined with several others of sessions that suffered the same fate. Didn’t you see the 3x5 card tacked to the conference bulletin board?

As noted above, I’m in favor of putting large conferences to rest. But If we insist on having them, let’s at least make sure they’re as advertised. O.K., things do happen, but in most cases missing presenters are simply AWOL. I know it smacks of McCarthyism, but I’ve come to support the idea of a data bank of no-shows that employers, conference planners, and deans can check.

Problem Five: Urban Sprawl

What’s the point of a conference that’s so big it’s inaccessible? I walked between two different hotels to attend sessions and pored over a Britannica-sized program to locate them. Conference attendees were housed in four "official" hotels and untold numbers of others. With round-the-clock sessions and decentralization, the few networking opportunities that did exist were logistically difficult. It took me two entire days to find my old friends, let alone new folks I wanted to engage. I met two interesting people at the airport. I never saw them again.

In Praise of Small Conferences

There are other problems I’ll leave for now, including the gnawing suspicion that some big conferences have become sinecures for "insiders" who have become "players" within associations. Let’s just say that there is a serious disconnect between how the big conferences operate and what makes sense in the changing world of academe.

Teleconferences with real-time discussion groups and online forums would be one good starting point for reform; providing more resources for regional and local conferences would be another. Small gatherings have issues of their own -- no-shows, sparsely attended sessions, overreliance on volunteers -- but they compensate by offering intimacy, good value, face-to-face feedback, and easier opportunities to network. It's time to give these the cachet they deserve. The big conference is like a one-size-fits-all t-shirt; it simply doesn’t accessorize most people. I’m done. For real. Unless I get funding for an exotic overseas meeting. (Just kidding!)

Author/s: 
Rob Weir
Author's email: 
info@insidehighered.com

Rob Weir, who writes the "Instant Mentor" column for Inside Higher Ed's Career Advice section, has published six books and numerous articles on social and cultural history, and has been cited for excellence in teaching on numerous occasions during his 20 years in the college classroom.

Conference Session Question

I was pleased to see this session in the conference program, organized around a topic to which I’ve dedicated much of my professional life, and I think you presenters have done a wonderful job to an extent. I think we all know what a labor of love organizing a conference session can be, especially when it is on a topic that is fairly complicated – a topic, perhaps, that only a handful of scholars have truly engaged and perhaps upon which only one or two have done any truly definitive work. The panel organizers might have thought to invite a central figure in this field to anchor the session, someone who has covered much of this ground already and is acknowledged to have done the first and still the best work on this question, but I’m told the organizers wanted fresh (as opposed to what, I don’t know!) voices and they invited some, well, emerging scholars to contribute. I think we’d all agree they did a fine job after a fashion; we hardly missed the usual contributors that often present papers on this topic.

But to return to my question – I promise there’s a question in here! – as I sat, rapt, listening to these fine presentations, I started wondering if the panelists were perhaps giving short shrift to some of the definitive findings on this topic that have proved quite sound and durable for almost two decades; I’m sure everyone in the room can tick off the titles of the groundbreaking publications that helped define this field – and, with all due respect, I began to suspect that some of the presenters were taking a rather … cavalier … direction, given the enduring centrality of those seminal works of scholarship with which all of us are familiar. So as I listened I began to formulate a response — we can’t all help but notice that a panel this good often cries out for a respondent, a prominent scholar to draw all the presentations together under the existing — and still quite valid — paradigm.

Thus my query, which should be prefaced with a reminder that in our discipline conference panels like this one ought to be informed by a thorough understanding of, if not respect for, the earlier work that created the very conditions that allow for the continued study of this issue. I hesitate to say "standing on the shoulders of giants," but I would hazard that some of the panel participants have failed to accommodate, much less cite – yes, I said cite – the key sources, which are as relevant today as they were when first published. At the risk of detracting from all this freshness, I can’t help put note that the papers I heard today can do no more than elaborate upon disciplinary principles already well established — footnotes to Plato and all that. And yet, certain experts went unmentioned. Certain still-relevant and available authorities could have spoken today, had one been invited to this session. My question, at last:

Don't you know who I am?

Author/s: 
Daniel J. Ennis
Author's email: 
info@insidehighered.com

Daniel J. Ennis is a professor of English at Coastal Carolina University.

Introducing Myself

Teresa Mangum offers perspective on the job search -- having launched hers during a previous economic downturn.

Abuse of Power

Very rarely do I wish that a professor would write his or her memoirs. Even saying “very rarely” may overstates the frequency of the wish. But if ever there were an exception to be made, it would be for Athan G. Theoharis – the dean of Freedom of Information Act scholarship.

Just to clarify, he is not actually a dean. According to the back cover of his new book, Abuse of Power: How Cold War Surveillance and Secrecy Policy Shaped the Response to 9/11 (Temple University Press), he is now professor emeritus of history at Marquette University. In the 1970s, on behalf of the Senate body best known as the Church Committee, Theoharis dug around in presidential libraries to find Federal Bureau of Investigation records, and he’s spent decades filing FOIA requests. At Marquette, he has “supervised a stable of masters and doctoral students who wrote about the civil liberties record of the FBI,” as one chronicle of the bureau puts it.

At very least, someone needs to sit down with Theoharis for a series of in-depth interviews on how he conducted his research, and trained others to continue the work. From passing references in Abuse of Power, it’s clear that the job requires extraordinary patience and tenacity -- but also a systematic understanding of the bureaucratic mind at its sneakiest. What’s that like? Does the frustration ever get to you? J. Edgar Hoover is named in the titles of three books by Theoharis, and central to several others. How do you learn to think like him without going quite paranoid? These questions blur the line between historiography and autobiography, though in a good way.

Abuse of Power, the author’s 20th book, is nowhere near so personal or ruminative as the one we might, with luck, get out of him one day. Subtitle notwithstanding, it has fairly little to do with 9/11, as such. Nor, for that matter, does Theoharis really make an argument about how Cold War policy “shaped the response” to that day’s attacks. He suggests that the Bush administration’s approach to domestic surveillance was an especially gung-ho version on Hoover’s attitude. But precedent is not cause.

There are a couple of ways to understand what Theoharis is actually doing in this book. One is to treat it as a challenge to American citizens of the present day. The other is to consider it a warning to future generations of historians.

Its contemporary challenge seems aimed at the liberal wing of public opinion in particular; for Theoharis makes the rather provocative case that George W. Bush was (at least in one respect) the fulfiller of FDR’s legacy, rather than its dim negation.

In the mid-1930s, faced with the aggressive pursuit of influence by Germany and the Soviet Union, Roosevelt made “a fundamental shift in the role of the Federal Bureau of Investigation … from [being] a law enforcement agency that sought to develop evidence to prosecute violators of federal laws to an intelligence agency that would seek to acquire advance information about the plans and capabilities of suspected spies and saboteurs.”

Rather than propose legislation to that effect, the president “instead opted for secret executive directives, a method that had as its central purpose the foreclosure of a potentially divisive and contentious debate.” And the director of the FBI was hardly going to object if things were done in secret, at least if he were the one doing them. Like the New Deal, “this profound shift was effected not through a well-defined blueprint but through a series of ad hoc responses” -- creating their own complex dynamics.

Just before the U.S. entered World War II, for example, FBI agents had interviewed almost 33,000 members of the American Legion, looking for people willing to infiltrate targeted organizations or monitor “persons of German, Italian, and Communist sympathies.” The program continued and grew throughout the war. Recruitment efforts intensified during the early 1950s despite grumbling by FBI field agents that maintaining contact with the Legionnaires took up a lot of time without producing much of value. But by then, the whole thing counted as a public relations effort for the FBI.

Meanwhile, more serious intelligence-gathering operations developed with scarcely any oversight. They included wiretaps and break-ins, campaigns to infiltrate and disrupt various organizations, and investigations into the private lives of public figures. Much of this is now common knowledge, of course, though only through the work of Theoharis and other researchers. Less widely known (and even more Ashcroft-y) is the program called Custodial Detention that Hoover launched in September 1939.

Renamed the Security Index in 1943, this created a list of candidates for “preventative detention” in case of emergency. There was also “a plan for the suspension of the Writ of Habeas Corpus,” in the words of FBI assistant director D. Milton Ladd. In 1950, conservative members of Congress were able to override president Truman’s veto to pass an internal security act that included its own provisions for rounding up potential subversives. But it defined the pool of detainees more narrowly than the FBI had, and mandated that they receive a hearing within 48 hours of being taken into custody. The bureau ignored the legislation, and by the 1960s was adding civil-rights and antiwar activists to the list. The program was phased out following Hoover’s death in 1972. By then it was called the Administrative Index: a case of blandness as disguise.

In his final chapter, Theoharis quickly reviews the domestic-surveillance operations that emerged in the wake of the 9/11 attacks, or at least the ones now part of the public record. Their resemblance to Cold War-era programs is not a matter of Hoover’s influence. The NSA has resources that old-school gumshoes could never imagine. According to Theoharis, the FBI had about 200 wiretaps going at any one time throughout the entire country, while intercepting (that is to say, opening) thousands of pieces of correspondence. Under the USA PATRIOT Act, that would count as a slow morning.

The point of Abuse of Power is that “a broad consensus that Congress should enact legislative charters for the intelligence agencies rather than defer to the executive branch” existed for only a brief period, roughly the 1970s. Even then, the “consensus” Theoharis invokes was hardly robust. Following 9/11, the old habits kicked in again -- technologically fortified, with all the ingenuity that highly skilled lawyers can bring to rationalizing decisions that have already been made.

As noted, the book also serves as a warning to those who, down the line, try to research this past decade.

Theoharis describes Hoover's techniques for routing and storing information. Agents would not submit material gathered from break-ins, wiretaps, or highly confidential sources in their official reports, but via letters forwarded to him directly with a code on the envelope. Besides the FBI files, he kept auxiliary archives full of especially sensitive documents. That way, if forced to turn over the bureau’s files on a given topic, he could limit the exposure of what intelligence he had gathered -- and, just as importantly, how he gathered it.

Assembling information was not enough; he had to dissemble it as well. The power to do either has grown exponentially in the meantime. Hoover's material was recorded on paper and audiotape. Just learning to find his way into the maze of Hoover’s evasions cost Theoharis much time and effort. The comparable documents covering the past few years will be far less tangible, more heavily encrypted -- stored in sticks, chips, or clouds. And the people creating (and hiding) information have the warning of Hoover's example. Within a few years of his death, the architecture of secrecy began to crumble. Things will be better hidden, and learning to track them down will be harder.

Abuse of power usually implies confidence that the abuser will escape any consequences. In that case, the historian is able to enforce some kind of accountability, however minimal and belated. Will it still be possible to do that in the future? Theoharis doesn’t seem to be prone to speculation; the occasional references to his own career are rather perfunctory, even self-effacing. But I hope he overcomes his reticence and explains how he found his way through the old labyrinth -- and how he sizes up the one on the horizon.

Author/s: 
Scott McLemee
Author's email: 
scott.mclemee@insidehighered.com

Falling Into the Generation Gap

A few weeks ago, sitting over a cup of coffee, a writer in his twenties told me what it had been like to attend a fairly sedate university (I think he used the word "dull") that had a few old-time New Left activists on its faculty.

"If they thought you were interested in anything besides just your career," he said, "if you cared about ideas or issues, they got really excited. They sort of jumped on you."

Now, I expected this to be the prelude to a little tribute to his professors – how they had taken him seriously, opened his mind to an earlier generation’s experience, etc. But no.

"It was like they wanted to finish their youth through you, somehow," he said. "They needed your energy. They needed you to admire them. They were hungry for it. It felt like I had wandered into a crypt full of vampires. After a while, I just wanted to flee."

It was disconcerting to hear. My friend is not a conservative. And in any case, this was not the usual boilerplate about tenured radicals seeking to brainwash their students. He was not complaining about their ideas and outlook. This vivid appraisal of his teachers was not so much ideological as visceral. It tapped into an undercurrent of generational conflict that the endless "culture wars" seldom acknowledge.

You could sum it up neatly by saying that his professors, mostly in their fifties and sixties by now, had been part of the "Baby Boom," while he belonged to "Generation X."

Of course, there was a whole segment of the population that fell between those two big cultural bins -- people born at the end of the 1950s and the start of the 1960s. Our cohort never had a name, which is probably just as well. (For one thing, we’ve never really believed that we are a "we." And beside, the whole idea of a prepackaged identity based on what year you were born seems kind of tacky.)

One effect of living in this no-man’s-land between Boomers and Xers is a tendency to feel both fascinated and repulsed by moments when people really did have a strong sense of belonging to a generation. The ambivalence is confusing. But after a while it seems preferable to nostalgia -- because nostalgia is always rather simple-minded, if not dishonest.

The recent documentary The Weather Underground  (a big hit with the young-activist/antiglobalization crowd) expressed doe-eyed sadness that the terrible Amerikan War Machine had forced young idealists to plant bombs. But it somehow never mentioned that group’s enthusiasm for the Charles Manson "family." (Instead of the two-fingered hippie peace sign, Weather members flashed a three-finger salute, in honor of the fork used to carve the word "war" into one of the victims’ stomach.) Robert McNamara and Henry Kissinger have a lot of things to answer for – but that particular bit of insanity is not one of them.

Paul Berman, who was a member of Students for a Democratic Society at Columbia University during the strike of 1968, has been writing about the legacy of the 1960s for a long time. Sometimes he does so in interesting ways, as in parts of his book A Tale of Two Utopias; and sometimes he draws lessons from history that make an otherwise placid soul pull out his hair with irritation. He has tried to sort the positive aspects of the 1960s out from the negative -- claiming all the good for a revitalized liberalism, while treating the rest as symptoms of a lingering totalitarian mindset and/or psychological immaturity.

Whatever the merits of that analysis, it runs into trouble the minute Berman writes about world history -- which he always paints in broad strokes, using bright and simple colors. In his latest book, Terror and Liberalism,  he summed up the last 300 years in terms that suggested Europe and the United States had grabbed their colonies in a fit of progress-minded enthusiasm. (Economic exploitation, by Berman’s account, had nothing to do with it, or not much.) Liberalism and Terror is a small book, and easy to throw.

His essay in the new issue of Bookforum is, to my mind, part of the thoughtful, reflective, valuable side of Berman’s work. In other words, I did not lose much hair reading it.

The essay has none of that quality my friend mentioned over coffee – the morbid hunger to feast off the fresh blood of a younger generation’s idealism. Berman has fond recollections of the Columbia strike. But that is not the same as being fond of the mentality that it fostered. "Nothing is more bovine than a student movement," he writes, "with the uneducated leading the anti-educated and mooing all the way."

The foil for Berman’s reflections is the sociologist Daniel Bell, who left Columbia in the wake of the strike. At the time, Bell’s book The End of Ideology  was the bete noir of young radicals. (It was the kind of book that made people so furious that they refused to read it – always the sign of the true-believer mentality in full effect.) But it was Bell’s writing on the history of the left in the United States that had the deepest effect on Berman’s own thinking.

Bell noticed, as Berman puts it, "a strange and repeated tendency on the part of the American Left to lose the thread of continuity from one generation to the next, such that each new generation feels impelled to reinvent the entire political tradition."

There is certainly something to this. It applies to Berman himself. After all, Terror and Liberalism is pretty much a jerry-rigged version of the Whig interpretation of history,  updated for duty in the War on Terror. And the memoiristic passages in his Bookforum essay are, in part, a record of his own effort to find "the thread of continuity from one generation to the next."

But something else may be implicit in Bell’s insight about the "strange and repeated tendency" to lose that thread. It is a puzzle for which I have no solution readily at hand. Namely: Why is this tendency limited to the left?

Why is it that young conservatives tend to know who Russell Kirk was, and what Hayek thought, and how Barry Goldwater’s defeat in 1964 prepared the way for Reagan’s victory in 1980? Karl Marx once wrote that "the tradition of all the dead generations weighs like a nightmare on the brain of the living." So how come the conservatives are so well-rested and energetic, while the left has all the bad dreams?

Author/s: 
Scott McLemee
Author's email: 
scott.mclemee@insidehighered.com

Scott McLemee writes Intellectual Affairs on Tuesdays and Thursdays.

Impure Literature

The publication, 100 years ago, of  The Jungle, by Upton Sinclair, in the popular American socialist newspaper Appeal to Reason had an enormous effect -- if not quite the one that its author intended. "I aimed at the public’s heart," Sinclair later said, “and by accident I hit it in the stomach.”

Drawing on interviews with workers in Chicago and his own covert explorations of the city’s meat-processing factories, Sinclair intended the novel to be an expose of brutal working conditions. By the time it appeared as a book the following year, The Jungle’s nauseating revelations were the catalyst for a reform movement culminating in the Pure Food and Drug Act. In portraying the life and struggles of Jurgis Rudkus, a Lithuanian immigrant, Sinclair wanted to write (as he put it), “The Uncle Tom’s Cabin of wage slavery,” thereby ushering in an age of proletarian emancipation. Instead, he obliged the bourgeoisie to regulate itself -- if only to keep from feeling disgust at its breakfast sausages.

In his introduction to a new edition of The Jungle, just published by Bedford/St. Martin’s, Christopher Phelps traces the origins and effects of Sinclair’s novel. Phelps, an associate professor of history at Ohio State University in Mansfield, is currently on a Fulbright fellowship in Poland, where he occupies a distinguished chair in American studies and literature at the University of Lodz. The following is the transcript of an e-mail interview conducted this month.

Q: At one of the major chain bookstores the other day, I noticed at least four editions of The Jungle on the shelf.  Yours wasn’t one of them. Presumably it's just a matter of time. What’s the need, or the added value, of your edition? Some of the versions available are pretty cheap, after all. The book is now in the public domain.

A:  Yes, it’s even available for free online these days, if all you want is the text. This new edition is for readers seeking context. It has a number of unique aspects. I’m pleased about the appendix, a report written by the inspectors President Theodore Roosevelt dispatched to Chicago to investigate Upton Sinclair’s claims about the meatpacking industry. In one workplace, they watch as a pig slides off the line into a latrine, only to be returned to the hook, unwashed, for processing. No other version of The Jungle includes this report, which before now had lapsed into obscurity. The new edition also features an introduction in which I survey the scholarship on the novel and provide findings from my research in Sinclair’s papers held by the Lilly Library at Indiana University. Finally, there are a lot of features aimed at students, including a cartoon, a map, several photographs, a bibliography, a chronology of Sinclair’s life, and a list of questions for discussion. So it doubles as scholarly edition and teaching edition.

Q: Let me ask about teaching the book, then. How does The Jungle go over in the classroom?

A:  Extremely well. Students love it. The challenge of teaching history, especially the survey, is to get students who think history is boring to imagine the past so that it comes alive for them. The Jungle has a compelling story line that captures readers’ attention from its very first scene, a wedding celebration shaded in financial anxiety and doubts about whether Old World cultural traditions can survive in America. From then on, students just want to learn what will befall Jurgis and his family. Along the way, of course, Sinclair injects so much social commentary and description that teachers can easily use students’ interest in the narrative as a point of departure for raising a whole range of issues about the period historians call the Progressive Era.

Q:  As you've said, the new edition includes a government report that appeared in the wake of the novel, confirming the nauseating details. What are the grounds for reading and studying Sinclair's fiction, rather than the government report?

A:  Well, Teddy Roosevelt’s inspectors had the singular mission of determining whether the industry’s slaughtering and processing practices were wholesome. Sinclair, for his part, had many other concerns. What drew him to write about the meatpacking industry in the first place was the crushing of a massive strike of tens of thousands of workers led by the Amalgamated Meat Cutters and Butcher Workmen of North America in 1904. In other words, he wanted to advance the cause of labor by exposing the degradation of work and exploitation of the immigrant poor.

When The Jungle became a bestseller, Sinclair was frustrated that the public furor centered almost exclusively on whether the companies were grinding up rats into sausage or disguising malodorous tinned beef with dyes. These were real concerns, but Sinclair cared most of all about the grinding up of workers. I included this government report, therefore, not only because it confirms Sinclair’s portrait of unsanitary meat processing, but because it exemplifies the constriction of Sinclair’s panorama of concerns to the worries of the middle-class consumer.

It further shows how Sinclair’s socialist proposal of public ownership was set aside in favor of regulatory measures like the Pure Food and Drug Act and Meat Inspection Act of 1906. Of course, that did not surprise Sinclair. He was proud, rightly so, of having been a catalyst for reform. Now, just as the report must be read with this kind of critical eye, so too the novel ought not be taken literally.

Q:  Right. All kinds of problems come from taking any work of literature, even the most intentionally documentary, as giving the reader direct access to history.

A: Nowadays The Jungle is much more likely to be assigned in history courses than in literature courses, and yet it is a work of fiction. You point to a major problem, which we might call the construction of realism. I devote a good deal of attention to literary form and genre in my introduction, because I think they are crucial and should not be shunted aside. I note the influence upon The Jungle of the sentimentalism of Harriet Beecher Stowe, of naturalist and realist writers like William Dean Howells and Frank Norris, and of the popular dime novels of Horatio Alger. Sinclair was writing a novel, not a government report. He fancied himself an American Zola, the Stowe of wage slavery.

A good teacher ought to be able to take into account this status of the text as a work of creative literature while still drawing out its historical value. We might consider Jurgis, for example, as the personification of a class. He receives far more lumps in life than any single worker would in 1906, but the problems he encounters, such as on-the-job injury or the compulsion to make one’s children work, were in fact dilemmas for the working class of the time.

In my introduction, I contrast the novel with what historians now think about immigrant enclaves, the labor process, gender relations, and race. There is no determinant answer to the question of how well The Jungle represented such social realities. Many things it depicted extremely well, others abominably, race being in the latter category. If we keep in mind that realism is literary, fabricated, we can see that Sinclair’s background afforded him a discerning view of many social developments, making him a visionary, even while he was blind in other ways. Those failings are themselves revelatory of phenomena of the period, such as the racism then commonplace among white liberals, socialists, and labor activists. It’s important that we read the novel on all these levels.

Q: Sinclair wrote quite a few other novels, most of them less memorable than The Jungle. Well, OK, to be frank,  what I've heard is that they were, for the most part, awful. Is that an unfair judgment? Was The Jungle a case of the right author handling the right subject at the right time?

A:  That's precisely it, I think. Sinclair was uniquely inspired at the moment of writing The Jungle. I've been reading a lot of his other books, and although some have their moments, they sure can give you a headache. Many of them read like failed attempts to recapture that past moment of glory. He lived to be ninety and cranked out a book for every year of his life, so it's a cautionary tale about allowing prolixity to outpace quality. The book of his that I like best after The Jungle is his 1962 autobiography, a book that is wry and whimsical in a surprising and attractive, even disarming, way.

Author/s: 
Scott McLemee
Author's email: 
scott.mclemee@insidehighered.com

Scott McLemee writes Intellectual Affairs on Tuesdays and Thursdays.

Pages

Subscribe to RSS - History
Back to Top