Essay on new approach to defend the value of the humanities

"When the going gets tough, the tough take accounting." With those succinct words in a June 2010 op ed, New York Times columnist David Brooks summed up the conventional wisdom on the current crisis of the humanities. In an age when a higher education is increasingly about moving quickly through a curriculum streamlined to prepare students for a job, the humanities have no practical utility. As Brooks observes, "when the job market worsens, many students figure they can’t indulge in an English or a history major," a fact that explains why the "humanities now play bit roles when prospective students take their college tours. The labs are more glamorous than the libraries."

Pushed into a corner by these dismaying developments, defenders of the humanities -- both traditionalists and revisionists — have lately been pushing back. Traditionalists argue that emphasizing professional skills would betray the humanities' responsibility to honor the great monuments of culture for their own sake. Revisionists, on the other hand, argue that emphasizing the practical skills of analysis and communication that the humanities develop would represent a sellout, making the humanities complicit with dominant social values and ideologies. But though these rival factions agree on little else, both end up concluding that the humanities should resist our culture's increasing fixation on a practical, utilitarian education. Both complain that the purpose of higher education has been reduced to credentialing students for the marketplace.

Martha Nussbaum, for example, while stressing that the humanities foster critical thinking and the ability to sympathetically imagine the predicament of others, insists such skills are, as the title of her 2010 book puts it, "not for profit." In doing so she draws a stark line between the worlds of the humanities and the 21st-century workplace. Likewise, Geoffrey Galt Harpham in The Humanities and the Dream of America, laments the increasing focus on professional skills in the humanities at the expense of reading great books. Stanley Fish takes an even more extreme position, insisting that the humanities "don’t do anything, if by 'do' is meant bring about effects in the world. And if they don’t bring about effects in the world they cannot be justified except in relation to the pleasure they give to those who enjoy them. To the question 'of what use are the humanities?', the only honest answer is none whatsoever." Worse still, Frank Donoghue, in The Last Professors: The Corporate University and the Fate of the Humanities, argues that the humanities will simply disappear in the new corporate, vocation-centered university.

Ironically, these pessimistic assessments are appearing at the very moment when many employers outside academe are recognizing the practical value of humanities training. Fish simply dismisses the argument that "the humanities contribute to the economic health of the state — by producing more well-rounded workers or attracting corporations or delivering some other attenuated benefit — because nobody really buys that argument." But this would come as news to the many heads of philanthropic foundations, nonprofits, and corporate CEOs who have lately been extolling the professional value of workplace skills grounded in the humanities.

We would be the last to argue that traditional ways of valuing the humanities are not important, that studying philosophy, literature, and the fine arts do not have a value in and of themselves apart from the skills they teach. We also recognize that the interests of the corporate world and the marketplace often clash with the values of the humanities. What is needed for the humanities in our view is neither an uncritical surrender to the market nor a disdainful refusal to be sullied by it, but what we might call a critical vocationalism, an attitude that is receptive to taking advantage of opportunities in the private and public sectors for humanities graduates that enable those graduates to apply their training in meaningful and satisfying ways. We believe such opportunities do exist.

To be sure, such optimism must be tempered in today’s bleak economy, where hardly any form of education is a sure ticket to a job and where many in the private sector may still look with indifference or even disdain on a humanities degree.  But as David Brooks himself went on to point out in his op-ed: "Studying the humanities improves your ability to read and write. No matter what you do in life, you will have a huge advantage if you can read a paragraph and discern its meaning (a rarer talent than you might suppose). You will have enormous power if you are the person in the office who can write a clear and concise memo."

Brooks’ view is echoed by Edward B. Rust Jr., chairman and CEO of State Farm Insurance Companies, who observes that "at State Farm, our employment exam does not test applicants on their knowledge of finance or the insurance business, but it does require them to demonstrate critical thinking skills" and "the ability to read for information, to communicate and write effectively, and to have an understanding of global integration." And then there is Google, which more than any other company has sung the praises of humanities students and intends to recruit many of them. "We are going through a period of unbelievable growth," reports Google’s Marissa Mayer, "and will be hiring about 6,000 people this year — and probably 4,000-5,000 from the humanities or liberal arts."

This evidence of the professional utility of humanities skills belies Donoghue’s apparent assumption (in The Last Professors) that the "the corporate world’s hostility" toward humanistic education remains as intense today as it was a century ago, when industrialists like Andrew Carnegie dismissed such an education as "literally, worthless." Donoghue ignores changes in the global economy, the culture, and the humanities themselves since Carnegie’s day that have given many corporate leaders a dramatically more favorable view of the humanities’ usefulness. Associate Dean Scott Sprenger of Brigham Young University, who oversees a humanities program we will discuss in a moment, quotes the dean of the Rotman School of Management in Toronto, who observes a "tectonic shift for business school leaders," who are now aware that "learning to think critically — how to imaginatively frame questions and consider multiple perspectives — has historically been associated with a liberal arts education, not a business school curriculum."

All of these commentators are right, and the skills they call attention to only begin to identify the range of useful professional competencies with which a humanities education equips 21st-century students. In addition to learning to read carefully and to write concisely, humanities students are trained in fields like rhetoric and composition, literary criticism and critical theory, philosophy, history, and theology to analyze and make arguments in imaginative ways, to confront ambiguity, and to reflect skeptically about received truths, skills that are increasingly sought for in upper management positions in today’s information-based economy. Even more important for operating as global citizens in a transnational marketplace, studying the literary, philosophical, historical, and theological texts of diverse cultures teaches humanities students to put themselves in the shoes of people who see and experience the world very differently from their own accustomed perspectives. Are some corporations still looking for employees who will be well-behaved, compliantly bureaucratized cogs in the wheel? Of course they are. But increasingly, many others are looking for employees who are willing to think outside the box and challenge orthodoxy.

It is true that humanities study, unlike technical training in, say, carpentry or bookkeeping, prepares students not for any specific occupation, but for an unpredictable variety of occupations. But as many before us have rightly pointed out, in an unpredictable marketplace this kind of versatility is actually an advantage. As Associate Dean Sprenger notes, "the usefulness of the humanities" paradoxically "derives precisely from their detachment from any immediate or particular utility. Experts tell us that the industry-specific knowledge of a typical vocational education is exhausted within a few years," if not "by the time students enter the workforce." It is no accident, he observes, "that a large percentage of people running Fortune 500 companies (one study says up to 40 percent) are liberal arts graduates; they advance more rapidly into mid- and senior-level management positions, and their earning power tends to rise more significantly than people with only technical training."

If there is a crisis in the humanities, then, it stems less from their inherent lack of practical utility than from our humanistic disdain for such utility, which too often prevents us from taking advantage of the vocational opportunities presented to us. This lofty disdain for the market has thwarted the success of the few programs that have recognized that humanities graduates have much to offer the worlds of business, technology, arts agencies, and philanthropic foundations.

The most promising of these was a program in the 1990s developed by the Woodrow Wilson National Fellowship Foundation under the leadership of its then-director, Robert Weisbuch. First called "Unleashing the Humanities" and later "The Humanities at Work," the program, according to Weisbuch in an e-mail correspondence with the authors, "had persuaded 40 nonprofits and for-profits to reserve meaningful positions for Ph.D. graduates in the humanities and had placed a large number in well-paying and interesting positions — at places ranging from Verizon to AT Kearney to The Wall Street Journal to the National Parks Service." Unfortunately, Weisbuch reports, only a few humanities graduate programs "enlisted their alumni and the influential corporations and others in their areas of influence to revolutionize the possibilities for employment of humanities doctoral graduates," while most faculty members "continued to expect their graduate students to look for jobs much like their own and to consider any other outcome a failure."

Today, however, some humanities programs that emphasize useful professional applications are prospering.  One of these is a new undergraduate major at Brigham Young University called "Humanities +," with the "+” referring to the value-added vocational component gained by students who elect the program. According to an e-mail to the authors from Associate Dean Sprenger, BYU hired a career services specialist tasked with "revolutionizing our humanities advising office along the lines of the Humanities + vision, and the program has developed ties with the university’s colleges of business and management" —  a virtually unheard-of step for a humanities program. The program’s students are urged "to minor in a practical subject, professionalize their language skills, and internationalize their profile by doing an overseas internship." The objective, Sprenger says, "is that career thinking and strategizing become second nature to students," while faculty members "see it as in their interest to help students find 'alternative' careers, and are reassured that they can rely on our advising office to be informed and to do the training."

Another notable program that sees its mission as "bringing humanities into the world" beyond academe and that works closely with its university’s office of career placement is the Master of Arts Program in the Humanities (MAPH) at the University of Chicago, which Gerald helped design and direct in the 1990s. According to a recent article by program associate A.J. Aronstein in Tableau, a University of Chicago house journal, one recent MAPH graduate got a job as finance director in Florida for Barack Obama’s 2008 campaign, later served as chief of staff at the International Trade Association, and now works as a political consultant in Washington. Other MAPH graduates have gone on to internships and subsequent positions as museum curators, technical writers, journalists and other media workers, marketing specialists, and policy analysts with investment firms.

The false assumption in both anti-utilitarian defenses of the humanities and pessimistic predictions of their extinction is that we have to choose between a credentializing and a humanizing view of higher education, between vocational utility and high-minded study as an end in itself. This either/or way of thinking about the humanities — either they exist solely for their own sake or they have no justification at all – is a trap that leaves humanists unable to argue for the value of their work in terms of the practical skills it teaches, an argument that inevitably has to be made in the changing marketplace of higher education. In fact, we would argue there is no defense of the humanities that is not ultimately based on the useful skills it teaches.

Evidence is plentiful that stressing the range of expertise  humanities graduates have makes intellectual and economic sense. Take, for example, Damon Horowitz, director of engineering at Google. He insisted recently in an article in The Chronicle of Higher Education entitled "From Technologist to Philosopher: Why You Should Quit Your Technology Job and Get a Ph.D. in the Humanities," that "if you are worried about your career ... getting a humanities Ph.D. is not only not a danger to your employability, it is quite the opposite. I believe there no surer path to leaping dramatically forward in your career than to earn a Ph.D. in the humanities." "You go into the humanities to pursue your intellectual passion," he explains, "and it just so happens, as a byproduct, that you emerge as a desired commodity for industry."

Horowitz, a leading figure in artificial intelligence and the head of a number of tech startups, ought to know. He took a break from his lucrative career to enroll in Stanford’s Ph.D. program in philosophy because he figured out that in order to do his job in technology well he needed to immerse himself in the humanities. "I realized that, while I had set out in AI to build a better thinker, all I had really done was to create a bunch of clever toys." Horowitz came to realize that the questions he was "asking were philosophical questions — about the nature of thought, the structure of language, the grounds of meaning." Returning to the humanities, Horowitz took time out from the world of artificial intelligence to study "radically different approaches to exploring thought and language," such as philosophy, rhetoric, hermeneutics and literary theory. As he studied intelligence from these perspectives he "realized just how limited my technologist view of thought and language was. I learned how the quantifiable, individualistic, ahistorical — that is, computational — view I had of cognition failed to account for whole expanses of cognitive experience (including, say, most of Shakespeare)."

The concrete value of the humanities education Horowitz celebrates is especially well epitomized in the new field of the digital humanities. The emergence of this field calls attention to how old 20th-century divisions between science and the humanities are breaking down and gives those of us committed to defending the practical value of the humanities a tremendous opportunity. The digital humanities represent the cutting-edge intersection of the humanities and computer science, the merging of skills and points of view from two formerly very different fields that are leading to a host of exciting innovations – and opportunities for students who want to enter fields related to everything from writing computer programs to text encoding and text editing, electronic publishing, interface design, and archive construction. Students in the digital humanities are trained to deal with concrete issues related to intellectual property and privacy, and with questions related to public access and methods of text preservation.

Graduates of the digital humanities programs that are now developing all over the country will be first in line for such positions. For example, Paul’s university now has a Digital Humanities M.A. with two converging tracks, one designed for students with a background in computer science and one for students with a background in the humanities. The program website notes that it offers "theoretical, critical, social, and ethical contexts for thinking about the making of new knowledge through digital humanities research and applications, from issues of intellectual property and privacy, to questions of public access and methods of preservation." When we are asked about the practical value of a humanities education, we need to add the digital humanities to the list.

We believe it is time to stop the ritualized lamentation over the crisis in the humanities and get on with the task of making them relevant in the 21st century.  Such lamentation only reveals the inability of many humanists to break free of a 19th-century vision of education that sees the humanities as an escape from the world of business and science. As Cathy Davidson has forcefully argued in her new book, Now You See It, this outmoded way of thinking about the humanities as a realm of high-minded cultivation and pleasure in which students contemplate the meaning of life is a relic of the industrial revolution with its crude dualism of lofty spiritual art vs. mechanized smoking factories, a way of thinking that will serve students poorly in meeting the challenges of the 21st century.

Though we have argued in defense of the practical and vocational utility of a humanities education, our argument should in no way be construed as undercutting the aspirations of those in the humanities who seek an academic career. Indeed, on this score we need to redouble our efforts to increase public and private funding for higher education and to support unionizing efforts by faculty members and adjuncts. But even as we fight these battles to expand the academic job market we would be foolish to turn our backs on alternative forms of employment for humanities graduates when they are out there. In this spirit, we applaud both Modern Language Association President Russell Berman and American Historical Association President Anthony Grafton, who, along with the executive director of the AHA, James Grossman, have recently urged their organizations to acknowledge that advanced training in the humanities can lead to a variety of careers beyond academia and have suggested how graduate programs can be adapted for these kinds of careers.

For ultimately, to take advantage of the vocational potential of humanities study as we propose is not to sell out to the corporate world, but to bring the critical perspective of the humanities into that world. It is a perspective that is sorely needed, especially in corporate and financial sectors that have lately been notoriously challenged in the ethics department, to say the least. Humanities graduates are trained to consider the ethical dimensions of experience, linking the humanities with the sciences as well as with business and looking at both these realms from diverse perspectives. To those who worry that what we urge would blunt the humanities' critical power, we would reply that it would actually figure to increase that power, for power after all is the ability to act in the world.

Paul Jay is professor of English at Loyola University Chicago and the author, most recently, of Global Matters: The Transnational Turn in Literary Studies. Gerald Graff is professor of English and education at the University of Illinois at Chicago and a past president of the Modern Language Association.


Report finds literary research an inefficient use of university money

Smart Title: 

A new paper by an English professor argues that literary research, much of which is rarely cited, is not an efficient use of university resources.

Study says colleges pay a price for humanities support

Smart Title: 

Study suggests that private colleges with many such programs may pay a price in tuition and research revenues.

A new Occupy movement focuses on the MLA

Smart Title: 

The latest spinoff of the protest movement focuses on the language and literature job market with demand of "Tenure Track Now."

'This Is Not the Ivy League'

Smart Title: 

New memoir offers no-holds-barred look at life at a rural public college.

Will's Quill or Not Will's Quill …

Smart Title: 
… that is the question, but scholars are arguing over whether they should ask it.

'The Managerial Unconscious'

A familiar story about the modern university goes something like this: Once upon a time, the freshman arrived already knowing at least the basic mechanics of writing: what a paragraph was, how punctuation marks worked, the existence of nouns and verbs (and the obligation that they agree in a given sentence), that sort of thing. But the expansion of higher education throughout the 20th century, and especially in its second half, meant that a steadily growing portion of the student body needed basic training in such things.

The job naturally fell to professors of English, even though composition stood in relation to the study of literature roughly as long division did to algebraic topology, over in the math department. Still, it was necessary. Teaching this basic (even remedial) course helped justify offering the more advanced sections in literature. As the demand for writing instruction grew, it ceased being one task among others that the English faculty performed. It was became a function planned and administered separately from the courses on literature, and sometimes it even broke off from the department entirely, to do its own thing.

And that is why there is now a writing center on campus, probably in a basement somewhere, largely staffed by graduate students. There are faculty who specialize in composition studies, every single one of whom remembers that Cary Nelson, the president of the American Association of University Professors, once called them “comp droids” pursuing an activity devoid of any real intellectual content. That was more than 10 years ago, and in the original context it was a critique of literary scholars' attitudes. But the comp people still quote it sometimes, with bitterness, as if they've made a slogan of the militant online group Anonymous their own: “We are legion. We do not forgive. We do not forget.”

There are various problems with this narrative, including the fact that happy compositionists do exist. (I have met them.) But the most important is the fable about the golden age when secondary education produced literate students, so that the English faculty could keep its attention focused on higher things. In The Managerial Unconscious in the History of Composition Studies, published by Southern Illinois University Press, Donna Strickland quotes various exasperated statements issuing from Harvard University in the 1890s. Professors were obliged to instruct “bearded men [in] the rudiments of their native tongue,” so that “a large corps of teachers have to be engaged and paid from the College treasury to do that which should have been done before the student presented himself for admission.”

Teaching composition was the manual labor of the mind: “In quantity,” said a committee appointed by the Harvard Board of Overseers in its report from 1892, “this work is calculated to excite dismay; while the performance of it involves not only unremitted industry, but mental drudgery of the most exhausting nature.” And keep in mind that the students in question -- the raw material to be processed in the sweatshops of the Harvard English department – were typically the product of prep schools, in an era when the only distracting form of electronic communication was the telephone.

’Twas ever thus, in other words. But demolishing the belief that basic writing instruction at the college level reflects some recent dysfunction in secondary education (especially public schools) is a fairly minor element in Strickland’s argument.

The author, an assistant professor of English at the University of Missouri at Columbia, reconfigures the history of composition studies, rejecting the commonplace view that the field took shape on the margins of another discipline -- a humble (but all too necessary) pedagogical supplement to literary studies. The Managerial Unconscious is remarkably compact book, its points made with much concentration. Reading it more than once seems like a good idea. Here is a brief survey, offered with with all due trepidation by someone who has been through it just once.

The title might be a good way in. It alludes to Fredric Jameson’s The Political Unconscious (1981), which offered “always historicize” as a slogan for cultural analysis.

Strickland follows this injunction by stressing an important thing about the emergence of university-level composition courses in the U.S. at the close of the 19th century: it coincided with a rapidly growing market for white-collar labor. As companies expanded, their internal structures became more complex. Mechanization and the division of labor in manufacture increased productivity, but coordinating manufacture and distribution required new layers of managerial staff, able to turn out reports, memos, press releases, and the like.

When bearded men at university were unable to write coherently, that, rather than how prepared they were to compose a theme on Keats, was the real issue. To occupy a slot in the corporate chain of command, they had to be able to put a pen intelligibly to paper. One member of the Harvard committee that scrutinized undergraduate writing in the 1890s was the chairman of the Massachusetts Railroad Commission -- an early expert on what would soon be called systematic or scientific approaches to management. The committee’s stress on the drudgery involved in handling thousands of student compositions per semester echoes the managerial theme that work can and should be organized for greater efficiency.

“Whether the ideas of systematic management were employed consciously or unconsciously,” writes Strickland, “articulating the correct divisions of labor in the teaching of English was clearly the burden of the committee’s report.” To produce enough skilled labor to manage American business, the university itself needed to retool.

So by Strickland’s account, English professors did not shove composition out of literary studies like an unwelcome stepchild. Another dynamic was at work. Writing instruction became a discipline in Foucault’s sense – a way of inculcating both skills and the capacity to perform in a corporate workplace. Can the student rework a paper to the prof’s satisfaction, as a mid-management person might be called upon to revise a handbook? “Writing programs,” says Strickland, “… were made possible not by the devaluing of student writing in the university but by its central function in an institution that depended on writing as a tool for surveillance and assessment.”

The quest for managerial efficiency just happened to reinforce other power relationships: “The teaching of required writing [became], in the process of being divided from the English department in the name of efficiency, sometimes an entry-level position, more frequently in recent decades a position completely outside the tenure track. Because more stable, better paying faculty positions tended to be awarded to men, women often had little choice but to take on low-paying instructorships in composition.”

And by the later third of the 20th century, the consolidation of composition studies as a distinct field (with its own journals, graduate programs, academic organizations, and book series) had an odd effect. In keeping with Strickland’s title, the specialty behaves like one of Freud’s patients -- running away from “the managerial unconscious,” only to find it returning, just ahead. Comp studies established itself as an intellectual discipline. But one career track in it leads to supervising the labor of adjuncts and graduate students, preparing a syllabus that others will follow, and trying to keep the writing center’s costs down and statistics up. Still, thinking of the field as having a managerial component meets resistance, given the “negative connotations for traditional humanist intellectuals,” Strickand writes, “who have tended over the decades to distrust management as, at best, nonintellectual and, at worst, soul-murdering.” Management is where you land after doing a really good job at Pizza Hut for a couple of years.

But if the shoe fits.... "Once organizations of any kind are organized hierarchically," writes Strickland, "with a class of experts structuring and overseeing the work of a group of nonexperts, management happens. Professionalism calls for control and systematization of knowledge, and management is the group of people who reinforce that." Much of the book is devoted to how the evasion of its managerial function has played itself out over the years, even after the Council of Writing Program Administrators was established in the late 1970s. Strickland’s tone is never harsh. But when she writes that “almost from the beginning of the organization, the WPA discourse showed an aversion toward so-called managerial tasks,” somebody’s ox is being gored.

Strickland's argument implies consequences – but only implies them. Greater lucidity about how the managerial legacy of composition studies is the prerequisite for creating better working conditions; she also suggests that it will help make writing instruction a way to develop students' critical intelligence. But just how any of that will happen is left unaddressed. The Managerial Unconscious feels like the first volume of something, rather than the last word. If its implications are hard to read, that is because they remain to be drafted.

Scott McLemee
Author's email: 

Tolstoy in the Slaughterhouse


The University of North Carolina at Chapel Hill, like many other colleges, sponsors a "Summer Reading Program" for incoming students. Participating students all read the same book and, in the days before classes begin, meet with faculty members to discuss it in small seminars. Each year the university asks for recommendations, and each year I’ve suggested a book. Actually, I’ve suggested the same book every year: Tolstoy’s late novella, Hadji Murat. Four years on, the choice looks ludicrous, but the first year I suggested it some combination of arrogance and naïveté convinced me it would be picked. The book was supposed to be "intellectually stimulating," "enjoyable," capable of provoking "interesting discussion," and "appropriate for the level of incoming students," and to "address a theme applicable to the students themselves." In my submission proposal, I don't think I even made a case for the first four. Intellectually stimulating? It's by Tolstoy. Enjoyable? Capable of provoking interesting discussion? No problems there. Appropriate to the level of incoming students? I took this as code for "not too long," and, in my experience, Hadji Murat is about the briskest hundred-or-so pages I've ever read.

But what about the last criterion – "a theme applicable to the students themselves"? That, too, was easily met. Hadji Murat is, after all, about a morally-suspect empire’s attempt to suppress a guerrilla campaign waged by besieged Muslim Chechens. This alignment of forces was eerily contemporary. It was especially so in 2007-8, a year the fathers and mothers of some students might have spent waging a counterinsurgency campaign against a Muslim enemy in a land not so distant from the Caucuses – Iraq. And if not their parents then perhaps their high-school peers, particularly in a state with a rich military tradition like North Carolina. Hadji Murat is about dying for and against empire. That seemed "applicable" to the students themselves, who could participate in a summer-year reading program because someone else was waging a counterinsurgency campaign on their behalf.

But I still had an ace in the hole. I had heard enough complaining from colleagues about how poorly students fared with fiction, and I had witnessed quite a lot of it myself. I figured that the selection committee must have experienced much the same, and would fall over itself to find a novel that fit the bill. Here was a chance to teach fresh, unformed minds about fiction’s difficult riches. I honestly thought the book was a lock.

It wasn’t picked. The winner that year was Covering: The Hidden Assault on Our Civil Rights, by Kenji Yoshino. I was disappointed – not because my selection lost (I’m not that competitive), but because this just didn’t seem like a very good choice. The selection committee wrote that "Kenji Yoshino’s book forces readers to confront important issues relating to what we mean by equality and social justice, important themes indeed during a time when many mistakenly believe we live in the 'post-civil rights era.' It is both rigorously put and beautifully rendered. This book offers an excellent introduction to what rigorous critical inquiry is like at the university level. And the central topics treated – identity and self-expression – are central to most 18- and 19-year-olds."

I am sure that this is all true, but I couldn’t help feeling that the students had been done a disservice by asking them to read a work of nonfiction. Nonfiction was what they would be reading for the next four years and, though the students needed to learn about "rigorous critical inquiry" at the university level, all indications were that they needed far more to learn about the rigorous critical inquiry of fiction. Yoshino’s argument, I have no doubt, was subtle. But protecting civil rights is not an argument many students would find either surprising or objectionable. So for all the rhetoric about "confronting important issues," I wonder if a book like this doesn’t confront them in too cozy a manner. I cannot say for sure. But should it really be the first book an undergraduate meets? Just look at the description above, which notes that its central topics are "identity" and "self-expression." The selection committee was right – those are central to most 18-year-olds. So why give them exactly what they already know, in exactly the nonfictional form with which they are most familiar?

That question has nothing to do with Yoshino’s politics, about which I know nothing for certain, though I suspect they’re vaguely liberal. And this is where last year’s report from the National Association of Scholars, which also took issue with summer-reading programs, misses the mark. The report concluded, disapprovingly, that the "preponderance of reading assignments promotes liberal social causes and liberal sensibilities." Only 3 books of the 180 surveyed promoted a "conservative sensibility" and none promoted "conservative political causes." There is something methodologically dodgy about this sort of accounting, and I actually suspect the balance is much more equitable than this lets on. But it only occludes a more serious issue. The NAS’s challenge could be met, I take it, by substituting a conservative defense of civil rights (or limited government or what have you). But why would this be any better, since students would still be denied the intellectual and affective exercise that comes with clambering around the rock terrain of dense, difficult, and distant fiction? The NAS has no interest, so far as I can tell, in fiction as such. (Or just not a very good eye for fiction – it described The Adventures of Huckleberry Finn as a "not very challenging text," which makes me wish that the NAS reread Huck Finn next summer.)


Enough colleges and universities now run a first-year reading program that Princeton University Press has a section of its catalog specifically dedicated to books that fit the bill. I probably should have looked at it before suggesting Hadji Murat. Had I done so, I would have known it never stood a chance.

Here are four representative selections:

Diane Coyle’s The Soulful Science: What Economists Really Do and Why It Matters; George Akerlof and Robert Schiller’s Animal Spirits: How Human Psychology Drives the Economy and Why it Matters for Global Capitalism; Peter Leeson’s The Invisible Hook: The Hidden Economics of Pirates; Joel Waldfogel’s Scroogenomics: Why You Shouldn’t Buy Presents; Lawrence Weinstein and John Adam’s Guesstimation: Solving the World’s Problems on the Back of a Cocktail Napkin..

Also included are books on the Tea Party, UFOs, and an edited volume of Lincoln’s writings on race and slavery. There is not a single novel. One novelist, Amos Oz, is represented, but instead of a novel, the catalog suggests his essays on the Israeli-Palestinian conflict, How to Cure a Fanatic.

The books form a distinct but difficult to define family. Begin with the titles. Many of them have that format that publishers are keen on nowadays – a made-up word (Scroogenomics Superfreakonomics, Guesstimation) or a pun (Souled Out, Cop in the Hood), followed by a colon and, depending on the title, something appropriately sober or comically gee-whiz. More importantly, they are almost all nonfiction, are mostly concerned with contemporary American political and economic life, and, suitably compressed, would not be out of place on the opinion pages of the Sunday New York Times. Most earned warm reviews, and their authors are responsible for some wonderful writing (and some Nobel Prizes). But there remains something distastefully topical about them all: the Tea Party, piracy, the religious right, how soulful economists really are. (For all the NAS’s griping about the absence of "conservative" titles, it might take some comfort in the fact that the "market" is warmly represented). And, again, not a single work of fiction on the list. Princeton is not primarily a publisher of fictional titles, but, in its extensive back catalog, could it not find one bit of fiction worth suggesting?

I do not know how many universities have turned to this list to stock their first-year reading programs. But a quick search shows what universities have turned to this past year: Elizabeth Kolbert’s Field Notes From a Catastrophe: Man, Nature and Climate Change; William Kamkwamba and Bryan Mealer’s The Boy Who Harnessed the Wind: Creating Currents of Electricity and Hope; Colin Beavan’s No Impact Man; Steven Levitt and Stephen Deubner’s Superfreakonomics; Reichen Lemkuhl’s, Here’s What We’ll Say: Growing Up, Coming Out, and the U.S. Air Force; and Moustafa Bayoumi’s How Does It Feel To Be a Problem: Being Young and Arab in America. This reads like a somewhat-less distinguished version of the Princeton list. Again, topicality is the order of the day – climate change, alternative energy, economics-as-the-solution-to-everything. Close behind are stories of identity and self-expression, topics so dear to 18-year-olds.

All of which is difficult to say without seeming to cast aspersions on the quality of these books. That could not be further from my mind. I haven’t read most of them (though my own garden-variety left-wing blog reading makes me think I’ve got the gist of just about every one). I’m sure some are excellent.

But do incoming students really need to be told that the world is facing a climatic Armageddon? Probably not. Do they need to be told that economics is the master discipline, ready to solve the world’s problems? Probably not. Do they need to be told that it is difficult to be gay and that gay rights should be protected? Again, probably not. Of course some students need to hear all of this – the last especially – and I hope that they leave college less bigoted and more humane than they arrived.

But I hope more ardently that they don’t leave thinking that these topics are at the center of a collegiate education. A collegiate education can, of course, be taken-up with thoughts of the timely – climate-change, the Tea Party, financial markets, and piracy. But there would be something defective about that, precisely because it ignores the way an education must be about the disinterested pursuit of the permanently untimely. And that is what these books, and these first-year reading programs, miss so egregiously. College becomes a kind of intensified continuation of blog- or opinion-page reading. Worse, it becomes training for a life in thrall to the market. (A point forcefully made by Martha Nussbaum’s Not For Profit: Why Democracy Needs the Humanities. It is a cruel irony that her book is also included in Princeton’s first-year catalog.) How else to understand the preponderance of glowing titles about economics? The idea that something utterly, irremediably foreign ought to confront the students seems nowhere in sight. That is part of the reason why I suggested Hadji Murat, and part of the reason, I think, that students have such a difficulty with imaginative literature. They simply do not confront much that they have not already encountered.


I didn’t sign up to lead a seminar on Yoshino’s book. My participation in the university’s first-year reading program was limited to my hopeless suggestion of Hadji Murat, my version of a write-in vote for Nader. And I had informally vowed not to participate until the committee selected a novel. I had given up on Hadji Murat. (Maybe it will be selected to commemorate the 25th anniversary of the Sunni Awakening). But then, this year, the committee selected Jonathan Safran Foer’s Eating Animals. It, like Yoshino’s book, violates the strictures that had kept me from participating: nonfiction, topical, and, suitably compressed, something that wouldn’t be out of place in the Atlantic. But I happened to be college friends with Safran Foer, and the bonds of friendship were sufficiently strong that I got my copy of the book, and signed up to lead a seminar.

In the course of reading Eating Animals I reflected upon my earlier reluctance to participate in the program. And I began to doubt that it was as well-grounded as I thought. I am a vegetarian, and I find Safran Foer’s book – despite his disavowals – a resoundingly clear case against eating meat. And I was secretly quite thrilled that a few thousand students had been invited by the university to read this book – secretly thrilled, that is, that few thousand students would be confronted with a powerful case for stopping the slaughter of billions of animals. “Would it have been better if they were reading Hadji Murat?,” I asked myself. I wasn’t so sure. And, during the positively gut-wrenching pages of Safran Foer’s book, pages filled with unspeakable acts of cruelty, I was absolutely certain that I couldn’t care less if no one ever read Hadji Murat again. Twenty vegetarians – or, really, five – were better than two converts to Tolstoy. If I wanted to stop the slaughter of animals for food – a slaughter that Safran Foer shows is more gruesome than I ever imagined – what could be better?

And I thought differently about all those other books I castigated for their overly parochial concerns – with climate change, the Tea Party, civil rights, or gay identity. As strongly as I felt about the slaughter of animals, weren’t these books nominated, and selected, because their supporters believed just as fervently in their causes? I’m certain they were. I was able to see, in a way I had not before, that these books were selected because someone thought the threats to our civil rights, to the environment, or from Tea Party were as urgent as I found the threat to animal life. My own parochialism had been shown up, and suitably shamed. I was sorry for my arrogance and ignorance.


As it happens, Eating Animals seems not to have made much of an impact on the students. None of the 20 who came to the seminar became a vegetarian. The best I got were vague professions about more ethical eating. "I’ll only eat free-range," said one student. "I’m only eating chicken from now on," said another. These sound like good outcomes, consistent with the aims of a program designed to get students to "think more deeply" about the topic at hand. But they are incoherent things to say after reading Safran Foer’s book, which memorably demolishes the meaningless moniker "free range." And if there were one animal you would not want to eat after reading the book, it is the chicken. That chicken’s miserably short life was spent in a warehouse with thousands of equally miserable, equally doomed birds. It moved only a few feet in the entirety of its unnatural life, and then only to escape the aggression of its crazed neighbors. Its last moments were spent in a slaughterhouse, caked in defecation and dirt, where its neck was severed by a machine of truly medieval cruelty. How had they missed those pages?

I've spent the last few weeks trying to understand why so many students got the book so wrong. There were all sorts of good reasons. Maybe they read it in the beginning of the summer and had forgotten the details. Or maybe they just didn’t read it. Either of which would have been understandable and, for first-year students who spent the summer on the beach, forgivable. But with one or two exceptions, all professed to have read the book. They just had no idea what Safran Foer had written.

And so the more I thought about the conversation the more difficult it became to resist the conclusion that students simply aren’t very good readers. This is not news. Becoming a good reader, after all, is one of the things that happens in college, not before it. But until this seminar, I had assumed that their difficulties were limited to fiction. I assumed that they all knew how to follow the argument of a piece of nonfiction, especially one as linear as Safran Foer’s. There are some subtle arguments in the book, but the overriding claim simply cannot be missed. And miss it is just what these students had done.

I was disappointed that no one had become a vegetarian – just as, I suppose, the recommenders of Yoshiro’s book would have been disappointed had someone left the discussion convinced that our civil rights were safe. But I was more disappointed for the students, that they were so inexperienced in the ways of reading that they were lost even in a book like this. I tried to see their inexperience as a reason for excitement. Just think, I tried to tell myself, how much room they have to grow as reader. But I didn’t really believe this. Sure, they would become marginally better readers. But the kind of reader I wanted them to become – the kind of reader I myself want to be – is the kind of reader one becomes only after years of reading fiction.

Others have made the case for fiction more persuasively than I ever could. And if these first-year reading programs contained a smattering of fiction and nonfiction, there wouldn’t be much reason to gripe. Fiction some years, nonfiction others is a good enough outcome. But the paucity of fiction in these programs is stunning. My university has chosen fiction once (Jhumpa Lahiri’s The Namesake) in 12 years, a ratio that seems fairly typical of colleges nationwide. (It also bravely selected Approaching the Quran in 2003. Though obviously not a work of fiction in the sense I have in mind, it poses interpretive challenges not unlike the best imaginative literature). I was heartened to see Knopf issue its own catalog of first-year titles. Although it mostly resembles Princeton’s in its emphasis on economics and gee-whiz science, it includes nearly 20 novels, some of them, like Joseph O’Neill’s Netherland, supremely beautiful and difficult. Sadly, I couldn’t find a college that chose it.

Tolstoy makes a brief appearance in Safran Foer’s book, which has a laugh at the Russian’s fatuous suggestion that the end of slaughterhouses would mean the end of war. This just goes to show that reading fiction doesn’t inoculate you against bad ideas. It doesn’t make you a more moral person either, many "save-the-humanities" cases notwithstanding. But reading fiction makes you a better reader. Somehow, administrators of these programs seem to have lost sight of that. And so the next time some college thinks about selecting Safran Foer, I’d ask them to think about the famous Russian vegetarian instead. I’ll be sorry to take away the sales from my friend, and sorry that his book won’t be read. But I’m sorrier to see his book read poorly, and, as the author of some wonderfully imaginative fiction himself, perhaps he won’t mind losing to Tolstoy.

Brendan Boyle
Author's email: 

Brendan Boyle teaches in the classics department at the University of North Carolina at Chapel Hill.

The Broccoli of Higher Ed

We hear these days of the "crisis of the humanities." The number of majors, jobs, and student interest in these subjects is dropping. The Boston Globe offered one report on the worries of the humanities in an article last year about the new Mandell Center at Brandeis University. The Globe asserted, "At college campuses around the world, the humanities are hurting. Students are flocking to majors more closely linked to their career ambitions. Grant money and philanthropy are flowing to the sciences. And university presidents are worried about the future of subjects once at the heart of a liberal arts education."

Such gloom must be placed in context. Doubts about the humanities have been around at least since Aristophanes wrote The Clouds. The playwright claimed that if a man engaged in the "new" Socratic form of teaching and questioning, he could wind up with big genitals (apparently seen as a negative side effect) due to a loss of self-control. But the Socratic humanities survived, in spite of the execution of their founder, through the schools of his intellectual son and grandson -- the Academy of Plato and the Lyceum of Aristotle.

I don't think that the humanities are really in a crisis, though perhaps they have a chronic illness. Bachelor's degrees in the humanities have held relatively steady since 1994 at roughly 12-13 percent of all majors. Such figures demonstrate that the health of the humanities is not robust, as measured in terms of student preferences. In contrast, the number of undergraduate business majors is steadily and constantly increasing.

So what has been the response of university and college leaders to the ill health of the humanities?

It has been to declare to applicants, students, faculty, and the public that these subjects are important. It has included more investments in humanities, from new buildings like the Mandel Center at Brandeis, to, in some cases, hiring more faculty and publicizing the humanities energetically. Dartmouth College's president, Jim Yong Kim, recently offered the hortatory remark that "Literature and the arts should not only be for kids who go to cotillion balls to make polite conversation at parties."

I couldn't agree more with the idea that the humanities are important. But this type of approach is what I call the "eat it, it's good for you" response to the curricular doldrums of humanities. That never worked with my children when it came to eating broccoli and it is even less likely to help increase humanities enrollments nationally today.

The dual-horned dilemma of higher education is the erosion of the number of majors in the humanities on the one hand and the long-feared "closing of the American mind" on the other, produced in part by the growing number of students taking what some regard as easy business majors. Yet these problems can only be solved by harnessing the power of culture, by understanding the ethno-axiological soup from which curriculums evolve and find their sustenance. Jerome Bruner has long urged educators to connect with culture, to recognize that the environment in which we operate is a value-laden behemoth whose course changes usually consume decades, a creature that won't be ignored.

It is also vital that we of the humanities not overplay our hands and claim for ourselves a uniqueness that we do not have. For example, it has become nearly a truism to say that the humanities teach "critical thinking skills." This is often correct of humanities instruction (though certainly not universally so). But critical thinking is unique neither to the humanities nor to the arts and sciences more generally. A good business education, for example, teaches critical thinking in management, marketing, accounting, finance, and other courses. More realistically and humbly, what we can say is that the humanities and sciences provide complementary contexts for reasoning and cultural knowledge that are crucial to functioning at a high level in the enveloping society.

Thus, admitting that critical thinking can also be developed in professional schools, we realize that it is enhanced and further developed when the thinker learns to develop analytical skills in history, different languages, philosophy, mathematics, and other contexts. The humanities offer a distinct set of problems that hone thinking skills, even if they are not the only critical thinking game in town. At my institution, Bentley University, and other institutions where most students major in professional fields, for example, English develops vocabulary and clarity of expression while, say, marketing builds on and contributes to these. Science requires empirical verification and consideration of alternatives. Accountancy builds on and contributes to these. Science and English make better business students as business courses improve thinking in the humanities and sciences.

If, like me, you believe that the humanities do have problems to solve, I hope you agree that they are not going to be solved by lamenting the change in culture and exhorting folks to get back on course. That's like holding your finger up to stop a tidal wave. Thinking like this could mean that new buildings dedicated to the humanities will wind up as mausoleums for the mighty dead rather than as centers of engagement with modern culture and the building of futures in contemporary society.

So what is there to do? How do we harness the power of culture to revive and heal the influence of the humanities on future generations? Remember, Popeye didn't eat his spinach only because it was good for him. He ate his spinach because he believed that it was a vital part of his ability to defend himself from the dangers and vicissitudes of life, personified in Bluto. And because he believed that it would give him a good life, represented by Olive Oyl.

Recently, an alumnus of Bentley told me over dinner, "You need business skills to get a job at our firm. But you need the arts and sciences to advance." Now, that is the kind of skyhook that the friends of the humanities need in order to strengthen their numbers, perception, and impact.

While I was considering the offer to come to Bentley as its next dean of arts and sciences, Brown University and another institution were considering me for professorial positions. Although I felt honored, I did not want to polish my own lamp when I felt that much in the humanities and elsewhere in higher education risk becoming a Ponzi scheme, which Wikipedia defines accurately as an "...operation that pays returns to separate investors, not from any actual profit earned by the organization, but from their own money or money paid by subsequent investors."

I wanted to make my small contribution to solving this problem, so I withdrew from consideration for these appointments to become an administrator and face the issue on the front line. And Bentley sounded like exactly the place to be, based on pioneering efforts to integrate the humanities and sciences into professional education -- such as our innovative liberal studies major, in which business majors complete a series of courses, reflections, and a capstone project emerging from their individual integration of humanities, sciences, and business.

Programs that take in students without proper concern for their future or provision for post-graduate opportunities -- how they can use what they have learned in meaningful work -- need to think about the ethics of their situation. Students no longer come mainly from the leisured classes that were prominent at the founding of higher education. Today they need to find gainful employment in which to apply all the substantive things they learn in college. Majors that give no thought to that small detail seem to assume that since the humanities are good for you, the financial commitment and apprenticeship between student and teacher is fully justified. But in these cases, the numbers of students benefit the faculty and particular programs arguably more than they benefit the students themselves. This is a Ponzi scheme. Q.E.D.

The cultural zeitgeist requires of education that it be intellectually well-balanced and focused but also useful. Providing all of these and more is not the commercialization of higher education. Rather, the combination of professional education and the humanities and sciences is an opportunity to at once (re-)engage students in the humanities and to realize Dewey's pragmatic goal of transforming education by coupling concrete objectives with abstract ideas, general knowledge, and theory.

I have labeled this call for a closer connection between the humanities and professional education the "Crucial Educational Fusion." Others have recognized this need, as examples in the new Carnegie Foundation for the Advancement of Teaching book Rethinking Undergraduate Business Education: Liberal Learning for the Profession illustrate. This crucial educational fusion is one solution to the lethargy of the humanities -- breaking down academic silos, building the humanities into professional curriculums, and creating a need for the humanities. Enhancing their flavor like cheese on broccoli.

Daniel L. Everett
Author's email: 

Daniel L. Everett is dean of arts and sciences at Bentley University.

End Large Conferences

I’ll play Marc Antony. I have not come to praise large conferences, but to bury them. It is my opinion that mega humanities conferences are way past their sell-by date. For senior faculty the only reason to go is to schmooze with old friends; for junior faculty they are an onerous duty, and for graduate students they are a rip-off for which professional organizations ought to be collectively ashamed.

First codicil: I speak exclusively of humanities conferences, as they are the only ones I know firsthand. Friends in computing and the sciences tell me that collaborative efforts arise from their conferences. I’m willing to believe them. Maybe it’s a cultural thing. Most humanities people find it so hard to collaborate that their wills stipulate that their notes go with them to the grave.

Second codicil: I have only myself to blame for recent travails. I didn't need to go to my unnamed conference, but I got it into my head that it would be fun. I was wrong. It serves me right for violating my principles.

Five years ago I concluded that humanities conferences were out of touch with the times and vowed to attend only smaller regional meetings with less cachet, but more satisfaction. But I didn’t listen to me. Instead I spent four days and a considerable wad of cash jostling among a throng of over three thousand. I returned home more akin to Ponce de Leon, who sought the Fountain of Youth and found mostly dismal swampland. Sound harsh? See if any of these observations resonate with your own.

Problem One: Outmoded Presentations

We live in the communications age, but the memo apparently never circulated among those studying the liberal arts. For reasons arcane and mysterious, humanities scholars still read papers. That’s tedious enough at a small conference where one might attend six three-paper presentations. At my recent conference, sessions commenced at 8 a.m. and ran past 10 p.m. One could have conceivably attended 30 sessions and heard 90 or more papers, though the only ones with the stamina to attend more than six or seven sessions were either posturing or desperate.

I wanted my four-day sojourn to introduce me to new ideas, concepts, and teaching modules, but the reality of such a grueling schedule is that I was running on fumes by the end of day one. It would have helped if presenters took advantage of new technology, but things seldom got more flash than PowerPoint, a program that, alas, seems to encourage more reading. Let me reiterate something I’ve said for years: the death penalty should apply to those who read anything from a PowerPoint slide other than a direct quote. It's an academic conference, for crying out loud; assume your audience is reasonably proficient at reading! Seriously, does anyone need to fly across the country to listen to a paper? Why not do as science conferences have done for years: post papers online and gather to have a serious discussion of those papers?

The mind-numbing tedium of being read to for four days is exacerbated by the fact that many humanities scholars have little idea about the differences between hearing and reading. If you construct a paper that’s so highly nuanced that understanding it rests upon subtle turns of phrase or complicated linguistic shifts, do not look up from your paper with a wan smile indicating you are enamored of your own cleverness; go back to your room and rewrite the damn thing. Audience, clarity, and coherence are pretty much the Big Three for speech and composition, unless one's audience is the International Mindreaders' Society. By the way, is there something wrong with using a map, providing a chart, or summarizing a work that few in the room are likely to have read? And do bother to tell me why your paper matters.

I actually heard several very exciting papers, but most of the offerings were dreadful. Note to young scholars: stop relying on the Internet and check out journals that predate 1995 before you proclaim a “discovery.” And if you really want to stand out, work on your shtick. Guess which papers I remember? Yep -- those in which the presenter did more than read to me.

Critical note to young scholars: Want to turn off everyone in the room? Be the person who doesn’t think that the 20-minute limit applies to you. Nothing says "non-collegial" more clearly.

Problem Two: Expense

Another reason to rethink conferences is that they cost an arm and a leg to attend. I had partial funding from my university because I was presenting -- and no, I bloody well did not read my paper -- but I was still out of pocket for quite a hunk of cash. If you attend a humanities conference and want to stay anywhere near the actual site of the event, plan on $150 per night for lodging in a soulless franchise hotel with windowless conference rooms and quirky technology, $20 per day for Internet access, another $200 for conference fees, roughly $500 for airfare, at least $50 for taxis to and from the airport -- almost no U.S. city has a convenient shuttle service anymore -- and money for whatever you plan on eating.

Budget plenty for the latter if your conference is in what is glibly called a Destination City. That’s shorthand for a theme area marketing itself as unique, though it’s actually a slice of Generica surrounded by shops and restaurants identical to those found in suburban malls in every way except one: captive audiences equal higher prices. (One small example: the Starbucks inside the pedestrian precinct at my hotel charged a buck more per cup than the one on the street 100 yards away.) Do the math and you can see that you can easily drop a few grand on a megaconference. (That’s what some adjuncts are paid per course!)

An immediate cost-saving adjustment would be to confine conferences to airline hub cities such as New York, Chicago, Los Angeles, Atlanta, and Houston. The moment the conference locates to a (not my term) "second-tier" city, allot another few hundred dollars for "connecting flights," a term used by the airline industry because it sounds nicer than saying you’ll spend six hours waiting in a hub, after you’ve sprinted through the airport like Usain Bolt for your next flight, found the gate closed, and retreated to the rebooking counter.

Problem Three: Victimized Grad Students

I'm a parsimonious Scot who resents spending money on boring hotels and lousy food, but I can afford it when I have to. Grad students can’t. A major way in which megaconferences have changed in the past several decades is that there’s considerably less balance between senior scholars, junior colleagues, and graduate students. (Senior scholars used to accompany the latter two in a mentor capacity.) Now there is just a smattering of senior and junior scholars, and they’re often holed up in hotel suites conducting interviews. Whenever they can, search committee members flee the conference and rendezvous with old friends. They might attend a session or two. Unless they have to be there, there aren’t many junior colleagues in attendance at all because they're busy getting material into publication and they can meet presentation expectations at cheaper regional meetings, or save their dough and go to prestigious (-sounding) international gatherings.

So who’s left? Graduate students. Lots of graduate students. So many that conservationists would recommend culling the herd if they were wild mustangs. Grad students have always gone to conferences in hopes of making their mark, attracting attention, and meeting people who can help them advance. That was the way it was done -- 20 years ago. Now network opportunities are slimmer. Whom do they meet? Mostly other grad students, often those massed outside of interview rooms.

Of all the antiquated things about large conferences, the "cattle call" interview is the most perverse. These were barbaric back in the days in which there were jobs; now they’re simply cruel. At least a third of attendees at my conference were grad students from a single discipline: English. As has been discussed many times on this site, most of them shouldn't be in grad school in the first place. How many of the thousand-plus English grad students can realistically hope to get an academic job of any sort?

The Modern Language Association predicts that only 900 English jobs will come open for all of 2011. That’s 900 in all specialties of English, the bulk of which will be in writing and rhetoric, not Austen and Proust. Will a fifth of those at the conference get a job? The odds are long. It's probably more like half of that, and if we're talking about a good job, slice it in half once more. So why ask strapped grad students to attend expensive conferences for 15-minute preliminary interviews? Do a telephone interview, for heaven’s sake; it’s kinder on both grad students and search committees.

As I did as a grad student, many young hopefuls pooled resources and economized where they could, but the sad truth is that the vast majority of attendees spent a small fortune on a gamble whose odds aren't much greater than buying lottery tickets. Are associations playing the role of enabler to grad student delusions? Yes. Here’s another thought: Instead of holding a big conference, sponsor a teleconference. Charge a fee for uploads, but give speakers one-year access to the URL, which they can make available to potential employers. Use the savings to the association to lobby for more tenure-track faculty.

Problem Four: No-Shows

You spend lots of money, you sit through desultory talks, and head off to the one or two sessions that made you want to attend the conference in the first place. What do you find? It’s been canceled because only one of the presenters showed up, and that paper was combined with several others of sessions that suffered the same fate. Didn’t you see the 3x5 card tacked to the conference bulletin board?

As noted above, I’m in favor of putting large conferences to rest. But If we insist on having them, let’s at least make sure they’re as advertised. O.K., things do happen, but in most cases missing presenters are simply AWOL. I know it smacks of McCarthyism, but I’ve come to support the idea of a data bank of no-shows that employers, conference planners, and deans can check.

Problem Five: Urban Sprawl

What’s the point of a conference that’s so big it’s inaccessible? I walked between two different hotels to attend sessions and pored over a Britannica-sized program to locate them. Conference attendees were housed in four "official" hotels and untold numbers of others. With round-the-clock sessions and decentralization, the few networking opportunities that did exist were logistically difficult. It took me two entire days to find my old friends, let alone new folks I wanted to engage. I met two interesting people at the airport. I never saw them again.

In Praise of Small Conferences

There are other problems I’ll leave for now, including the gnawing suspicion that some big conferences have become sinecures for "insiders" who have become "players" within associations. Let’s just say that there is a serious disconnect between how the big conferences operate and what makes sense in the changing world of academe.

Teleconferences with real-time discussion groups and online forums would be one good starting point for reform; providing more resources for regional and local conferences would be another. Small gatherings have issues of their own -- no-shows, sparsely attended sessions, overreliance on volunteers -- but they compensate by offering intimacy, good value, face-to-face feedback, and easier opportunities to network. It's time to give these the cachet they deserve. The big conference is like a one-size-fits-all t-shirt; it simply doesn’t accessorize most people. I’m done. For real. Unless I get funding for an exotic overseas meeting. (Just kidding!)

Rob Weir
Author's email: 

Rob Weir, who writes the "Instant Mentor" column for Inside Higher Ed's Career Advice section, has published six books and numerous articles on social and cultural history, and has been cited for excellence in teaching on numerous occasions during his 20 years in the college classroom.


Subscribe to RSS - English
Back to Top