Fear of Being Useful
"When the going gets tough, the tough take accounting." With those succinct words in a June 2010 op ed, New York Times columnist David Brooks summed up the conventional wisdom on the current crisis of the humanities. In an age when a higher education is increasingly about moving quickly through a curriculum streamlined to prepare students for a job, the humanities have no practical utility. As Brooks observes, "when the job market worsens, many students figure they can’t indulge in an English or a history major," a fact that explains why the "humanities now play bit roles when prospective students take their college tours. The labs are more glamorous than the libraries."
Pushed into a corner by these dismaying developments, defenders of the humanities -- both traditionalists and revisionists — have lately been pushing back. Traditionalists argue that emphasizing professional skills would betray the humanities' responsibility to honor the great monuments of culture for their own sake. Revisionists, on the other hand, argue that emphasizing the practical skills of analysis and communication that the humanities develop would represent a sellout, making the humanities complicit with dominant social values and ideologies. But though these rival factions agree on little else, both end up concluding that the humanities should resist our culture's increasing fixation on a practical, utilitarian education. Both complain that the purpose of higher education has been reduced to credentialing students for the marketplace.
Martha Nussbaum, for example, while stressing that the humanities foster critical thinking and the ability to sympathetically imagine the predicament of others, insists such skills are, as the title of her 2010 book puts it, "not for profit." In doing so she draws a stark line between the worlds of the humanities and the 21st-century workplace. Likewise, Geoffrey Galt Harpham in The Humanities and the Dream of America, laments the increasing focus on professional skills in the humanities at the expense of reading great books. Stanley Fish takes an even more extreme position, insisting that the humanities "don’t do anything, if by 'do' is meant bring about effects in the world. And if they don’t bring about effects in the world they cannot be justified except in relation to the pleasure they give to those who enjoy them. To the question 'of what use are the humanities?', the only honest answer is none whatsoever." Worse still, Frank Donoghue, in The Last Professors: The Corporate University and the Fate of the Humanities, argues that the humanities will simply disappear in the new corporate, vocation-centered university.
Ironically, these pessimistic assessments are appearing at the very moment when many employers outside academe are recognizing the practical value of humanities training. Fish simply dismisses the argument that "the humanities contribute to the economic health of the state — by producing more well-rounded workers or attracting corporations or delivering some other attenuated benefit — because nobody really buys that argument." But this would come as news to the many heads of philanthropic foundations, nonprofits, and corporate CEOs who have lately been extolling the professional value of workplace skills grounded in the humanities.
We would be the last to argue that traditional ways of valuing the humanities are not important, that studying philosophy, literature, and the fine arts do not have a value in and of themselves apart from the skills they teach. We also recognize that the interests of the corporate world and the marketplace often clash with the values of the humanities. What is needed for the humanities in our view is neither an uncritical surrender to the market nor a disdainful refusal to be sullied by it, but what we might call a critical vocationalism, an attitude that is receptive to taking advantage of opportunities in the private and public sectors for humanities graduates that enable those graduates to apply their training in meaningful and satisfying ways. We believe such opportunities do exist.
To be sure, such optimism must be tempered in today’s bleak economy, where hardly any form of education is a sure ticket to a job and where many in the private sector may still look with indifference or even disdain on a humanities degree. But as David Brooks himself went on to point out in his op-ed: "Studying the humanities improves your ability to read and write. No matter what you do in life, you will have a huge advantage if you can read a paragraph and discern its meaning (a rarer talent than you might suppose). You will have enormous power if you are the person in the office who can write a clear and concise memo."
Brooks’ view is echoed by Edward B. Rust Jr., chairman and CEO of State Farm Insurance Companies, who observes that "at State Farm, our employment exam does not test applicants on their knowledge of finance or the insurance business, but it does require them to demonstrate critical thinking skills" and "the ability to read for information, to communicate and write effectively, and to have an understanding of global integration." And then there is Google, which more than any other company has sung the praises of humanities students and intends to recruit many of them. "We are going through a period of unbelievable growth," reports Google’s Marissa Mayer, "and will be hiring about 6,000 people this year — and probably 4,000-5,000 from the humanities or liberal arts."
This evidence of the professional utility of humanities skills belies Donoghue’s apparent assumption (in The Last Professors) that the "the corporate world’s hostility" toward humanistic education remains as intense today as it was a century ago, when industrialists like Andrew Carnegie dismissed such an education as "literally, worthless." Donoghue ignores changes in the global economy, the culture, and the humanities themselves since Carnegie’s day that have given many corporate leaders a dramatically more favorable view of the humanities’ usefulness. Associate Dean Scott Sprenger of Brigham Young University, who oversees a humanities program we will discuss in a moment, quotes the dean of the Rotman School of Management in Toronto, who observes a "tectonic shift for business school leaders," who are now aware that "learning to think critically — how to imaginatively frame questions and consider multiple perspectives — has historically been associated with a liberal arts education, not a business school curriculum."
All of these commentators are right, and the skills they call attention to only begin to identify the range of useful professional competencies with which a humanities education equips 21st-century students. In addition to learning to read carefully and to write concisely, humanities students are trained in fields like rhetoric and composition, literary criticism and critical theory, philosophy, history, and theology to analyze and make arguments in imaginative ways, to confront ambiguity, and to reflect skeptically about received truths, skills that are increasingly sought for in upper management positions in today’s information-based economy. Even more important for operating as global citizens in a transnational marketplace, studying the literary, philosophical, historical, and theological texts of diverse cultures teaches humanities students to put themselves in the shoes of people who see and experience the world very differently from their own accustomed perspectives. Are some corporations still looking for employees who will be well-behaved, compliantly bureaucratized cogs in the wheel? Of course they are. But increasingly, many others are looking for employees who are willing to think outside the box and challenge orthodoxy.
It is true that humanities study, unlike technical training in, say, carpentry or bookkeeping, prepares students not for any specific occupation, but for an unpredictable variety of occupations. But as many before us have rightly pointed out, in an unpredictable marketplace this kind of versatility is actually an advantage. As Associate Dean Sprenger notes, "the usefulness of the humanities" paradoxically "derives precisely from their detachment from any immediate or particular utility. Experts tell us that the industry-specific knowledge of a typical vocational education is exhausted within a few years," if not "by the time students enter the workforce." It is no accident, he observes, "that a large percentage of people running Fortune 500 companies (one study says up to 40 percent) are liberal arts graduates; they advance more rapidly into mid- and senior-level management positions, and their earning power tends to rise more significantly than people with only technical training."
If there is a crisis in the humanities, then, it stems less from their inherent lack of practical utility than from our humanistic disdain for such utility, which too often prevents us from taking advantage of the vocational opportunities presented to us. This lofty disdain for the market has thwarted the success of the few programs that have recognized that humanities graduates have much to offer the worlds of business, technology, arts agencies, and philanthropic foundations.
The most promising of these was a program in the 1990s developed by the Woodrow Wilson National Fellowship Foundation under the leadership of its then-director, Robert Weisbuch. First called "Unleashing the Humanities" and later "The Humanities at Work," the program, according to Weisbuch in an e-mail correspondence with the authors, "had persuaded 40 nonprofits and for-profits to reserve meaningful positions for Ph.D. graduates in the humanities and had placed a large number in well-paying and interesting positions — at places ranging from Verizon to AT Kearney to The Wall Street Journal to the National Parks Service." Unfortunately, Weisbuch reports, only a few humanities graduate programs "enlisted their alumni and the influential corporations and others in their areas of influence to revolutionize the possibilities for employment of humanities doctoral graduates," while most faculty members "continued to expect their graduate students to look for jobs much like their own and to consider any other outcome a failure."
Today, however, some humanities programs that emphasize useful professional applications are prospering. One of these is a new undergraduate major at Brigham Young University called "Humanities +," with the "+” referring to the value-added vocational component gained by students who elect the program. According to an e-mail to the authors from Associate Dean Sprenger, BYU hired a career services specialist tasked with "revolutionizing our humanities advising office along the lines of the Humanities + vision, and the program has developed ties with the university’s colleges of business and management" — a virtually unheard-of step for a humanities program. The program’s students are urged "to minor in a practical subject, professionalize their language skills, and internationalize their profile by doing an overseas internship." The objective, Sprenger says, "is that career thinking and strategizing become second nature to students," while faculty members "see it as in their interest to help students find 'alternative' careers, and are reassured that they can rely on our advising office to be informed and to do the training."
Another notable program that sees its mission as "bringing humanities into the world" beyond academe and that works closely with its university’s office of career placement is the Master of Arts Program in the Humanities (MAPH) at the University of Chicago, which Gerald helped design and direct in the 1990s. According to a recent article by program associate A.J. Aronstein in Tableau, a University of Chicago house journal, one recent MAPH graduate got a job as finance director in Florida for Barack Obama’s 2008 campaign, later served as chief of staff at the International Trade Association, and now works as a political consultant in Washington. Other MAPH graduates have gone on to internships and subsequent positions as museum curators, technical writers, journalists and other media workers, marketing specialists, and policy analysts with investment firms.
The false assumption in both anti-utilitarian defenses of the humanities and pessimistic predictions of their extinction is that we have to choose between a credentializing and a humanizing view of higher education, between vocational utility and high-minded study as an end in itself. This either/or way of thinking about the humanities — either they exist solely for their own sake or they have no justification at all – is a trap that leaves humanists unable to argue for the value of their work in terms of the practical skills it teaches, an argument that inevitably has to be made in the changing marketplace of higher education. In fact, we would argue there is no defense of the humanities that is not ultimately based on the useful skills it teaches.
Evidence is plentiful that stressing the range of expertise humanities graduates have makes intellectual and economic sense. Take, for example, Damon Horowitz, director of engineering at Google. He insisted recently in an article in The Chronicle of Higher Education entitled "From Technologist to Philosopher: Why You Should Quit Your Technology Job and Get a Ph.D. in the Humanities," that "if you are worried about your career ... getting a humanities Ph.D. is not only not a danger to your employability, it is quite the opposite. I believe there no surer path to leaping dramatically forward in your career than to earn a Ph.D. in the humanities." "You go into the humanities to pursue your intellectual passion," he explains, "and it just so happens, as a byproduct, that you emerge as a desired commodity for industry."
Horowitz, a leading figure in artificial intelligence and the head of a number of tech startups, ought to know. He took a break from his lucrative career to enroll in Stanford’s Ph.D. program in philosophy because he figured out that in order to do his job in technology well he needed to immerse himself in the humanities. "I realized that, while I had set out in AI to build a better thinker, all I had really done was to create a bunch of clever toys." Horowitz came to realize that the questions he was "asking were philosophical questions — about the nature of thought, the structure of language, the grounds of meaning." Returning to the humanities, Horowitz took time out from the world of artificial intelligence to study "radically different approaches to exploring thought and language," such as philosophy, rhetoric, hermeneutics and literary theory. As he studied intelligence from these perspectives he "realized just how limited my technologist view of thought and language was. I learned how the quantifiable, individualistic, ahistorical — that is, computational — view I had of cognition failed to account for whole expanses of cognitive experience (including, say, most of Shakespeare)."
The concrete value of the humanities education Horowitz celebrates is especially well epitomized in the new field of the digital humanities. The emergence of this field calls attention to how old 20th-century divisions between science and the humanities are breaking down and gives those of us committed to defending the practical value of the humanities a tremendous opportunity. The digital humanities represent the cutting-edge intersection of the humanities and computer science, the merging of skills and points of view from two formerly very different fields that are leading to a host of exciting innovations – and opportunities for students who want to enter fields related to everything from writing computer programs to text encoding and text editing, electronic publishing, interface design, and archive construction. Students in the digital humanities are trained to deal with concrete issues related to intellectual property and privacy, and with questions related to public access and methods of text preservation.
Graduates of the digital humanities programs that are now developing all over the country will be first in line for such positions. For example, Paul’s university now has a Digital Humanities M.A. with two converging tracks, one designed for students with a background in computer science and one for students with a background in the humanities. The program website notes that it offers "theoretical, critical, social, and ethical contexts for thinking about the making of new knowledge through digital humanities research and applications, from issues of intellectual property and privacy, to questions of public access and methods of preservation." When we are asked about the practical value of a humanities education, we need to add the digital humanities to the list.
We believe it is time to stop the ritualized lamentation over the crisis in the humanities and get on with the task of making them relevant in the 21st century. Such lamentation only reveals the inability of many humanists to break free of a 19th-century vision of education that sees the humanities as an escape from the world of business and science. As Cathy Davidson has forcefully argued in her new book, Now You See It, this outmoded way of thinking about the humanities as a realm of high-minded cultivation and pleasure in which students contemplate the meaning of life is a relic of the industrial revolution with its crude dualism of lofty spiritual art vs. mechanized smoking factories, a way of thinking that will serve students poorly in meeting the challenges of the 21st century.
Though we have argued in defense of the practical and vocational utility of a humanities education, our argument should in no way be construed as undercutting the aspirations of those in the humanities who seek an academic career. Indeed, on this score we need to redouble our efforts to increase public and private funding for higher education and to support unionizing efforts by faculty members and adjuncts. But even as we fight these battles to expand the academic job market we would be foolish to turn our backs on alternative forms of employment for humanities graduates when they are out there. In this spirit, we applaud both Modern Language Association President Russell Berman and American Historical Association President Anthony Grafton, who, along with the executive director of the AHA, James Grossman, have recently urged their organizations to acknowledge that advanced training in the humanities can lead to a variety of careers beyond academia and have suggested how graduate programs can be adapted for these kinds of careers.
For ultimately, to take advantage of the vocational potential of humanities study as we propose is not to sell out to the corporate world, but to bring the critical perspective of the humanities into that world. It is a perspective that is sorely needed, especially in corporate and financial sectors that have lately been notoriously challenged in the ethics department, to say the least. Humanities graduates are trained to consider the ethical dimensions of experience, linking the humanities with the sciences as well as with business and looking at both these realms from diverse perspectives. To those who worry that what we urge would blunt the humanities' critical power, we would reply that it would actually figure to increase that power, for power after all is the ability to act in the world.
Paul Jay is professor of English at Loyola University Chicago and the author, most recently, of Global Matters: The Transnational Turn in Literary Studies. Gerald Graff is professor of English and education at the University of Illinois at Chicago and a past president of the Modern Language Association.
Search for Jobs
Popular Job Categories