Ask anyone professing the humanities today and you come to understand that a medieval dimness looms. If this is the end-times for the ice sheets at our poles — and it is — many of us also understand that the melt can be found closer to home, in the elimination of language and classics departments, for instance, and in the philistinism represented by governors such as Rick Scott of Florida and Patrick McCrory of North Carolina, who apparently see in the humanities a waste of time and taxpayer subsidies. In the name of efficiency and job creation, according to their logic, taxpayers can no longer afford to support bleary-eyed poets, Latin history radicals, and brie-nibbling Francophiles.
That there is a general and widespread acceptance in the United States that what is good for corporate America is good for the country is perhaps inarguable, and this is why men like Governors Scott and McCrory are dangerous. They merely invoke a longstanding and not-so-ugly stereotype: the pointy-headed humanist whose work, if you can call it that, is irrelevant. Among the many easy targets, English departments and their ilk are convenient and mostly defenseless. Few will rise to rush the barricades with us, least of all the hard-headed realists who understand the difficulties of running a business, which is what the university is, anyway.
I wish, therefore, to propose a solution that will save money, save the humanities, and perhaps make the world a better place: Close the business schools.
The Market Argument
We are told that something called “the market” is responsible for the great disparities in pay between humanities professors and business professors. To a humanist, however, this market is the great mystifier; we find no evidence of an “invisible hand” that magically allocates resources within the university. The market argument for pay differentials between business professors and historians (average pay in 2014 for full professors at all institutions: $123,233 and $86,636, respectively, a difference of almost 30 percent; average at research institutions is $160,705 and $102,981, a difference of 36 percent), for instance, fails to convince that a market is operating. This is because administrators and trustees who set salaries based upon what the market can bear, or what it calls for, or what it demands, are actually subsidizing those of us who are who are manifestly out of the market.
Your average finance professor, for instance, is not a part of this market; indeed, she is a member of the artificial market created by colleges and universities themselves, the same institutions that tout the importance of critical thinking and of creating the well-rounded individual whose liberal arts study will ostensibly make her into a productive member of our democracy. But the administrators who buy the argument that the market allocates upward of 20, 30, or 40 percent more for the business professor than it does her colleague in the humanities have failed to be the example they tout: they are not thinking.
The higher education market for business professors and legal scholars, for instance, is one in which the professor is paid as if she took her services and sold them on what is commonly call the market. Which is where she, and her talents, manifestly are not. She is here, in the building next to ours, teaching our students and doing the same work we are. If my daughter cuts our lawn, she does not get paid as if she were cutting the neighbor’s lawn.
The business professor has sacrificed the blandishments of the other market for that of the university, where she can work softer hours, have her December/January vacation, go to London during the summer on a fellowship or university grant, and generally live something approaching the good life — which is what being employed by a college or university allows the lucky who earn tenure. She avoids the other market — eschews the long hours in the office, the demands of travel, the oppressive corporate state — so that she can pick up her kids from school on occasion, sleep in on a Saturday, and turn off her smartphone. She may be part of a machine, but it is a university machine, and as machines go she could do worse. This “market” is better than the other one.
But does she bring more value to the university? Does she generate more student hours? These are questions that administrators and business professors do not ask. Why? Because they wouldn’t like the answers. They would find that she is an expensive acquisition. Unless she is one of the Wharton superstars and appears on CNN Money and is quoted in The Wall Street Journal, there’s a good chance that the university isn’t getting its money’s worth.
The Moral Argument
There is another argument for wishing our business professor adieu. She is ostensibly training the next crop of financiers and M.B.A.s whose machinations have arguably had no salutary effects on this democracy. I understand that I am casting a wide net here, grouping the good with the bad, blaming the recent implosion of the world economy on business schools. One could, perhaps, lay equal blame on the mathematicians and quantitative analysts who created the derivative algorithms and mortgage packages that even the M.B.A.s themselves don’t understand, though there’s a good chance that business school graduates hired these alpha number crunchers.
Our investment bankers and their ilk will have to take the fall because, well, they should have known better. If only because, at bottom, they are responsible — with their easy cash and credit, their drive-through mortgages, and, worst of all, their betting against the very system they knew was hopelessly constructed. And they were trained at our universities, many of them, probably at our best universities, the Harvards and Princetons and Dartmouths, where — it is increasingly apparent — the brightest students go to learn how to destroy the world.
I am not arguing that students shouldn’t take classes in accounting, marketing, and economics. An understanding of these subjects holds value. They are honorable subjects often horribly applied. In the wrong hands they become tools less of enlightenment and liberation than ruthless self-interest. And when you have groups of like-minded economic pirates banding together in the name of self-interest, they form a corporation, that is, a person. That person, it is now apparent, cannot be relied upon to do the right thing; that person cannot be held accountable.
It’s not as if this is news. Over 150 years ago, Charles Dickens saw this problem, and he wrote A Christmas Carol to address it. The hero of Dickens’s novella is Jacob Marley, who returns from the grave to warn his tightfisted partner Ebenezer Scrooge that he might want to change his ways. When Scrooge tells Marley that he was always a “good man of business,” Marley brings down the thunder: “Mankind was my business. The common welfare was my business; charity, mercy, forbearance, and benevolence, were, all, my business. The dealings of my trade were but a drop of water in the comprehensive ocean of my business!”
In closing the business schools, may the former professors of finance bring to the market a more human side (or, apropos of Dickens, a more ghostly side). Whether or not they do, though, closing the business schools is a necessary first step in righting the social and economic injustices perpetuated not by capitalism but by those who have used it to rend the very social fabric that nourishes them. By planting the seeds of corporate and financial tyranny, our business schools, operating as so many of them do in collusion with a too-big-to-fail mentality, have become the enemy of democracy. They must be closed, since, as Jacob Marley reminds us, we all live in the business world.
II. Save the Humanities
Closing the business schools will allow us to turn our attention more fully to the state of the humanities and their apparent demise. The 2013 report released by the American Academy of Arts and Sciences, which asserts that “the humanities and social sciences are not merely elective, nor are they elite or elitist. They go beyond the immediate and instrumental to help us understand the past and the future.” As if that’s going to sell.
In the wake of the academy’s report, The New York Times dutifully ran three columns on the humanities — by David Brooks, Verlyn Klinkeborg, and Stanley Fish — which dove into the wreck and surveyed the damage in fairly predictable ways (excepting Fish, whose unpredictability is predictable). Brooks remembers when they used to teach Seneca and Catullus, and Klinkeborg looks back on the good old days when everyone treasured literature and literary study. Those days are gone, he argues, because “the humanities often do a bad job of teaching the humanities,” and because “writing well used to be a fundamental principle of the humanities,” though it apparently is not anymore. Why writing well isn’t a fundamental principle of life is perhaps a better question.
We might therefore ask: Aside from the typical obeisance to something called “critical thinking,” what are the humanities supposed to do?
I propose that one of the beauties of the liberal arts degree is that it is meant to do nothing. I would like to think, therefore, that the typical humanities major reads because she is interested in knowledge for purposes outside of the pervasive instrumentalism now fouling higher education. She does not read philosophy because she wants, necessarily, to become a philosopher; she does not read poetry to become a poet, though she may dream of it; she does not study art history, usually, to become an art historian, though she may one day take this road.
She may be in the minority, but she studies these subjects because of the pleasure it gives her. Reading literature, or studying philosophy, or viewing art, or watching films — and thinking about them — are pleasurable things. What a delight to subsidize something that gives her immediate and future joy instead of spending capital on a course of study that might someday allow her to make more money so that she can do the things she wants to do at some distant time. Henry David Thoreau said it best: “This spending of the best part of one's life earning money in order to enjoy a questionable liberty during the least valuable part of it reminds me of the Englishman who went to India to make a fortune first, in order that he might return to England and live the life of a poet. He should have gone up garret at once.” If you want to be a poet, be done with it.
Does she suffer for this pleasure?
It is an unfortunate fact of our political and cultural economy that she probably does. Her parents wonder helplessly what she is up to and they threaten to cut off her tuition unless she comes to her senses. The governor and legislature of her state tell her that she is wasting her time and that she is unemployable. She goes to her advisers, who, if they are in the humanities, tell her that the companies her parents revere love to hire our kind, that we know how to think critically and write clearly and solve problems.
And it isn’t that they are lying, exactly (except to themselves). They simply aren’t telling her the whole truth: that she will almost surely never have the kind of financial success that her peers in business or engineering or medicine will have; that she will have enormous regrets barely ameliorated by the thought that she carries the fire; that the digital humanities will not save her, either, though they may help make her life slightly more interesting.
It is with this problem in mind that I argue for a vision of the university as a place where the humanities are more than tolerated, where they are celebrated as intrinsic to something other than vocationalism, as a place in which the ideology that inheres to the industrial model in all things can and ought to be dismantled and its various parts put back together into something resembling a university and not a factory floor.
Instead of making the case that the humanities gives students the skills to “succeed in a rapidly changing world,” I want to invoke the wisdom of Walt Whitman, one of the great philosophers of seeming inactivity, who wrote: “I lean and loafe at my ease observing a spear of summer grass.”
What does it mean to loafe? Whitman is reclining and relaxing, but he is also active: he “invites” his soul and “observes” the world around him. This conjunction of observation and contemplation with an invitation to the soul is the key here; using our time, energy, and intellectual faculties to attend to our world is the root of successful living. A world of contemplative loafers is one that can potentially make clear-eyed moral and ethical judgments of the sort that we need, judgments that deny the conflation of economic value with other notions of value.
Whitman would rather hang out with the men who brought in the catch than listen to the disputations of science or catch the fish himself: “You should have been with us that day round the chowder-kettle.” While I am not necessarily advocating a life of sloth, I’m not arguing against it, either. I respect the art of study for its own sake and revere the thinker who does nothing worthwhile, if by worthwhile we mean something like growing the economy. Making a living rather than living is the sign of desperation.
William Major is professor of English at Hillyer College of the University of Hartford. He is author of Grounded Vision: New Agrarianism and the Academy (University of Alabama Press, 2011).
The Modern Language Association report on the Ph.D. in languages and literatures has already succeeded in sparking a lively debate. Some commentators have welcomed the report’s general findings, while others have taken issue with its specific recommendations. Beyond these differences, a broad consensus has emerged that the current situation is unsustainable, and this recognition is key to moving forward. The relationship of doctoral education to the deteriorating conditions of the academic workforce demands transformative changes. If doctoral study is to thrive in more than a handful of elite institutions, the profession as a whole and at multiple levels must adopt a reform agenda. The urgency of change was the premise of the task force report, and change is what the critics of the report also demand.
The agreement goes even further. Like its critics, the members of the MLA task force that produced the report point to the persistently weak job market and the importance of advocating an increase in tenure-track positions. The report similarly criticizes the casualization of academic labor and the poor working conditions of most contingent faculty members. This advocacy stands explicitly in the tradition of the MLA, which in recent decades has analyzed and criticized these developments while providing resources such as the Committee on Contingent Labor’s two recent projects: “Professional Employment Practices for Non-Tenure-Track Faculty Members” and a 2013 special issue of the ADE and ADFL Bulletin on contingent labor. I know my colleagues on the MLA Executive Council are committed to pursuing activism in this area and leading the scholarly association network toward collective action .
Yet the report does more than call for advocacy. It also calls for change within graduate programs, and this is where the consensus breaks down. Some critics of the report have staked out the position that the MLA should focus primarily on job market and working conditions issues and not on the academic programs in which our members teach and study. While the task force report underscores the importance of the labor question, it also recommends that the MLA engage the profession in considering internal reforms to serve graduate students more effectively. The difference between what the report says and what its critics argue is particularly clear on three points.
First, some complain that the report does not call for deep cuts in admissions to doctoral programs. Only such cuts, they argue, could address the weak job market in which there are more qualified candidates than tenure-track positions. In contrast, the task force report insists on maintaining accessibility to doctoral programs: Qualified students with an intellectual dedication to the fields of language and literature should have the opportunity to pursue advanced study. Access to higher education is a hallmark of a democratic society, and it is the precondition of diversity in our fields Still, if critics want to call for the closing of programs, which programs, one might ask, should be eliminated? How will closings not end up disadvantaging public institutions, where the majority of first-generation college students study? Calls to shutter departments rather than reform them most likely will play into the hands of university budget-cutters. The scope of the humanities in higher education in the United States already faces significant reduction. We should be fighting for the humanities rather than closing off advanced study, the key to their sustained presence in colleges and universities.
The labor market critics are proposing what amounts to a guild protectionism: by reducing access to doctorate education, the limited pool of degree holders will be guaranteed abundant and better jobs. This strikes me as a gross miscalculation that will only end with diminished opportunities for all students. In contrast, the MLA report envisions humanities education with a potential for growth in response to the expanded intellectual scope of our fields as well as to society’s changing needs in classrooms and beyond.
A second flashpoint of dispute is time to degree. The report recommends that departments design programs that can be completed in five years and provide sufficient financial support for students to do so. Some commentators have viewed this time frame as an assault on quality. The point, however, is that currently around half of doctoral students take more than a decade to complete the Ph.D., which represents an enormous investment in time with limited prospects for return on the academic job market. Furthermore, there are no legitimate grounds for median time to degree in humanities fields to be significantly longer than in doctoral programs in the natural and social sciences. There would be nothing wrong for humanities scholars to adopt potentially more effective educational practices from these other fields. We language and literature faculty members need to develop and share new ideas for mentoring students, and departments need to ask how programs might be designed more effectively. As the appendix of the report demonstrates, some departments are already leading the way through reform initiatives that integrate the intellectual demands of humanistic study with a prudent rethinking of program structure.
A third point involves career outcomes for graduate students. It should be obvious to all faculty members that, as the MLA's 14 studies of Ph.D. placement make evident, at best only about half of new modern language doctorate recipients find tenure-track positions in the same year that they complete their degrees, and under 40 percent when academic job opportunities contract, as they did in the 1990s and have again since 2008. (Figure 2 in "Our PhD Employment Problem, Part 1" shows the summary findings for all 14 surveys, 1978 to 2010.)
Departments have an obligation to make this clear to applicants, yet such candor alone does not absolve departments of the responsibility of providing students with opportunities for professional development that can serve them well on multiple career paths. That is why the report points to possibilities such as curriculums designed to develop transferable skills, enhanced engagement with technology, and more effective use of other resources found throughout the university. The report also underscores the importance of pointing students to diverse career possibilities, not by singling out students for job-market tracking, but instead by engaging the graduate student community in exploring a widened career arc.
Doctoral recipients have a rich set of skills — in communication, research, and leadership — and those who do not find a tenure line should not have to settle for poorly compensated contingent positions. Those who view a job as a college professor as the only legitimate outcome of doctoral education fall into that labor market trap. That’s why the report emphasizes the importance of discussing the broad range of career paths and providing students with the resources they need to expand their employment horizons.
Nonetheless, doctoral study should not be viewed exclusively as the pursuit of a career-oriented credential. It must also involve an intellectual passion for languages and literatures, as defined by our evolving disciplines. This is the foundation of successful graduate study. The qualities cultivated through intensive, long-term research and thoughtful, extended writing are not just transferable to other careers; they are valuable in their own right, just as it would be valuable to our culture as a whole — and particularly to the future of the academy — to have highly educated professionals who appreciate scholarly thoughtfulness and humanistic perspectives working throughout society. It has been pointed out in the public discussion of the report that many of its recommendations are not new. The academic community has been talking about these issues for a long time, and change is already under way. Time to degree is coming down, and some departments are initiating salutary program modifications. The profession has reached a tipping point, and the time has come for broad-based change. Doctoral study in the humanities fields contributes to the quality of society, it answers individual students’ desire for intellectual depth, and it is a vital piece in the intellectual diversity of universities. If we want to preserve it, we need to reform it.
Russell A. Berman is professor of comparative literature and German studies at Stanford University. He led the task force that produced the MLA report.
“I’m flexing my muscles. Come see me flex my muscles!”
The digital humanities (DH) is a proud discipline. Its members will be the first to tell you when they have done something impressive. Lately, that pride has started to wear on our non-digital colleagues who have quietly begun pushing back, by setting aside applications that look a little too digital, and rejecting high-profile journal submissions from digital scholars. I can’t prove it, but as an early-career scholar, I can feel it. They don’t like us.
But maybe it’s not them. Maybe it’s us.
For the past decade we’ve been living in the age of digital hubris, and we can therefore hardly blame people for getting sick of us. Did you hear about the size of Soandso’s newest grant? Or did you read the latest on Whatshisname’s research in The New York Times? Did you know more people read that popular DH student’s blog post yesterday than have ever read your book?
In April, one of the most successful DH projects of all time turned 11 years old. The Old Bailey Online (OBO) first appeared in 2003, and brought 127 million words of transcribed criminal trial records to the Internet. In the decade since its launch, 309 publications have cited the OBO, the project has helped thousands of genealogists piece together their family histories, and it has even inspired a television series. As far as academic projects go, few have as much to be proud of as the OBO.
So 11 years on, what do the project leaders regret? Their hubris.
Professor Tim Hitchcock, one of the principal investigators, wrote on his blog that his enthusiasm for the project’s potential had “simply raised the ire of a group of historians ... who felt their own expertise was somehow threatened.”
It’s not just their expertise that humanists see as being under threat. Traditional humanities monographs are becoming economically unfeasible. Nonetheless, the perceived slow death of the book hasn’t been enough for many scholars in the digital humanities. They want everyone to know their stance: good riddance! Why bother with a publisher and a two-year turnaround when the internet is free and immediate?
Research budgets everywhere are being slashed, yet there always seems to be a million here or a hundred thousand there to get the next big DH project off the ground. That means less money for traditional research. A few years ago a historian colleague of mine assured me that she “does her own research” and didn’t need to waste grant money hiring someone to do it for her. She was a real researcher -- or so implied the cold fury in her intonation.
Who cares what other people think, we might say. On their own these non-digital colleagues won’t be able to put a stop to the private funding, or national-level competitions targeted at DH projects, or the pressure from humanities departments to attract grant funding. However, while they continue to sit on hiring committees, adjudication panels, and act as peer reviewers, they do still hold a number of keys to the academic world. A dose of digital humility may be in our collective best interest, not least for the sake of those of us just starting our careers.
DH is inherently interdisciplinary. My “core” discipline is history. As a (recent) graduate student, that meant my scholarship and job applications typically went through panels of historians. When the application was for something digital or not explicitly disciplinary, I (with all humility) did quite well. But if I had to convince a group of anonymous historians that my work was worthy, I seemed destined for the “no” pile.
Times are tough. I can accept that there are other great candidates out there who may have been better for the job, or more worthy of the scholarship. But it’s not just me. Most of my colleagues in Britain were self-funded during their Ph.D.s, or supported their studies as part-time developers and project managers. I know of none with the golden-ticket scholarships that have long been a measure of the top students in the humanities. I’m grateful I can support myself in other ways. But it’s difficult to ignore the feeling that young scholars are being kept on the other side of the gates by an establishment that’s decided those DH people get enough already.
I’ve seen it in traditional publishing venues as well. One of the most influential digital history articles ever written has been repeatedly rejected by traditional historical journals and is still without a home. This is despite the fact that I’m quite confident that if you are a DH scholar you would recognize the visualization that formed the basis of the article.
Some scholars are getting cunning in an effort to sidestep this digital backlash. A colleague of mine working at the corner of DH and history obfuscated his digital connections in an effort to get hired by a history department. It worked. I followed suit and immediately found my prospects improved when applying for funding.
Is it a conspiracy? No. DHers have many allies in the halls of power. But we all have room for more friends, and the wave of early-career scholars in the field can ill-afford to have people on their selection committees and peer review panels viewing them as a threat, or as arrogant, just because of their field of study.
We can dig in for another decade of covert gatekeeping, or we can move into a new phase of digital humility and mend the divide that has grown between us. I’m sure everyone would agree that DH’s proper place is alongside traditional humanists, supplementing rather than eradicating their techniques with new ways of looking at old problems. Non-digital scholars add depth to our breadth, and focus to our vision.
So I’d like to encourage DH to join me in a decade of digital humility, in which we remind our colleagues that we’re all here because we love the humanities. We appreciate their work, even if we don’t always say so. And we’d like to be on the same team.
That doesn’t mean we need to stop flexing our muscles. We’ve worked hard on them. But, when we are flexing, maybe we can let people notice on their own.
Adam Crymble is a lecturer in digital history at the University of Hertfordshire, in the United Kingdom.