English

Latest literary fad combines Shakespeare with Seuss and Twitter

Smart Title: 

What if Dr. Seuss rewrote Shakespeare and did it on Twitter?

Essay on the role of the dictionary

Sometimes I get a little fancy in the final comment of a student paper. Usually my comments are pretty direct: two or three things I like about the paper, two or three things I think need revision, and two or three remarks about style or correctness. But once in a while, out of boredom or inspiration, I grasp for a simile or a metaphor. Recently I found myself writing, "Successfully rebutting counter-arguments is not unlike slaying a hydra.”

I started with great confidence, but suddenly I wasn’t so sure I knew what a hydra is: a multiheaded creature? Yes.  But how many heads?  And can I use the word generically or do I have to capitalize it?  Would “slaying the Hydra” be the correct expression?

Since I have no Internet connectivity at home, never have, and don’t miss it, I grabbed my Webster’s Seventh New Collegiate Dictionary from 1965 — the kind of dictionary you can get for free at the dump or from a curbside box of discarded books — and looked up hydra. On my way to hydra, however, I got hung up on horse, startled by a picture of a horse busily covered with numbers. I knew a horse has a face, a forehead, a mouth. A nose, ears, nostrils, a neck.  A mane, hooves, a tail.

Pressed for more parts, I might have guessed that a horse had a lower jaw, a forelock (which I would have described as a tuft of hair between the ears), cheeks, ribs, a breast, haunches, buttocks, knees, a belly.

I don’t think I would have guessed flank, loin, thighs, and shoulders, words I associate with other animals, humans, or cuts of meat.  I know I wouldn’t have guessed forearm or elbow.

What I’d thought of as an animal with a head, a mane, a tail, hooves, and a body has 36 separate parts, it seems, all enumerated in a simple design on page 401 of my dictionary.  Had I not forgotten the precise definition of a hydra, I may never have learned that a horse also has a poll, withers, a croup, a gaskin, a stifle, fetlocks, coronets, pasterns, and cannons.  (The withers are the ridge between a horse’s shoulder bones.)

Hoof is defined and illustrated on the page opposite the horse, an alphabetical coincidence.  That picture too caught my eye, now that I was in an equine frame of mind.  For the moment, I wanted to learn everything I could about the horse. The unshod hoof, it turns out, has a wall with four parts — the toe, the sidewalls, quarters, and buttresses — a white line, bars, a sole, and a frog, behind which lie the bulbs.

Eventually I returned to my original search. A Hydra with a capital H is a nine-headed monster of Greek mythology whose power lies in its regenerative abilities: if one head is cut off, two will grow in its place unless the wound is cauterized. With a lower case h, the word stands for a multifarious evil that cannot be overcome by a single effort. After all this dictionary work, I’m not sure hydra is the word I want.

I've been thinking about dictionaries lately. The writing center at Smith College, where I work, is transitioning from paper schedules to an online appointment system, and yesterday we spent part of the morning moving furniture around trying to create room for a new computer station dedicated to scheduling. One of my younger colleagues suggested getting rid of the dictionary stand, which, he said, "nobody uses." I bristled. It’s a beautiful thing, the dictionary, an oversize third edition of the American Heritage Dictionary, just a hair over 2,000 pages. For more than a dozen years it’s resided in a cozy nook on a well-lit lectern below a framed poster publicizing the 1994 Annual Katharine Ashen Engel Lecture by Murray Kiteley, then Sophia Smith Professor of Philosophy. The poster was chosen as much for its elegance as for the lecture’s title: "Parts of Speech, Parts of the World: A Match Made in Heaven? Or Just Athens?"

For years I had an office across from the dictionary and never used it myself, preferring the handiness of my taped-up 1958 American College Dictionary by Random House. The American Heritage is too massive. It takes me too long to find a word and I get easily distracted: by illustrations and unusual words. I continue to find my college dictionary completely adequate for my purposes. I’ve never needed a word that I couldn’t find in it.  

Another colleague within earshot spoke up for the American Heritage, claiming he used it once in a while. "Maybe," I thought. More likely, he didn’t want to contemplate the loss of the big dictionary while he still mourned the loss of the blue paper schedules. The dictionary stayed: words, that’s what a writing center is about, and the dictionary is where they live.

I cannot remember the last time I saw one of my students using a paper dictionary, much less one carrying one around, not even an international student. Have today’s students ever instinctively pulled out a paper dictionary and used it to look up a word or check its spelling? Is a paper dictionary as quaint as a typewriter? Have things changed that much? I wonder. Is it partly my fault? It’s been many years, after all, since I’ve listed "a college dictionary" among the required texts for my writing course.

I doubt my students use dictionaries much, of whatever kind. You have to care about words to reach for the dictionary, and I don’t think they care very much about words. At their age, I probably didn’t either, though I think I did care more about right and wrong. I was embarrassed when I used the wrong word or misspelled a word. I still remember the embarrassment of spelling sophisticated with an f in a college paper, something a modern spell checker doesn’t allow. But it does allow "discreet categories" for "discrete categories," another unforgettably embarrassing error — this one in graduate school!

My students appear cheerfully to accept whatever the spell checker suggests, or whatever word sounds like the one they want, especially if they’re in roughly the same semantic domain.  They are positively proud to confess that they’re bad spellers — who among them isn’t? — and really don’t seem to care much that they have used the wrong word.  Words don’t appear to be things you choose anymore. They’re things that pop up:  in autocorrect, in spell checkers, in synonym menus. They are not things you ponder over, they are things you click, or worse, your laptop decides to click for you.

When I meet with a student about her paper, we always work with a paper copy. Even so, more often than not I still have to remind her to take a pencil so she can annotate her draft as we discuss it. Toward the end of our meetings, we talk about word choice and the exchange often goes like this:

"Is this the word you want?"

"I think so."

"I think here you might have meant to say blah."

"Oh, yeah, that’s right" and out comes the pencil — scratch this, scribble that, lest it affect her final grade. No consideration, no embarrassment. I used to pull out the dictionary "to inculcate good habits," but no more. In the presence of today’s students, pulling out a dictionary feels as remote as pulling out a typewriter or playing a record.

Sometimes the situation is not so clear-cut. The student might, for example, write a word like security in a context where it makes a bit of sense, but after some gentle prodding and, yes, a few pointed suggestions, she might decide that what she really means is privacy. Out comes the pencil again. Scratch "security," scribble "privacy." What she really means is safety, though, I think, but I let it go. If I push too hard, she’ll stop thinking I'm being helpful and begin to think I have a problem: "What a nitpicker! The man’s obsessed with words!" I imagine her complaining to her friends. "But it matters! It matters!" goes the imaginary dialogue. "What precisely were the opponents of the ERA arguing, that it would violate security, invade privacy, or threaten safety?"

I have used the online Webster's on occasion, of course, and recognize the advantages of online dictionaries: They can be kept up-to-date more easily, they can give us access to more words than a standard portable dictionary, they can be accessed anywhere at any time, they take up no shelf space, etc. I'm not prejudiced against online reference tools. In fact, unlike many of my colleagues, I'm a great fan of online encyclopedias and a lover of Wikipedia. Online dictionaries leave me cold, though. They should fill me with awe the way Wikipedia sometimes does, but they don't. I marvel at the invention of the dictionary every time I look up a word in my paper copy; at the brilliant evolutionary step of such a book; at the effort of generations of scholars, professionals and lay people that led to such a comprehensive compendium of words; at how much information — and not just word meanings — it puts at my fingertips; at how much I still have to learn; and at how much my education could still be enhanced if I read my college dictionary cover to cover.

I think of The Autobiography of Malcolm X, in which the author makes a powerful statement about the dictionary as a pedagogical tool. Frustrated with his inarticulateness in writing while in prison and his inability to take charge of a conversation like his fellow inmate Bimbi, Malcolm X came to the conclusion that what he needed was "to get hold of a dictionary — to study, to learn some words." The experience was a revelation: "I’d never realized so many words existed!" He started at the beginning and read on, learning not just words but also history — about people, places, and events. "Actually the dictionary is like a miniature encyclopedia," he noted. The dictionary was the start of his "homemade education."

Online all I get is quick definition of the word I want, and I’m done. On paper I get the definition plus something akin to a small education along the way. The experience is not unlike that of slaying the Hydra: For every word I word I look up, I see two others whose meaning I don’t know. If I were Hercules I could put an end to the battle once and for all, but I’m not, and glad I’m not. The battle is far too delicious. But how to convince my students?

Julio Alves is the director of the Jacobson Center for Writing, Teaching and Learning at Smith College.

MLA session on first-year common reading programs

Smart Title: 

At MLA, literature professors consider the non-literary values behind first-year reading programs -- and how such programs play out in the classroom.

MLA considers radical changes in the dissertation

Smart Title: 

MLA leaders encourage radical changes in Ph.D. programs. Among ideas: ending norm of producing a "proto-book," embracing digital formats, shaming committee members into doing work, and halving 9-year average for doctorate completion.

Essay urges reforms for doctoral education in humanities

Not all doctorate recipients will become faculty members, but all future faculty will come out of graduate programs. Do these programs serve the needs of graduate students well?

In light of the rate of educational debt carried by humanities doctoral recipients, twice that of their peers in sciences or engineering; in light of the lengthy time to degree in the humanities, reaching more than nine years; and in light of the dearth of opportunities on the job market, the system needs to be changed significantly. I want to begin to sketch out an agenda for reform.

The major problem on all of our minds is the job market, the lack of sufficient tenure-track openings for recent doctorate recipients. One response I have heard is the call to reduce the flow of new applicants for jobs by limiting access to advanced study in the humanities. If we prevent some students from pursuing graduate study — so the argument goes — we will protect the job market for others. I disagree.  

Let us not lose sight of the fact that the number of new Ph.D.s has already declined significantly, down about 10 percent from a recent peak in the 1990s. Because that drop hardly matches the 32 percent decline in job listings since 2007-08, the problem is not too many scholars: it is too few tenure-track positions. I fear that any call to reduce doctoral programs will end up limiting accessibility and diversity, while playing into the hands of budget-cutters. U.S. education needs more teaching in our fields, not less, and therefore more teaching positions, the real shovel-ready jobs.

Instead of asking that you lock your doors behind the last class of admitted students,  I appeal to those of you involved in the structure of doctoral programs to consider how to keep them open by making them  more affordable and therefore more accessible. Can we redesign graduate student learning in the face of our changed circumstances?

Reform has to go to the core structures of our programs. Let me share two pertinent experiences at Stanford.

Thanks to a seed grant from the Teagle Foundation, I was able to experiment with a program for collaborative faculty-graduate student teaching. In our umbrella grouping of the language departments, we set up small teams — one faculty member and two graduate students from each language — to develop and deliver undergraduate courses, against the backdrop of a common reading group on current scholarship on student learning and other issues in higher education. The graduate students developed their profiles as teachers of undergraduate liberal arts. Teaching experience is only going to grow more significant as a criterion in hiring, and we should, in our departments, explore how to transform our programs to prepare students better as future humanities teachers of undergraduates. I encourage all departments to experiment with new modalities of collaborative graduate student-faculty teaching arrangements that are precisely not traditional TA arrangements.

Support from Stanford's Center for Teaching and Learning has led to an ad hoc project on "Assessing Graduate Education," a twice-a-quarter discussion group to which all faculty and graduate students have been invited. German studies graduate student Stacy Hartman organized an excellent survey of best practices, which has become the center of a vigorous discussion. My point now is not to dwell on the particular issues — teaching opportunities, examination sequencing, quality of advising, professionalization opportunities, etc. — but to showcase the potential in every department of a structured public discussion forum on the character of doctoral training. I advise all doctoral programs to initiate similar discussions, not limited to members of departmental standing committees but open to all faculty and graduate students. What works in our programs; what could be better?  

At nine years (according to the Survey of Earned Doctorates), time to degree in our fields is excessive. We should try to cut that in half. I call on all departments with doctoral programs to scrutinize the hurdles in the prescribed trajectories: are there unnecessary impediments to student progress? Is the sequencing of examinations still useful for students?

Accelerating progress to completion will, moreover, depend on better curriculum planning and course articulation, as former MLA President Gerald Graff emphasized in his convention address three years ago. We should plan course offerings with reference to student learning needs. Curricular and extracurricular professionalization opportunities could take into account the multiple career tracks that doctorate recipients in fact pursue — this means the real diversity of hiring institutions, the working conditions of faculty at different kinds of institutions, non-teaching careers in the academy as well as non-academic positions. Can we prepare students better for all of these outcomes? Finally, we have to reinvent the conclusion of doctoral study.  As last year's President Sidonie Smith reminds us, the dissertation, as a proto-book, need not remain the exclusive model for the capstone project. This piece is crucial to the reform agenda.
 

Russell A. Berman is professor of comparative literature and German studies at Stanford University. This essay is an except from his presidential address at the 2012 meeting of the Modern Language Association.

Essay on new approach to defend the value of the humanities

"When the going gets tough, the tough take accounting." With those succinct words in a June 2010 op ed, New York Times columnist David Brooks summed up the conventional wisdom on the current crisis of the humanities. In an age when a higher education is increasingly about moving quickly through a curriculum streamlined to prepare students for a job, the humanities have no practical utility. As Brooks observes, "when the job market worsens, many students figure they can’t indulge in an English or a history major," a fact that explains why the "humanities now play bit roles when prospective students take their college tours. The labs are more glamorous than the libraries."

Pushed into a corner by these dismaying developments, defenders of the humanities -- both traditionalists and revisionists — have lately been pushing back. Traditionalists argue that emphasizing professional skills would betray the humanities' responsibility to honor the great monuments of culture for their own sake. Revisionists, on the other hand, argue that emphasizing the practical skills of analysis and communication that the humanities develop would represent a sellout, making the humanities complicit with dominant social values and ideologies. But though these rival factions agree on little else, both end up concluding that the humanities should resist our culture's increasing fixation on a practical, utilitarian education. Both complain that the purpose of higher education has been reduced to credentialing students for the marketplace.

Martha Nussbaum, for example, while stressing that the humanities foster critical thinking and the ability to sympathetically imagine the predicament of others, insists such skills are, as the title of her 2010 book puts it, "not for profit." In doing so she draws a stark line between the worlds of the humanities and the 21st-century workplace. Likewise, Geoffrey Galt Harpham in The Humanities and the Dream of America, laments the increasing focus on professional skills in the humanities at the expense of reading great books. Stanley Fish takes an even more extreme position, insisting that the humanities "don’t do anything, if by 'do' is meant bring about effects in the world. And if they don’t bring about effects in the world they cannot be justified except in relation to the pleasure they give to those who enjoy them. To the question 'of what use are the humanities?', the only honest answer is none whatsoever." Worse still, Frank Donoghue, in The Last Professors: The Corporate University and the Fate of the Humanities, argues that the humanities will simply disappear in the new corporate, vocation-centered university.

Ironically, these pessimistic assessments are appearing at the very moment when many employers outside academe are recognizing the practical value of humanities training. Fish simply dismisses the argument that "the humanities contribute to the economic health of the state — by producing more well-rounded workers or attracting corporations or delivering some other attenuated benefit — because nobody really buys that argument." But this would come as news to the many heads of philanthropic foundations, nonprofits, and corporate CEOs who have lately been extolling the professional value of workplace skills grounded in the humanities.

We would be the last to argue that traditional ways of valuing the humanities are not important, that studying philosophy, literature, and the fine arts do not have a value in and of themselves apart from the skills they teach. We also recognize that the interests of the corporate world and the marketplace often clash with the values of the humanities. What is needed for the humanities in our view is neither an uncritical surrender to the market nor a disdainful refusal to be sullied by it, but what we might call a critical vocationalism, an attitude that is receptive to taking advantage of opportunities in the private and public sectors for humanities graduates that enable those graduates to apply their training in meaningful and satisfying ways. We believe such opportunities do exist.

To be sure, such optimism must be tempered in today’s bleak economy, where hardly any form of education is a sure ticket to a job and where many in the private sector may still look with indifference or even disdain on a humanities degree.  But as David Brooks himself went on to point out in his op-ed: "Studying the humanities improves your ability to read and write. No matter what you do in life, you will have a huge advantage if you can read a paragraph and discern its meaning (a rarer talent than you might suppose). You will have enormous power if you are the person in the office who can write a clear and concise memo."

Brooks’ view is echoed by Edward B. Rust Jr., chairman and CEO of State Farm Insurance Companies, who observes that "at State Farm, our employment exam does not test applicants on their knowledge of finance or the insurance business, but it does require them to demonstrate critical thinking skills" and "the ability to read for information, to communicate and write effectively, and to have an understanding of global integration." And then there is Google, which more than any other company has sung the praises of humanities students and intends to recruit many of them. "We are going through a period of unbelievable growth," reports Google’s Marissa Mayer, "and will be hiring about 6,000 people this year — and probably 4,000-5,000 from the humanities or liberal arts."

This evidence of the professional utility of humanities skills belies Donoghue’s apparent assumption (in The Last Professors) that the "the corporate world’s hostility" toward humanistic education remains as intense today as it was a century ago, when industrialists like Andrew Carnegie dismissed such an education as "literally, worthless." Donoghue ignores changes in the global economy, the culture, and the humanities themselves since Carnegie’s day that have given many corporate leaders a dramatically more favorable view of the humanities’ usefulness. Associate Dean Scott Sprenger of Brigham Young University, who oversees a humanities program we will discuss in a moment, quotes the dean of the Rotman School of Management in Toronto, who observes a "tectonic shift for business school leaders," who are now aware that "learning to think critically — how to imaginatively frame questions and consider multiple perspectives — has historically been associated with a liberal arts education, not a business school curriculum."

All of these commentators are right, and the skills they call attention to only begin to identify the range of useful professional competencies with which a humanities education equips 21st-century students. In addition to learning to read carefully and to write concisely, humanities students are trained in fields like rhetoric and composition, literary criticism and critical theory, philosophy, history, and theology to analyze and make arguments in imaginative ways, to confront ambiguity, and to reflect skeptically about received truths, skills that are increasingly sought for in upper management positions in today’s information-based economy. Even more important for operating as global citizens in a transnational marketplace, studying the literary, philosophical, historical, and theological texts of diverse cultures teaches humanities students to put themselves in the shoes of people who see and experience the world very differently from their own accustomed perspectives. Are some corporations still looking for employees who will be well-behaved, compliantly bureaucratized cogs in the wheel? Of course they are. But increasingly, many others are looking for employees who are willing to think outside the box and challenge orthodoxy.

It is true that humanities study, unlike technical training in, say, carpentry or bookkeeping, prepares students not for any specific occupation, but for an unpredictable variety of occupations. But as many before us have rightly pointed out, in an unpredictable marketplace this kind of versatility is actually an advantage. As Associate Dean Sprenger notes, "the usefulness of the humanities" paradoxically "derives precisely from their detachment from any immediate or particular utility. Experts tell us that the industry-specific knowledge of a typical vocational education is exhausted within a few years," if not "by the time students enter the workforce." It is no accident, he observes, "that a large percentage of people running Fortune 500 companies (one study says up to 40 percent) are liberal arts graduates; they advance more rapidly into mid- and senior-level management positions, and their earning power tends to rise more significantly than people with only technical training."

If there is a crisis in the humanities, then, it stems less from their inherent lack of practical utility than from our humanistic disdain for such utility, which too often prevents us from taking advantage of the vocational opportunities presented to us. This lofty disdain for the market has thwarted the success of the few programs that have recognized that humanities graduates have much to offer the worlds of business, technology, arts agencies, and philanthropic foundations.

The most promising of these was a program in the 1990s developed by the Woodrow Wilson National Fellowship Foundation under the leadership of its then-director, Robert Weisbuch. First called "Unleashing the Humanities" and later "The Humanities at Work," the program, according to Weisbuch in an e-mail correspondence with the authors, "had persuaded 40 nonprofits and for-profits to reserve meaningful positions for Ph.D. graduates in the humanities and had placed a large number in well-paying and interesting positions — at places ranging from Verizon to AT Kearney to The Wall Street Journal to the National Parks Service." Unfortunately, Weisbuch reports, only a few humanities graduate programs "enlisted their alumni and the influential corporations and others in their areas of influence to revolutionize the possibilities for employment of humanities doctoral graduates," while most faculty members "continued to expect their graduate students to look for jobs much like their own and to consider any other outcome a failure."

Today, however, some humanities programs that emphasize useful professional applications are prospering.  One of these is a new undergraduate major at Brigham Young University called "Humanities +," with the "+” referring to the value-added vocational component gained by students who elect the program. According to an e-mail to the authors from Associate Dean Sprenger, BYU hired a career services specialist tasked with "revolutionizing our humanities advising office along the lines of the Humanities + vision, and the program has developed ties with the university’s colleges of business and management" —  a virtually unheard-of step for a humanities program. The program’s students are urged "to minor in a practical subject, professionalize their language skills, and internationalize their profile by doing an overseas internship." The objective, Sprenger says, "is that career thinking and strategizing become second nature to students," while faculty members "see it as in their interest to help students find 'alternative' careers, and are reassured that they can rely on our advising office to be informed and to do the training."

Another notable program that sees its mission as "bringing humanities into the world" beyond academe and that works closely with its university’s office of career placement is the Master of Arts Program in the Humanities (MAPH) at the University of Chicago, which Gerald helped design and direct in the 1990s. According to a recent article by program associate A.J. Aronstein in Tableau, a University of Chicago house journal, one recent MAPH graduate got a job as finance director in Florida for Barack Obama’s 2008 campaign, later served as chief of staff at the International Trade Association, and now works as a political consultant in Washington. Other MAPH graduates have gone on to internships and subsequent positions as museum curators, technical writers, journalists and other media workers, marketing specialists, and policy analysts with investment firms.

The false assumption in both anti-utilitarian defenses of the humanities and pessimistic predictions of their extinction is that we have to choose between a credentializing and a humanizing view of higher education, between vocational utility and high-minded study as an end in itself. This either/or way of thinking about the humanities — either they exist solely for their own sake or they have no justification at all – is a trap that leaves humanists unable to argue for the value of their work in terms of the practical skills it teaches, an argument that inevitably has to be made in the changing marketplace of higher education. In fact, we would argue there is no defense of the humanities that is not ultimately based on the useful skills it teaches.

Evidence is plentiful that stressing the range of expertise  humanities graduates have makes intellectual and economic sense. Take, for example, Damon Horowitz, director of engineering at Google. He insisted recently in an article in The Chronicle of Higher Education entitled "From Technologist to Philosopher: Why You Should Quit Your Technology Job and Get a Ph.D. in the Humanities," that "if you are worried about your career ... getting a humanities Ph.D. is not only not a danger to your employability, it is quite the opposite. I believe there no surer path to leaping dramatically forward in your career than to earn a Ph.D. in the humanities." "You go into the humanities to pursue your intellectual passion," he explains, "and it just so happens, as a byproduct, that you emerge as a desired commodity for industry."

Horowitz, a leading figure in artificial intelligence and the head of a number of tech startups, ought to know. He took a break from his lucrative career to enroll in Stanford’s Ph.D. program in philosophy because he figured out that in order to do his job in technology well he needed to immerse himself in the humanities. "I realized that, while I had set out in AI to build a better thinker, all I had really done was to create a bunch of clever toys." Horowitz came to realize that the questions he was "asking were philosophical questions — about the nature of thought, the structure of language, the grounds of meaning." Returning to the humanities, Horowitz took time out from the world of artificial intelligence to study "radically different approaches to exploring thought and language," such as philosophy, rhetoric, hermeneutics and literary theory. As he studied intelligence from these perspectives he "realized just how limited my technologist view of thought and language was. I learned how the quantifiable, individualistic, ahistorical — that is, computational — view I had of cognition failed to account for whole expanses of cognitive experience (including, say, most of Shakespeare)."

The concrete value of the humanities education Horowitz celebrates is especially well epitomized in the new field of the digital humanities. The emergence of this field calls attention to how old 20th-century divisions between science and the humanities are breaking down and gives those of us committed to defending the practical value of the humanities a tremendous opportunity. The digital humanities represent the cutting-edge intersection of the humanities and computer science, the merging of skills and points of view from two formerly very different fields that are leading to a host of exciting innovations – and opportunities for students who want to enter fields related to everything from writing computer programs to text encoding and text editing, electronic publishing, interface design, and archive construction. Students in the digital humanities are trained to deal with concrete issues related to intellectual property and privacy, and with questions related to public access and methods of text preservation.

Graduates of the digital humanities programs that are now developing all over the country will be first in line for such positions. For example, Paul’s university now has a Digital Humanities M.A. with two converging tracks, one designed for students with a background in computer science and one for students with a background in the humanities. The program website notes that it offers "theoretical, critical, social, and ethical contexts for thinking about the making of new knowledge through digital humanities research and applications, from issues of intellectual property and privacy, to questions of public access and methods of preservation." When we are asked about the practical value of a humanities education, we need to add the digital humanities to the list.

We believe it is time to stop the ritualized lamentation over the crisis in the humanities and get on with the task of making them relevant in the 21st century.  Such lamentation only reveals the inability of many humanists to break free of a 19th-century vision of education that sees the humanities as an escape from the world of business and science. As Cathy Davidson has forcefully argued in her new book, Now You See It, this outmoded way of thinking about the humanities as a realm of high-minded cultivation and pleasure in which students contemplate the meaning of life is a relic of the industrial revolution with its crude dualism of lofty spiritual art vs. mechanized smoking factories, a way of thinking that will serve students poorly in meeting the challenges of the 21st century.

Though we have argued in defense of the practical and vocational utility of a humanities education, our argument should in no way be construed as undercutting the aspirations of those in the humanities who seek an academic career. Indeed, on this score we need to redouble our efforts to increase public and private funding for higher education and to support unionizing efforts by faculty members and adjuncts. But even as we fight these battles to expand the academic job market we would be foolish to turn our backs on alternative forms of employment for humanities graduates when they are out there. In this spirit, we applaud both Modern Language Association President Russell Berman and American Historical Association President Anthony Grafton, who, along with the executive director of the AHA, James Grossman, have recently urged their organizations to acknowledge that advanced training in the humanities can lead to a variety of careers beyond academia and have suggested how graduate programs can be adapted for these kinds of careers.

For ultimately, to take advantage of the vocational potential of humanities study as we propose is not to sell out to the corporate world, but to bring the critical perspective of the humanities into that world. It is a perspective that is sorely needed, especially in corporate and financial sectors that have lately been notoriously challenged in the ethics department, to say the least. Humanities graduates are trained to consider the ethical dimensions of experience, linking the humanities with the sciences as well as with business and looking at both these realms from diverse perspectives. To those who worry that what we urge would blunt the humanities' critical power, we would reply that it would actually figure to increase that power, for power after all is the ability to act in the world.

Paul Jay is professor of English at Loyola University Chicago and the author, most recently, of Global Matters: The Transnational Turn in Literary Studies. Gerald Graff is professor of English and education at the University of Illinois at Chicago and a past president of the Modern Language Association.

 

Report finds literary research an inefficient use of university money

Smart Title: 

A new paper by an English professor argues that literary research, much of which is rarely cited, is not an efficient use of university resources.

Pages

Subscribe to RSS - English
Back to Top