Essay on the atmosphere at writing programs conference

I recently spent four days at the AWP Carnival at the Chicago Hilton; there were, according to various reports, anywhere from 9,300 to 10,000 in attendance, and I saw most of those attendees standing ahead of me in line at Starbucks or waiting for a seat at Kitty O’Shea’s Pub. This was the annual convention of the Association of Writers and Writing Programs, where “writers, editors, and publishers come together.” And like most carnivals, it dealt in dreams.

There were 450 panels to choose from — all holding the promise of some magical connection, some dim and dimly borrowed light. This last was sometimes the literal case: a session on writing for radio involved the audience sitting in the dark and listening to the panelists’ favorite segments. Their advice: storytelling is key (well, yes) and audience members should feel free to look up any of the panelists online.

Interestingly, the session audiences’ biggest applause seemed to be reserved not for resume line-items involving publishing coups (such as one, two or even three memoirs -- that particular author deserved a round of applause for the sheer stamina involved not only in the life she lived but also her determination to write -- and write -- about it) but for announcements by panelists regarding tenure. At one session, a mystery writer announced that her recent MFA in playwriting had led to a tenure-track appointment; at another, the crowd literally went wild when a poet panelist announced that she had just received tenure. The irony of the fact that she was part of a panel promising to reveal what sort of work outside academia could bring MFA graduates, if not fame and fortune, then at least enough money to pay off their loans, went largely unnoticed. As for that session, the lead presenter was absent, and so the others valiantly soldiered on. It turned out that for these panelists, at least, “outside academia” meant working on the edges of academia. The advice included:

  • Hold creative writing salons in your home.
  • Be fortunate enough to have a thesis adviser who is selected to be Poet Laureate; then work as an intern for him/her.
  • Go back to school! Specifically, go back to school for an MLS degree. (Libraries are among the first to be hit in recessions. A master's in library science will only qualify graduates to attend future sessions entitled “What to Do with Your Library Degree.")

No one mentioned going back to school for classes in business or info tech or community planning. No one mentioned that you can be an accountant (or a health care worker or a plumber) and still write. The single poet most responsible for changing poetry in the .21st century was a doctor who made house calls. But there was no recognition of William Carlos Williams or of any other physician writer. Nor did anyone mention Wallace Stevens, who combined a career in life insurance with a life of poetry. No one mentioned the missing panelist, who has admirably combined a life of business and poetry and who served as Chairman of the National Endowment for the Arts. No one mentioned that there are, in fact, plenty of paying writing jobs available. Or that a one-time prize of $1,000 or a free trip to a writers’ conference isn’t enough, in the long run, to sustain a life.  Or that one might apply imagination and creativity to finding or creating a job. Yes, poetry is the news that stays new. But you can do something else and still write poetry. And someone should have told you that before you started your MFA program.

Of the 9,300 to 10,000 attendees, one third, according to AWP executive director David Fenza were graduate students. Of these 3,000+ individuals, a handful seemed to be interested in nonfiction (or at least the memoir category of nonfiction) or playwriting (playwriting! Why not, at least, screenplay writing?); a number were engaged in fiction writing, but the vast majority were poets. The final (and recently tenured) panelist suggested volunteer work and offered a twofold rationale: that volunteer work might lead to (academic) connections and  that poets already receive nothing for their work, so why not consider doing more work for nothing? This line received the most laughter that I heard in two days, and was far more amusing, albeit in a grim existential sort of way, than the ones I heard at a session titled “How to Tell a Joke.”

Of course, if you’re a poet or a jokester, you didn’t even have to buy a conference pass; you could skip the panels and just cruise the hotel lobby. Or go straight to the bars. Or you could, on the last day of the conference, hang out for free at the midway, the literally underground portion of the event — the book fair with its more than 550 exhibitors’ booths located in the basement of the Hilton. Here a few big-name academic publishers (whose displays featured textbooks about writing for teachers of writing) and venerable publishing houses shared space with many more small presses, small literary magazines, several individuals selling their single works, and reps for MFA programs. The atmosphere, like that of any other carnival, was crowded and noisy, with hawkers pushing their wares and onlookers seeking the lucky chance. Most attendees that I observed followed a similar pattern: upon first arriving, attentive perusal of each table, to be replaced, by the fifth row, by a sort of quick jog down the middle of the aisles.

There were some striking moments. Donovan Hohn, author of Moby-Duck: The True Story of 28,800 Bath Toys Lost at Sea..., delivered one of the best conference presentations that I have ever heard. Derek Alger and his panel of writers talking about memoir writing were funny and frank. Esmeralda Santiago and Jesmyn Ward read and spoke powerfully and beautifully.

The two most interesting people that I met during my time in Chicago were Margaret Atwood, the famous Canadian author who delivered the keynote address, and Cindy, the cab driver who drove me to and from the hotel. “Met,” in the case of Atwood, is a slight exaggeration; along with 139 other devotees, I had won a lottery for the book signing. By the time I approached her at the signing table, she looked so exhausted that I contemplated jumping the velvet guide rope and running away. As the woman waiting next to me on the line said, “My God, do you think we’re killing her?”

Atwood’s speech, listed in the program for an hour-and-a-half slot, ran about 25 minutes. This meant, if I added up the registration fee, the plane fare, the hotel bill, the bar bill, and Cindy’s rides to and from the airport, that I had actually paid about $75.00 per minute to sit in her presence. But it was, after all, Atwood, and it was worth it to see her and to hear her — wryly brilliant as ever — deliver a speech that began with her remarking that when she stated writing, there were no organizations like AWP — it was just her, writing and then tearing up drafts and then writing again.

As for Cindy, she’s been driving a cab for 18 years, or nearly all of her adult life. She’s looking, however, to get out of the business, and so she’s going back to school next year. Someday, she told me, she’s going to write about her life as a cab driver. In the meantime, she’s signed up for a community-college program -- in radiology.

Carolyn Foster Segal left a full-time tenured position in Dec. 2011. She currently works as an adjunct at Muhlenberg College and as a book-group facilitator for the Pennsylvania Humanities Council. She has had over 25 other jobs, including waitressing, sitting as an artists’ model, and working on the assembly line in a pickle factory.

Column on Twitter and scholarly citation

Intellectual Affairs

The Modern Language Association has now issued its official, authoritative, and precisely calibrated guidelines for citing tweets – a matter left unaddressed in the seventh edition of the MLA Handbook (2009). The blogs have been -- you probably see this one coming -- all a-Twitter. The announcement was unexpected, eliciting comments that range from “this is really exciting to me and i don’t know why” to "holy moly i hate the world read a damn book." (Expressions of an old-school humanistic sensibility are all the more poignant sans punctuation.) Somewhere in between, there’s this: "when academia and the internet collide, i am almost always amused."

Yet the real surprise here is that anyone is surprised. The MLA is getting into the game fairly late. The American Psychological Association has had a format for citing both Twitter and Facebook since 2009. Last summer, the American Medical Association announced its citation style after carefully considering "whether Twitter constituted a standard citable source or was more in the realm of ‘personal communications’ (such as e-mail),” finally deciding that tweets are public discourse rather than private expression.

The AMA Style Insider noted that a standard format for Twitter references should “help avoid citations sounding like a cacophony of Angry Birds.”

How long was the possibility of an MLA citation format been under consideration? Was it a response to MLA members needing and demanding a way to bibliograph tweets, or rather an effort to anticipate future needs? Rosemary Feal, the organization’s executive director, was the obvious person to ask.

"The release of the tweet citation style,” she said by e-mail, “came in response to repeated requests from teachers, students, and scholars (most of them received, perhaps unsurprisingly, over Twitter). We debated the particulars on staff for some weeks. We're certain that the format we've announced is just a first step; user needs will change over time, as will technologies.”

Having exact, authoritatively formulated rules is clearly an urgent, even an anxiety-inducing matter for the MLA’s constituency. “Every time people asked me on Twitter about citing tweets,” Feal said, “I told them MLA style was flexible. Just adapt the format.” And as a matter of fact, the current MLA Handbook does have a format for citing blog entries – which would seem to apply, given that Twitter is a microblog.

“But because people wanted something very specific,” Feal said, “I asked staff to think about it…. Our hope is to remain nimble enough to respond to circumstances as they develop.” In that case, it might be time to start brainstorming how to cite Facebook exchanges, which can certainly be recondite enough, if the right people are involved. At least the Twitter citation format will be part of the eighth edition of the MLA Handbook -- though Feal indicated it would take at least another year to finish it.

Directing scholarly attention to the incessant flow of 140-character Twitter texts can yield far more substantial results than you might imagine, as explained in this column almost two years ago. Often this involves gathering tweets by the thousands and squeezing them hard, via software, to extract raw data, like so much juice from a vat of grapes. Add the yeast of statistical methodology, and it then ferments into the fine wine of an analogy that’s already gone on far too long.

So let’s try that again. Social scientists have ways of charting trends and finding correlations in tweets en masse. Fair enough. But recent work by Kaitlin L. Costello and Jason Priem points in a different direction: towards Twitter’s role in the more narrowly channeled and discussions taking pace within scholarly networks.

Costello and Priem, who are graduate students in the information and library science at the University of North Carolina at Chapel Hill, have been gathering and analyzing information about academics who tweet. Their findings suggest that Twitter has become a distinct and useful -- if exceedingly concentrated -- mode of serious intellectual exchange.

In one study, they examined the departmental web pages at five universities in the United States and Britain, compiling “a list of all the scholars (defined as full-time faculty, postdocs, and doctoral students) at each one, yielding a sample of 8,826.” Through a process of elimination, they were able to generate a pool of 230 scholars with active Twitter accounts. Out of the initial pool, then, they found one scholar in 40 using Twitter – not a lot, although it’s definitely an underestimation. Some in the pool were removed because Costello and Priem could not establish a link between faculty listing and Twitter profile beyond any doubt. (In the case of people with extremely common names, they didn’t even try.)

The most striking finding is that the scholars who used Twitter were almost indistinguishable from those who didn’t. Status as faculty or nonfaculty made no difference. Natural scientists, social scientists, and humanists were represented among the Twitterati at rates nearly identical to their share of the non-tweeting academic population. Scholars in the formal sciences (math, logic, comp sci, etc.) proved less likely to use Twitter than their colleagues – though only slightly.

A large majority of tweets by academics, about 60 percent, were of a non-scholarly nature. A given tweet by a faculty member was about twice as likely to have some scholarly relevance than one by a nonfaculty person. While the share of traffic devoted to strictly scholarly matters is not enormous, its importance shouldn’t be underestimated – especially since a significant portion of it involves the exchange of links to new publications.

In an earlier study (archived here) Costello and Priem conducted interviews with 28 scholars – seven scientists, seven humanists, and 14 social scientists – as well as harvesting more than 46,000 of their tweets. For each subject, they created a set of the 100 most recent tweets containing links that were still active. (A few didn’t reach the 100 mark, but their data was still useful.)

Six percent of the tweets containing hyperlinks fell into the category of what Priem and Costello call “Twitter citations” of peer-reviewed scholarly articles available online. One of their subjects compared linking to a scholarly article via Twitters to citing it in a classroom or seminar setting: “It’s about pointing people in the direction of things they would find interesting, rather than using it as evidence for something.”

At the same time, tweeting plays a role in disseminating new work in particular: 39 percent of the links were to articles less than a week old -- with 15 percent being to things published the same day.

The researchers divided citation tweets evenly into two categories of roughly equal sizes: direct links to an article, and links to blog entries or other intermediary pages that discussing an article (usually with a link to it). Not surprisingly, 56 percent of direct links lead to open-access sources. About three-quarters of the indirect links went to material behind a paywall. “As long as intermediary webpages provide even an abstract-level description,” write C&P, "our participants often viewed them as equivalent.”

One scholar told them: “I don’t have time to look at everything. But I trust [the people I follow] and they trust me to contribute to the conversation of what to pay attention to. So yes, Twitter definitely helps filter the literature.” Another said, “It’s like I have a stream of lit review going.”

At this level, Twitter, or rather its users, create a quasi-public arena for the distribution of scholarship – and, to some degree, even for its evaluation. Costello and Priem suggest that harvesting and analyzing these citations could yield “faster, broader, and more nuanced metrics of scholarly communication to supplement traditional citation analysis,” as well as strengthening “real-time article recommendation engines.”

At the MLA convention in January 2011, Amanda French gave a talk that summed up, in its title, a major implication of Priem and Costello’s work: “Your Twitter Followers and Facebook Friends Won’t Read Your Peer-Reviewed Article If They Have to Pay For It, and Neither Will Strangers.” This is true. And its obvious corollary – that open-access and scholarly tweeting can magnify an article’s impact considerably – is demonstrated by Melissa Terras, the co-director of the Center for Digital Humanities at the University College London.

On October 16, she made one of her papers available through the UCL online repository. Two people downloaded it. She tweeted and blogged about it on a Friday, whereupon it was downloaded 140 times in short order, then re-tweeted it on Monday, with the same effect. “I have no idea what happened on the 24th October,” she writes. “Someone must have linked to it? Posted it on a blog? Then there were a further 80 downloads. Then the traditional long tail, then it all goes quiet.”

In all, more than 800 people added the article to their to-read collections in a couple of months – which, for a two-year old paper called "Digital Curiosities: Resource Creation Via Amateur Digitisation," from the journal Literary and Linguistic Computing, is not bad at all.

That may be another reason why citation formats for Twitter are necessary. One day, and it might be soon, an intellectual historian narrating the development of a theory or argument may have to discuss someone’s extremely influential tweet. Stranger things have happened.


Latest literary fad combines Shakespeare with Seuss and Twitter

Smart Title: 

What if Dr. Seuss rewrote Shakespeare and did it on Twitter?

Essay on the role of the dictionary

Sometimes I get a little fancy in the final comment of a student paper. Usually my comments are pretty direct: two or three things I like about the paper, two or three things I think need revision, and two or three remarks about style or correctness. But once in a while, out of boredom or inspiration, I grasp for a simile or a metaphor. Recently I found myself writing, "Successfully rebutting counter-arguments is not unlike slaying a hydra.”

I started with great confidence, but suddenly I wasn’t so sure I knew what a hydra is: a multiheaded creature? Yes.  But how many heads?  And can I use the word generically or do I have to capitalize it?  Would “slaying the Hydra” be the correct expression?

Since I have no Internet connectivity at home, never have, and don’t miss it, I grabbed my Webster’s Seventh New Collegiate Dictionary from 1965 — the kind of dictionary you can get for free at the dump or from a curbside box of discarded books — and looked up hydra. On my way to hydra, however, I got hung up on horse, startled by a picture of a horse busily covered with numbers. I knew a horse has a face, a forehead, a mouth. A nose, ears, nostrils, a neck.  A mane, hooves, a tail.

Pressed for more parts, I might have guessed that a horse had a lower jaw, a forelock (which I would have described as a tuft of hair between the ears), cheeks, ribs, a breast, haunches, buttocks, knees, a belly.

I don’t think I would have guessed flank, loin, thighs, and shoulders, words I associate with other animals, humans, or cuts of meat.  I know I wouldn’t have guessed forearm or elbow.

What I’d thought of as an animal with a head, a mane, a tail, hooves, and a body has 36 separate parts, it seems, all enumerated in a simple design on page 401 of my dictionary.  Had I not forgotten the precise definition of a hydra, I may never have learned that a horse also has a poll, withers, a croup, a gaskin, a stifle, fetlocks, coronets, pasterns, and cannons.  (The withers are the ridge between a horse’s shoulder bones.)

Hoof is defined and illustrated on the page opposite the horse, an alphabetical coincidence.  That picture too caught my eye, now that I was in an equine frame of mind.  For the moment, I wanted to learn everything I could about the horse. The unshod hoof, it turns out, has a wall with four parts — the toe, the sidewalls, quarters, and buttresses — a white line, bars, a sole, and a frog, behind which lie the bulbs.

Eventually I returned to my original search. A Hydra with a capital H is a nine-headed monster of Greek mythology whose power lies in its regenerative abilities: if one head is cut off, two will grow in its place unless the wound is cauterized. With a lower case h, the word stands for a multifarious evil that cannot be overcome by a single effort. After all this dictionary work, I’m not sure hydra is the word I want.

I've been thinking about dictionaries lately. The writing center at Smith College, where I work, is transitioning from paper schedules to an online appointment system, and yesterday we spent part of the morning moving furniture around trying to create room for a new computer station dedicated to scheduling. One of my younger colleagues suggested getting rid of the dictionary stand, which, he said, "nobody uses." I bristled. It’s a beautiful thing, the dictionary, an oversize third edition of the American Heritage Dictionary, just a hair over 2,000 pages. For more than a dozen years it’s resided in a cozy nook on a well-lit lectern below a framed poster publicizing the 1994 Annual Katharine Ashen Engel Lecture by Murray Kiteley, then Sophia Smith Professor of Philosophy. The poster was chosen as much for its elegance as for the lecture’s title: "Parts of Speech, Parts of the World: A Match Made in Heaven? Or Just Athens?"

For years I had an office across from the dictionary and never used it myself, preferring the handiness of my taped-up 1958 American College Dictionary by Random House. The American Heritage is too massive. It takes me too long to find a word and I get easily distracted: by illustrations and unusual words. I continue to find my college dictionary completely adequate for my purposes. I’ve never needed a word that I couldn’t find in it.  

Another colleague within earshot spoke up for the American Heritage, claiming he used it once in a while. "Maybe," I thought. More likely, he didn’t want to contemplate the loss of the big dictionary while he still mourned the loss of the blue paper schedules. The dictionary stayed: words, that’s what a writing center is about, and the dictionary is where they live.

I cannot remember the last time I saw one of my students using a paper dictionary, much less one carrying one around, not even an international student. Have today’s students ever instinctively pulled out a paper dictionary and used it to look up a word or check its spelling? Is a paper dictionary as quaint as a typewriter? Have things changed that much? I wonder. Is it partly my fault? It’s been many years, after all, since I’ve listed "a college dictionary" among the required texts for my writing course.

I doubt my students use dictionaries much, of whatever kind. You have to care about words to reach for the dictionary, and I don’t think they care very much about words. At their age, I probably didn’t either, though I think I did care more about right and wrong. I was embarrassed when I used the wrong word or misspelled a word. I still remember the embarrassment of spelling sophisticated with an f in a college paper, something a modern spell checker doesn’t allow. But it does allow "discreet categories" for "discrete categories," another unforgettably embarrassing error — this one in graduate school!

My students appear cheerfully to accept whatever the spell checker suggests, or whatever word sounds like the one they want, especially if they’re in roughly the same semantic domain.  They are positively proud to confess that they’re bad spellers — who among them isn’t? — and really don’t seem to care much that they have used the wrong word.  Words don’t appear to be things you choose anymore. They’re things that pop up:  in autocorrect, in spell checkers, in synonym menus. They are not things you ponder over, they are things you click, or worse, your laptop decides to click for you.

When I meet with a student about her paper, we always work with a paper copy. Even so, more often than not I still have to remind her to take a pencil so she can annotate her draft as we discuss it. Toward the end of our meetings, we talk about word choice and the exchange often goes like this:

"Is this the word you want?"

"I think so."

"I think here you might have meant to say blah."

"Oh, yeah, that’s right" and out comes the pencil — scratch this, scribble that, lest it affect her final grade. No consideration, no embarrassment. I used to pull out the dictionary "to inculcate good habits," but no more. In the presence of today’s students, pulling out a dictionary feels as remote as pulling out a typewriter or playing a record.

Sometimes the situation is not so clear-cut. The student might, for example, write a word like security in a context where it makes a bit of sense, but after some gentle prodding and, yes, a few pointed suggestions, she might decide that what she really means is privacy. Out comes the pencil again. Scratch "security," scribble "privacy." What she really means is safety, though, I think, but I let it go. If I push too hard, she’ll stop thinking I'm being helpful and begin to think I have a problem: "What a nitpicker! The man’s obsessed with words!" I imagine her complaining to her friends. "But it matters! It matters!" goes the imaginary dialogue. "What precisely were the opponents of the ERA arguing, that it would violate security, invade privacy, or threaten safety?"

I have used the online Webster's on occasion, of course, and recognize the advantages of online dictionaries: They can be kept up-to-date more easily, they can give us access to more words than a standard portable dictionary, they can be accessed anywhere at any time, they take up no shelf space, etc. I'm not prejudiced against online reference tools. In fact, unlike many of my colleagues, I'm a great fan of online encyclopedias and a lover of Wikipedia. Online dictionaries leave me cold, though. They should fill me with awe the way Wikipedia sometimes does, but they don't. I marvel at the invention of the dictionary every time I look up a word in my paper copy; at the brilliant evolutionary step of such a book; at the effort of generations of scholars, professionals and lay people that led to such a comprehensive compendium of words; at how much information — and not just word meanings — it puts at my fingertips; at how much I still have to learn; and at how much my education could still be enhanced if I read my college dictionary cover to cover.

I think of The Autobiography of Malcolm X, in which the author makes a powerful statement about the dictionary as a pedagogical tool. Frustrated with his inarticulateness in writing while in prison and his inability to take charge of a conversation like his fellow inmate Bimbi, Malcolm X came to the conclusion that what he needed was "to get hold of a dictionary — to study, to learn some words." The experience was a revelation: "I’d never realized so many words existed!" He started at the beginning and read on, learning not just words but also history — about people, places, and events. "Actually the dictionary is like a miniature encyclopedia," he noted. The dictionary was the start of his "homemade education."

Online all I get is quick definition of the word I want, and I’m done. On paper I get the definition plus something akin to a small education along the way. The experience is not unlike that of slaying the Hydra: For every word I word I look up, I see two others whose meaning I don’t know. If I were Hercules I could put an end to the battle once and for all, but I’m not, and glad I’m not. The battle is far too delicious. But how to convince my students?

Julio Alves is the director of the Jacobson Center for Writing, Teaching and Learning at Smith College.

MLA session on first-year common reading programs

Smart Title: 

At MLA, literature professors consider the non-literary values behind first-year reading programs -- and how such programs play out in the classroom.

MLA considers radical changes in the dissertation

Smart Title: 

MLA leaders encourage radical changes in Ph.D. programs. Among ideas: ending norm of producing a "proto-book," embracing digital formats, shaming committee members into doing work, and halving 9-year average for doctorate completion.

Essay urges reforms for doctoral education in humanities

Not all doctorate recipients will become faculty members, but all future faculty will come out of graduate programs. Do these programs serve the needs of graduate students well?

In light of the rate of educational debt carried by humanities doctoral recipients, twice that of their peers in sciences or engineering; in light of the lengthy time to degree in the humanities, reaching more than nine years; and in light of the dearth of opportunities on the job market, the system needs to be changed significantly. I want to begin to sketch out an agenda for reform.

The major problem on all of our minds is the job market, the lack of sufficient tenure-track openings for recent doctorate recipients. One response I have heard is the call to reduce the flow of new applicants for jobs by limiting access to advanced study in the humanities. If we prevent some students from pursuing graduate study — so the argument goes — we will protect the job market for others. I disagree.  

Let us not lose sight of the fact that the number of new Ph.D.s has already declined significantly, down about 10 percent from a recent peak in the 1990s. Because that drop hardly matches the 32 percent decline in job listings since 2007-08, the problem is not too many scholars: it is too few tenure-track positions. I fear that any call to reduce doctoral programs will end up limiting accessibility and diversity, while playing into the hands of budget-cutters. U.S. education needs more teaching in our fields, not less, and therefore more teaching positions, the real shovel-ready jobs.

Instead of asking that you lock your doors behind the last class of admitted students,  I appeal to those of you involved in the structure of doctoral programs to consider how to keep them open by making them  more affordable and therefore more accessible. Can we redesign graduate student learning in the face of our changed circumstances?

Reform has to go to the core structures of our programs. Let me share two pertinent experiences at Stanford.

Thanks to a seed grant from the Teagle Foundation, I was able to experiment with a program for collaborative faculty-graduate student teaching. In our umbrella grouping of the language departments, we set up small teams — one faculty member and two graduate students from each language — to develop and deliver undergraduate courses, against the backdrop of a common reading group on current scholarship on student learning and other issues in higher education. The graduate students developed their profiles as teachers of undergraduate liberal arts. Teaching experience is only going to grow more significant as a criterion in hiring, and we should, in our departments, explore how to transform our programs to prepare students better as future humanities teachers of undergraduates. I encourage all departments to experiment with new modalities of collaborative graduate student-faculty teaching arrangements that are precisely not traditional TA arrangements.

Support from Stanford's Center for Teaching and Learning has led to an ad hoc project on "Assessing Graduate Education," a twice-a-quarter discussion group to which all faculty and graduate students have been invited. German studies graduate student Stacy Hartman organized an excellent survey of best practices, which has become the center of a vigorous discussion. My point now is not to dwell on the particular issues — teaching opportunities, examination sequencing, quality of advising, professionalization opportunities, etc. — but to showcase the potential in every department of a structured public discussion forum on the character of doctoral training. I advise all doctoral programs to initiate similar discussions, not limited to members of departmental standing committees but open to all faculty and graduate students. What works in our programs; what could be better?  

At nine years (according to the Survey of Earned Doctorates), time to degree in our fields is excessive. We should try to cut that in half. I call on all departments with doctoral programs to scrutinize the hurdles in the prescribed trajectories: are there unnecessary impediments to student progress? Is the sequencing of examinations still useful for students?

Accelerating progress to completion will, moreover, depend on better curriculum planning and course articulation, as former MLA President Gerald Graff emphasized in his convention address three years ago. We should plan course offerings with reference to student learning needs. Curricular and extracurricular professionalization opportunities could take into account the multiple career tracks that doctorate recipients in fact pursue — this means the real diversity of hiring institutions, the working conditions of faculty at different kinds of institutions, non-teaching careers in the academy as well as non-academic positions. Can we prepare students better for all of these outcomes? Finally, we have to reinvent the conclusion of doctoral study.  As last year's President Sidonie Smith reminds us, the dissertation, as a proto-book, need not remain the exclusive model for the capstone project. This piece is crucial to the reform agenda.

Russell A. Berman is professor of comparative literature and German studies at Stanford University. This essay is an except from his presidential address at the 2012 meeting of the Modern Language Association.

Essay on new approach to defend the value of the humanities

"When the going gets tough, the tough take accounting." With those succinct words in a June 2010 op ed, New York Times columnist David Brooks summed up the conventional wisdom on the current crisis of the humanities. In an age when a higher education is increasingly about moving quickly through a curriculum streamlined to prepare students for a job, the humanities have no practical utility. As Brooks observes, "when the job market worsens, many students figure they can’t indulge in an English or a history major," a fact that explains why the "humanities now play bit roles when prospective students take their college tours. The labs are more glamorous than the libraries."

Pushed into a corner by these dismaying developments, defenders of the humanities -- both traditionalists and revisionists — have lately been pushing back. Traditionalists argue that emphasizing professional skills would betray the humanities' responsibility to honor the great monuments of culture for their own sake. Revisionists, on the other hand, argue that emphasizing the practical skills of analysis and communication that the humanities develop would represent a sellout, making the humanities complicit with dominant social values and ideologies. But though these rival factions agree on little else, both end up concluding that the humanities should resist our culture's increasing fixation on a practical, utilitarian education. Both complain that the purpose of higher education has been reduced to credentialing students for the marketplace.

Martha Nussbaum, for example, while stressing that the humanities foster critical thinking and the ability to sympathetically imagine the predicament of others, insists such skills are, as the title of her 2010 book puts it, "not for profit." In doing so she draws a stark line between the worlds of the humanities and the 21st-century workplace. Likewise, Geoffrey Galt Harpham in The Humanities and the Dream of America, laments the increasing focus on professional skills in the humanities at the expense of reading great books. Stanley Fish takes an even more extreme position, insisting that the humanities "don’t do anything, if by 'do' is meant bring about effects in the world. And if they don’t bring about effects in the world they cannot be justified except in relation to the pleasure they give to those who enjoy them. To the question 'of what use are the humanities?', the only honest answer is none whatsoever." Worse still, Frank Donoghue, in The Last Professors: The Corporate University and the Fate of the Humanities, argues that the humanities will simply disappear in the new corporate, vocation-centered university.

Ironically, these pessimistic assessments are appearing at the very moment when many employers outside academe are recognizing the practical value of humanities training. Fish simply dismisses the argument that "the humanities contribute to the economic health of the state — by producing more well-rounded workers or attracting corporations or delivering some other attenuated benefit — because nobody really buys that argument." But this would come as news to the many heads of philanthropic foundations, nonprofits, and corporate CEOs who have lately been extolling the professional value of workplace skills grounded in the humanities.

We would be the last to argue that traditional ways of valuing the humanities are not important, that studying philosophy, literature, and the fine arts do not have a value in and of themselves apart from the skills they teach. We also recognize that the interests of the corporate world and the marketplace often clash with the values of the humanities. What is needed for the humanities in our view is neither an uncritical surrender to the market nor a disdainful refusal to be sullied by it, but what we might call a critical vocationalism, an attitude that is receptive to taking advantage of opportunities in the private and public sectors for humanities graduates that enable those graduates to apply their training in meaningful and satisfying ways. We believe such opportunities do exist.

To be sure, such optimism must be tempered in today’s bleak economy, where hardly any form of education is a sure ticket to a job and where many in the private sector may still look with indifference or even disdain on a humanities degree.  But as David Brooks himself went on to point out in his op-ed: "Studying the humanities improves your ability to read and write. No matter what you do in life, you will have a huge advantage if you can read a paragraph and discern its meaning (a rarer talent than you might suppose). You will have enormous power if you are the person in the office who can write a clear and concise memo."

Brooks’ view is echoed by Edward B. Rust Jr., chairman and CEO of State Farm Insurance Companies, who observes that "at State Farm, our employment exam does not test applicants on their knowledge of finance or the insurance business, but it does require them to demonstrate critical thinking skills" and "the ability to read for information, to communicate and write effectively, and to have an understanding of global integration." And then there is Google, which more than any other company has sung the praises of humanities students and intends to recruit many of them. "We are going through a period of unbelievable growth," reports Google’s Marissa Mayer, "and will be hiring about 6,000 people this year — and probably 4,000-5,000 from the humanities or liberal arts."

This evidence of the professional utility of humanities skills belies Donoghue’s apparent assumption (in The Last Professors) that the "the corporate world’s hostility" toward humanistic education remains as intense today as it was a century ago, when industrialists like Andrew Carnegie dismissed such an education as "literally, worthless." Donoghue ignores changes in the global economy, the culture, and the humanities themselves since Carnegie’s day that have given many corporate leaders a dramatically more favorable view of the humanities’ usefulness. Associate Dean Scott Sprenger of Brigham Young University, who oversees a humanities program we will discuss in a moment, quotes the dean of the Rotman School of Management in Toronto, who observes a "tectonic shift for business school leaders," who are now aware that "learning to think critically — how to imaginatively frame questions and consider multiple perspectives — has historically been associated with a liberal arts education, not a business school curriculum."

All of these commentators are right, and the skills they call attention to only begin to identify the range of useful professional competencies with which a humanities education equips 21st-century students. In addition to learning to read carefully and to write concisely, humanities students are trained in fields like rhetoric and composition, literary criticism and critical theory, philosophy, history, and theology to analyze and make arguments in imaginative ways, to confront ambiguity, and to reflect skeptically about received truths, skills that are increasingly sought for in upper management positions in today’s information-based economy. Even more important for operating as global citizens in a transnational marketplace, studying the literary, philosophical, historical, and theological texts of diverse cultures teaches humanities students to put themselves in the shoes of people who see and experience the world very differently from their own accustomed perspectives. Are some corporations still looking for employees who will be well-behaved, compliantly bureaucratized cogs in the wheel? Of course they are. But increasingly, many others are looking for employees who are willing to think outside the box and challenge orthodoxy.

It is true that humanities study, unlike technical training in, say, carpentry or bookkeeping, prepares students not for any specific occupation, but for an unpredictable variety of occupations. But as many before us have rightly pointed out, in an unpredictable marketplace this kind of versatility is actually an advantage. As Associate Dean Sprenger notes, "the usefulness of the humanities" paradoxically "derives precisely from their detachment from any immediate or particular utility. Experts tell us that the industry-specific knowledge of a typical vocational education is exhausted within a few years," if not "by the time students enter the workforce." It is no accident, he observes, "that a large percentage of people running Fortune 500 companies (one study says up to 40 percent) are liberal arts graduates; they advance more rapidly into mid- and senior-level management positions, and their earning power tends to rise more significantly than people with only technical training."

If there is a crisis in the humanities, then, it stems less from their inherent lack of practical utility than from our humanistic disdain for such utility, which too often prevents us from taking advantage of the vocational opportunities presented to us. This lofty disdain for the market has thwarted the success of the few programs that have recognized that humanities graduates have much to offer the worlds of business, technology, arts agencies, and philanthropic foundations.

The most promising of these was a program in the 1990s developed by the Woodrow Wilson National Fellowship Foundation under the leadership of its then-director, Robert Weisbuch. First called "Unleashing the Humanities" and later "The Humanities at Work," the program, according to Weisbuch in an e-mail correspondence with the authors, "had persuaded 40 nonprofits and for-profits to reserve meaningful positions for Ph.D. graduates in the humanities and had placed a large number in well-paying and interesting positions — at places ranging from Verizon to AT Kearney to The Wall Street Journal to the National Parks Service." Unfortunately, Weisbuch reports, only a few humanities graduate programs "enlisted their alumni and the influential corporations and others in their areas of influence to revolutionize the possibilities for employment of humanities doctoral graduates," while most faculty members "continued to expect their graduate students to look for jobs much like their own and to consider any other outcome a failure."

Today, however, some humanities programs that emphasize useful professional applications are prospering.  One of these is a new undergraduate major at Brigham Young University called "Humanities +," with the "+” referring to the value-added vocational component gained by students who elect the program. According to an e-mail to the authors from Associate Dean Sprenger, BYU hired a career services specialist tasked with "revolutionizing our humanities advising office along the lines of the Humanities + vision, and the program has developed ties with the university’s colleges of business and management" —  a virtually unheard-of step for a humanities program. The program’s students are urged "to minor in a practical subject, professionalize their language skills, and internationalize their profile by doing an overseas internship." The objective, Sprenger says, "is that career thinking and strategizing become second nature to students," while faculty members "see it as in their interest to help students find 'alternative' careers, and are reassured that they can rely on our advising office to be informed and to do the training."

Another notable program that sees its mission as "bringing humanities into the world" beyond academe and that works closely with its university’s office of career placement is the Master of Arts Program in the Humanities (MAPH) at the University of Chicago, which Gerald helped design and direct in the 1990s. According to a recent article by program associate A.J. Aronstein in Tableau, a University of Chicago house journal, one recent MAPH graduate got a job as finance director in Florida for Barack Obama’s 2008 campaign, later served as chief of staff at the International Trade Association, and now works as a political consultant in Washington. Other MAPH graduates have gone on to internships and subsequent positions as museum curators, technical writers, journalists and other media workers, marketing specialists, and policy analysts with investment firms.

The false assumption in both anti-utilitarian defenses of the humanities and pessimistic predictions of their extinction is that we have to choose between a credentializing and a humanizing view of higher education, between vocational utility and high-minded study as an end in itself. This either/or way of thinking about the humanities — either they exist solely for their own sake or they have no justification at all – is a trap that leaves humanists unable to argue for the value of their work in terms of the practical skills it teaches, an argument that inevitably has to be made in the changing marketplace of higher education. In fact, we would argue there is no defense of the humanities that is not ultimately based on the useful skills it teaches.

Evidence is plentiful that stressing the range of expertise  humanities graduates have makes intellectual and economic sense. Take, for example, Damon Horowitz, director of engineering at Google. He insisted recently in an article in The Chronicle of Higher Education entitled "From Technologist to Philosopher: Why You Should Quit Your Technology Job and Get a Ph.D. in the Humanities," that "if you are worried about your career ... getting a humanities Ph.D. is not only not a danger to your employability, it is quite the opposite. I believe there no surer path to leaping dramatically forward in your career than to earn a Ph.D. in the humanities." "You go into the humanities to pursue your intellectual passion," he explains, "and it just so happens, as a byproduct, that you emerge as a desired commodity for industry."

Horowitz, a leading figure in artificial intelligence and the head of a number of tech startups, ought to know. He took a break from his lucrative career to enroll in Stanford’s Ph.D. program in philosophy because he figured out that in order to do his job in technology well he needed to immerse himself in the humanities. "I realized that, while I had set out in AI to build a better thinker, all I had really done was to create a bunch of clever toys." Horowitz came to realize that the questions he was "asking were philosophical questions — about the nature of thought, the structure of language, the grounds of meaning." Returning to the humanities, Horowitz took time out from the world of artificial intelligence to study "radically different approaches to exploring thought and language," such as philosophy, rhetoric, hermeneutics and literary theory. As he studied intelligence from these perspectives he "realized just how limited my technologist view of thought and language was. I learned how the quantifiable, individualistic, ahistorical — that is, computational — view I had of cognition failed to account for whole expanses of cognitive experience (including, say, most of Shakespeare)."

The concrete value of the humanities education Horowitz celebrates is especially well epitomized in the new field of the digital humanities. The emergence of this field calls attention to how old 20th-century divisions between science and the humanities are breaking down and gives those of us committed to defending the practical value of the humanities a tremendous opportunity. The digital humanities represent the cutting-edge intersection of the humanities and computer science, the merging of skills and points of view from two formerly very different fields that are leading to a host of exciting innovations – and opportunities for students who want to enter fields related to everything from writing computer programs to text encoding and text editing, electronic publishing, interface design, and archive construction. Students in the digital humanities are trained to deal with concrete issues related to intellectual property and privacy, and with questions related to public access and methods of text preservation.

Graduates of the digital humanities programs that are now developing all over the country will be first in line for such positions. For example, Paul’s university now has a Digital Humanities M.A. with two converging tracks, one designed for students with a background in computer science and one for students with a background in the humanities. The program website notes that it offers "theoretical, critical, social, and ethical contexts for thinking about the making of new knowledge through digital humanities research and applications, from issues of intellectual property and privacy, to questions of public access and methods of preservation." When we are asked about the practical value of a humanities education, we need to add the digital humanities to the list.

We believe it is time to stop the ritualized lamentation over the crisis in the humanities and get on with the task of making them relevant in the 21st century.  Such lamentation only reveals the inability of many humanists to break free of a 19th-century vision of education that sees the humanities as an escape from the world of business and science. As Cathy Davidson has forcefully argued in her new book, Now You See It, this outmoded way of thinking about the humanities as a realm of high-minded cultivation and pleasure in which students contemplate the meaning of life is a relic of the industrial revolution with its crude dualism of lofty spiritual art vs. mechanized smoking factories, a way of thinking that will serve students poorly in meeting the challenges of the 21st century.

Though we have argued in defense of the practical and vocational utility of a humanities education, our argument should in no way be construed as undercutting the aspirations of those in the humanities who seek an academic career. Indeed, on this score we need to redouble our efforts to increase public and private funding for higher education and to support unionizing efforts by faculty members and adjuncts. But even as we fight these battles to expand the academic job market we would be foolish to turn our backs on alternative forms of employment for humanities graduates when they are out there. In this spirit, we applaud both Modern Language Association President Russell Berman and American Historical Association President Anthony Grafton, who, along with the executive director of the AHA, James Grossman, have recently urged their organizations to acknowledge that advanced training in the humanities can lead to a variety of careers beyond academia and have suggested how graduate programs can be adapted for these kinds of careers.

For ultimately, to take advantage of the vocational potential of humanities study as we propose is not to sell out to the corporate world, but to bring the critical perspective of the humanities into that world. It is a perspective that is sorely needed, especially in corporate and financial sectors that have lately been notoriously challenged in the ethics department, to say the least. Humanities graduates are trained to consider the ethical dimensions of experience, linking the humanities with the sciences as well as with business and looking at both these realms from diverse perspectives. To those who worry that what we urge would blunt the humanities' critical power, we would reply that it would actually figure to increase that power, for power after all is the ability to act in the world.

Paul Jay is professor of English at Loyola University Chicago and the author, most recently, of Global Matters: The Transnational Turn in Literary Studies. Gerald Graff is professor of English and education at the University of Illinois at Chicago and a past president of the Modern Language Association.



Subscribe to RSS - English
Back to Top