Afterglow From the MLA

I felt the need to get away, even as the pile of student papers I had to grade slowly dwindled. With final grades submitted, I still felt the impulse. I resisted as well as I could, but something within nagged me.

I considered a spiritual retreat, one to recharge and rest after a busy, even frenzied, semester. I had worked at three campuses, two writing centers, one community center. I did freelance writing. I’m not a workaholic, just a teacher trying to make ends meet. These days, it’s getting harder.

Catholic, Buddhist, ecumenical -- the retreat path did not matter. But calling and surfing for such a place, I found it was too late. Everything was filled. There was only one possibility -- in the twisting hills of Arkansas. I would have to bring my own food and get transportation from a distant airport. I appreciated the offer but felt too tired. Maybe in spring...

Then I had another thought. A whim. Just ninety minutes away, if I could get a direct flight… Could I?

I did.

I joined the Modern Language Association after 30 years in academia and flew to Philadelphia for the 2009 conference, tantalized by conference titles I had only read about before and noticing more than a few that dealt with the ups and downs of academia that I not only know but are etched on my heart. Student assistant, secretary, graduate assistant, writer/editor, teacher…

Although I was just beginning to recite poems when some of the long-term veterans joined, I’ve chalked up my flight miles in the classroom. If I had a banner across my chest like the Girl Scouts used to wear, I’d have badges for adjuncting at up to four institutions at a time, loving words, and being midwife, doula, mother to students in the classroom. A former boss called me a composition worker. Some people think people in my line of work are exploited. I call myself a professional muse.

Maybe going to a professional conference does not seem like a big deal to some. For some, it’s draining. For others, routine. For still others, a dreaded initiation or the key to a job.

I remember sitting behind my desk as a secretary in an English department in the early 1980s, hearing that people interviewed at MLA.

Over winter holidays? I thought. How strange.

“How did you like the meat market?” said a friend, hearing I had been there.

Actually, I didn’t even pack anything formal to wear. I went just to learn. Without expectation, I found myself transported back to a joy I have not felt since my undergraduate years.

“Yes, undergrad is a carefree time,” a colleague said upon patiently listening to my post-conference euphoria.

Actually, for me the undergraduate years were also full of care. But in tough times, it’s literature, art, music, drama that gives me hope, words, perspective.

“You know, those conference titles are often obscure, even ridiculed,” said another friend.

Well, I loved the sessions. Translation and Kafka. Awesome Yiddish. When will I be near Yiddish scholars again? Why study literature? Packed. Langston Hughes. Well worth the trip. Hurston screening. Couldn’t squeeze in. And others…

I’m old enough to feel like a mother to some of the presenters. And I’ve been to other conferences; I have one foot in English, one in counseling, and one in journalism. How is this possible with two feet? I keep shifting my stance, my focus, my efforts. In a world thought to be increasingly interdisciplinary, perhaps I can create a new dance. MLA, for me, was an imaginative leap. I am glad I took it.

Books, stories, and poems have added meaning to my life since I was a little girl. I was imaginative, as kids are – maybe beyond imaginative into the quirky. I “became” Cinderella and Snow White, responding not to my own name, but to the name of the character of the week. I learned French in an innovative, public elementary school and my parents spoke German. Whitman and Golding were among my beacons in junior high, with words I couldn’t utter but could understand. I devoured “the classics” my much older sisters brought home from their demanding high school. A sonnet by Shakespeare and a poem by Millay provided solace through very dark times as a teenager.

My heart further opened to and through the humanities as an undergraduate English major, even with a foot in psychology and another in other interests. I have not changed that much.

The humanities gave me some range to explore, and I majored in English for several reasons. Philosophy had beckoned, but one day I asked a question in a philosophy class and was told “that’s a question for an English class.”

In English class a few months later, a question I asked about what I now know was the teacher’s formalist analysis of The Scarlet Letter yielded an even harsher response from a teacher.

“Do you think the unexamined life is worth living?”

Teachers have bad days.

That teacher, like most of my mentors from school, is deceased now. With the strange quirks of fate, right before I began graduate school (in English), my path crossed his. “Of course I remember you,” he said. “You were the best student I ever had.”

I negotiated my way through the canon in graduate school in English, in a world before composition and rhetoric, but devising my own intuitions about the teaching of writing, and teaching writing, and beginning a career as a writer and editor.

I considered comparative literature studies in graduate school but thought that English was more -- I almost can’t type it out -- practical.

Fate landed me in a hotel room in the Loews in Philadelphia, where many of the modern language sessions were held. Just riding the elevator was fun. People entered and exited, speaking many languages.

Across the street was the Marriott and the Pennsylvania Convention Center, where many English sessions were held. I jaywalked with abandon, with absolute certainty that here on this side or there on that side was where I needed to be.

This jaywalking is a metaphor for my life; as the daughter of immigrants who struggled with English, I sometimes struggle with words, too. Why else would I strive to become a writer?

In my home town, I don’t jaywalk. But what is travel to a professional conference if not an expansion of boundaries?

I befriended three women by chance, each with Ph.D.'s and following different, intriguing, winding career paths.

One had been a high school teacher for 15 years and had also taught on Indian reservations and in China.

Another, formerly on the tenure track, was derailed and maintains energetic writing and teaching.

A third, originally from China, turned out to be a presenter.

As is my wont, I asked questions of everyone I met, no matter whether scrunched in a shuttle or in an elevator. Mainly I asked, “Are you enjoying your sessions?” “Did you get what you came for?” My response to one question from a man in a uniform covered by an overcoat was a gentle, “I work here.”

Enough of my questions. MLA for me was an immersion experience, a cross-cultural journey. The academic paper sessions I attended were mind-stretching. Translation was an echoing theme, and what could be more apropos as the academe struggles to define and express itself in difficult economic times. The sessions on the state of affairs in academe reassured me that I am not alone. And the session on writing teachers who write reassured me that I am on a valid path.

I also learned, among other things, that some people perceive rifts in the MLA. Other languages over there, English over here. Full-time issues there, part-time here. Writing here, literature there.

“I don’t know if I’ll come back,” one new friend said. “Some of this feels elitist.”

If so, that is a shame. What more powerful bridge between human differences than the humanities?

I had the good fortune of encountering people, at random, who attended sessions I wanted to make but couldn’t. On two-year colleges. On analyzing “The Moose” by Elizabeth Bishop. On standings of academic journals. Even my missteps seemed well-orchestrated.

I ate energy bars, instant oatmeal, salmon at a French restaurant, a side of mashed potatoes for a meal, a meal in Chinatown courtesy of a spontaneous Philadelphia friend. I am too shy for cash bars, so I drank cups of tea and coffee in my room.

I’ll be paying off the trip for a while. But it was worth it.

In the three decades that seem like three days that I have spent in academia, I was a student assistant in a college of education, secretary in an English department, a graduate assistant, a publications writer, a liaison with the news media, an adjunct lecturer in three departments at one school, a teacher without walls (adjunct) at four other schools.

In this economy, I won’t be retiring or stopping learning any time soon.

When I told my teenage son I planned to go to the conference, I asked him if he knew what MLA is.

“Those are the people that make the rules I use when I have to write a paper.”

When I have taught documentation in the classroom, MLA or APA, depending on the course, I typically have pointed out that scholars in these groups are not strictly documentation experts, but explorers, researchers, lovers of learning.

Finally, I have decided to count myself among them, even if all I did was sign up.

One new friend was, to my surprise, a presenter. We shared costs of our hotel the last day. She approached me as I indulged my habit honed in a pre-ecological, pre-Internet era. I was seeking out fliers. She offered that we share costs. Why not, I thought.

I had a room with two beds. I rushed across the street again to clear off my avalanche of paper. I had the joy of listening to part of her paper the night before. And, attending her session, by sheer chance. I got a stunning view of Philadelphia from the 33rd floor.

Returning to Cleveland was, of course, a descent. And the stacks of papers are piling up again.… It’s just a few weeks after, and I’m still walking around in post-conference delirium.

Few believe me when I say there is hope for the humanities. There has to be. The most difficult times in my life, the more I have needed books, art, music, drama. I have seen the value of humanities study for students of all ages, at colleges private and public, large and small. My memoir students, some in their eighth decade of life, still turn to the written and electronic word for solace, support, and inspiration.

What did I leave behind? My wide-tooth comb and fliers I could not stuff in my carry-on. That’s all right. It’s hat weather in Cleveland -- and there’s always the Internet.

Maria Shine Stewart
Author's email:

Maria Shine Stewart teaches and writes in South Euclid, Ohio.

Gerrymandering the Canon

In a recent New York Review article on Byron, Harold Bloom makes the following passing remark: “In the two centuries since Byron died in Greece [...] only Shakespeare has been translated and read more, first on the Continent and then worldwide.” Bloom does not cite any statistics, and one cannot help but wonder: Really? More than Homer and Dante, or, among the moderns, more than Sartre and Thomas Mann? Of course, what Bloom really means is that Byron was translated and read more than any other English writer, and he may well be correct on that count. Yet this omission is telling, as it highlights an unfortunate tendency (recently diagnosed by David Damrosch) among certain English professors to equate literature in general with literature written in English. This disciplinary bias, less prejudice than habit, can distort their scholarship – the authors that they admire tend to be far more catholic in their reading. But this pattern also raises a larger academic question: Why do we still partition the literary canon according to nationalist traditions? Is this really the most intellectually satisfying and authentic approach to literary studies?

For an example of how disciplinary blinders can affect scholars as well-read as Bloom, we need only turn back to his article, where we find Byron described as “the eternal archetype of the celebrity, the Napoleon of the realms of rhyme... the still unique celebrity of the modern world.” What such hyperbole masks is the fact that the model for such literary celebrity is in reality to be located in another author, who unfortunately did not have the good sense to be born in England. Indeed, anyone familiar with the inordinate fame of Jean-Jacques Rousseau knows that he was the first genuine literary celebrity, lionized and sought out across Europe, much to his growing despair and paranoia (as this brilliant study by the historian Antoine Lilti details). Byron himself was smitten by Rousseau, touring the Lac Léman with his friend Shelley to visit the sites from Julie, ou la nouvelle Héloïse. Rousseau may not have provided his public with the same devilish scandals as the naughty Lord, but his Confessions, with their admission of a fondness for spankings and exhibitionism, were sultry enough.

Bloom is certainly no provincial, and his own, published version of The Western Canon includes German, Spanish, French, and Italian works – although this canon, too, is heavily tilted toward English authors. But can this be avoided? No doubt French scholars would produce a version of the canon equally tilted toward the French, just as scholars from other nations would privilege their own authors. To an extent, this literary patriotism is normal and understandable: every culture values its heritage, and will expend more energy and resources promoting it.

From the viewpoint of literary history, however, such patriotism is also intellectually wrongheaded. To be sure, writers are often marked most strongly by their compatriots: one must read Dante to understand Boccacio, Corneille to understand Racine, or, as Bloom would have us believe, Whitman to understand T. S. Eliot. But such a vertical reading of literature (which Bloom himself mapped out in The Anxiety of Influence) overlooks the equally – sometimes far more – important horizontal ties that connect authors across national borders. T. S. Eliot may have been “hopelessly evasive about Whitman while endlessly revising him in [his] own major poems,” yet by Eliot’s own admission, the French school of symbolist poetry had a far greater impact on his work. Some of Eliot’s first published poems, in fact, were written in French. Conversely, the French novelist Claude Simon may have endlessly revised Proust, but his own major novels – such as La route des Flandres and L’herbe – owe far more to William Faulkner. Such examples could be multiplied ad infinitum: they are, in fact, the stuff that literary history is made of.

To this criticism, English professors have a ready-made answer: Go study comparative literature! But they have only half a point. Comp lit programs are designed to give students a great deal of flexibility: their degrees may impose quotas for number of courses taken in foreign language departments, but rarely, if ever, do comp lit programs build curricular requirements around literary history. Yet that is precisely the point: Students wishing to study English Romanticism ought to have more than Wikipedia-level knowledge about German Idealist philosophy and Romantic poetry; students interested in the 18th-century English novel should be familiar with the Spanish picaresque tradition; and so on and so forth. Comp lit alone cannot break down the walls of literary protectionism.

The fact that we even have comp lit departments reveals our ingrained belief that “comparing” literary works or traditions is merely optional. Despite Bloom’s own defense of a “Western canon,” such a thing no longer exists for most academics. This is not because the feminists, post-colonialists, or post-modernists managed to deconstruct it, but rather because our institutions for literary studies have gerrymandered the canon, department by department. Is it not shocking that students can major in English at many colleges without ever having read a single book written in a foreign language? Even in translation? (Consider, by contrast, that history majors, even those desirous to only study the American Revolution, are routinely required to take courses on Asian, African, and/or European history, in many different time periods, to boot.) Given that English is the natural home for literary-minded students who are not proficient in another language, it is depressing that they can graduate from college with the implicit assumption that literature is the prerogative of the English-speaking peoples, an habeas corpus of the arts.

But wait a minute: how dare I criticize English curriculums for not including foreign works, when the major granted by my own department, French, is not exactly brimming with German, Russian, or Arabic texts, either? To the extent that French (or any other foreign language) is a literature major, this point is well taken. But there are differences, too. First, it is far more likely that our students will have read and studied English literature at some point in high school and college. They will thus already have had some exposure, at least, to another national canon. Second, and more importantly, a French, Spanish, or Chinese major is more than a literature major: it is to no small degree a foreign language major, meaning that the students must master an entire other set of linguistic skills. Finally, language departments are increasingly headed toward area studies. German departments routinely offer classes on Marx, Nietzsche, and Freud, none of whom are technically literary authors. Foreign language departments are sometimes the only places in a university where once-important scholarly traditions can still be studied: Lévi-Strauss’s Tristes tropiques probably features on reading exam lists more often in French than in anthropology departments. A model for such an interdisciplinary department already exists in Classics.

I do not wish to suggest that English professors are to blame for the Anglicization of literature in American universities: they reside, after all, in English departments, and can hardly be expected to teach courses on Russian writers. The larger problem is institutional, as well as methodological. But it bears emphasizing that this problem does not only affect undergraduates, and can lead to serious provincialism in the realm of research, as well. An English doctoral student who works on the Enlightenment once openly confessed to me that she had not read a single French text from that period. No Montesquieu, no Voltaire, no Rousseau, no Diderot, rien. Sadly, this tendency does not seem restricted to graduate students, either.

Literary scholars are not blind to this problem: a decade ago, Franco Moretti challenged his colleagues to study “world literature” rather than local, national, or comparative literatures. He also outlined the obvious difficulty: “I work on West European narrative between 1790 and 1930, and already feel like a charlatan outside of Britain or France. World literature?” While the study of world literature presents an opportunity for innovative methodologies (some of which were surveyed in a recent issue of New Literary History), students already struggling to master a single national literary history will no doubt find such global ambitions overwhelming.

What, then, is to be done? Rearranging the academic order of knowledge can be a revolutionary undertaking, in which ideals get trampled in administrative terror. And prescribing a dose of world literature may ultimately be too strong a medicine for the malady that ails literary studies, particularly at the undergraduate level. In fact, a number of smaller measures might improve matters considerably. To begin with, literature professors could make a greater effort to incorporate works from other national literatures in their courses. Where the funds are available, professors from neighboring literature departments could team-teach such hybrid reading lists. Second, language and literature majors could also require that a number of courses be taken in two or three other literature departments. A model for this arrangement already exists at Stanford, where the English department recently launched an “English Literature and Foreign Language Literature” major, which includes “a coherent program of four courses in the foreign literature, read in the original.” To fulfill this last condition, of course, colleges would have to become more serious about their foreign-language requirements. Finally, literature students would be better served if colleges and universities offered a literature major, as is notably the case at Yale, UC San Diego, and UC Santa Cruz. Within this field of study, students could specialize in a particular period, genre, author, or even language, all the while taking into account the larger international or even global context.

Will such measures suffice to pull down the iron curtain dividing the literary past? Unless they manage to infiltrate the scholarly mindset of national-literature professors, probably not. Then again, as many of us know firsthand, teaching often does transform (or at least inform) our research interests. A case could of course be made for more radical measures, such as the fusion of English and foreign language departments into a single “Literature Department,” as exists at UC San Diego. But enacting this sort of bureaucratic coup carries a steep intellectual (not to mention political) price. It would be unfortunate, for instance, to inhibit foreign literature departments from developing their area-studies breadth, and from building bridges with philosophy, history, anthropology, sociology, religious studies, political science, and international relations. English departments, moreover, are developing in similar, centrifugal directions: in addition to teaching their own majors, English departments contribute more widely to the instruction of writing (including creative writing), and have their own ties with Linguistics and Communications departments. This existing segmentation of the university may appear messy, but has the benefit of preventing new walls from being erected, this time between neighboring disciplines.

Dan Edelstein
Author's email:

Dan Edelstein is assistant professor of French at Stanford University.

Secret Language

Umberto Eco writes somewhere that a sign may be defined as anything that can be used to tell a lie. This remark sounds cynical. But it’s really just the most extreme formulation of something implied by events in the Garden of Eden.

As you may recall, God gives Adam the power of speech and tells him to name “every living creature.” And so Adam does – not excluding the creature he calls “Woman.” But the next time we find Adam using language in Genesis, it is not exactly for its denotative properties. Having palpably annoyed the Almighty, he blames everything on the woman, hoping thereby to avoid the consequences of his own actions. (Like that’s going to work.)

Leaving aside the gender studies implication, this is interesting as a fable about communication itself. The power to symbolize gives rise, in very short order, to the ability to conceal.

Such was not really the point of the story, as I recall it from Sunday school anyway, but it came to mind while reading Barry J. Blake’s Secret Language: Codes, Tricks, Spies, Thieves, and Witchcraft, just published by Oxford University Press. The subtitle is both accurate and cryptic, and provides only a hint of just how intriguing and diverting a book this is. I suppose you could read it straight through, but Secret Language feels much more like a volume for dipping into when the mood strikes or the time permits. The author is an emeritus professor of linguistics at La Trobe University in Australia; and his book qualifies as a learned miscellany, rather than a monograph.

Blake catalogs a variety of techniques used to conceal or disguise meaning; or to limit its circulation, or to draw out the hidden powers of language itself. They include cryptography, Kabbalah, crossword puzzles, riddles, anagrams, magic spells, slang, and the private languages that emerge within tightly knit groups.

Part of the fascination of the book comes from noticing how often these modes of concealment resemble one another, or bleed together.

The ability to store information by writing it down (in effect, concealing it from the eyes of the illiterate) was once rare enough to make it a close kin of sorcery. And so the properties of the written word were themselves virtually occultic. The power of an amulet might come from a tiny scroll inside, running it like a little Pentium processor; and the text on that scroll could be baffling unless you knew to look for the anagram spelled out by the first letter in each word.

Or words might be written backwards – whether to conceal their meaning or to reverse their magical properties. Reversal of words could also have more purely commercial application, in the case of “back slang,” used among food vendors in London since the early 19th century. “It is essentially a system of enciphering words,” Blake explains,” by taking the written forms and pronouncing them backwards.” Thus “fish” becomes “shif,” “old” turns into “delo,” and “no good” is “on dog.” So a couple of merchants could talk candidly (“The shif is delo and on dog”) without customers being any the wiser.

Any given occupation will tend to generate its own jargon, mostly for the sake of convenience rather than to keep outsiders in the dark. But in some cases, a self-consciously “professionalized” diction amounts to a form of concealment – if only of total vacuity.

Baker gives a fine example of almost meaning-free writing from a memo by a school administrator: “Care is taken to avoid creating new categories of high staff turnover schools in regional areas not within defined categories of remoteness in determination of hard-to-staff schools developed under the total recruitment strategy.”

Well, one would hope.

Secret Language is dense with examples and generous with explanations, but light on general ideas (philosophical, sociological, or otherwise) designed to subsume the varieties of linguistic concealment.

That’s OK. There are theoretical works aplenty meditating on Power and Language and the Secret. It is good to have a book that gives you something to ponder without being ponderous itself. And it can be recommended in particular to anyone disposed to find language itself, as such, a source of pleasure.

The author quotes a fairly conservative British lexicographer, H.W. Fowler, who in the 1920s offered a surprising defense of slang. At one level, a piece of slang is like any other variety of “secret language”: it keeps access to meaning restricted to initiates. But it is also – again, like any other form of discursive concealment – an expression of inventiveness. It is, in Fowler’s words, “the diction that results from the favorite game among the young and lively of playing with words and renaming things and actions; some invent new words, or mutilate or misapply the old, for the pleasure of novelty, and others catch up on such words for the pleasure of being in fashion.”

Or to put it a different way, it is evidence that we’re never quite done naming the world.

Scott McLemee
Author's email:

Open Letter to SUNY Albany

The following letter to George M. Philip, the president of the State University of New York at Albany, prompted by the proposed elimination there of French, Italian, Russian and classics, was originally a blog post at Genome Biology and is reprinted here with permission of the author.

Dear President Philip,

Probably the last thing you need at this moment is someone else from outside your university complaining about your decision. If you want to argue that I can't really understand all aspects of the situation, never having been associated with SUNY Albany, I wouldn't disagree. But I cannot let something like this go by without weighing in. I hope, when I'm through, you will at least understand why.

On October 1st, you announced that the departments of French, Italian, Classics, Russian and Theater Arts were being eliminated. You gave several reasons for your decision, including that there are comparatively fewer students enrolled in these degree programs. Of course, your decision was also, perhaps chiefly, a cost-cutting measure -- in fact, you stated that this decision might not have been necessary had the state legislature passed a bill that would have allowed your university to set its own tuition rates. Finally, you asserted that the humanities were a drain on the institution financially, as opposed to the sciences, which bring in money in the form of grants and contracts.

Let's examine these and your other reasons in detail, because I think if one does, it becomes clear that the facts on which they are based have some important aspects that are not covered in your statement. First, the matter of enrollment. I'm sure that relatively few students take classes in these subjects nowadays, just as you say. There wouldn't have been many in my day, either, if universities hadn't required students to take a distribution of courses in many different parts of the academy -- humanities, social sciences, the fine arts, the physical and natural sciences -- and to attain minimal proficiency in at least one foreign language. You see, the reason that humanities classes have low enrollment is not because students these days are clamoring for more relevant courses; it's because administrators like you, and spineless faculty, have stopped setting distribution requirements and started allowing students to choose their own academic programs -- something I feel is a complete abrogation of the duty of university faculty as teachers and mentors. You could fix the enrollment problem tomorrow by instituting a mandatory core curriculum that included a wide range of courses.

Young people haven't, for the most part, yet attained the wisdom to have that kind of freedom without making poor decisions. In fact, without wisdom, it's hard for most people. That idea is thrashed out better than anywhere else, I think, in Dostoyevsky's parable of the Grand Inquisitor, which is told in Chapter Five of his great novel, The Brothers Karamazov. In the parable, Christ comes back to earth in Seville at the time of the Spanish Inquisition. He performs several miracles but is arrested by Inquisition leaders and sentenced to be burned at the stake. The Grand Inquisitor visits Him in his cell to tell Him that the Church no longer needs Him. The main portion of the text is the Inquisitor explaining why. The Inquisitor says that Jesus rejected the three temptations of Satan in the desert in favor of freedom, but he believes that Jesus has misjudged human nature. The Inquisitor says that the vast majority of humanity cannot handle freedom. In giving humans the freedom to choose, Christ has doomed humanity to a life of suffering.

That single chapter in a much longer book is one of the great works of modern literature. You would find a lot in it to think about. I'm sure your Russian faculty would love to talk with you about it -- if only you had a Russian department, which now, of course, you don't.

Then there's the question of whether the state legislature's inaction gave you no other choice. I'm sure the budgetary problems you have to deal with are serious. They certainly are at Brandeis University, where I work. And we, too, faced critical strategic decisions because our income was no longer enough to meet our expenses. But we eschewed your draconian -- and authoritarian -- solution, and a team of faculty, with input from all parts of the university, came up with a plan to do more with fewer resources. I'm not saying that all the specifics of our solution would fit your institution, but the process sure would have. You did call a town meeting, but it was to discuss your plan, not to let the university craft its own. And you called that meeting for Friday afternoon on October 1st, when few of your students or faculty would be around to attend. In your defense, you called the timing "unfortunate," but pleaded that there was a "limited availability of appropriate large venue options." I find that rather surprising. If the president of Brandeis needed a lecture hall on short notice, he would get one. I guess you don't have much clout at your university.

It seems to me that the way you went about it couldn't have been more likely to alienate just about everybody on campus. In your position, I would have done everything possible to avoid that. I wouldn't want to end up in the 9th Bolgia (ditch of stone) of the 8th Circle of the Inferno, where the great 14th century Italian poet Dante Alighieri put the sowers of discord. There, as they struggle in that pit for all eternity, a demon continually hacks their limbs apart, just as in life they divided others.

The Inferno is the first book of Dante's Divine Comedy, one of the great works of the human imagination. There's so much to learn from it about human weakness and folly. The faculty in your Italian department would be delighted to introduce you to its many wonders -- if only you had an Italian department, which now, of course, you don't.

And do you really think even those faculty and administrators who may applaud your tough-minded stance (partly, I'm sure, in relief that they didn't get the axe themselves) are still going to be on your side in the future? I'm reminded of the fable by Aesop of the Travelers and the Bear: two men were walking together through the woods, when a bear rushed out at them. One of the travelers happened to be in front, and he grabbed the branch of a tree, climbed up, and hid himself in the leaves. The other, being too far behind, threw himself flat down on the ground, with his face in the dust. The bear came up to him, put his muzzle close to the man's ear, and sniffed and sniffed. But at last with a growl the bear slouched off, for bears will not touch dead meat. Then the fellow in the tree came down to his companion, and, laughing, said "What was it that the bear whispered to you?" "He told me," said the other man, "never to trust a friend who deserts you in a pinch."

I first learned that fable, and its valuable lesson for life, in a freshman classics course. Aesop is credited with literally hundreds of fables, most of which are equally enjoyable -- and enlightening. Your classics faculty would gladly tell you about them, if only you had a classics department, which now, of course, you don't.

As for the argument that the humanities don't pay their own way, well, I guess that's true, but it seems to me that there's a fallacy in assuming that a university should be run like a business. I'm not saying it shouldn't be managed prudently, but the notion that every part of it needs to be self-supporting is simply at variance with what a university is all about. You seem to value entrepreneurial programs and practical subjects that might generate intellectual property more than you do "old-fashioned" courses of study. But universities aren't just about discovering and capitalizing on new knowledge; they are also about preserving knowledge from being lost over time, and that requires a financial investment.

There is good reason for it: what seems to be archaic today can become vital in the future. I'll give you two examples of that. The first is the science of virology, which in the 1970s was dying out because people felt that infectious diseases were no longer a serious health problem in the developed world, and other subjects, such as molecular biology, were much sexier. Then, in the early 1990s, a little problem called AIDS became the world's No. 1 health concern. The virus that causes AIDS was first isolated and characterized at the National Institutes of Health in the United States and the Institute Pasteur in France, because these were among the few institutions that still had thriving virology programs.

My second example you will probably be more familiar with. Middle Eastern studies, including the study of foreign languages such as Arabic and Persian, was hardly a hot subject on most campuses in the 1990s. Then came September 11, 2001. Suddenly we realized that we needed a lot more people who understood something about that part of the world, especially its Muslim culture. Those universities that had preserved their Middle Eastern studies departments, even in the face of declining enrollment, suddenly became very important places. Those that hadn't -- well, I'm sure you get the picture.

I know one of your arguments is that not every place should try to do everything. Let other institutions have great programs in classics or theater arts, you say; we will focus on preparing students for jobs in the real world. Well, I hope I've just shown you that the real world is pretty fickle about what it wants. The best way for people to be prepared for the inevitable shock of change is to be as broadly educated as possible, because today's backwater is often tomorrow's hot field. And interdisciplinary research, which is all the rage these days, is only possible if people aren't too narrowly trained. If none of that convinces you, then I'm willing to let you turn your institution into a place that focuses on the practical, but only if you stop calling it a university and yourself the president of one. You see, the word "university" derives from the Latin universitas, meaning "the whole." You can't be a university without having a thriving humanities program. You will need to call SUNY Albany a trade school, or perhaps a vocational college, but not a university. Not anymore.

I utterly refuse to believe that you had no alternative. It's your job as president to find ways of solving problems that do not require the amputation of healthy limbs. Voltaire said that no problem can withstand the assault of sustained thinking. Voltaire, whose real name was François-Marie Arouet, had a lot of pithy, witty and brilliant things to say (my favorite is "God is a comedian playing to an audience that is afraid to laugh"). Much of what he wrote would be very useful to you. I'm sure the faculty in your French department would be happy to introduce you to his writings, if only you had a French department, which now, of course, you don't.

I guess I shouldn't be surprised that you have trouble understanding the importance of maintaining programs in unglamorous or even seemingly "dead" subjects. From your biography, you don't actually have a Ph.D. or other high degree, and have never really taught or done research at a university. Perhaps my own background will interest you. I started out as a classics major. I'm now professor of biochemistry and chemistry. Of all the courses I took in college and graduate school, the ones that have benefited me the most in my career as a scientist are the courses in classics, art history, sociology, and English literature. These courses didn't just give me a much better appreciation for my own culture; they taught me how to think, to analyze, and to write clearly. None of my science courses did any of that.

One of the things I do now is write a monthly column on science and society. I've done it for over 10 years, and I'm pleased to say some people seem to like it. If I've been fortunate enough to come up with a few insightful observations, I can assure you they are entirely due to my background in the humanities and my love of the arts.

One of the things I've written about is the way genomics is changing the world we live in. Our ability to manipulate the human genome is going to pose some very difficult questions for humanity in the next few decades, including the question of just what it means to be human. That isn't a question for science alone; it's a question that must be answered with input from every sphere of human thought, including -- especially including -- the humanities and arts. Science unleavened by the human heart and the human spirit is sterile, cold, and self-absorbed. It's also unimaginative: some of my best ideas as a scientist have come from thinking and reading about things that have, superficially, nothing to do with science. If I'm right that what it means to be human is going to be one of the central issues of our time, then universities that are best equipped to deal with it, in all its many facets, will be the most important institutions of higher learning in the future. You've just ensured that yours won't be one of them.

Some of your defenders have asserted that this is all a brilliant ploy on your part -- a master political move designed to shock the legislature and force them to give SUNY Albany enough resources to keep these departments open. That would be Machiavellian (another notable Italian writer, but then, you don't have any Italian faculty to tell you about him), certainly, but I doubt that you're that clever. If you were, you would have held that town meeting when the whole university could have been present, at a place where the press would be all over it. That's how you force the hand of a bunch of politicians. You proclaim your action on the steps of the state capitol. You don't try to sneak it through in the dead of night, when your institution has its back turned.

No, I think you were simply trying to balance your budget at the expense of what you believe to be weak, outdated and powerless departments. I think you will find, in time, that you made a Faustian bargain. Faust is the title character in a play by Johann Wolfgang von Goethe. It was written around 1800 but still attracts the largest audiences of any play in Germany whenever it's performed. Faust is the story of a scholar who makes a deal with the devil. The devil promises him anything he wants as long as he lives. In return, the devil will get -- well, I'm sure you can guess how these sorts of deals usually go. If only you had a theater department, which now, of course, you don't, you could ask them to perform the play so you could see what happens. It's awfully relevant to your situation. You see, Goethe believed that it profits a man nothing to give up his soul for the whole world. That's the whole world, President Philip, not just a balanced budget. Although, I guess, to be fair, you haven't given up your soul. Just the soul of your institution.


Disrespectfully yours,

Gregory A Petsko

Gregory A. Petsko
Author's email:

Gregory A. Petsko is the Gyula and Katica Tauber Professor of Biochemistry and Chemistry and chair of biochemistry at Brandeis University.

As Others See Us

A genome biologist, Gregory Petsko, has gone to bat for the humanities, in an open letter to the State University of New York at Albany president who recently (and underhandedly) announced significant cuts. (For those who haven’t been paying attention: the departments of theater, Italian, Russian, classics, and French at SUNY-Albany are all going to be eliminated).

If you are in academia, and Petsko’s missive (which appeared on this site Monday) hasn’t appeared on your Facebook wall, it will soon. And here’s the passage that everyone seizes on, evidence that Petsko understands us and has our back (that is, we in the humanities): "The real world is pretty fickle about what it wants. The best way for people to be prepared for the inevitable shock of change is to be as broadly educated as possible, because today's backwater is often tomorrow's hot field. And interdisciplinary research, which is all the rage these days, is only possible if people aren't too narrowly trained."

He's right. And if scientists want to speak up for the humanities, I’m all for it. But Petsko understands us differently than we understand ourselves. Why fund the humanities, even if they don’t bring in grant money or produce patents? Petsko points out "universities aren't just about discovering and capitalizing on new knowledge; they are also about preserving knowledge from being lost over time, and that requires a financial investment."

How many us willingly embrace that interpretation of what we do? "My interest is not merely antiquarian...." is how we frame the justification for our cutting edge research. Even as we express our dismay when crucial texts go out of print, any sacred flame that we were tending was blown out when the canon wars were fought to a draw. Why should we resurrect it? Because, says Petsko, "what seems to be archaic today can become vital in the future." His examples are virology and Middle Eastern studies. Mine is 18th-century literature — and with all the imaginative vigor at my disposal, I have trouble discerning the variation on the AIDS scare or 9/11 that would revive interest in my field. That’s OK, though: Petsko has other reasons why the humanities matter:

"Our ability to manipulate the human genome is going to pose some very difficult questions for humanity in the next few decades, including the question of just what it means to be human. That isn't a question for science alone; it's a question that must be answered with input from every sphere of human thought, including -- especially including -- the humanities and arts... If I'm right that what it means to be human is going to be one of the central issues of our time, then universities that are best equipped to deal with it, in all its many facets, will be the most important institutions of higher learning in the future."

Well, that would be great. I have no confidence, though, that we in the humanities are positioned to take advantage of this dawning world, even if our departments escape SUNY-style cost-cutting. How many of us can meaningfully apply what we do to "the question of just what it means to be human" without cringing, or adopting an ironic pose, or immediately distancing ourselves from that very question? How many of us see our real purpose as teaching students to draw the kinds of connections between literature and life that Petsko uses to such clever effect in his diatribe?

Petsko is not necessarily right in his perception of what the humanities are good for, nor are professionals in the humanities necessarily wrong to pursue another vision of what our fields are about. But there is a profound disconnect between how we see ourselves (and how our work is valued and remunerated in the university and how we organize our professional lives to respond to those expectations) and how others see us. If we're going to take comfort in the affirmations of Petsko and those outside of the humanities whom he speaks for, perhaps we need to take seriously how he understands what we do. Perhaps the future is asking something of us that we are not providing — or perhaps we need to do a better job of explaining why anyone other than us should care about what we do.

Kirstin Wilcox
Author's email:

Kirstin Wilcox is senior lecturer in English at the University of Illinois at Urbana-Champaign.

Foreign Language for Foreign Policy?

These are troubled times for language programs in the United States, which have been battered by irresponsible cutbacks at all levels. Despite the chatter about globalization and multilateralism that has dominated public discourse in recent years, leaders in government and policy circles continue to live in a bubble of their own making, imagining that we can be global while refusing to learn the languages or learn about the cultures of the rest of the world. So it was surely encouraging that Richard Haass, president of the Council on Foreign Relations and a fixture of the foreign policy establishment, agreed to deliver the keynote address at the American Council on the Teaching of Foreign Languages Annual Convention in Boston on November 19.

Haass is a distinguished author, Oberlin- and Oxford-educated, and an influential voice in American debates. The good news is that in his talk, "Language as a Gateway to Global Communities," Haass expressed strong support for increased foreign language learning opportunities. He recognized the important work that language instructors undertake as well as the crucial connection between language and culture: language learning is not just technical mastery of grammar but rather, in his words, a "gateway" to a thorough understanding of other societies. We in the language learning community should take heed and be sure to build curriculums that provide systematic introductions to those histories, political systems, and ways of life. The Modern Language Association has made curricular recommendations along these lines in the report "Foreign Languages and Higher Education," which ACTFL President Eileen Glisan praised in her remarks that preceded the keynote address.

Haass claims that in an era of tight budgets, we need convincing arguments to rally support for languages. Of course that's true, but -- and this is the bad news -- despite his support for language as a gateway to other cultures, he countenances only a narrowly instrumental defense for foreign language learning, limited to two rationales: national security and global economy. At the risk of schematizing his account too severely, this means: more Arabic for national security and more Mandarin, Hindi, and, en passant, Korean for the economy. It appears that in his view the only compelling arguments for language-learning involve equipping individual Americans to be better vehicles of national interest as defined by Washington. In fact, at a revealing moment in the talk, Haass boiled his own position down to a neat choice: Fallujah or Firenze. We need more Arabic to do better in Fallujah , i.e., so we could have been more effective in the Iraq War (or could be in the next one?), and we need less Italian because Italy (to his mind) is a place that is only about culture.

In this argument, Italian — like other European languages — is a luxury. There was no mention of French as a global language, with its crucial presence in Africa and North America. Haass even seems to regard Spanish as just one more European language, except perhaps that it might be useful to manage instability in Mexico. Such arguments that reduce language learning to foreign policy objectives get too simple too quickly. And they run the risk of destroying the same foreign language learning agenda they claim to defend. Language learning in Haass's view ultimately becomes just a boot camp for our students to be better soldiers, more efficient in carrying out the projects of the foreign policy establishment. That program stands in stark contrast to a vision of language learning as part of an education of citizens who can think for themselves.

Haass’s account deserves attention: he is influential and thoughtful, and he is by no means alone in reducing the rationale for foreign language learning solely to national foreign policy needs. Yet why should all local educational decisions be subject to Washington approval? Moreover, given the poor track record of foreign policy leaders in anticipating national needs, why should we suddenly treat their analyses as the touchstone for curricular planning? And, finally, the contribution of language learning to student intellectual growth is too large, complex and dynamic to be squeezed onto the menu of skill sets the government imagines it might need in the future.

Yet even on his own instrumental terms, Haass seemed to get it wrong. If language learning were primarily about plugging into large economies more successfully, then we should be offering more Japanese and German (still two very big economies after all), but they barely showed up on his map.

The much more important issue involves getting beyond instrumental thinking altogether, at least in the educational sphere. Second language acquisition is a key component of education because it builds student ability in language as such. Students who do well in a second language do better in their first language. With the core language skills — abilities to speak and to listen, to read and to write — come higher-order capacities: to interpret and understand, to recognize cultural difference, and, yes, to appreciate traditions, including one’s own. Language learning is not just an instrumental skill, any more than one's writing ability is merely about learning to type on a keyboard. On the contrary, through language we become better thinkers, and that’s what education is about, at least outside Washington.

Russell A. Berman
Author's email:

Russell A. Berman is vice president of the Modern Language Association and professor of comparative literature and German studies at Stanford University.


I was a graduate student in the 1980s, during the heyday of the so-called “culture wars” and the curricular attacks on "Western civilization." Those days were punctuated by some Stanford students chanting slogans like "Hey hey, ho ho, Western Civ has got to go," and by fiery debates about Allan Bloom’s book The Closing of the American Mind, which appeared in 1987, toward the end of my years in graduate school. Back then the battle lines seemed clear: conservatives were for Western civilization courses and the traditional literary canon, while liberals and progressives were against those things and for a new, more liberating approach to education.

In retrospect I find that decade and its arguments increasingly difficult to comprehend, even though I experienced them firsthand. I ask myself: What on earth were we thinking? Exactly why was it considered progressive in the 1980s to get rid of courses like Western civilization (courses that frequently included both progressives and conservatives on their reading lists)? And why did supporting a traditional liberal arts education automatically make one a conservative — especially if such an education included philosophers like Jean-Jacques Rousseau and Karl Marx?

A quarter of a century later, with the humanities in crisis across the country and students and parents demanding ever more pragmatic, ever more job-oriented kinds of education, the curricular debates of the 1980s over courses about Western civilization and the canon seem as if they had happened on another planet, with completely different preconceptions and assumptions than the ones that prevail today. We now live in a radically different world, one in which most students are not forced to take courses like Western civilization or, most of the time, in foreign languages or cultures, or even the supposedly more progressive courses that were designed to replace them. And whereas as late as the 1980s English was the most popular major at many colleges and universities, by far the most popular undergraduate major in the country now is business.

The battle between self-identified conservatives and progressives in the 1980s seems increasingly like rearranging the deck chairs on the Titanic. While humanists were busy arguing amongst themselves, American college students and their families were turning in ever-increasing numbers away from the humanities and toward seemingly more pragmatic, more vocational concerns.

And who can really blame them? If humanists themselves could not even agree on the basic value, structure, and content of a liberal arts education — if some saw the tradition of Western civilization as one of oppression and tyranny, while others defended and validated it; if some argued that a humanistic education ought to be devoted to the voices of those previously excluded from "civilized" discussion, such as people of color and women, while others argued that such changes constituted a betrayal of the liberal arts — is it any wonder that students and their families began turning away from the humanities?

After all, economics and business professors did not fight about the basic structure of business or economics majors, even though there were differences among Keynesian and Friedmanite economists, for instance, over monetary policy. And physics professors did not engage in fundamental debates about physics curriculums — which should one teach, quantum mechanics or relativity? — in spite of Einstein’s problems with quantum mechanics ("God does not play dice with the universe"). In the 1980s the humanities as a whole seemed to be the only field where even experts were unable to agree on what constituted the appropriate object of study.

If I go to a doctor’s office and witness doctors and nurses fighting about whether or not I should take a particular medication, I’m likely to go elsewhere for my health care needs. I think something analogous happened to the humanities in the 1980s, and it is continuing to happen today, although by now the humanities are so diminished institutionally that these changes no longer have the overall significance they had in the 1980s. In the 1980s the humanities still constituted the core of most major universities; by now, at most universities, even major ones, the humanities are relatively marginal, far surpassed, in institutional strength, by business, medical, and law schools.

One of the core functions of the humanities for centuries was the passing down of a tradition from one generation to the next. The idea behind Western civilization courses was supposed to be that students needed them in order to understand the origins and development of their own culture. In the 1980s three developments worked against that idea. The first was an educational establishment that was no longer content simply to pass knowledge down from one generation to the next, and that wanted to create new knowledge. The second development, which dovetailed with the first, was the emergence of new approaches to the humanities that examined structures of oppression and domination in traditions previously viewed as unimpeachable. One could examine women's history, for instance, or non-Western cultures. The third development, which dovetailed with the first and second, was the increasing demand for “relevance” in higher education, with "relevance" being understood as present-oriented and pragmatic, i.e. job-related.

The conflation of these three developments led to the widespread perception — and not just among self-proclaimed progressives — that anything traditional or old was also, almost by definition, conservative, fuddy-duddy, and impractical. In essence those three developments have now long since triumphed, and the educational world of today is largely the result of that triumph.

Unfortunately, however, traditions that are not passed on from one generation to the next die. If an entire generation grows up largely unexposed to a particular tradition, then that tradition can in essence be said to be dead, because it is no longer capable of reproducing itself. It does not matter whether the tradition in question is imagined as the Western tradition, the Christian tradition, or the Marxist tradition (and of course both Christianity and Marxism are part of the Western tradition). Traditions are like languages: if they are not passed on, they die. Most traditions, of course, have good and bad elements in them (some might argue for Christianity, some for Marxism, relatively few for both), and what dies when a tradition dies is therefore often both good and bad, no matter what one’s perspective. But what also dies with a tradition is any possibility of self-critique from within the tradition (in the sense that Marxism, for instance, constituted a self-critique from within the Western tradition), since a tradition’s self-critique presupposes the existence of the tradition. Therefore the death of a tradition is not just the death of the oppression and tyranny that might be associated with the tradition, but also the death of progressive and liberating impulses within the tradition.

We all know, of course, that nature abhors a vacuum, and for that reason when a tradition dies, what fills in the vacuum where the tradition used to be is whatever is strongest in the surrounding culture. In our culture we know quite well what that is: the belief in money, in business, in economics, and in popular culture. That is our real religion, and it has largely triumphed over any tradition, either progressive or tyrannical. It is no more a coincidence that business is the most popular major in the United States today than it was that theology was one of the major fields of the 1700s.

As a result of the triumph of relevance and pragmatism over tradition, the ivy-covered walls of academia, which once seemed so separated from what is often called the “real world,” now offer very little protection from it. In fact the so-called "real world" almost entirely dominates the supposedly unreal world of academia. It may have once been true that academia offered at least a temporary sanctuary for American students on their way to being productive, hard-working contributors to a booming economy; now, however, academia offers very little refuge to students on their way into a shaky, shell-shocked economy where even the seemingly rock-solid belief in the “free market” has been thrown into question. In 1987 Allan Bloom wrote: "Education is not sermonizing to children against their instincts and pleasures, but providing a natural continuity between what they feel and what they can and should be. But this is a lost art. Now we have come to exactly the opposite point." Over two decades later, it seems to me that Bloom was right, and that indeed we have come “to exactly the opposite point.” Unfortunately now, neither self-styled conservatives nor self-styled progressives are likely to want to defend a vision of education that even in Bloom’s view was long gone. And sadder still is the fact that few of our students will even realize what has been lost.

And so I think we owe an apology to our students. We humanists inherited a tradition more or less intact, with all its strengths and weaknesses, but it appears highly likely that we will not be able or willing to pass it on to them. That is a signal failure, and it is one for which we will pay dearly. No doubt there is lots of blame to go around, but instead of looking around for people to blame, it would be more constructive to save what we can and pass it along to the next generation. They are waiting, and we have a responsibility.

Stephen Brockmann
Author's email:

Stephen Brockmann is professor of German at Carnegie Mellon University and president of the German Studies Association.

Translating Success

Smart Title: 
One of the most diverse foreign language programs in the country is at a community college in New Jersey.

Mapping Mandarin, Mohave and Miao-Mien

Smart Title: 
MLA's revamped language map offers precise look at who speaks what in every nook and cranny in the U.S.

Florida Pushes Foreign Languages

Smart Title: 
Chancellor seeks greater focus on Chinese and Portuguese -- through online instruction -- to prepare students for global economy.


Subscribe to RSS - Languages
Back to Top