These are troubled times for language programs in the United States, which have been battered by irresponsible cutbacks at all levels. Despite the chatter about globalization and multilateralism that has dominated public discourse in recent years, leaders in government and policy circles continue to live in a bubble of their own making, imagining that we can be global while refusing to learn the languages or learn about the cultures of the rest of the world. So it was surely encouraging that Richard Haass, president of the Council on Foreign Relations and a fixture of the foreign policy establishment, agreed to deliver the keynote address at the American Council on the Teaching of Foreign Languages Annual Convention in Boston on November 19.
Haass is a distinguished author, Oberlin- and Oxford-educated, and an influential voice in American debates. The good news is that in his talk, "Language as a Gateway to Global Communities," Haass expressed strong support for increased foreign language learning opportunities. He recognized the important work that language instructors undertake as well as the crucial connection between language and culture: language learning is not just technical mastery of grammar but rather, in his words, a "gateway" to a thorough understanding of other societies. We in the language learning community should take heed and be sure to build curriculums that provide systematic introductions to those histories, political systems, and ways of life. The Modern Language Association has made curricular recommendations along these lines in the report "Foreign Languages and Higher Education," which ACTFL President Eileen Glisan praised in her remarks that preceded the keynote address.
Haass claims that in an era of tight budgets, we need convincing arguments to rally support for languages. Of course that's true, but -- and this is the bad news -- despite his support for language as a gateway to other cultures, he countenances only a narrowly instrumental defense for foreign language learning, limited to two rationales: national security and global economy. At the risk of schematizing his account too severely, this means: more Arabic for national security and more Mandarin, Hindi, and, en passant, Korean for the economy. It appears that in his view the only compelling arguments for language-learning involve equipping individual Americans to be better vehicles of national interest as defined by Washington. In fact, at a revealing moment in the talk, Haass boiled his own position down to a neat choice: Fallujah or Firenze. We need more Arabic to do better in Fallujah , i.e., so we could have been more effective in the Iraq War (or could be in the next one?), and we need less Italian because Italy (to his mind) is a place that is only about culture.
In this argument, Italian — like other European languages — is a luxury. There was no mention of French as a global language, with its crucial presence in Africa and North America. Haass even seems to regard Spanish as just one more European language, except perhaps that it might be useful to manage instability in Mexico. Such arguments that reduce language learning to foreign policy objectives get too simple too quickly. And they run the risk of destroying the same foreign language learning agenda they claim to defend. Language learning in Haass's view ultimately becomes just a boot camp for our students to be better soldiers, more efficient in carrying out the projects of the foreign policy establishment. That program stands in stark contrast to a vision of language learning as part of an education of citizens who can think for themselves.
Haass’s account deserves attention: he is influential and thoughtful, and he is by no means alone in reducing the rationale for foreign language learning solely to national foreign policy needs. Yet why should all local educational decisions be subject to Washington approval? Moreover, given the poor track record of foreign policy leaders in anticipating national needs, why should we suddenly treat their analyses as the touchstone for curricular planning? And, finally, the contribution of language learning to student intellectual growth is too large, complex and dynamic to be squeezed onto the menu of skill sets the government imagines it might need in the future.
Yet even on his own instrumental terms, Haass seemed to get it wrong. If language learning were primarily about plugging into large economies more successfully, then we should be offering more Japanese and German (still two very big economies after all), but they barely showed up on his map.
The much more important issue involves getting beyond instrumental thinking altogether, at least in the educational sphere. Second language acquisition is a key component of education because it builds student ability in language as such. Students who do well in a second language do better in their first language. With the core language skills — abilities to speak and to listen, to read and to write — come higher-order capacities: to interpret and understand, to recognize cultural difference, and, yes, to appreciate traditions, including one’s own. Language learning is not just an instrumental skill, any more than one's writing ability is merely about learning to type on a keyboard. On the contrary, through language we become better thinkers, and that’s what education is about, at least outside Washington.
Russell A. Berman
Russell A. Berman is vice president of the Modern Language Association and professor of comparative literature and German studies at Stanford University.
I was a graduate student in the 1980s, during the heyday of the so-called “culture wars” and the curricular attacks on "Western civilization." Those days were punctuated by some Stanford students chanting slogans like "Hey hey, ho ho, Western Civ has got to go," and by fiery debates about Allan Bloom’s book The Closing of the American Mind, which appeared in 1987, toward the end of my years in graduate school. Back then the battle lines seemed clear: conservatives were for Western civilization courses and the traditional literary canon, while liberals and progressives were against those things and for a new, more liberating approach to education.
In retrospect I find that decade and its arguments increasingly difficult to comprehend, even though I experienced them firsthand. I ask myself: What on earth were we thinking? Exactly why was it considered progressive in the 1980s to get rid of courses like Western civilization (courses that frequently included both progressives and conservatives on their reading lists)? And why did supporting a traditional liberal arts education automatically make one a conservative — especially if such an education included philosophers like Jean-Jacques Rousseau and Karl Marx?
A quarter of a century later, with the humanities in crisis across the country and students and parents demanding ever more pragmatic, ever more job-oriented kinds of education, the curricular debates of the 1980s over courses about Western civilization and the canon seem as if they had happened on another planet, with completely different preconceptions and assumptions than the ones that prevail today. We now live in a radically different world, one in which most students are not forced to take courses like Western civilization or, most of the time, in foreign languages or cultures, or even the supposedly more progressive courses that were designed to replace them. And whereas as late as the 1980s English was the most popular major at many colleges and universities, by far the most popular undergraduate major in the country now is business.
The battle between self-identified conservatives and progressives in the 1980s seems increasingly like rearranging the deck chairs on the Titanic. While humanists were busy arguing amongst themselves, American college students and their families were turning in ever-increasing numbers away from the humanities and toward seemingly more pragmatic, more vocational concerns.
And who can really blame them? If humanists themselves could not even agree on the basic value, structure, and content of a liberal arts education — if some saw the tradition of Western civilization as one of oppression and tyranny, while others defended and validated it; if some argued that a humanistic education ought to be devoted to the voices of those previously excluded from "civilized" discussion, such as people of color and women, while others argued that such changes constituted a betrayal of the liberal arts — is it any wonder that students and their families began turning away from the humanities?
After all, economics and business professors did not fight about the basic structure of business or economics majors, even though there were differences among Keynesian and Friedmanite economists, for instance, over monetary policy. And physics professors did not engage in fundamental debates about physics curriculums — which should one teach, quantum mechanics or relativity? — in spite of Einstein’s problems with quantum mechanics ("God does not play dice with the universe"). In the 1980s the humanities as a whole seemed to be the only field where even experts were unable to agree on what constituted the appropriate object of study.
If I go to a doctor’s office and witness doctors and nurses fighting about whether or not I should take a particular medication, I’m likely to go elsewhere for my health care needs. I think something analogous happened to the humanities in the 1980s, and it is continuing to happen today, although by now the humanities are so diminished institutionally that these changes no longer have the overall significance they had in the 1980s. In the 1980s the humanities still constituted the core of most major universities; by now, at most universities, even major ones, the humanities are relatively marginal, far surpassed, in institutional strength, by business, medical, and law schools.
One of the core functions of the humanities for centuries was the passing down of a tradition from one generation to the next. The idea behind Western civilization courses was supposed to be that students needed them in order to understand the origins and development of their own culture. In the 1980s three developments worked against that idea. The first was an educational establishment that was no longer content simply to pass knowledge down from one generation to the next, and that wanted to create new knowledge. The second development, which dovetailed with the first, was the emergence of new approaches to the humanities that examined structures of oppression and domination in traditions previously viewed as unimpeachable. One could examine women's history, for instance, or non-Western cultures. The third development, which dovetailed with the first and second, was the increasing demand for “relevance” in higher education, with "relevance" being understood as present-oriented and pragmatic, i.e. job-related.
The conflation of these three developments led to the widespread perception — and not just among self-proclaimed progressives — that anything traditional or old was also, almost by definition, conservative, fuddy-duddy, and impractical. In essence those three developments have now long since triumphed, and the educational world of today is largely the result of that triumph.
Unfortunately, however, traditions that are not passed on from one generation to the next die. If an entire generation grows up largely unexposed to a particular tradition, then that tradition can in essence be said to be dead, because it is no longer capable of reproducing itself. It does not matter whether the tradition in question is imagined as the Western tradition, the Christian tradition, or the Marxist tradition (and of course both Christianity and Marxism are part of the Western tradition). Traditions are like languages: if they are not passed on, they die. Most traditions, of course, have good and bad elements in them (some might argue for Christianity, some for Marxism, relatively few for both), and what dies when a tradition dies is therefore often both good and bad, no matter what one’s perspective. But what also dies with a tradition is any possibility of self-critique from within the tradition (in the sense that Marxism, for instance, constituted a self-critique from within the Western tradition), since a tradition’s self-critique presupposes the existence of the tradition. Therefore the death of a tradition is not just the death of the oppression and tyranny that might be associated with the tradition, but also the death of progressive and liberating impulses within the tradition.
We all know, of course, that nature abhors a vacuum, and for that reason when a tradition dies, what fills in the vacuum where the tradition used to be is whatever is strongest in the surrounding culture. In our culture we know quite well what that is: the belief in money, in business, in economics, and in popular culture. That is our real religion, and it has largely triumphed over any tradition, either progressive or tyrannical. It is no more a coincidence that business is the most popular major in the United States today than it was that theology was one of the major fields of the 1700s.
As a result of the triumph of relevance and pragmatism over tradition, the ivy-covered walls of academia, which once seemed so separated from what is often called the “real world,” now offer very little protection from it. In fact the so-called "real world" almost entirely dominates the supposedly unreal world of academia. It may have once been true that academia offered at least a temporary sanctuary for American students on their way to being productive, hard-working contributors to a booming economy; now, however, academia offers very little refuge to students on their way into a shaky, shell-shocked economy where even the seemingly rock-solid belief in the “free market” has been thrown into question. In 1987 Allan Bloom wrote: "Education is not sermonizing to children against their instincts and pleasures, but providing a natural continuity between what they feel and what they can and should be. But this is a lost art. Now we have come to exactly the opposite point." Over two decades later, it seems to me that Bloom was right, and that indeed we have come “to exactly the opposite point.” Unfortunately now, neither self-styled conservatives nor self-styled progressives are likely to want to defend a vision of education that even in Bloom’s view was long gone. And sadder still is the fact that few of our students will even realize what has been lost.
And so I think we owe an apology to our students. We humanists inherited a tradition more or less intact, with all its strengths and weaknesses, but it appears highly likely that we will not be able or willing to pass it on to them. That is a signal failure, and it is one for which we will pay dearly. No doubt there is lots of blame to go around, but instead of looking around for people to blame, it would be more constructive to save what we can and pass it along to the next generation. They are waiting, and we have a responsibility.
Stephen Brockmann is professor of German at Carnegie Mellon University and president of the German Studies Association.
Vanessa Fonseca, now a graduate teaching assistant in the University of New Mexico’s Sabine Ulibarrí Spanish as a Heritage Language program, said it took her all of two minutes to figure out a non-heritage Spanish class she stumbled into as an undergraduate was not for her.
Reynaldo Pol, a coordinator of English as a second language courses for adults in a suburban Atlanta county, knows first-hand what issues language instructors in his corner of the world face. When he decided it was time to go back to school, Pol, a Cuban by birth who grew up in Puerto Rico and received his bachelor’s degree at Georgia’s Piedmont College, decided he wanted to look more broadly, beyond borders.
Foreign language study in the United States has had many a “Sputnik moment,” as H. Jay Siskin, a French instructor at Cabrillo College, in California, put it -- that is, a moment that reveals an economic or military weakness and has been used as a call to arms to strengthen, among other things, language education.