Reporting on the Senate's confirmation of Theodore Mitchell as the U.S. Department of Education's chief higher education official, Inside Higher Ed quoted a statement from Secretary of Education: “He will lead us through this important time in higher education as we continue to work toward the President’s goal to produce the best-educated, most competitive workforce in the world by 2020.” While this brief remark is hardly a major policy statement, its tone and focus are typical of the way Secretary Duncan, President Obama, and many others in politics these days talk about higher education.
This typical rhetoric, in Duncan’s statement and beyond, makes a good point, but it doesn't say enough. To explain why, I will take a leaf from Thucydides. In History of the Peloponnesian War, he explained that his apparent verbatim accounts of speeches by other figures really articulated what he thought they should have said. With due respect for Secretary Duncan and President Obama, here is what the Secretary of Education should have said, on behalf of the President's aims, on the confirmation of a new Under Secretary of Education in charge of higher education affairs:
He will lead us through this important time in higher education as we continue to work toward the president’s goals for higher education in making America a more productive economy, a more just society, a more flourishing democracy, and a richer environment for what the Founders called, in the Declaration of Independence, "the pursuit of happiness," and in the Preamble to the Constitution, "the general welfare."
A part of that economic goal is to produce the best-educated, most competitive workforce in the world by 2020. Another part is to ensure that higher education extends broadly the opportunity to develop the ingenuity and creativity that will drive American innovation in the years ahead.
That means working to ensure that higher education regains its function as an engine of socioeconomic advancement, both for the individual and for society as a whole. This means resisting the increasing stratification of curriculums and opportunities, making sure that the advantages of arts and sciences education are extended as far throughout higher education as possible. This is both prudent, to cultivate the nation's human capital, and also just, to mitigate disadvantages of less-privileged starting points.
Everyone knows that democracy depends on America's capacity to maintain a deliberative electorate, capable of making well-informed choices in a political system they understand and in which they actively participate. It is a responsibility of higher education to enhance this investment in America by helping maintain that electorate. It is a responsibility of government to promote that role.
Finally, when the Founders embraced such goals as " the pursuit of happiness," and securing "the general welfare" of the people, they acknowledged that the well-being of individuals and of society as a whole -- difficult as these concepts are to define -- are legitimate objects of government interest. Higher education has crucial responsibilities of exploration and discovery in this broad field of human well-being. It is here that the perennial American question concerning the scope and limits of government itself is to be explored, and given for inquiry to succeeding generations of Americans.
"So on the appointment of a new Under Secretary with responsibilities toward higher education, we celebrate the many contributions of higher education to American flourishing: its role in contributing to a vibrant economy, certainly; and also its role in sustaining and advancing the broad aims of justice and improvement to which the country has always been committed."
That would have been good to hear from Secretary Duncan, and would be good to hear in any of the administration's speeches about higher education. None of us who are committed to this broader vision of higher education can ever, I emphasize, lose sight of its role in propelling the economy forward. But we cannot permit the purposes of higher education in America to be narrowed solely into the goal of workforce production. More is at stake: access to opportunity, cultivation of ingenuity and innovation, and broad contributions to the future of the country. Phi Beta Kappa joins many voices in advocacy of that vision. We invite Theodore Mitchell, Secretary Duncan, and President Obama to join, as well.
John Churchill is secretary of the Phi Beta Kappa Society.
Supporters of the American Studies Association’s call for a boycott of Israel universities are distorting what the boycott is – and how it will affect academe. The "institutional boycott" is likely to function as a political test in a hidden form. It violates principles of academic freedom. And in practice, it has been, and is likely to continue to be, a campaign for the exclusion of individual scholars who work in Israel, from the global academic community. It’s time to look with more care at the boycott and what it’s really about.
What the ASA Resolution Says
The ASA resolution reaffirms, in a general and abstract way, its support for the principle of academic freedom. It then says that it will “honor the call of Palestinian civil society for a boycott of Israeli academic institutions.” It goes on to offer guarantees that it will support the academic freedom of scholars who speak about Israel and who support the boycott; the implication here is that this refers to scholars who are opponents of Israel or of Israeli policy. The resolution does not specifically mention the academic freedom of individual Israeli scholars or students, nor does it mention protection for people to speak out against the boycott, nor does it say anything about the academic freedom of people to collaborate with Israeli colleagues.
What the ASA names "the call of Palestinian civil society for a boycott" is the Palestinian Campaign for the Academic and Cultural Boycott of Israel (PACBI) "Call for Academic and Cultural Boycott of Israel." The PACBI call explicitly says that the "vast majority of Israeli intellectuals and academics," that is to say individuals, have contributed to, or have been "complicit in through their silence," the Israeli human rights abuses which are the reasons given for boycott. There would be no sense in making this claim if no sanctions against individuals were envisaged. The PACBI guidelines state that "virtually all" Israeli academic institutions are guilty in the same way.
These claims, about the collective guilt of Israeli academics and institutions are strongly contested empirically. Opponents of the boycott argue that Israeli academe is pluralistic and diverse and contains many individuals who explicitly oppose anti-Arab racism, Islamophobia and the military and the civilian occupations of the West Bank. These claims about the guilt of Israeli academe are also contested by those who hold that the principle of collective guilt is a violation of the norms of the global academic community and of natural justice. Opponents of the boycott argue that academics and institutions should be judged by the content of their work and by the nature of their academic norms and practices, not by the state in which they are employed.
The PACBI guidelines go on to specify what is meant by the "institutional" boycott. "[T]hese institutions, all their activities, and all the events they sponsor or support must be boycotted." And "[e] and projects involving individuals explicitly representing these complicit institutions should be boycotted." The guidelines then offer an exemption for some other classes of individual as follows: "Mere institutional affiliation to the Israeli academy is therefore not a sufficient condition for applying the boycott."
A Political Test by Another Name
Refusing to collaborate with academics on the basis of their nationality is, prima facie, a violation of the norms of academic freedom and of the principle of the universality of science. It seems to punish scholars not for something related to their work, nor for something that they have done wrong, but because of who they are.
In 2002 Mona Baker, an academic in Britain, fired two Israelis from the editorial boards of academic journals that she owned and edited. Gideon Toury and Miriam Shlesinger are both well-respected internationally as scholars and also as public opponents of Israeli human rights abuses, but nevertheless they were "boycotted." The boycott campaign sought a more sophisticated formulation which did not appear to target individuals just for being Israeli.
In 2003, the formulation of the "institutional boycott" was put into action with a resolution to the Association of University Teachers (AUT), an academic trade union in Britain, that members should "sever any academic links they may have with official Israeli institutions, including universities." Yet in the same year, Andrew Wilkie, an Oxford academic, rejected an Israeli who applied to do a Ph.D. with him, giving as a reason that he had served in the Israeli armed forces. The boycott campaign in the UK supported Andrew Wilkie against criticism which focused on his boycott of an individual who had no affiliation of any kind to an Israeli academic institution. If the principle was accepted that anybody who had been in the Israeli armed forces was to be boycotted, then virtually every Israeli Jew would be thus targeted.
In 2006 the boycott campaign took a new tack, offering an exemption from the boycott to Israelis who could demonstrate their political cleanliness. The other British academic union, NATFHE, called for a boycott of Israeli scholars who failed to "publicly dissociate themselves" from ‘Israel’s apartheid policies." The political test opened the campaign up to a charge of McCarthyism: the implementation of a boycott on this basis would require some kind of machinery to be set up to judge who was allowed an exemption and who was not. The assertion that Israel is "apartheid" is emotionally charged and strongly contested. While it is possible for such analogies to be employed carefully and legitimately, it is also possible for such analogies to function as statements of loyalty to the Palestinians. They sometimes function as short cuts to the boycott conclusion, and as ways of demonizing Israel, Israelis, and those who are accused of speaking on their behalf. In practice, the boycott campaign attempts to construct supporters of the boycott as friends of Palestine and opponents of the boycott as enemies of Palestine.
It is reasonable to assume that under the influence of the campaign for an "institutional boycott," much boycotting of individuals goes on silently and privately. It is also reasonable to assume that Israeli scholars may come to fear submitting papers to journals or conferences if they think they may be boycotted, explicitly or not; this would lead to a "self-boycott" effect. There are anecdotal examples of the kinds of things which are likely to happen under the surface even of an institutional boycott. An Israeli colleague contacted a British academic in 2008, saying that he was in town and would like to meet for a coffee to discuss common research interests. The Israeli was told that the British colleague would be happy to meet, but he would first have to disavow Israeli apartheid.
The PACBI call, endorsed by ASA, says that Israeli institutions are guilty, Israeli intellectuals are guilty, Israeli academics who explicitly represent their institutions should be boycotted, but an affiliation in itself, is not grounds for boycott. The danger is that Israelis will be asked not to disavow Israel politically, but to disavow their university ‘institutionally’, as a pre-condition for recognition as legitimate members of the academic community. Israelis may be told that they are welcome to submit an article to a journal or to attend a seminar or a conference as an individual: EG David Hirsh is acceptable, David Hirsh, Tel Aviv University is not. Some Israelis will, as a matter of principle, refuse to appear only as an individual; others may be required by the institution which pays their salary, or by the institution which funds their research, not to disavow.
An ‘Institutional Boycott’ Still Violates Principles of Academic Freedom
Academic institutions themselves, in Israel as anywhere else, are fundamentally communities of scholars; they protect scholars, they make it possible for scholars to research and to teach, and they defend the academic freedom of scholars. The premise of the "institutional boycott" is that in Israel, universities are bad but scholars are (possibly, exceptionally) good, that universities are organs of the state while individual scholars are employees who may be (possibly, exceptionally) not guilty of supporting Israeli "apartheid" or some similar formulation.
There are two fundamental elements that are contested by opponents of the boycott in the "institutional boycott" rhetoric. First, it is argued, academic institutions are a necessary part of the structure of academic freedom. If there were no universities, scholars would band together and invent them, in order to create a framework within which they could function as professional researchers and teachers, and within which they could collectively defend their academic freedom.
Second, opponents of the boycott argue that Israeli academic institutions are not materially different from academic institutions in other free countries: they are not segregated by race, religion or gender, they have relative autonomy from the state, they defend academic freedom and freedom of criticism, not least against government and political pressure. There are of course threats to academic freedom in Israel, as there are in the U.S. and elsewhere, but the record of Israeli institutions is a good one in defending their scholars from political interference. Neve Gordon, for example, still has tenure at Ben Gurion University, in spite of calling for a boycott of his own institution; Ilan Pappe left Haifa voluntarily after having been protected by his institution even after traveling the world denouncing his institution and Israel in general as genocidal, Nazi and worthy of boycott.
Jon Pike argued that the very business of academia does not open itself up to a clear distinction between individuals and institutions. For example the boycott campaign has proposed that while Israelis may submit papers as individuals, they would be boycotted if they submitted it from their institutions. He points out that "papers that ‘issue from Israeli institutions' or are 'submitted from Israeli institutions' are worried over, written by, formatted by, referenced by, checked by, posted off by individual Israeli academics. Scientists, theorists, and researchers do their thinking, write it up and send it off to journals. It seems to me that Israeli academics can’t plausibly be so different from the rest of us that they have discovered some wonderful way of writing papers without the intervention of a human, individual, writer."
Boycotting academic institutions means refusing to collaborate with Israeli academics, at least under some circumstances if not others; and then we are likely to see the reintroduction of some form of "disavowal" test.
The Boycott Is an Exclusion of Jewish Scholars Who Work in Israel
In 2011 the University of Johannesburg decided, under pressure from the boycott campaign, to cut the institutional links it had with Ben Gurion University for the study of irrigation techniques in arid agriculture. Logically the cutting of links should have meant the end of the research with the Israeli scholars being boycotted as explicit representatives of their university. What in fact happened was that the boycotters had their public political victory and then the two universities quietly renegotiated their links under the radar, with the knowledge of the boycott campaign, and the research into agriculture continued. The boycott campaign portrayed this as an institutional boycott that didn’t harm scientific co-operation or Israeli individuals. The risks are that such pragmatism (and hypocrisy) will not always be the outcome and that the official position of "cutting links" will actually be implemented; in any case, the University of Johannesburg solution encourages a rhetoric of stigmatization against Israeli academics, even if it quietly neglects to act on it.
Another risk is that the targeting of Israelis by the "institutional boycott," or the targeting of the ones who are likely to refuse to disavow their institutional affiliations, is likely to impact disproportionately Jews. The risk here is that the institutional boycott has the potential to become, in its actual implementation, an exclusion of Jewish Israelis, although there will of course be exemption for some "good Jews": anti-Zionist Jewish Israelis or Israeli Jewish supporters of the boycott campaign. The result would be a policy which harms Israeli Jews more than anybody else. Further, among scholars who insist on "breaking the institutional boycott" or on arguing against it in America, Jews are likely to be disproportionately represented. If there are consequences which follow these activities, which some boycotters will regard as scabbing, the consequences will impact most heavily on American Jewish academics. Under any accepted practice of equal opportunities impact assessment, the policy of "institutional boycott" would cross the red lines which would normally constitute warnings of institutional racism.
The reality of the "institutional boycott" is that somebody will be in charge of judging who should be boycotted and who should be exempt. Even the official positions of ASA and PACBI are confusing and contradictory; they say there will be no boycott of individuals but they nevertheless make claims which offer justification for a boycott of individuals. But there is the added danger that some people implementing the boycott locally are likely not to have even the political sophistication of the official boycott campaign. There is a risk that there will still be boycotts of individuals (Mona Baker), political tests (NATFHE), breaking of scientific links (University of Johannesburg) and silent individual boycotts.
Even if nobody intends this, it is foreseeable that in practice the effects of a boycott may include exclusions, opprobrium, and stigma against Jewish Israeli academics who do not pass, or who refuse to submit to, one version or another of a test of their ideological purity; similar treatment may be visited upon those non-Israeli academics who insist on working with Israeli colleagues. There is a clear risk that an ‘institutional boycott’, if actually implemented, would function as such a test.
PACBI is the "Palestinian Campaign for the Academic and Cultural Boycott of Israel." What it hopes to achieve is stated in its name. It hopes to institute an "academic boycott of Israel." The small print concerning the distinction between institutions and individuals is contradictory, unclear and small. It is likely that some people will continue to understand the term "academic boycott of Israel," in a common sense way, to mean a boycott of Israeli academics.
David Hirsh is lecturer in sociology at Goldsmiths, the University of London. He is founding editor of Engage, a network and website that opposes boycotts of Israel and anti-Semitism.
In all those years I was pursuing a Ph.D. in religious studies, the question of what my profession really stood for rarely came up in conversation with fellow academics, save for occasional moments when the position of the humanities in higher education came under criticism in public discourse. When such moments passed, it was again simply assumed that anyone entering a doctoral program in the humanities knowingly signed on to a traditional career of specialized research and teaching.
But the closer I got to receiving that doctorate, the less certain I became that this was a meaningful goal. I was surrounded by undergraduates who were rich, well-meaning, and largely apathetic to what I learned and taught. I saw my teachers and peers struggle against the tide of general indifference aimed at our discipline and succumb to unhappiness or cynicism. It was heartbreaking.
Fearing that I no longer knew why I studied religion or the humanities at large, I left sunny California for a teaching job at the Asian University for Women, in Chittagong, Bangladesh. My new students came from 12 different countries, and many of them had been brought up in deeply religious households, representing nearly all traditions practiced throughout Asia. They, however, knew about religion only what they had heard from priests, monks, or imams, and did not understand what it meant to study religion from an academic point of view. And that so many of them came from disadvantaged backgrounds convinced me that this position would give me a sense of purpose.
I arrived in Bangladesh prepared to teach an introductory course on the history of Asian religions. But what was meant to be a straightforward comparison of religious traditions around the region quickly slipped from my control and morphed into a terrible mess. I remember an early lesson: When I suggested during a class on religious pilgrimage that a visit to a Muslim saint’s shrine had the potential to constitute worship, it incited a near-riot.
Several Muslim students immediately protested that I was suggesting heresy, citing a Quranic injunction that only Allah should be revered. What I had intended was to point out how similar tension existed in Buddhism over circumambulation of a stupa — an earthen mound containing the relics of an eminent religious figure — since that act could be seen as both remembrance of the deceased’s worthy deeds and veneration of the person. But instead of provoking a thoughtful discussion, my idea of comparative religious studies seemed only to strike students as blasphemous.
Even more memorable, and comical in hindsight, was being urged by the same Muslims in my class to choose one version of Islam among all its sectarian and national variations and declare it the best. Whereas Palestinians pointed to the "bad Arabic" used in the signage of one local site as evidence of Islam’s degeneration in South Asia, a Pakistani would present Afghanis as misguided believers because — she claimed—they probably never read the entire Quran. While Bangladeshis counseled me to ignore Pakistanis from the minority Ismaili sect who claim that God is accessible through all religions, Bangladeshis themselves were ridiculed by other students for not knowing whether they were Sunni or Shi’a, two main branches of Islam. In the midst of all this I thought my call to accept these various manifestations of Islam as intriguing theological propositions went unheeded.
With my early enthusiasm and amusement depleted, I was ready to declare neutral instruction of religion in Bangladesh impossible. But over the course of the semester I could discern one positive effect of our classroom exercise: students’ increasing skepticism toward received wisdom. In becoming comfortable with challenging my explanations and debating competing religious ideas, students came to perceive any view toward religion as more an argument than an indisputable fact. They no longer accepted a truth claim at face value and analyzed its underlying logic in order to evaluate the merit of the argument. They expressed confidence in the notion that a religion could be understood in multiple ways. And all the more remarkable was their implicit decision over time to position themselves as rational thinkers and to define their religions for themselves.
An illustrative encounter took place at the shrine of the city’s most prominent Muslim saint. I, being a man, was the only one among our group to be allowed into the space. My students, the keeper of the door said, could be "impure" — menstruating — and were forbidden to enter. Instead of backing down as the local custom expected, the students ganged up on the sole guard and began a lengthy exposition on the meaning of female impurity in Islam. First they argued that a woman was impure only when she was menstruating and not at other times; they then invoked Allah as the sole witness to their cyclical impurity, a fact the guard could not be privy to and thus should not be able to use against them; and finally they made the case that if other Muslim countries left it up to individual women to decide whether to visit a mosque, it was not up to a Bangladeshi guard to create a different rule concerning entry. Besieged by a half-dozen self-styled female theologians of Islam, the man cowered, and withdrew his ban.
I was incredibly, indescribably proud of them.
Equally poignant was coming face to face with a student who asked me to interpret the will of Allah. Emanating the kind of glow only the truly faithful seem to possess, she sat herself down in my office, fixed the hijab around her round alabaster face, and quietly but measuredly confessed her crime: She had taken to praying at a Hindu temple because most local mosques did not have space for women, and she was both puzzled and elated that even in a non-Islamic space she could still sense the same divine presence she had been familiar with all her life as Allah. She asked for my guidance in resolving her crisis of faith. If other Muslims knew about her routine excursions to a Hindu temple, she would be branded an apostate, but did I think that her instinct was right, and that perhaps it was possible for Allah to communicate his existence through a temple belonging to another religion?
In the privacy of my office, I felt honored by her question. I had lectured on that very topic just before this meeting, arguing that sacred space was not the monopoly of any one religion, but could be seen as a construct contingent upon the presence of several key characteristics. This simple idea, which scholars often take for granted, had struck her as a novel but convincing explanation for her visceral experience of the Islamic divine inside a Hindu holy space. Though she had come asking for my approval of her newly found conviction, it was clear that she did not need anyone’s blessing to claim redemption. Humanistic learning had already provided her with a framework under which her religious experience could be made meaningful and righteous, regardless of what others might say.
And thanks to her and other students, I could at last define my own discipline with confidence I had until then lacked: The humanities is not just about disseminating facts or teaching interpretive skills or making a living; it is about taking a very public stance that above the specifics of widely divergent human ideas exist more important, universally applicable ideals of truth and freedom. In acknowledging this I was supremely grateful for the rare privilege I enjoyed as a teacher, having heard friends and colleagues elsewhere bemoan the difficulty of finding a meaningful career as humanists in a world constantly questioning the value of our discipline. I was humbled to be able to see, by moving to Bangladesh, that humanistic learning was not as dispensable as many charge.
But before I could fully savor the discovery that what I did actually mattered, my faith in the humanities was again put to a test when a major scandal befell my institution. I knew that as a member of this community I had to critique what was happening after all my posturing before students about the importance of seeking truth. If I remained silent, it would amount to a betrayal of my students and a discredit to my recent conclusion that humanistic endeavor is meant to make us not only better thinkers, but also more empowered and virtuous human beings.
So it was all the more crushing to be told to say nothing by the people in my very profession, whose purpose I thought I had finally ascertained. In private chats my friends and mentors in academe saw only the urgent need for me to extricate myself for the sake of my career, but had little to say about how to address the situation. Several of my colleagues on the faculty, though wonderful as individuals, demurred from taking a stance for fear of being targeted by the administration for retribution or losing the professional and financial benefits they enjoyed. And the worst blow, more so than the scandal itself, was consulting the one man I respected more than anybody else, a brilliant tenured scholar who chairs his own department at a research university in North America, and receiving this one-liner:
"My advice would be to leave it alone."
It was simultaneously flummoxing and devastating to hear a humanist say that when called to think about the real-life implications of our discipline, we should resort to inaction. And soon it enraged me that the same people who decry the dismantling of traditional academe under market pressure and changing attitudes toward higher education could be so indifferent, thereby silently but surely contributing to the collapse of humanists’ already tenuous legitimacy as public intellectuals.
While my kind did nothing of consequence, it was the students — the same students whom I had once dismissed as incapable of intellectual growth — who tried to speak up at the risk of jeopardizing the only educational opportunity they had. They approached the governing boards, the administration, and the faculty to hold an official dialogue. They considered staging a street protest. And finally, they gave up and succumbed to cynicism about higher education and the world, seeing many of their professors do nothing to live by the principles taught in class, and recognizing the humanities as exquisitely crafted words utterly devoid of substance.
As my feeling about my discipline shifted from profound grief to ecstatic revelation to acute disappointment, I was able to recall a sentiment expressed by one of my professors, who himself might not remember it after all these years. Once upon a time we sat sipping espresso on a verdant lawn not far from the main library, and he mused that he never understood why young people no longer seemed to feel outrage at the sight of injustice. He is a product of a generation that once once rampaged campuses and braved oppression by the Man. On first hearing his indictment, I was embarrassed to have failed the moral standard established by the older generation of scholars like him. But now I see that it is not just young people but much of our discipline, both young and old, that at present suffers from moral inertia. With only a few exceptions, humanists I know do not consider enactment of virtue to be their primary professional objective, whether because of the more important business of knowledge production or material exigencies of life. And I can only conclude, with no small amount of sadness, that most humanists are not, nor do they care to be, exemplary human beings.
Maybe I should move on, as did a friend and former academic who believes that the only people we can trust to stand on principle are "holy men, artists, poets, and hobos," because yes, it is true that humanists should not be confused with saints. But the humanities will always appear irrelevant as long as its practitioners refrain from demonstrating a tangible link between what they preach and how they behave. In light of the current academic penchant for blaming others for undoing the humanities, it must be said that humanists as a collective should look at themselves first, and feel shame that there is so much they can say — need to say — about the world, but that they say so little at their own expense.
After a year and a half in Bangladesh, I do not doubt any longer that the humanities matters, but now I know that the discipline’s raison d’être dies at the hands of those humanists who do not deserve their name.
Se-Woong Koo earned his Ph.D. in religious studies from Stanford University in 2011. He currently serves as Rice Family Foundation Visiting Fellow and Lecturer at Yale University.
Let me first say that my reply has nothing to do with the merits of either the resolution or your position. It has rather to do with your tactic and the presumptions upon which your open letter operates. I feel somewhat passionate about this because Asian-American studies is a field with which I have been long associated and for which I have immense respect.
I'll make this brief. So, all those who attended the Association for Asian American Studies in Seattle happened to, without public debate, vote in favor of a resolution. So what? Can't people unanimously feel passionate and committed to one point of view? If it were a resolution in favor of stricter background checks for gun purchases would you be as moralistic and publicly so?
Instead, you trot out some cherry-picked quotes from some leftists and chastise us for not taking these into account, calling into question our thoughtfulness and indeed personal ethics. But how do you know that many if not all of us did not in fact, on our own, or with friends, families, colleagues, conscientiously think through our positions? We were alerted well in advance of the resolution, after all, and we are, after all, academics, so maybe we did our homework.
I am sorry you are dismayed at the result, but your inference does not work.
But more importantly, I really do not see why you chose to use a mainstream journal of the academy to launch your public chastisement and browbeating.
Oh, I think I do. Your "position" lost so you decided not only to use Inside Higher Ed to offer it to a wider and assumedly more sympathetic audience, you also used the opportunity it afforded to lambaste an entire organization for reputed past sins of a similar nature.
Your letter moves out from a critique of a single vote to a broad indictment of many fine scholars and teachers, indeed all of those in the field, impugning their moral character simply because their judgment did not coincide with your own. Surely we can all stand to learn from one another, but your mode of persuasion is, to my mind, entirely counterproductive.
Louise Hewlett Nixon Professor
Professor of Comparative Literature and, by courtesy, English
Director of Asian American Studies
David Palumbo-Liu is the Louise Hewlett Nixon Professor at Stanford University.
Some months ago I started asking friends, colleagues from my teaching days, researchers in higher education, faculty members of various ages and ranks, deans, provosts and presidents, and focus groups of students: “What’s the status of the Big Questions on your campus?” Quite deliberately I avoided defining “Big Questions,” but I gave as examples such questions as “Who am I? Where do I come from? What am I going to do with my life? What are my values? Is there such a thing as evil? What does it mean to be human? How can I understand suffering and death? What obligations do I have to other people? What does it mean to be a citizen in a democracy? What makes work, or a life, meaningful and satisfying?” In other words, I wanted to know what was happening to questions of meaning and value that traditionally have been close to the heart of a liberal education.
Some of what I found puzzled me. People pointed out quite properly that some Big Questions were alive and well in academia today. These included some questions about the origin of the universe, the emergence of life, the nature of consciousness, and others that have been raised by the scientific breakthroughs of the past few decades.
In the humanities and related social sciences the situation was rather different. Some friends reminded me that, not all big questions were in eclipse. Over the past generation faculty members have paid great attention to questions of racial, ethnicity, gender and sexual identity. Curricular structures, professional patterns, etc. continue to be transformed by this set of questions. Professors, as well as students, care about these questions, and as a result, write, teach and learn with passion about them.
But there was wide agreement that other big questions, the ones about meaning, value, moral and civic responsibility, were in eclipse. To be sure, some individual faculty members addressed them, and when they did, students responded powerfully. In fact, in a recent Teagle-sponsored meeting on a related topic, participants kept using words such as “hungry,” “thirsty,” and “parched” to describe students’ eagerness to find ways in the curriculum, or outside it, to address these questions. But the old curricular structures that put these questions front and center have over the years often faded or been dismantled, including core curricula, great books programs, surveys “from Plato to NATO,” and general education requirements of various sorts. Only rarely have new structures emerged to replace them.
I am puzzled why. To be sure, these Big Questions are hot potatoes. Sensitivities are high. And faculty members always have the excuse that they have other more pressing things to do. Over two years ago, in an article entitled “Aim Low,” Stanley Fish attacked some of the gurus of higher education (notably, Ernest Boyer) and their insistence that college education should “go beyond the developing of intellectual and technical skills and … mastery of a scholarly domain. It should include the competence to act in the world and the judgment to do so wisely” ( Chronicle of Higher Education, May 16 2003). Fish hasn’t been the only one to point out that calls to “fashion” moral and civic-minded citizens, or to “go beyond” academic competency assume that students now routinely achieve such mastery of intellectual and scholarly skills. We all know that’s far from the case.
Minimalist approaches -- ones that limit teaching to what another friend calls “sectoral knowledge -- are alluring. But if you are committed to a liberal education, it’s hard just to aim low and leave it at that. The fact that American university students need to develop basic competencies provides an excuse, not a reason, for avoiding the Big Questions. Students also need to be challenged, provoked, and helped to explore the issues they will inevitable face as citizens and as individuals. Why have we been so reluctant to develop the structures, in the curriculum or beyond it, that provide students with the intellectual tools they need to grapple thoughtfully over the course of a lifetime with these questions?
I see four possible reasons:
1. Faculty members are scared away by the straw man Stanley Fish and others have set up. Despite accusations of liberal bias and “brainwashing” no faculty member I know wants to “mold,” “fashion” or “proselytize” students. But that’s not what exploring the Big Questions is all about. Along with all the paraphernalia college students bring with them these days are Big Questions, often poorly formulated and approached with no clue that anyone in the history of humankind has ever had anything useful to say about any of them. There’s no need to answer those questions for students, or to try to fashion them into noble people or virtuous citizens for the republic. There is, however, every reason to help students develop the vocabularies, the metaphors, the exempla, the historical perspective, the patterns of analysis and argument that let them over time answer them for themselves.
2. A second possible reason is that faculty are put off by the feeling they are not “experts” in these matters. In a culture that quite properly values professional expertise, forays beyond one’s field of competence are understandably suspect. But one does not have to be a moral philosopher to raise the Big Questions and show some of the ways smart people in the past have struggled with them. I won’t pontificate about other fields, but in my own field -- classics and ancient history -- the Big Questions come bubbling up between the floor boards of any text I have ever taught. I don’t have to be a specialist in philosophy or political science to see that Thucydides has something to say about power and morality, or the Odyssey about being a father and a husband. A classicist’s job, as I see it, is to challenge students to think about what’s implicit in a text, help them make it explicit and use that understanding to think with.
3. Or is it that engaging with these “Big Questions” or anything resembling them is the third rail of a professional career. Senior colleagues don’t encourage it; professional journals don’t publish it; deans don’t reward it and a half dozen disgruntled students might sink your tenure case with their teaching evaluations. You learn early on in an academic career not to touch the third rail. If this is right, do we need to rewire the whole reward system of academia?
4. Or, is a former student of mine, now teaching at a fine women’s college, correct when she says that on her campus “It tends to be that … those who talk about morality and the big questions come from such an entrenched far right position … that the rest of us … run for cover.”
Some of the above? All of the above? None of the above? You tell me, but let’s not shrug our shoulders and walk away from the topic until we’ve dealt with one more issue: What happens if, for whatever reason, faculty members run for the hills when the Big Questions, including the ones about morality and civic responsibility, arise? Is this not to lose focus on what matters most in an education intended to last for a lifetime? In running away, do we not then leave the field to ideologues and others we cannot trust, and create a vacuum that may be filled by proselytizers, propagandists, or the unspoken but powerful manipulations of consumer culture? Does this not sever one of the roots that has over the centuries kept liberal education alive and flourishing? But, most serious of all, will we at each Commencement say farewell to another class of students knowing that for all they have learned, they are ill equipped to lead an examined life? And if we do, can we claim to be surprised and without responsibility if a few decades later these same graduates abuse the positions of power and trust in our corporate and civic life to which they have ascended?
W. Robert Connor
W. Robert Connor is president of the Teagle Foundation, which is dedicated to strengthening liberal education. More on the foundation's “Big Questions” project may be found on its Web site. This essay is based on remarks Connor recently made at a meeting of the Middle Atlantic Chapters of Phi Beta Kappa, at the University of Pennsylvania.