Whether or not the humanities are truly in crisis, the current debates around them have a certain gun-to-the-head quality. “This is why you -- student, parent, Republican senator -- shouldn’t pull the trigger,” their promoters plead. “We deserve to live; we’re good productive citizens; we, too, contribute to the economy, national security, democracy, etc.” Most of these reasons are perfectly accurate. But it is nonetheless surprising that, in the face of what is depicted as an existential crisis, most believers shy away from existential claims (with someexceptions). And by not defending the humanities on their own turf, we risk alienating the very people on whose support the long-term survival of our disciplines depend: students.
One reason why our defenses can have a desperate ring to them is that we’re not used to justifying ourselves. Most humanists hold the value of the objects they study to be self-evident. The student who falls in love with Kant, Flaubert, or ancient Egypt does not need to provide an explanation for why she would like to devote years of her life to such studies. To paraphrase Max Weber, scholarship in the humanities is a vocation, a “calling” in the clerical sense. It chooses you, you don’t choose it. The problem with this kind of spiritual passion is that it is difficult to describe. To paraphrase another 20th-century giant, Jimi Hendrix, it’s more about the experience.
It’s not surprising, then, that when we humanists feel (or imagine) the budget axe tickling the hairs on the backs of our necks, we don’t have ready-made apologia with which to woo or wow our would-be executioners. And because a calling is hard to explain, we turn instead to more straightforward, utilitarian defenses -- “but employers say they like English majors!” -- which, while true, don’t capture the authentic spirit that moves the humanities student.
There is of course sound logic to this approach. Government and state funding is a zero-sum game, and politicians are more likely to be receptive to practical arguments than to existential propositions. But in the long run, it takes more than state and university budgets to maintain the health of the humanities. It also takes students. And by constantly putting our most productive foot forward, we may unintentionally end up selling ourselves short (disclosure: I, too, have sinned). The fundamental reason why students should devote hours of their weeks to novels, philosophy, art, music, or history is not so that they can hone their communication skills or refine their critical thinking. It is because the humanities offer students a profound sense of existential purpose.
The real challenge that we face today, then, lies in explaining to a perplexed, but not necessarily hostile audience -- and perhaps even to ourselves -- why it is that the study of literature, anthropology, art history, or classics can be so meaningful, and why this existential rationale is equally important as other, more utilitarian ones. This line of argument stands in opposition to proclamations of the humanities’ uselessness: to declare that the humanities are of existential value is to affirm that they are very useful indeed.
So how might we go about defining this existential value? A good place to start would be with existentialism itself. A premise of existentialist philosophy is that we live in a world without inherent meaning. For atheists, this is often understood as the human condition following the death of God. But as Jean-Paul Sartre pointed out in “Existentialism is a Humanism,” even believers must recognize that they ultimately are the ones responsible for the production of meaning (in fact, many early existentialists were Christians). Abraham had to decide for himself whether the angel who commanded him to halt his sacrifice was genuinely a divine messenger. In Sartre's memorable formulation, man is “condemned to be free”; we have no choice but to choose. While it may feel as though a humanities vocation is a calling, you still have to decide to answer the call.
The realization that meaning isn’t something we receive from the outside, from others, but that it always must come from within us, from our conscious, deliberative choices, does not make us crave it any less. We are, existentialists insist, creatures of purpose, a thesis that psychological research has also confirmed.
Now what does this have to do with the humanities? It’s not that obvious, after all, how reading Madame Bovary, the Critique of Pure Reason, or The Book of the Dead can fill your life with purpose. At the same time, we also know that some people do find it deeply meaningful to peruse these works, and even to dedicate their careers to studying them.
What is it, then, that lovers of literature -- to consider but them for the moment -- find so existentially rewarding about reading? In a recent book, my colleague Joshua Landy argues that one of the more satisfying features of literature is that it creates the illusion of a meaningful world. “The poem forms a magic circle from within which all contingency is banished,” he writes apropos of Mallarmé’s celebrated sonnet en -yx. The order we discover in literary works may be magical, but it isn’t metaphysical; it comes from the sense that “everything is exactly what and where it has to be.” Art offers a reprieve from a universe governed by chance; what were merely sordid newspaper clippings can become, when transported into artful narratives, The Red and the Black or Madame Bovary. Landy suggests that fictions produce these illusions through a process of “overdetermination:” the ending of Anna Karenina, for instance, is foreshadowed by its beginning, when Anna witnesses a woman throwing herself under a train.
If art offered only illusions of necessity, it would hardly satisfy existential longing. Pretending that everything happens for a reason is precisely what the existentialists castigated as “bad faith.” Yet there’s an obvious difference between enjoying a novel and, say, believing in Providence. We don’t inhabit fictional worlds, we only pay them visits. No lover of literature actually believes her life is as determined as that of a literary heroine (even Emma Bovary wasn’t psychotic). So why does the semblance of an orderly universe enchant us so?
Well-ordered, fictional worlds attract us, it seems, because we, too, aspire to live lives from which contingency is kept at bay. Beauty, wrote Stendhal, is “only a promise of happiness.” As Alexander Nehamas suggested, in his book of this title, the beautiful work of art provides us with a tantalizing pleasure; beauty engages us in its pursuit. But what do we pursue? “To find something beautiful is inseparable from the need to understand what makes it so,” he writes. Behind the beautiful object -- sonnet, style, or sculpture -- we reach for the idea of order itself. The promise of happiness made by art is a promise of purpose.
But a promise of purpose is still a bird in the bush: it can disappear when you put down the book, or leave the concert hall. For the philosopher Immanuel Kant, art only provides us with an empty sense of purpose; or as he put it, in his distinctively Kantian way, "purposiveness without purpose" (it’s even better in German).
It’s true that few existential crises have been resolved by a trip to the museum or the download of a new album. But Kant may have underestimated how the sense of artistic purpose can also seep into our own lives. For instance, as Plato and every teenager know well, instrumental music can give voice to inexpressible feelings without the help of language. These emotional frameworks can convey a potent sense of purpose. When my youngest daughter spent six weeks in the neonatal ICU with a life-threatening condition, my mind kept replaying the second movement of Beethoven’s seventh symphony to tame my fears. Its somber, resolute progress, punctuated by brief moments of respite, helped to keep my vacillating emotions under control. As in films, sometimes it is the soundtrack that gives meaning to our actions.
The promise of order found in beautiful works of art, then, can inspire us to find purpose in our own lives. The illusion of a world where everything is in its place helps us view reality in a different light. This process is particularly clear -- indeed, almost trivial -- in those humanistic disciplines that do not deal primarily with aesthetic objects, such as philosophy. We aren't attracted to the worldviews of Plato, Kant, or Sartre, purely for the elegance of their formal structure. If we’re swayed by their philosophies, it’s because they allow us to discover hitherto unnoticed patterns in our lives. Sometimes, when you read philosophy, it seems as though the whole world has snapped into place. This is not an experience reserved for professional philosophers, either: at the conclusion of a philosophy course that my colleagues Debra Satz and Rob Reich offer to recovering female addicts, one student declared, “I feel like a butterfly drawn from a cocoon.”
So where art initially appeals to us through intimations of otherworldly beauty, a more prolonged engagement with the humanities can produce a sense of order in the here and now. One could even say that Plato got things the wrong way around: first we’re attracted by an ideal universe, and then we’re led to discover that our own reality is not as absurd as it once seemed. And while particularly evident with philosophy, this sensation of finally making sense of the world, and of your own place in it, can come from many quarters of the humanities. In a delightful interview (originally conducted in French), Justice Stephen Breyer recently exclaimed, “It’s all there in Proust — all mankind!” Other readers have had similar responses to Dante, Shakespeare, Tolstoy, and many more.
But exploring the humanities is not like a trip to the mall: you don't set off to find an off-the-rack outfit to wear. Proust can change your life, but if you only saw the world through his novel, it would be a rather impoverished life. Worse, it would be inauthentic: no author, no matter how great, can tell you what the meaning of your life is. That is something we must cobble together for ourselves, from the bits and pieces of literature, philosophy, religion, history, and art that particularly resonate in us. “These fragments I have shored against my ruins,” T.S. Eliot wrote at the end of The Waste Land. No poem offers a better illustration of this cultural bricolage: Shakespeare answers Dante, and the Upanishads disclose what the Book of Revelation had suppressed.
So here we find an existential rationale for a liberal education. To be sure, the humanities do not figure alone in this endeavor: psychology, biology, and physics can contribute to our perception of ourselves in relation to the world, as can economics, sociology, and political science. But the more a discipline tends toward scientific precision, the more it privileges a small number of accepted, canonical explanations of those aspects of reality it aims to describe. If 20 biology professors lectured on Darwin’s theory of evolution, chances are they’d have a lot in common. But if 20 French professors lectured on Proust’s Recherche, chances are they’d be quite different. The same could be said, perhaps to a lesser extent, for 20 lectures on Plato’s Republic. The kinds of objects that the humanities focus on are generally irreducible to a single explanation. This is why they provide such good fodder for hungry minds: there are so many ways a poem, a painting, or a philosophy book can stick with you.
In his diatribe against the way the humanities have been taught since the '60s, Allan Bloom harrumphed, “On the portal of the humanities is written in many ways and many tongues, ‘There is no truth -- at least here.’ ” But the point of a liberal education is not to read great works in order to discover The Truth. Its point is to give students the chance to fashion purposeful lives for themselves. This is why authors such as Freud, whose truth-value is doubted by many, can still be a source of meaning for others. Conversely, this is also why humanities professors, many of whom are rightfully concerned about the truth-value of certain questions or interpretations, do not always teach the kinds of classes where students can serendipitously discover existential purpose.
There are more than existential reasons to study the humanities. Some are intellectual: history, for instance, responds to our profound curiosity about the past. Some are practical. To celebrate one is not to deny others. The biggest difficulty with defending the humanities is the embarrassment of riches: because humanists are like foxes and learn many different things, it is hard to explain them to the hedgehogs of the world, who want to know what One Big Thing we do well. The danger is that, in compressing our message so it gets heard, we leave out precisely the part that naturally appeals to our future students. Yes, students and parents are worried about employment prospects. But what parents don’t also want their child to lead a meaningful life? We are betraying our students if, as a society, we do not tell them that purpose is what ultimately makes a life well-lived.
Dan Edelstein is a professor of French and (by courtesy) history at Stanford University. He directs the Stanford Summer Humanities Institute.
In recent weeks a number of Modern Language Association members have talked with me about MLA Resolution 2014-1 to be voted on in Chicago on Saturday by the organization’s Delegate Assembly at the MLA’s annual meeting. The resolution "urges the U.S. Department of State to contest Israel’s arbitrary denials of entry to Gaza and the West Bank by U.S. academics who have been invited to teach, confer, or do research at Palestinian universities.” Several people expressed doubt that any counter-evidence could be presented to question the conclusions advanced by the background paper distributed by the resolution’s proponents. They then typically advanced to the next stage of the discussion, wondering what arguments could possibly be raised to defeat the resolution. The background paper sounds reasonable, even factual, if you aren’t well informed or up-to-date about conditions in Israel and the occupied territories. The people I talked with concluded it was an open-and-shut case.
Until now, MLA members have been in the same situation as the American Studies Association members who voted on a boycott resolution in December: They have only been presented with one side of the case. But a group of MLA members have now put together a detailed document exposing factual errors, contested claims, and misleading conclusions in the background paper available to MLA members on the association’s website. Like the resolution’s proponents, they have drawn on material gathered by non-government organizations with an interest in the subject. Rather than an objective report, the pro-resolution background paper is now revealed to be essentially the prosecution’s case. The document prepared by the resolution’s opponents amounts to the case for the defense.
The case for the defense rebuts both arguments and examples put forward by proponents of the resolution. It shows that many international scholars work and teach in the West Bank. It demonstrates why visa denials may not be “arbitrary.” It shows how the documents supporting the resolution are flawed and unreliable, including some that are now out of date. And it shows how Israeli visa policies are comparable to visa policies elsewhere. There are fundamental disagreements of fact between the two sides.
The members of the MLA’s Delegate Assembly have thus become triers of the facts, acting to evaluate what are fundamentally a set of evidence-based issues: what are the conditions at Palestinian universities? Are faculty members from other countries who wish to do so able to teach there? Are Palestinian faculty members able to engage in professional travel? What Israeli security concerns that affect access are or are not valid? What travel rules should an existentially threatened country in a state of perpetual war feel justified in enforcing? Does Israel have the right to exclude foreign faculty who advocate violence?
It is fair to say that MLA members are not necessarily well-informed about the first questions and are not professionally equipped to answer the last three. They would ideally have to listen to weeks of expert testimony and questioning before voting on the resolution. Instead they will hear an afternoon’s debate by English and foreign language professors. If the resolution passes, it will then be subjected to a vote by the association’s 30,000 members.
The MLA is to be applauded for requiring a democratic vote by its members before a resolution is formally adopted by the organization as a whole. Unfortunately, neither the Delegate Assembly nor the MLA’s 30,000 members have been equipped to be triers of the facts. Indeed MLA’s members are not required to read the documents supporting or contesting the resolution. Nor will they even be able to sit in judgment and hear arguments. They would be free to vote on the basis of their prior convictions, much as many of the ASA’s members surely did. Many ASA members no doubt voted approval simply because they were angry at Israel. They took the only organizational opportunity they had to express their disapproval of Israeli policy. The efficacy or advisability of academic boycotts aside, they registered their general convictions. Indeed there is no guarantee that members of the Delegate Assembly will read the two sets of background documents before voting.
Unfortunately, the context and basis for voting on the MLA resolution are worse still. Whether or not you support academic boycotts is fundamentally a matter of principle. Principle alone can guide a vote. But the MLA resolution is fundamentally fact-based. The process the MLA uses is not adequate to the task of establishing the facts. It is fatally flawed, or at least it will be if the Delegate Assembly approves the resolution.
Before the American Association of University Professors censures a college or university administration, it reviews documents submitted by both faculty members and administrators, tasks staff to prepare a review of relevant issues and key questions needing answers, and selects a team of faculty knowledgeable about academic freedom and shared governance to visit the campus in question to interview interested parties. The AAUP then drafts a full report reaching consensus on the facts. The AAUP also shares the draft report with administrators and faculty members on the campus and requests comments. The revised report is published for comment. The organization’s 39-member National Council reviews the report and votes on whether to recommend a vote for censure to the annual meeting. This is the kind of process required to decide a fact-based case in a responsible and professional manner.
But the MLA is not merely contemplating censuring a university. It is basically censuring a country for its policies. When did MLA conduct site visits to Israel, Gaza, and the West Bank? When did the MLA give Israelis an opportunity to respond, a procedure the MLA’s rules would seem to require? Where is the consensus report evaluating arguments pro and con and giving MLA members a disinterested basis on which to vote? If the Delegate Assembly votes to approve the resolution after this flawed process proceeds, it will have undermined the credibility of the organization and gone a long way toward transforming it from a scholarly to a political one. It does not augur well for the group’s future as a widely endorsed advocacy vehicle for the humanities.
On the other hand, the Delegate Assembly has an opportunity to reject the resolution. Set beside one another, the two sets of documents make it clear that a good deal more objective evidence would be needed to prove the prosecution case. To follow through on the jury trial analogy: when the documents for and against the resolution are compared, the DA at the very least must conclude there is “reasonable doubt” the resolution is justified.
That is not to say that Israel should not take the risk of loosening the security restrictions under which Palestinian universities operate. That would be one component of a plan for jettisoning control of the West Bank, something Israel may have to do unilaterally if negotiations continue to fail. But it is to say that MLA’s ill-informed resolution and inadequate procedures have no role to play in the process. In an era of continuing adjunct abuse and politicians declaring the humanities of no economic use, the MLA should concentrate instead on saving a profession endangered in its own country.
Cary Nelson served as national president of the American Association of University Professors from 2006 to 2012. He teaches at the University of Illinois at Urbana-Champaign.
Supporters of the American Studies Association’s call for a boycott of Israel universities are distorting what the boycott is – and how it will affect academe. The "institutional boycott" is likely to function as a political test in a hidden form. It violates principles of academic freedom. And in practice, it has been, and is likely to continue to be, a campaign for the exclusion of individual scholars who work in Israel, from the global academic community. It’s time to look with more care at the boycott and what it’s really about.
What the ASA Resolution Says
The ASA resolution reaffirms, in a general and abstract way, its support for the principle of academic freedom. It then says that it will “honor the call of Palestinian civil society for a boycott of Israeli academic institutions.” It goes on to offer guarantees that it will support the academic freedom of scholars who speak about Israel and who support the boycott; the implication here is that this refers to scholars who are opponents of Israel or of Israeli policy. The resolution does not specifically mention the academic freedom of individual Israeli scholars or students, nor does it mention protection for people to speak out against the boycott, nor does it say anything about the academic freedom of people to collaborate with Israeli colleagues.
What the ASA names "the call of Palestinian civil society for a boycott" is the Palestinian Campaign for the Academic and Cultural Boycott of Israel (PACBI) "Call for Academic and Cultural Boycott of Israel." The PACBI call explicitly says that the "vast majority of Israeli intellectuals and academics," that is to say individuals, have contributed to, or have been "complicit in through their silence," the Israeli human rights abuses which are the reasons given for boycott. There would be no sense in making this claim if no sanctions against individuals were envisaged. The PACBI guidelines state that "virtually all" Israeli academic institutions are guilty in the same way.
These claims, about the collective guilt of Israeli academics and institutions are strongly contested empirically. Opponents of the boycott argue that Israeli academe is pluralistic and diverse and contains many individuals who explicitly oppose anti-Arab racism, Islamophobia and the military and the civilian occupations of the West Bank. These claims about the guilt of Israeli academe are also contested by those who hold that the principle of collective guilt is a violation of the norms of the global academic community and of natural justice. Opponents of the boycott argue that academics and institutions should be judged by the content of their work and by the nature of their academic norms and practices, not by the state in which they are employed.
The PACBI guidelines go on to specify what is meant by the "institutional" boycott. "[T]hese institutions, all their activities, and all the events they sponsor or support must be boycotted." And "[e] and projects involving individuals explicitly representing these complicit institutions should be boycotted." The guidelines then offer an exemption for some other classes of individual as follows: "Mere institutional affiliation to the Israeli academy is therefore not a sufficient condition for applying the boycott."
A Political Test by Another Name
Refusing to collaborate with academics on the basis of their nationality is, prima facie, a violation of the norms of academic freedom and of the principle of the universality of science. It seems to punish scholars not for something related to their work, nor for something that they have done wrong, but because of who they are.
In 2002 Mona Baker, an academic in Britain, fired two Israelis from the editorial boards of academic journals that she owned and edited. Gideon Toury and Miriam Shlesinger are both well-respected internationally as scholars and also as public opponents of Israeli human rights abuses, but nevertheless they were "boycotted." The boycott campaign sought a more sophisticated formulation which did not appear to target individuals just for being Israeli.
In 2003, the formulation of the "institutional boycott" was put into action with a resolution to the Association of University Teachers (AUT), an academic trade union in Britain, that members should "sever any academic links they may have with official Israeli institutions, including universities." Yet in the same year, Andrew Wilkie, an Oxford academic, rejected an Israeli who applied to do a Ph.D. with him, giving as a reason that he had served in the Israeli armed forces. The boycott campaign in the UK supported Andrew Wilkie against criticism which focused on his boycott of an individual who had no affiliation of any kind to an Israeli academic institution. If the principle was accepted that anybody who had been in the Israeli armed forces was to be boycotted, then virtually every Israeli Jew would be thus targeted.
In 2006 the boycott campaign took a new tack, offering an exemption from the boycott to Israelis who could demonstrate their political cleanliness. The other British academic union, NATFHE, called for a boycott of Israeli scholars who failed to "publicly dissociate themselves" from ‘Israel’s apartheid policies." The political test opened the campaign up to a charge of McCarthyism: the implementation of a boycott on this basis would require some kind of machinery to be set up to judge who was allowed an exemption and who was not. The assertion that Israel is "apartheid" is emotionally charged and strongly contested. While it is possible for such analogies to be employed carefully and legitimately, it is also possible for such analogies to function as statements of loyalty to the Palestinians. They sometimes function as short cuts to the boycott conclusion, and as ways of demonizing Israel, Israelis, and those who are accused of speaking on their behalf. In practice, the boycott campaign attempts to construct supporters of the boycott as friends of Palestine and opponents of the boycott as enemies of Palestine.
It is reasonable to assume that under the influence of the campaign for an "institutional boycott," much boycotting of individuals goes on silently and privately. It is also reasonable to assume that Israeli scholars may come to fear submitting papers to journals or conferences if they think they may be boycotted, explicitly or not; this would lead to a "self-boycott" effect. There are anecdotal examples of the kinds of things which are likely to happen under the surface even of an institutional boycott. An Israeli colleague contacted a British academic in 2008, saying that he was in town and would like to meet for a coffee to discuss common research interests. The Israeli was told that the British colleague would be happy to meet, but he would first have to disavow Israeli apartheid.
The PACBI call, endorsed by ASA, says that Israeli institutions are guilty, Israeli intellectuals are guilty, Israeli academics who explicitly represent their institutions should be boycotted, but an affiliation in itself, is not grounds for boycott. The danger is that Israelis will be asked not to disavow Israel politically, but to disavow their university ‘institutionally’, as a pre-condition for recognition as legitimate members of the academic community. Israelis may be told that they are welcome to submit an article to a journal or to attend a seminar or a conference as an individual: EG David Hirsh is acceptable, David Hirsh, Tel Aviv University is not. Some Israelis will, as a matter of principle, refuse to appear only as an individual; others may be required by the institution which pays their salary, or by the institution which funds their research, not to disavow.
An ‘Institutional Boycott’ Still Violates Principles of Academic Freedom
Academic institutions themselves, in Israel as anywhere else, are fundamentally communities of scholars; they protect scholars, they make it possible for scholars to research and to teach, and they defend the academic freedom of scholars. The premise of the "institutional boycott" is that in Israel, universities are bad but scholars are (possibly, exceptionally) good, that universities are organs of the state while individual scholars are employees who may be (possibly, exceptionally) not guilty of supporting Israeli "apartheid" or some similar formulation.
There are two fundamental elements that are contested by opponents of the boycott in the "institutional boycott" rhetoric. First, it is argued, academic institutions are a necessary part of the structure of academic freedom. If there were no universities, scholars would band together and invent them, in order to create a framework within which they could function as professional researchers and teachers, and within which they could collectively defend their academic freedom.
Second, opponents of the boycott argue that Israeli academic institutions are not materially different from academic institutions in other free countries: they are not segregated by race, religion or gender, they have relative autonomy from the state, they defend academic freedom and freedom of criticism, not least against government and political pressure. There are of course threats to academic freedom in Israel, as there are in the U.S. and elsewhere, but the record of Israeli institutions is a good one in defending their scholars from political interference. Neve Gordon, for example, still has tenure at Ben Gurion University, in spite of calling for a boycott of his own institution; Ilan Pappe left Haifa voluntarily after having been protected by his institution even after traveling the world denouncing his institution and Israel in general as genocidal, Nazi and worthy of boycott.
Jon Pike argued that the very business of academia does not open itself up to a clear distinction between individuals and institutions. For example the boycott campaign has proposed that while Israelis may submit papers as individuals, they would be boycotted if they submitted it from their institutions. He points out that "papers that ‘issue from Israeli institutions' or are 'submitted from Israeli institutions' are worried over, written by, formatted by, referenced by, checked by, posted off by individual Israeli academics. Scientists, theorists, and researchers do their thinking, write it up and send it off to journals. It seems to me that Israeli academics can’t plausibly be so different from the rest of us that they have discovered some wonderful way of writing papers without the intervention of a human, individual, writer."
Boycotting academic institutions means refusing to collaborate with Israeli academics, at least under some circumstances if not others; and then we are likely to see the reintroduction of some form of "disavowal" test.
The Boycott Is an Exclusion of Jewish Scholars Who Work in Israel
In 2011 the University of Johannesburg decided, under pressure from the boycott campaign, to cut the institutional links it had with Ben Gurion University for the study of irrigation techniques in arid agriculture. Logically the cutting of links should have meant the end of the research with the Israeli scholars being boycotted as explicit representatives of their university. What in fact happened was that the boycotters had their public political victory and then the two universities quietly renegotiated their links under the radar, with the knowledge of the boycott campaign, and the research into agriculture continued. The boycott campaign portrayed this as an institutional boycott that didn’t harm scientific co-operation or Israeli individuals. The risks are that such pragmatism (and hypocrisy) will not always be the outcome and that the official position of "cutting links" will actually be implemented; in any case, the University of Johannesburg solution encourages a rhetoric of stigmatization against Israeli academics, even if it quietly neglects to act on it.
Another risk is that the targeting of Israelis by the "institutional boycott," or the targeting of the ones who are likely to refuse to disavow their institutional affiliations, is likely to impact disproportionately Jews. The risk here is that the institutional boycott has the potential to become, in its actual implementation, an exclusion of Jewish Israelis, although there will of course be exemption for some "good Jews": anti-Zionist Jewish Israelis or Israeli Jewish supporters of the boycott campaign. The result would be a policy which harms Israeli Jews more than anybody else. Further, among scholars who insist on "breaking the institutional boycott" or on arguing against it in America, Jews are likely to be disproportionately represented. If there are consequences which follow these activities, which some boycotters will regard as scabbing, the consequences will impact most heavily on American Jewish academics. Under any accepted practice of equal opportunities impact assessment, the policy of "institutional boycott" would cross the red lines which would normally constitute warnings of institutional racism.
The reality of the "institutional boycott" is that somebody will be in charge of judging who should be boycotted and who should be exempt. Even the official positions of ASA and PACBI are confusing and contradictory; they say there will be no boycott of individuals but they nevertheless make claims which offer justification for a boycott of individuals. But there is the added danger that some people implementing the boycott locally are likely not to have even the political sophistication of the official boycott campaign. There is a risk that there will still be boycotts of individuals (Mona Baker), political tests (NATFHE), breaking of scientific links (University of Johannesburg) and silent individual boycotts.
Even if nobody intends this, it is foreseeable that in practice the effects of a boycott may include exclusions, opprobrium, and stigma against Jewish Israeli academics who do not pass, or who refuse to submit to, one version or another of a test of their ideological purity; similar treatment may be visited upon those non-Israeli academics who insist on working with Israeli colleagues. There is a clear risk that an ‘institutional boycott’, if actually implemented, would function as such a test.
PACBI is the "Palestinian Campaign for the Academic and Cultural Boycott of Israel." What it hopes to achieve is stated in its name. It hopes to institute an "academic boycott of Israel." The small print concerning the distinction between institutions and individuals is contradictory, unclear and small. It is likely that some people will continue to understand the term "academic boycott of Israel," in a common sense way, to mean a boycott of Israeli academics.
David Hirsh is lecturer in sociology at Goldsmiths, the University of London. He is founding editor of Engage, a network and website that opposes boycotts of Israel and anti-Semitism.
What the social-democratic left has always objected to is not the liberal aspiration to universal rights and freedoms, but rather the way that classical liberalism generally ignored the unequal economic and social conditions of access to those freedoms. The liberal’s abstract universalism affirmed everyone’s equal rights without giving everyone the real means of realizing these formally universal rights. The rich and the poor may have an equal formal right to be elected to political office, for instance, but the poor were effectively excluded from office when it did not pay a full-time salary.
For this reason generations of social democrats have insisted that all citizens must be guaranteed access to the institutional resources they need to make effective use of their civil and political rights. The British sociologist T. H. Marshall referred to those guarantees as the social component of citizenship, and he argued that only when this social component began to be incorporated into citizenship did equal citizenship start to impose modifications on the substantive inequalities of the capitalist class system. Today, when neoliberalism is ascendant and the welfare state is in tatters, it is more important than ever to remember the social-democratic critique of formal equality and abstract universalism.
Like other freedoms, academic freedom cannot be practiced effectively without the means of realizing it. At one time, those means were largely in the hands of academics themselves. As the German sociologist Max Weber put it, “The old-time lecturer and university professor worked with the books and the technical resources which they procured or made for themselves.” Like the artisan, the peasant smallholder, or the member of a liberal profession, the scholar was not separated from his means of production. But that time is long past. As Weber understood well, this “pre-capitalist” mode of scholarship had already disappeared a century ago, when he wrote those words.
The modern academic, he pointed out, did not own the means to conduct scientific or humanistic research or to communicate his or her findings any more than the modern proletarian owns the means of production, the modern soldier owns the means of warfare, or the modern civil servant owns the means of administration. Like those other figures in a capitalistic and bureaucratized society, the individual academic depends on means that are not his or her own. Specifically, she relies on academic institutions and the resources they provide — access to books, journals, laboratories, equipment, materials, research and travel funds, etc. — to participate in the intellectual and communicative exchanges that are the lifeblood of her profession. Unless she is independently wealthy, she depends on an academic institution for her very livelihood.
What, then, is an academic boycott of Israel in relation to these facts? The boycott recently endorsed by the American Studies Association, its supporters emphasize, is aimed only at Israeli academic institutions and not at individual scholars. Consequently, Judith Butler explained in the pages of The Nation in December 2013, “any Israeli, Jewish or not, is free to come to a conference, to submit his or her work to a journal and to enter into any form of scholarly exchange. The only request that is being made is that no institutional funding from Israeli institutions be used for the purposes of those activities.”
Butler argues that such a request does not infringe upon the Israeli scholar’s academic freedom because that scholar can pay from her “own personal funds” or ask others to pay for her. Personal funds presumably come from the salary paid to the Israeli scholar by her institution, but for Butler money apparently ceases to be institutional once it changes hands. One wonders why this same reasoning doesn’t apply to conference or travel funds furnished by an Israeli university.
One also wonders how many ASA members are willing to raise their own dues or earmark a portion of their current dues to pay for the participation of Israeli colleagues in the activities of their organization. Furthermore, one wonders why Butler, who has raised concerns about new forms of effective censorship exercised by private donors, does not have similar concerns about the donors who might pay for Israeli colleagues. But the most serious problem with Butler’s proposal is that it imposes special costs and burdens on Israeli scholars, creating substantive inequalities that undermine the formally equal and universal freedoms that she is eager to affirm for everyone in the abstract.
While scholars of other nationalities may use the resources of their institutions, Israeli scholars must make do with their own private means or rely upon charity; they enjoy equal academic freedom in the same way that the rich and the poor are equally free to hold an unpaid office. For the generously paid academic aristocracy at elite institutions, using one’s own personal funds may only be an “inconvenience” (Butler’s word) rather than a hardship. However, not all academics have personal resources in such abundance, and those with fewer personal resources are more dependent on institutional funding.
Because “academic freedom can only be exercised when the material conditions for exercising those rights are secured,” Butler has argued, the academic freedom of Palestinians is vitiated by the conditions of Israeli military occupation. She is indeed right, but the remedy for military occupation is a negotiated peace, not an effort to deprive Israelis of the material conditions for their academic freedom. Butler seems not to understand how her point militates against her own demand that Israeli scholars become luftmenschen. The distinction between an institutional and an individual boycott only makes sense in a world of abstract universalism, where Israeli scholars are entitled to academic freedom in a formal sense without equal access to the institutional means and resources they need to realize it in practice. The great irony of the campaign to boycott Israeli academics is that its proponents consider it a litmus test of left-wing politics when in fact they fail to apply consistently one of the left’s most important insights.
Chad Alan Goldberg is professor of sociology at the University of Wisconsin at Madison. He is a member of the American Federation of Teachers, the American Association of University Professors and the Jewish Labor Committee.
I want to begin with a quotation from Tzvetan Todorov's Facing the Extreme Moral Life in the Concentration Camps, because, of all the many things that might be said in opposition to the American Studies Association boycott of Israeli institutions of higher education, the one I want to focus on is the association's lack of moral courage, which, in this case, includes its failure to have learned the lessons of the association's extraordinary and ethical achievements in previous generations.
This is Todorov: "to denounce slavery constitutes a moral act only at those times when such denunciation is not simply a matter of course and thus involves some personal risk. There is nothing moral in speaking out against slavery today; all it proves is that I'm in step with my society's ideology or else don't want to find myself on the wrong side of the barricades. Something very similar can be said about condemnations of racism, although that would not have been the case in 1936 in Germany."
I would ask the question of the ASA: Who, in their audience of addressees, do they imagine is NOT opposed to the idea of occupation? And who, again in their target audience, is NOT concerned with the rights of Palestinians? Not even the politically right-wing academics in Israel are pro-occupation or against Palestinians as a matter of moral belief or commitment, as were, say, slaveholders in the American South or anti-Semites in fascist Europe. The issue for them, for all of us here, is one that the boycott does not even recognize, let alone address: how do these two entities, Israel and Palestine, find a way to exist side by side?
To be sure Israeli Jews like myself are likely to be more sensitive to the potential extermination of the Jewish population in Israel than individuals outside of Israel. I confess that bias. But the possibilities of the destruction of the State of Israel and the deaths of its citizens are no fantasies of a deluded imagination. Read the Arab press, unless, of course, the boycotters would prefer to remain ignorant of the issues. What is required in Israel is a political solution that produces a Palestinian state and secures the existence of Israel. If any one of the boycotters has a solution that does that, we in Israel would love to hear it.
The generation of Americanists who opposed the 1940s and '50s idea of American exceptionalism and who opened the field of American studies to new voices (many of which are now prominent in the field), took bold stands, not only in terms of attacking the American hegemony of the time and transforming the American literary and historical narrative, but also in terms of the political actions they took: not just opposing segregation and racism, the Vietnam War, sexism, and many other less-than-enviable aspects of the American polity in their writings. Teaching at historically black colleges, producing programs of African American and minority studies, introducing feminism into the curriculum, and supporting the women who would teach those courses. Critics such as Paul Lauter, Leslie Fiedler, Stanley Elkins, Emory Eliot, Sacvan Bercovitch spoke out. They took risks. Many of them were first-generation college-educated; many were Jews. .
One of the boycott advocates, Cynthia Franklin, as quoted in Inside Higher Ed, speaks of the "culture of fear" in speaking out in relation to Israel and Palestine, specifically the fear of "reprisals," such as "not getting tenure or ... jobs." Since neither Israeli institutions of higher learning nor the State of Israel could possibly be the source of such reprisals, I can only imagine that Franklin fears other Americans. Wouldn't it make more sense to address these fellow Americans? If Franklin is right about the threat of reprisals, it would certainly take more moral courage, which apparently the boycotters lack. The president of the association, Curtis Marez also seems to know very little about what the field of American studies has stood for in the United States. As quoted in New York Magazine, he doesn't "dispute that many nations, including many of Israel's neighbors, are generally judged to have human rights records that are worse than Israel's [but ] 'one has to start somewhere' " – start somewhere to do what, exactly?
America, he may have forgotten, is no longer, actually it never was, the City on the Hill. It took decades and many academic arguments to break the American fantasy of itself as a land of equal opportunity for all and to acknowledge racism and sexism and genderism in American culture. These are still not eradicated, whatever the contemporary hegemony of Americanists believes. And there are still other American ills to deal with. To invoke Emerson's words in "Self-Reliance," voiced "to the angry bigot [who] assumes this bountiful cause of Abolition, and comes to me with his latest news from Barbardoes": "Go love thy infant, love thy woodchopper, be good-natured and modest: have that grace; and never varnish your hard, uncharitable ambition with this incredible tenderness for black folk a thousand miles off. Thy love afar is spite at home."
One defense of the boycott has been that, given this allegedly tremendous repression of the conversation in the United States by forces unnamed, and because of the necessity for exceptionalist Americanists to broadcast their hegemonic, moral message to the world, the boycott at least opens up the topic of Israel and Palestine for conversation. Five thousand academics belong to the ASA and not one of them could think of a single other way to open up this conversation? Centerpiecing a work of Arab-American fiction (say, for example, Muhja Kahf's Girl in the Tangerine Scarf, Suzan Muaddi Darraj's Inheritance of Exile or Leila Halabi's West of the Jordan) at the yearly conference might have been a start, in keeping with the association's disciplinary definition as well, though that might have complicated matters for the activists, since, lo and behold, not only is Israel not the only oppressor in these texts but the United States is not exactly a bastion of easy integration. Convening a panel of Israeli and Palestinian Americanists (some of them my former students) might also have been an option – if, of course, what the association wanted was change rather than domination and power.
American Americanists do not need to bring to the attention of Israeli academics the difficulty of getting an education under conditions of occupation or discrimination. I don't even dare bring up ancient history like European (not to mention American) quotas against Jews at the university, since this is not, we are told, a Jewish issue at all (though, who, in truth, are those Americans that the Americanists so fear?). I am talking about life in Palestine, pre-Israel, when Jews were Palestinians. I don't know if a Mandate, as in the British rule over the region from the end of World War I until the birth of Israel, is the same as an occupation, but under the pre-Israel Mandate travel throughout Palestine and for Jews coming into Palestine was severely restricted. Nor were uprisings against Jews (there were no Israelis then) uncommon. Yet 25 years before the declaration of the State of Israel, the Hebrew University was founded, and it flourished. And when, in violation of the truce in 1949, Israelis were forcibly denied access to that university, on Mount Scopus, they studied in a building in Rehavia, until they built a new campus in Givat Ram. After the 1967 war, they returned – note my word: returned – to Mount Scopus once again.
In his memoir, Little Did I Know, Stanley Cavell asks the question that all of us – Israelis, Palestinians, Americans – must ask in the global world we inhabit. He is discussing the return of his good friend, philosopher Kurt Fischer, to the Austria that had made of him a refugee, first in Shanghai, then in the United States. Fischer knows full well that he will now dwell among those very people who had ejected him, and that he is going to have to accept the human situation they now share. This is Cavell: "It takes an extreme case of oppression, which tore him from his home in his adolescence, to be posing the question every decently situated human being, after adolescence, either asks himself in an unjust world, or coarsens himself to avoid asking: Where is one now; how is one living with, hence counting upon, injustice?"
I suggest that the pro-boycotters of the American Studies Association ask themselves how they are now living with and hence counting upon injustice in order to preserve their own hegemonic authority and power and their utterly absurd sense of themselves as exceptional. As Jonathan Chait points out in his New York piece if, as Curtis Marez admits, Israel isn't the worst offender in the neighborhood, then wouldn't it make sense to start with those who are the worst offenders? In the absence of doing that, the boycotters cannot, in good conscience, claim that their boycott is anything more than power politics at its worst. Painfully for an Americanist like myself, it defeats everything that the ASA has stood for over the many years of its existence.
Emily Budick is the Ann and Joseph Edelman Chair of American Studies and chair of English at the Hebrew University of Jerusalem.
Originally published by Encyclopedia Britannica in 1952, Great Books of the Western World offered a selection of core texts representing the highest achievements of European and North American culture. That was the ambition. But today the set is perhaps best remembered as a peculiar episode in the history of furniture.
Many an American living room displayed its 54 volumes -- “monuments of unageing intellect,” to borrow a phrase from Yeats. (The poet himself, alas, did not make the grade as Great.) When it first appeared, the set cost $249.50, the equivalent of about $2,200 today. It was a shrewd investment in cultural capital, or at it least it could be, since the dividends came only from reading the books. Mortimer Adler – the philosopher and cultural impresario who envisioned the series in the early 1940s and led it through publication and beyond, into a host of spinoff projects – saw the Great Books authors as engaged in a Great Conversation across the centuries, enriching the meaning of each work and making it “endlessly rereadable.”
Adler's vision must have sounded enticing when explained by the Britannica salesman during a house call. Also enticing: the package deal, with Bible and specially designed bookcase, all for $10 down and $10 per month. But with some texts the accent was on endless more than rereadable (the fruits of ancient biological and medical research, for example, are dry and stony) and it is a good bet that many Great Books remained all but untouched by human hands.
Well, that’s one way to tell the Great Books story: High culture meets commodity fetishism amidst Cold War anxiety over the state of American education. But Tim Lacy gives a far more generous and considerably more complex analysis of the phenomenon in The Dream of a Democratic Culture: Mortimer J. Adler and the Great Books Idea, just published by Palgrave Macmillan. The book provides many unflattering details about how Adler’s pedagogical ambitions were packaged and marketed, including practices shady enough to have drawn Federal Trade Commission censure in the 1970s. (These included bogus contests, luring people into "advertising research analysis surveys" that turned into sales presentations, and misleading "bundling" of additional Great Books-related products without making clear the additional expense.) At the same time, it makes clear that Adler had more in mind than providing a codified and “branded” set of masterpieces that the reader should passively absorb (or trudge through, as the case may be).
The Dream of a Democratic Culture started life as a dissertation at Loyola University in Chicago, where Lacy is currently an academic adviser at the university’s Stritch School of Medicine. In its final pages, he describes the life-changing impact on him, some 20 years ago, of studying Adler’s How to Read a Book (1940), a longtime bestseller. He owns and is reading his way through the Great Books set, and his study reflects close attention to Adler’s own writings and the various supplementary Great Books projects. But in analyzing the life and work of “the Great Bookie,” as one of Adler’s friends dubbed him, Lacy is never merely celebratory. In the final dozen years or so before his death in 2001, Adler became one of the more splenetic culture warriors – saying, for example, that the reason no black authors appeared in the expanded 1990 edition of the Great Books was because they “didn’t write any good books.”
Other such late pronouncements have been all too memorable -- but Lacy, without excusing them, makes a case that they ought not to be treated as Adler’s definitive statements. On the contrary, they seem to betray principles expressed earlier in his career. Lacy stops short of diagnosing the aging philosopher’s bigoted remarks as evidence of declining mental powers, though it is surely a tempting explanation. Then again, working at a medical school would probably leave a non-doctor chary about that sort of thing.
I found The Dream of a Democratic Culture absorbing and was glad to be able to interview the author about it by email; the transcript follows. Between questions, I looked around a used-books website to check out the market in secondhand copies of Great Books of the Western World is like. One listing for the original 1952 edition is especially appealing, and not just because of its price (under $250, in today’s currency). “The whole set is in very good condition,” the bookseller writes, “i.e., not read at all.”
Q: How did your personal encounter with the Great Books turn into a scholarly project?
A: I started my graduate studies in history, at Loyola University Chicago, during the 1997-98 academic year. My initial plan was to work on U.S. cultural history, with a plan to zoom in on either urban environmental history or intellectual history in an urban context. I was going to earn an M.A. and then see about my possibilities for a Ph.D. program.
By the end of 1998 the only thing that had become clear to me was that I was confused. I had accumulated some debt and a little bit of coursework, but I needed a break rethink my options. I took a leave of absence for the 1999 calendar year. During that period I decided three things: (1) I wanted to stay at Loyola for my Ph.D. work; (2) Environmental history was not going to work for me there; (3) Cultural and intellectual history would work for me, but I would need to choose my M.A. thesis carefully to make it work for doctoral studies.
Alongside this intense re-education in the discipline of history I had maintained, all through the 1997 to 1999 period, my reading of the Britannica's Great Books set. I had also accumulated more books on Adler, including his two autobiographies, during stress-relief forays into Chicago's most excellent used bookstore scene. Given Adler's Chicago connections, one almost always saw his two or three of his works in the philosophy sections of these stores.
During a cold December day in 1999, while sitting in a Rogers Park coffee shop near Loyola, this all came together in a sudden caffeine-laced epiphany: Why not propose the Great Books themselves as the big project for my graduate study? I sat on the idea for a few days, both thinking about all the directions I could take for research and pounding myself on the head for not having thought of the project sooner. I knew at this point that Adler hadn't been studied much, and I had a sense that this could be a career's worth of work.
The project was going to bring together professional and personal interests in a way that I had not imagined possible when thinking about graduate school.
Q: Did you meet any resistance to working on Adler and the Great Books? They aren’t exactly held in the highest academic esteem.
A: The first resistance came late in graduate school, and after, when I began sending papers, based on my work, out to journals for potential publication. There I ran into some surprising resistance, in two ways. First, I noticed a strong reluctance toward acknowledging Adler's contributions to American intellectual life. As is evident in my work and in the writings of others (notably Joan Shelley Rubin and Lawrence Levine, but more recently in Alex Beam), Adler had made a number of enemies in the academy, especially in philosophy. But I had expected some resistance there. I know Adler was brusque, and had written negatively about the increasing specialization of the academy (especially in philosophy but also in the social sciences) over the course of the 20th century.
The second line of resistance, which was somewhat more surprising, came because I took a revisionist, positive outlook on the real and potential contributions of the great books idea. Of course this resistance linked back to Adler, who late in his life — in concert with conservative culture warriors --- declared that the canon was set and not revisable. Some of the biggest promoters of the great books idea had, ironically, made it unpalatable to a great number of intellectuals. I hadn't anticipated the fact that Adler and the Great Books were so tightly intertwined, synonymous even, in the minds of many academics.
Q: Selecting a core set of texts was only part of Adler's pedagogical program. Your account shows that it encompassed a range of forms of instruction, in various venues (on television and in newspapers as well as in classrooms and people’s homes). The teaching was, or is, pitched at people of diverse age groups, social backgrounds, and so on -- with an understanding that there are numerous ways of engaging with the material. Would you say something about that?
A: The great books idea in education --- whether higher, secondary, or even primary --- was seen by its promoters as intellectually romantic, adventurous even. It involved adults and younger students tackling primary texts instead of textbooks. As conceived by Adler and Hutchins, the great books idea focused people on lively discussion rather than boring Ben Stein-style droning lectures, or PowerPoints, or uninspiring, lowest-common-denominator student-led group work.
One can of course pick up bits of E.D. Hirsch-style "cultural literacy" (e.g., important places, names, dates, references, and trivia) through reading great books, or even acquire deeper notes of cultural capital as described in John Guillory's excellent but complex work, Cultural Capital: The Problem of Literary Canon Formation (1993). But the deepest goal of Adler's model of close reading was to lead everyday people into the high stakes world of ideas. This was no mere transaction in a "marketplace of ideas," but a full-fledged dialogue wherein one brought all her or his intellectual tools to the workbench.
Adler, Hutchins, John Erskine, Jacques Barzun, and Clifton Fadiman prided themselves being good discussion leaders, but most promoters also believed that this kind of leadership could be passed to others. Indeed, the Great Books Foundation trained (and still trains) people to lead seminars in a way that would've pleased Erskine and Adler. Education credentials matter to institutions, but the Foundation was willing train people off the street to lead great books reading groups.
This points to the fact that the excellent books by famous authors promoted by the great books movement, and the romance inherent in the world of ideas, mattered more than the personality or skill of any one discussion moderator. All could access an engagement with excellence, and that excellence could manifest in texts from a diverse array of authors.
Q: It seems like the tragedy of Adler is that he had this generous, capacious notion that could be called the Great Books as a sort of shorthand – but what he's remembered for is just the most tangible and commodified element of it. A victim of his own commercial success?
A: Your take on the tragedy of Adler is pretty much mine. Given his lifelong association with the great books project, his late-life failings almost guaranteed that the larger great books idea would be lost in the mess of both his temporary racism and promotion of Britannica's cultural commodity. The idea came to be seen as a mere byproduct of his promotional ability. The more admirable, important, and flexible project of close readings, critical thinking, and good citizenship devolved into a sad Culture Wars spectacle of sniping about race, class, and gender. This is why I tried, in my "Coda and Conclusion" to end on a more upbeat note by discussing the excellent work of Earl Shorris and my own positive adventures with great books and Adler's work.
Q: Was it obvious to you from the start that writing about Adler would entail a sort of prehistory of the culture wars, or did that realization come later?
A: At first I thought I would be exploring Adler's early work on the great books during my graduate studies. I saw myself intensely studying the 1920s-1950s period. Indeed, that's all I covered for my master's project which was completed in 2002.
However, I began to see the Culture Wars more clearly as I began to think in more detail about the dissertation. It was right around this time that I wrote a short, exploratory paper on Adler's 1980s-era Paideia Project. When I mapped Paideia in relation to "A Nation at Risk" and William Bennett, I began to see that my project would have to cover Bloom, the Stanford Affair, and the 1990 release of the second edition of Britannica's set. Around the same time I also wrote a paper on Adler's late 1960s books. When I noticed the correlation between his reactions to "The Sixties" and those of conservative culture warriors, it was plain to me that I would have to explore Adler as the culture warrior.
So even though I never set out to write about the Culture Wars, I got excited when I realized how little had been done on the topic, and that the historiography was thin. My focus would limit my exploration (unlike Andrew Hartman's forthcoming study), but I was pleased to know that I might be hanging around with a vanguard of scholars doing recent history on the Culture Wars.
Q: While Adler’s response to the upheaval of the 1960s was not enthusiastic, he was also quite contemptuous of Alan Bloom’s The Closing of the American Mind. How aware of Bloom's book and its aftermath were you when you bought and started reading the Great Books?
A: Honestly, I had little knowledge of Allan Bloom nor his ubiquitous The Closing of the American Mind until the mid-1990s. This requires a little background explanation. I started college in 1989 and finished in 1994. As a small-town Midwestern teenager and late-1980s high schooler, I was something of a rube when I started college. I was only vaguely aware, in 1989, that there was even a culture war ongoing out there (except in relation to HIV and AIDS).
I'm ashamed to admit, now, how unaware I was of the cultural scene generally. Moreover, I was insulated from some of it, and its intensity, during my early college years when it was at its height because I began college as an engineering student. Not only was my area of study far outside the humanities, the intensity of coursework in engineering sheltered me from all news beyond sports (my news reading outlet at the time). Even when I began to see that engineering wasn't for me, around 1992, my (then) vocational view of college caused me to move to chemistry rather than a humanities subject.
My own rudimentary philosophy of education kept me from thinking more about the Culture Wars until my last few years as a college student. It was then that I first heard about Bloom and his book. Even so, I only read passages in it, through the work of others, until I bought a copy of the book around 2000. I didn't read The Closing of the American Mind, word-for-word, until around 2003-04 while dissertating.
Q: There was no love lost between Adler and Bloom – you make that clear!
A: In my book you can see that Adler really wanted it known that he believed Leo Strauss and all his disciples, especially Bloom, were elitists. Adler believed that the knowledge (philosophy, history, theology, psychology, etc.) contained in great books were accessible to all. While scholarship and the knowledge of elites could add to what one gained from reading great books, there was a great deal in those works that was accessible to the common man and hence available to make better citizens.
So while Adler was sort of a comic-book character, you might say he was a clown for democratic citizenship -- a deceptively smart clown champion for democratizing knowledge and for raising the bar on intelligent discourse. This analogy is faulty, however, because of the intensity and seriousness with which he approached his intellectual endeavors. He loved debate with those who were sincerely engaged in his favorite topics (political philosophy, education, common sense philosophy, etc.).
I see only advantages in the fact that I was not personally or consistently engaged in the culture wars of the late 1980s and early 1990s. It has given me an objective distance, emotionally and intellectually, that I never believed possible for someone working on a topic that had occurred in her/his lifetime. Even though I started graduate school as something of a cultural and religious conservative (this is another story), I never felt invested in making my developing story into something that affirmed my beliefs about religious, culture, and America in general.
A belief that tradition and history had something to offer people today led me to the great books, but that did not confine me into a specific belief about what great books could, or should, offer people today. I was into great books for the intellectual challenge and personal development as a thinker, not for what great books could tell me about today's political, social, cultural, and intellectual scene.
Q: You defend Adler and the Great Books without being defensive, and I take it that you hope your book might help undo some of the damage to the reputation of each -- damage done by Adler himself, arguably, as much as by those who denounced him. But is that really possible, at this late a date? Won’t it take a generation or two? Or is there something about Adler's work that can be revived sooner, or even now?
A: Thank you very much for the compliment in your distinction about defending and being defensive. I did indeed seek to revise the way in which Adler is covered in the historiography. Because most other accounts about him have been, in the main, mocking and condescending, any revisionary project like mine would necessarily have to be more positive -- to inhabit his projects and work, which could result in something that might appear defensive. I think my mentor, Lewis Erenberg, and others will confirm that I did not always strike the right tone in my early work. It was a phase I had to work through to arrive at a mature, professional take on the whole of Adler's life and the Great Books Movement.
As for salvaging Adler's work as a whole, I don't know if that's possible. Some of it is dated and highly contextual. But there is much worth reviewing and studying in his corpus. My historical biography, focused on the great books in the United States, makes some headway in that area.
Some of Adler's other thinking about great books on the international scene will make it into a manuscript, on which I'm currently working, about the transnational history of the great books idea. If all goes well (fingers crossed), that piece will be paired with another by a philosopher and published as "The Great Books Controversy" in a series edited by Jonathan Zimmerman and Randall Curren.
I think a larger book on Adler's work in philosophy is needed, especially his work in his own Institute for Philosophical Research. I don't know if my current professional situation will give me the time and resources to accomplish much more on Adler. And even if my work situation evolves, I do have interests in other historical areas (anti-intellectualism, Chicago's intellectual history, a Jacques Maritain-in-America project). Finally, I also need keep up my hobby of reading more great books!
Over the last year there has been a steady stream of articles about the “crisis in the humanities,” fostering a sense that students are stampeding from liberal education toward more vocationally oriented studies. In fact, the decline in humanities enrollments, as some have pointed out, is wildly overstated, and much of that decline occurred in the 1970s and 1980s. Still, the press is filled with tales about parents riding herd on their offspring lest they be attracted to literature or history rather than to courses that teach them to develop new apps for the next, smarter phone.
America has long been ambivalent about learning for its own sake, at times investing heavily in free inquiry and lifelong learning, and at other times worrying that we need more specialized training to be economically competitive. A century ago these worries were intense, and then, as now, pundits talked about a flight from the humanities toward the hard sciences.
Liberal education was a core American value in the first half of the 20th century, but a value under enormous pressure from demographic expansion and the development of more consistent public schooling. The increase in the population considering postsecondary education was dramatic. In 1910 only 9 percent of students received a high school diploma; by 1940 it was 50 percent. For the great majority of those who went on to college, that education would be primarily vocational, whether in agriculture, business, or the mechanical arts. But even vocationally oriented programs usually included a liberal curriculum -- a curriculum that would provide an educational base on which one could continue to learn -- rather than just skills for the next job. Still, there were some then (as now) who worried that the lower classes were getting “too much education.”
Within the academy, between the World Wars, the sciences assumed greater and greater importance. Discoveries in physics, chemistry, and biology did not seem to depend on the moral, political, or cultural education of the researchers – specialization seemed to trump broad humanistic learning. These discoveries had a powerful impact on industry, the military, and health care; they created jobs! Specialized scientific research at universities produced tangible results, and its methodologies – especially rigorous experimentation – could be exported to transform private industry and the public sphere. Science was seen to be racing into the future, and some questioned whether the traditional ideas of liberal learning were merely archaic vestiges of a mode of education that should be left behind.
In reaction to this ascendancy of the sciences, many literature departments reimagined themselves as realms of value and heightened subjectivity, as opposed to so-called value-free, objective work. These “new humanists” of the 1920s portrayed the study of literature as an antidote to the spiritual vacuum left by hyperspecialization. They saw the study of literature as leading to a greater appreciation of cultural significance and a personal search for meaning, and these notions quickly spilled over into other areas of humanistic study. Historians and philosophers emphasized the synthetic dimensions of their endeavors, pointing out how they were able to bring ideas and facts together to help students create meaning. And arts instruction was reimagined as part of the development of a student’s ability to explore great works that expressed the highest values of a civilization. Artists were brought to campuses to inspire students rather than to teach them the nuances of their craft. During this interwar period a liberal education surely included the sciences, but many educators insisted that it not be reduced to them. The critical development of values and meaning was a core function of education.
Thus, despite the pressures of social change and of the compelling results of specialized scientific research, there remained strong support for the notion that liberal education and learning for its own sake were essential for an educated citizenry. And rather than restrict a nonvocational education to established elites, many saw this broad teaching as a vehicle for ensuring commonality in a country of immigrants. Free inquiry would model basic democratic values, and young people would be socialized to American civil society by learning to think for themselves.
By the 1930s, an era in which ideological indoctrination and fanaticism were recognized as antithetical to American civil society, liberal education was acclaimed as key to the development of free citizens. Totalitarian regimes embraced technological development, but they could not tolerate the free discussion that led to a critical appraisal of civic values. Here is the president of Harvard, James Bryant Conant, speaking to undergraduates just two years after Hitler had come to power in Germany:
To my mind, one of the most important aspects of a college education is that it provides a vigorous stimulus to independent thinking.... The desire to know more about the different sides of a question, a craving to understand something of the opinions of other peoples and other times mark the educated man. Education should not put the mind in a straitjacket of conventional formulas but should provide it with the nourishment on which it may unceasingly expand and grow. Think for yourselves! Absorb knowledge wherever possible and listen to the opinions of those more experienced than yourself, but don’t let any one do your thinking for you.
This was the 1930s version of liberal learning, and in it you can hear echoes of Thomas Jefferson’s idea of autonomy and Ralph Waldo Emerson’s thoughts on self-reliance.
In the interwar period the emphasis on science did not, in fact, lead to a rejection of broad humanistic education. Science was a facet of this education. Today, we must not let our embrace of STEM fields undermine our well-founded faith in the capacity of the humanities to help us resist “the straitjackets of conventional formulas.” Our independence, our freedom, has depended on not letting anyone else do our thinking for us. And that has demanded learning for its own sake; it has demanded a liberal education. It still does.
Michael Roth is president of Wesleyan University. His new book, Beyond the University: Why Liberal Education Matters, will be published next year by Yale University Press.His Twitter handle is@mroth78
In all those years I was pursuing a Ph.D. in religious studies, the question of what my profession really stood for rarely came up in conversation with fellow academics, save for occasional moments when the position of the humanities in higher education came under criticism in public discourse. When such moments passed, it was again simply assumed that anyone entering a doctoral program in the humanities knowingly signed on to a traditional career of specialized research and teaching.
But the closer I got to receiving that doctorate, the less certain I became that this was a meaningful goal. I was surrounded by undergraduates who were rich, well-meaning, and largely apathetic to what I learned and taught. I saw my teachers and peers struggle against the tide of general indifference aimed at our discipline and succumb to unhappiness or cynicism. It was heartbreaking.
Fearing that I no longer knew why I studied religion or the humanities at large, I left sunny California for a teaching job at the Asian University for Women, in Chittagong, Bangladesh. My new students came from 12 different countries, and many of them had been brought up in deeply religious households, representing nearly all traditions practiced throughout Asia. They, however, knew about religion only what they had heard from priests, monks, or imams, and did not understand what it meant to study religion from an academic point of view. And that so many of them came from disadvantaged backgrounds convinced me that this position would give me a sense of purpose.
I arrived in Bangladesh prepared to teach an introductory course on the history of Asian religions. But what was meant to be a straightforward comparison of religious traditions around the region quickly slipped from my control and morphed into a terrible mess. I remember an early lesson: When I suggested during a class on religious pilgrimage that a visit to a Muslim saint’s shrine had the potential to constitute worship, it incited a near-riot.
Several Muslim students immediately protested that I was suggesting heresy, citing a Quranic injunction that only Allah should be revered. What I had intended was to point out how similar tension existed in Buddhism over circumambulation of a stupa — an earthen mound containing the relics of an eminent religious figure — since that act could be seen as both remembrance of the deceased’s worthy deeds and veneration of the person. But instead of provoking a thoughtful discussion, my idea of comparative religious studies seemed only to strike students as blasphemous.
Even more memorable, and comical in hindsight, was being urged by the same Muslims in my class to choose one version of Islam among all its sectarian and national variations and declare it the best. Whereas Palestinians pointed to the "bad Arabic" used in the signage of one local site as evidence of Islam’s degeneration in South Asia, a Pakistani would present Afghanis as misguided believers because — she claimed—they probably never read the entire Quran. While Bangladeshis counseled me to ignore Pakistanis from the minority Ismaili sect who claim that God is accessible through all religions, Bangladeshis themselves were ridiculed by other students for not knowing whether they were Sunni or Shi’a, two main branches of Islam. In the midst of all this I thought my call to accept these various manifestations of Islam as intriguing theological propositions went unheeded.
With my early enthusiasm and amusement depleted, I was ready to declare neutral instruction of religion in Bangladesh impossible. But over the course of the semester I could discern one positive effect of our classroom exercise: students’ increasing skepticism toward received wisdom. In becoming comfortable with challenging my explanations and debating competing religious ideas, students came to perceive any view toward religion as more an argument than an indisputable fact. They no longer accepted a truth claim at face value and analyzed its underlying logic in order to evaluate the merit of the argument. They expressed confidence in the notion that a religion could be understood in multiple ways. And all the more remarkable was their implicit decision over time to position themselves as rational thinkers and to define their religions for themselves.
An illustrative encounter took place at the shrine of the city’s most prominent Muslim saint. I, being a man, was the only one among our group to be allowed into the space. My students, the keeper of the door said, could be "impure" — menstruating — and were forbidden to enter. Instead of backing down as the local custom expected, the students ganged up on the sole guard and began a lengthy exposition on the meaning of female impurity in Islam. First they argued that a woman was impure only when she was menstruating and not at other times; they then invoked Allah as the sole witness to their cyclical impurity, a fact the guard could not be privy to and thus should not be able to use against them; and finally they made the case that if other Muslim countries left it up to individual women to decide whether to visit a mosque, it was not up to a Bangladeshi guard to create a different rule concerning entry. Besieged by a half-dozen self-styled female theologians of Islam, the man cowered, and withdrew his ban.
I was incredibly, indescribably proud of them.
Equally poignant was coming face to face with a student who asked me to interpret the will of Allah. Emanating the kind of glow only the truly faithful seem to possess, she sat herself down in my office, fixed the hijab around her round alabaster face, and quietly but measuredly confessed her crime: She had taken to praying at a Hindu temple because most local mosques did not have space for women, and she was both puzzled and elated that even in a non-Islamic space she could still sense the same divine presence she had been familiar with all her life as Allah. She asked for my guidance in resolving her crisis of faith. If other Muslims knew about her routine excursions to a Hindu temple, she would be branded an apostate, but did I think that her instinct was right, and that perhaps it was possible for Allah to communicate his existence through a temple belonging to another religion?
In the privacy of my office, I felt honored by her question. I had lectured on that very topic just before this meeting, arguing that sacred space was not the monopoly of any one religion, but could be seen as a construct contingent upon the presence of several key characteristics. This simple idea, which scholars often take for granted, had struck her as a novel but convincing explanation for her visceral experience of the Islamic divine inside a Hindu holy space. Though she had come asking for my approval of her newly found conviction, it was clear that she did not need anyone’s blessing to claim redemption. Humanistic learning had already provided her with a framework under which her religious experience could be made meaningful and righteous, regardless of what others might say.
And thanks to her and other students, I could at last define my own discipline with confidence I had until then lacked: The humanities is not just about disseminating facts or teaching interpretive skills or making a living; it is about taking a very public stance that above the specifics of widely divergent human ideas exist more important, universally applicable ideals of truth and freedom. In acknowledging this I was supremely grateful for the rare privilege I enjoyed as a teacher, having heard friends and colleagues elsewhere bemoan the difficulty of finding a meaningful career as humanists in a world constantly questioning the value of our discipline. I was humbled to be able to see, by moving to Bangladesh, that humanistic learning was not as dispensable as many charge.
But before I could fully savor the discovery that what I did actually mattered, my faith in the humanities was again put to a test when a major scandal befell my institution. I knew that as a member of this community I had to critique what was happening after all my posturing before students about the importance of seeking truth. If I remained silent, it would amount to a betrayal of my students and a discredit to my recent conclusion that humanistic endeavor is meant to make us not only better thinkers, but also more empowered and virtuous human beings.
So it was all the more crushing to be told to say nothing by the people in my very profession, whose purpose I thought I had finally ascertained. In private chats my friends and mentors in academe saw only the urgent need for me to extricate myself for the sake of my career, but had little to say about how to address the situation. Several of my colleagues on the faculty, though wonderful as individuals, demurred from taking a stance for fear of being targeted by the administration for retribution or losing the professional and financial benefits they enjoyed. And the worst blow, more so than the scandal itself, was consulting the one man I respected more than anybody else, a brilliant tenured scholar who chairs his own department at a research university in North America, and receiving this one-liner:
"My advice would be to leave it alone."
It was simultaneously flummoxing and devastating to hear a humanist say that when called to think about the real-life implications of our discipline, we should resort to inaction. And soon it enraged me that the same people who decry the dismantling of traditional academe under market pressure and changing attitudes toward higher education could be so indifferent, thereby silently but surely contributing to the collapse of humanists’ already tenuous legitimacy as public intellectuals.
While my kind did nothing of consequence, it was the students — the same students whom I had once dismissed as incapable of intellectual growth — who tried to speak up at the risk of jeopardizing the only educational opportunity they had. They approached the governing boards, the administration, and the faculty to hold an official dialogue. They considered staging a street protest. And finally, they gave up and succumbed to cynicism about higher education and the world, seeing many of their professors do nothing to live by the principles taught in class, and recognizing the humanities as exquisitely crafted words utterly devoid of substance.
As my feeling about my discipline shifted from profound grief to ecstatic revelation to acute disappointment, I was able to recall a sentiment expressed by one of my professors, who himself might not remember it after all these years. Once upon a time we sat sipping espresso on a verdant lawn not far from the main library, and he mused that he never understood why young people no longer seemed to feel outrage at the sight of injustice. He is a product of a generation that once once rampaged campuses and braved oppression by the Man. On first hearing his indictment, I was embarrassed to have failed the moral standard established by the older generation of scholars like him. But now I see that it is not just young people but much of our discipline, both young and old, that at present suffers from moral inertia. With only a few exceptions, humanists I know do not consider enactment of virtue to be their primary professional objective, whether because of the more important business of knowledge production or material exigencies of life. And I can only conclude, with no small amount of sadness, that most humanists are not, nor do they care to be, exemplary human beings.
Maybe I should move on, as did a friend and former academic who believes that the only people we can trust to stand on principle are "holy men, artists, poets, and hobos," because yes, it is true that humanists should not be confused with saints. But the humanities will always appear irrelevant as long as its practitioners refrain from demonstrating a tangible link between what they preach and how they behave. In light of the current academic penchant for blaming others for undoing the humanities, it must be said that humanists as a collective should look at themselves first, and feel shame that there is so much they can say — need to say — about the world, but that they say so little at their own expense.
After a year and a half in Bangladesh, I do not doubt any longer that the humanities matters, but now I know that the discipline’s raison d’être dies at the hands of those humanists who do not deserve their name.
Se-Woong Koo earned his Ph.D. in religious studies from Stanford University in 2011. He currently serves as Rice Family Foundation Visiting Fellow and Lecturer at Yale University.
Because of my experience as former CEO of the Seagram Corporation, young business students and aspiring entrepreneurs often seek my advice on the best way to navigate the complex and daunting world of business. As college students begin to think about selecting their majors, they may be influenced by the many reports coming out this time of year that tell them which majors provide the highest post-college earning potential. Last month, PayScale released its 2013-2014 report, lauding math, science and business courses as the most profitable college majors.
My advice, however, is simple, but well-considered: Get a liberal arts degree. In my experience, a liberal arts degree is the most important factor in forming individuals into interesting and interested people who can determine their own paths through the future.
For all of the decisions young business leaders will be asked to make based on facts and figures, needs and wants, numbers and speculation, all of those choices will require one common skill: how to evaluate raw information, be it from people or a spreadsheet, and make reasoned and critical decisions. The ability to think clearly and critically -- to understand what people mean rather than what they say -- cannot be monetized, and in life should not be undervalued. In all the people who have worked for me over the years the ones who stood out the most were the people who were able to see beyond the facts and figures before them and understand what they mean in a larger context.
Since the financial crisis of 2008, there has been a decline in liberal arts disciplines and a rise is pragmatically oriented majors. Simultaneously, there was a rise of employment by college graduates of 9 percent, as well as a decrease of employment by high school graduates of 9 percent. What this demonstrates, in my mind, is that the work place of the future requires specialized skills that will need not only educated minds, but adaptable ones.
That adaptability is where a liberal arts degree comes in. There is nothing that makes the mind more elastic and expandable than discovering how the world works. Developing and rewarding curiosity will be where innovation finds its future. Steve Jobs, the founder of Apple, attributed his company’s success in 2011 to being a place where “technology married with liberal arts, married with the humanities … yields us the results that makes our heart sing.”
Is that reflected in our current thinking about education as looking at it as a return on investment? Chemistry for the non-scientist classes abound in universities, but why not poetry for business students? As our society becomes increasingly technologically focused and we build better, faster and more remarkable machines, where can technology not replicate human thinking? In being creative, nuanced and understanding of human needs, wants and desires. Think about the things you love most in your life and you will likely see you value them because of how they make you feel, think and understand the world around you.
That does not mean forsaking practical knowledge, or financial security, but in our haste to get everyone technically capable we will lose sight of creating well-rounded individuals who know how to do more than write computer programs.
We must push ourselves as a society to makes math and science education innovative and engaging, and to value teachers and education. In doing so, we will ensure that America continues to innovate and lead and provide more job and economic opportunities for everyone. We must remember, however, that what is seen as cutting-edge practical or technological knowledge at the moment is ever-evolving. What is seen as the most innovative thinking today will likely be seen as passé in ten years. Critical to remaining adaptable to those changes is to have developed a mind that has a life beyond work and to track the changes of human progress, by having learned how much we have changed in the past.
I also believe that business leaders ought to be doing more to encourage students to take a second look at the liberal arts degree. In order to move the conversation beyond rhetoric it is important that students see the merits of having a liberal arts degree, in both the hiring process and in the public statements of today’s business leaders.
In my own life, after studying history at Williams College and McGill University, I spent my entire career in business, and was fortunate to experience success. Essential to my success, however, was the fact that I was engaged in the larger world around me as a curious person who wanted to learn. I did not rely only on business perspectives. In fact, it was a drive to understand and enjoy life -- and be connected to something larger than myself in my love of reading, learning, and in my case, studying and learning about Judaism -- that allows me, at 84, to see my life as fully rounded.
Curiosity and openness to new ways of thinking -- which is developed in learning about the world around you, the ability to critically analyze situations, nurtured every time we encounter a new book, or encountering the abstract, that we deal with every time we encounter art, music or theater -- ensures future success more than any other quality. Learn, read, question, think. In developing the ability to exercise those traits, you will not only be successful in business, but in the business of life.
Edgar M. Bronfman was chief executive officer of the Seagram Company Ltd. and is president of the Samuel Bronfman Foundation, which seeks to inspire a renaissance of Jewish life.