Responsible academics have long attempted to discredit the positivistic data generated by IQ tests, variously demonstrating that such instruments favor certain socioeconomic groups under the guise of objectivity, reduce the many types of intelligence into a single rating, and imply a stable position for qualities that are far more variable, even volatile. The resulting bell curves, some scholars have demonstrated, may function as handcuffs for groups that don’t tend to do well. Yet analogs to the oversimplified and unyielding judgments of ability generated by those IQ tests are alive and well in the academy itself today. Too often, in situations ranging from a tenure decision to our expressed or internalized responses to a student paper, we impose firm and final rankings on academic aptitude rather than making a nuanced or provisional evaluation.
Can we generalize about situations ranging from marking a sophomore’s paper in the privacy of one’s office to participating in a meeting on a tenure decision? Clearly issues, stakes, and political implications may differ. The recurrence of certain problems and practices in situations across that spectrum, however, permits — even encourages — certain broad generalizations. At the same time, since some of these issues are field-specific, I am addressing the humanities, and particularly my own discipline, literary and cultural studies. And since the issue of how racial and gendered prejudices can contaminate judgments on intellect has been discussed extensively elsewhere, this essay devotes comparatively less attention to those issues.
Obviously, many types of judgment are necessary and valuable in such fields and in our universities as a whole; I have repeatedly — though by no means invariably — been impressed with the dedication, expertise, and care colleagues have brought to these responsibilities. And I am not now nor have I ever been a member of the parties opposing tenure, not least because I do not think that move would resolve the disgraceful reliance on adjuncts. But we need to acknowledge and negotiate the problems attending the way we evaluate academic ability.
One such problem is premature judgment. For example, deciding on the basis of a single paper that someone is not likely to be a good student throughout the semester or throughout his or her career is problematic for many reasons. In general the teacher should try to suspend that judgment, or, if it must be made, both bracket it with caveats and gradually buttress or modify it with additional evidence. As the literary historian Avrom Fleishman effectively argues in The Condition of English: Literary Studies in a Changing Culture, evaluations that may be appropriate for a particular example of or even a body of work all too often slide into more definitive overall judgments on the person creating it. Often a firm evaluation of the quality of the work at hand may well be entirely sound; a prognostication of future work feasible though risky, and a judgment on immutable qualities of mind deleterious.
The issue Fleishman identifies is especially risky when judgments are made on whether something or someone is “smart.” As Jeffrey Williams persuasively demonstrated in the minnesota review, the replacement of “solid” with “smart” as a term of praise marks an increasing delight in the startling or counterintuitive argument. The ability to generate such points in a single piece of work may indeed demonstrate the intelligence of its author from some perspectives. But again, doing so begs the question of whether those abilities will be sustained and whether they are adequate predictions in themselves of strong scholarship or criticism.
Moreover, should one privilege one version of intelligence over others? The emphasis on multiple types of intelligence in the work of the cognitive scientist Howard Gardner is an important caveat to making judgments of intellectual ability.
I vividly remember that after one of my early IQ tests I heard that I had puzzled teachers because I had done very well elsewhere but missed an apparently simple question. I still remember struggling with it: given a picture of a doll and gloves in three different sizes, we were asked in so many words which gloves would fit “this little doll.” I knew that one set of gloves looked right for the doll, but hearing the word “little” made me erroneously decide that the gloves that were best described as “little” were the correct answer. This mistake prefigured both the unusual verbal skills and indifferent visual and spatial abilities that have characterized my cognitive performances to this day — but since it was simply counted as an error, it also demonstrates the problems of measuring intelligence as a monolithic category.
Problems in the concept of “smart” as well as in other criteria for professional judgments are crystallized by the lecture-style presentation that is so important in hiring at many institutions. What are we measuring, and how effectively? Teaching abilities, some would assert. But such presentations at best reveal only a few of the many skills involved in effective teaching and in fact often serve as an excuse for not assessing other skills, especially at the sort of institution that gives only lip service to the importance of undergraduate education. Are we judging research through these presentations? Yes, and up to a point fair enough. But we risk devoting undue weight to impressions generated by job talks: a careful and protracted assessment of written material is typically both more time consuming (sometimes unfeasibly so) and more valuable.
Yet even faculty members who have reviewed that material sometimes allow their prior judgments on it to be subsumed or virtually forgotten, giving undue weight to the lecture that should instead be evaluated in close conjunction with earlier reading. What all that suggests is that often we are above all judging perceived smartness — or the performance of it — through job talks, and even judging if the candidate displays (flaunts?) precisely the putative markers of smartness we have ourselves, or to which we may aspire. The Q&A, itself unduly weighted in many decisions, also reflects performance and polish — and at its worst invites judgments based on whether one approves of the answer to one’s own question.
Even if one does decide that smartness in its customary senses of rapidly producing a startling insight is the sine qua non for and best measure of academic ability, or if one assigns that role to other dimensions of intelligence, we certainly risk not measuring them accurately, whether in job talks or many other situations. As noted above, the academy has recognized although not invariably curtailed the impact of racial, ethnic, and gendered stereotypes on judgments of academic ability, but many other prejudices may come into play as well. One of the top graduate students I ever taught told me that she had worked sedulously to discard her Southern accent, correctly perceiving that listeners in other regions might be less likely to take her seriously.
For all the consciousness of class and social status in literary and cultural criticism, in our own personnel decisions we too often interpret as signs of mental prowess mannerisms and behaviors that may well result instead from upper-middle-class breeding. Both verbal facility and refined social assurance, frequently though of course not invariably encouraged more in families from the more elite socioeconomic groups, may convey an impression of smartness. (Notice that “smart” is the very term used for elegant clothing.)
More broadly, some members of the profession will be less likely to identify intelligence in someone with an unpolished social manner — though on the other hand others are more likely to expect smartness there. (Another race in which I have a horse, though one emphatically not ready to be put out to pasture: aren’t colleagues more likely to describe people their own age, rather than significantly older, through these and related positive epithets?) As these instances suggest, both judgments on “smartness” as well as other monolithic overall evaluations may screen other, less savory evaluations, whether or not the person making them is aware of that.
Moreover, as the attacks on IQ tests also revealed, intelligence is far from the “ever-fixèd mark” that Shakespeare associates with love in one of his sonnets (116.5). Pressures of all types may temporarily block its components, notably memory; shortly after my father’s unexpected death, I repeatedly had trouble remembering the number for my ATM card, which I readily recalled before and after that event. People in the humanities may well grow and develop in many ways, not only at the stages of their undergraduate and graduate work but often considerably later in their careers. Often switching to a more congenial specialty or critical methodology produces such growth; its predecessor, less compatible with the interests and abilities of the person in question, may well have been encouraged or even dictated by a mentor or the perceived direction of the field. For such reasons, many people who composed an indifferent first or even second book do much better work later on; those who evaluate them throughout their careers on the basis of their early work, followed by a cursory familiarity with later writing or none at all, risk making unfair judgments.
Even if we do calibrate our scales to arrive at more accurate measures for academic aptitude and abilities, those categories may downplay one characteristic necessary for success: the drive that encourages intense and sustained work. Indeed, certain conceptions of intelligence dismiss that type of work as plodding , instead celebrating explicitly or implicitly a concept related to the Renaissance belief in sprezzatura: according to this model, the truly gifted will, as it were, rapidly and effortlessly turn out impressive academic work with their left hand, the right hand perhaps holding a crystal glass of, say, Meursault or another premier French burgundy (reminding us again of the implicit role of class in some judgments). But in fact, as anyone who has followed the career of graduate students over the years knows, the difference between a strong career and a disappointed and disappointing one typically involves not only talent and a sadly and increasingly large component of sheer luck. The recently publicized work by Angela Duckworth, a psychologist at the University of Pennsylvania, has demonstrated the effectiveness of what she terms “grit,” a conclusion that may variously to reinforce and to temper judgments made on other grounds.
The prices paid for the mistakes chronicled above are all too evident. Even if the teacher attempts to be tactful, both undergraduate and graduate students sense judgments; whether or not their perceptions are completely correct, thinking one has been classified as second-rate can too readily become a self-fulfilling prophecy. Above all, when the pie is as small as it is in the academy today, we must work to distribute it as fairly and judiciously as possible
How, then, can we avoid such errors, given that academic judgments are so often necessary and even desirable? We need to remain vigilant about the likelihood of mistakes, remembering, for example, that much as opponents of straw votes point out that they tend to solidify what should be tentative positions; the same danger shadows preliminary judgments on a student or colleague. We need to examine why we ourselves may be tempted into deceived and deceiving judgments. In particular, might we find it hard to challenge standards and procedures of judgment that have aided our own professional advancement?
Heather Dubrow is the John D. Boyd SJ Chair in the Poetic Imagination at Fordham University and taught previously at several other institutions. Among her publications are six single-authored monographs, a co-edited collection of essays, an edition of As You Like It, and a volume of her own poetry.
The article about Spring-Serenity Duvall, a communications professor who banned students from emailing her and lived to blog about it, caught my eye on the same day my own inboxes at two colleges spilled over with bewildered messages from students. Some had been told to purchase the wrong edition of our course text, resulting in their plodding through a chapter on meta-commentary instead of one on contributing meaningfully to group discussions; more simply hadn’t received their textbooks and didn’t know when they would; still others, I suspected, were so besieged by first-week information overload that they needed reassurance from a human who had seemed friendly enough on the first day of class.
When I announced to my Critical Reading and Writing classes the next morning that we wouldn’t cover the assigned reading so we could instead talk about “a professor who doesn’t allow students to email her,” many likely assumed I was using this hook as a launching pad for my own ban. Several — the ones who had dared type a few words or even sentences to me at quiet, unobtrusive hours of the night — looked somewhat repentant. We were going to read this article together, I told them, and in addition to identifying its purpose, audience, context, and noteworthy rhetorical moves, they would be invited to interject their opinions.
“I had a strong reaction when I read this,” I admitted, “and I expect you might as well.”
Turns out, the students generally endorsed Duvall’s policy more than I did. One young man remarked that he initially opposed the idea but began to see its merits as we dug further into the reasoning. Both classes and I settled unanimously on a valuable lesson that could be learned from the spirit of such a ban: Students should try to find the answers themselves, several pointed out, before they bother the professor, who they all (charitably) agreed would be busy with other matters. Others said it would be useful to practice reading course documents more carefully and researching answers on their own or with other peers.
As we identified potential audiences for an article championing such a ban, some responses were obvious, such as fellow professors with hectic schedules. Other responses were disconcerting. More than one student claimed their parents were a perhaps-unintended audience. Parents who foot the bill for this whole venture might be interested (disgruntled?) to discover a brick wall separating their children from the people who are paid to teach them important things.
I have no doubt the email embargo worked miracles for Duvall’s time management. Just because I find student correspondence one of the least complicated demands of the teaching profession doesn’t mean I should impose my preferences on others. And since 47 glowing course evaluations suggest that Duvall’s students not only didn’t feel cheated, but actually thought her in-person-or-by-phone-only rule made her more accessible, I won’t belabor my somewhat obvious challenge that such a policy could deter students — those, perhaps, who are at risk of doing poorly and therefore need the most encouragement — from asking questions down the line or even approaching their future professors.
But isn’t there something to be said for letting young adults — especially those enrolled in a communications course — navigate the delicate rules of student-professor etiquette on their own? For letting them fail at it even? Suppose you email about a problem your professor deems trifling. The two worst consequences are (a) no response or (b) a snippy response. In my own college days, I sent emails that at the time seemed vital but that I now recognize as self-absorbed and/or irritatingly Type A. After a few terse one-liners from professors I admired, I became a less zealous emailer.
There need not be an official ban committed in writing on a syllabus for professors to ignore or even confront messages that are petty or unprofessional. Furthermore, today’s students are attending college in the first place so they can land a job that might one day allow them to emerge from — or even to buoy — this faltering economy. Employers prize communication and collaboration skills more highly than ever, and it’s hard to imagine the 21st-century workplace functioning without people who can competently email.
Do we really want to graduate a generation of students who can’t decide for themselves what warrants pressing the send button? Or, to take this issue to its logical extreme, who think their employers should drop everything to schedule in-person conferences for matters that can be handled in one pithy sentence? If our wading through a bunch of syllabus emails can contribute to a larger discourse about the importance of good professional writing, then maybe we are — in the eyes of the public — one step closer to earning our keep as educators.
Danielle DeRise is an adjunct professor of English, literature, and writing at Piedmont Virginia Community College and James Madison University.
The topic of “civility,” including its place among the professional responsibilities of faculty members, has come to the fore recently, as it does from time to time when there is some especially hot-button, polarizing issue in academe. This time, the context is the Israeli/Palestinian conflict.
Much can be said – and has been said – about this particular context, including the observation there is a large body of writing by academics and public intellectuals that is highly critical of Israel’s treatment of the Palestinians that is based on research, evidence, and analysis. While this may not be an invariable guarantee of accuracy, such writings have a claim both to serious attention and to the protections of academic freedom for their authors. Such efforts and contributions are not well-served when faculty colleagues indulge in loose invective in what are now relatively public channels of communication – channels that are, in these times, open to students.
The same might be said of the more serious and substantive efforts of those who also engage in intemperate blogging and tweeting themselves. To take a not-quite-the-same-but-perhaps-close-enough-for-comparison case from the profession of journalism: would Paul Krugman’s work be more effective or less effective if he were sending out personally abusive tweets about Angela Merkel on the side? And, if he were, would The New York Times be just as proud and happy to have him on the roster? And would the newspaper’s editors and board be in as strong a position to defend him against outside pressure from those who do not share his economic, political, and moral views – that is, would they be in as strong a position to defend freedom of the press, an important responsibility of editors and boards of newspapers?
The American Association of University Professors has been understandably wary of the use of “civility,” particularly in matters of appointments and tenure. Like its close associate “collegiality,” it is open to multiple interpretations and may be put to various ignoble purposes. It may simply be thrown around too loosely for its message to be clear. Invoking it in the context of current affairs in the Middle East may seem ironic to some, downright offensive to others.
When we invoke “civility” in the context of higher education the focus must be on maintaining the kind of light-to-heat ratio that serves our role as teachers and scholars. What it certainly does not mean is sweeping difficult, controversial, painful issues under the rug. Let us also bear in mind that when we or our colleagues may be found wanting in the exercise of our professional responsibilities, this can be addressed in ways well short of the to-be-or-not-to-be of hiring, firing, and tenure.
Some critics of the civility standard propose that it can only be useful if operationalized and thus able to pass muster in terms of specificity. This, however, requires us to face the fact that formal codes and procedures are no substitute for shared norms about appropriate, responsible, civilized behavior.
We have seen this in the misbegotten attempt to address prejudice, ignorance, and general nastiness through formal speech codes that have the additional disadvantage of falling afoul of our legal systems.
We have seen it in the attempt of the University of Virginia’s board to pass a formal policy preventing trustees from speaking out of school, i.e., publicly criticizing decisions made by their fellow board members – yet another occasion for the university to receive negative attention in the higher education press.
Societies have various means of social control at their disposal. There is what the Quakers call “eldering,” whereby a respected, usually senior, member of the community offers guidance. Less kindly approaches include withholding important social rewards: positions of authority, honors, invitations. At the social extreme, there is ostracism.
This is not intended as a to-do list, but rather as examples of how a community expresses its attachment to what it sees as core values. Surely, we can manage to promote such values without sacrificing the forms of individualism and eccentricity necessary to academe.
And, also surely, it would seem desirable for such standards to be maintained first and foremost by faculty members themselves, rather than by administrative action from above (or, as some may see it, below). “Corporate” has become a term of abuse in academe, but we might go back to its more general reference to a group of individuals acting as a body, a community. It may well be that the corporate exercise of professional responsibility on the part of the faculty is key not only to preserving the tenure system but to making our colleges and universities the kinds of places where we truly want to work and live.
Judith Shapiro is professor of anthropology emerita at Bryn Mawr College and Barnard College.
If your college or university is anything like mine – seeking significantly increased resources to enable all the research, student aid, and facilities development that we would like to support – then perhaps you’ve been watching this summer’s social media phenom of the ALS ice bucket challenge with a sense of envy.
I share in the general pleasure that a worthy charity has enormously increased its finances, which may speed up a cure for a terrible disease. On the tally board of life, this profuse bucketing outbreak goes on the plus side for those of us who’d like to believe that people are basically good and inclined to help others in need.
And I also see the cavils: that this movement is a “slacktivist” fad, an easy and lazy manifestation of commitment; that amyotrophic lateral sclerosis is relatively rare, and perhaps less deserving of funding than more prevalent maladies like malaria, diabetes, and Alzheimer’s; and that research funding should be determined by the rational standards of peer review rather than clickbait.
But my own foremost (and self-centered) response to this orgy of charitable energy is: If only I’d thought of it first. We might have a half-dozen new endowed chairs in our department and teaching-free dissertation fellowships for every one of our graduate students. Zadie Smith and Thomas Pynchon would be the featured speakers in our English department lecture series. (Granted, Pynchon’s not very visible on the lecture circuit, but wait until he sees our offer!)
Is our cause sufficiently worthy? Of course it is, and it’s pointless to argue whether higher education or ALS is more deserving: apples and oranges. The suffering of an ALS victim is terrible. The plight of people who cannot maximize their talents, too, is terrible. At my university, where over half our students qualify for Pell Grants and a third are first-generation college students, I see firsthand every day how profoundly meaningful a college education is for those who are marginally able to achieve it, and how fundamentally valuable it would be to extend that margin as much as possible.
So what can we do to connect with the public, to promote our worthy cause, and to set off a chain reaction that will bring along hordes of people jumping onto our bandwagon?
In the meta-analysis of the ice bucket challenge, many have commented on the arbitrariness of charitable giving and of catching the public’s eyes and hearts. But still, is there something we in academe can learn from this? Is this sort of philanthropic enterprise replicable?
Where can I sign up my department to raise millions of dollars? I suppose I’d include the humanities at large – or even more magnanimously, I’ll extend the invitation to academe generally. (Participants from every campus could sport their university T-shirts to identify the recipient of each donation.)
Nearly as important as the cash, it would be extremely rewarding to find ourselves in the thick of a snowballing social movement, like the ALS campaign, that raises national consciousness and unleashes a contagious enthusiasm about what we do in higher education and how deserving our mission is of support.
Probably the appeal of the ice bucket campaign was lucky and unpredictable; if anyone knew exactly what makes a multimillion-hit meme, I imagine there would be consultants charging multimillion-dollar fees to produce them. (Perhaps there actually are such consultants, though I’m not aware of them.) Is it the snazzy visuals of the unexpected? The counterintuitive willingness to ruin an outfit and suffer – however momentarily – what I imagine would be a very unpleasant experience? (I haven’t taken this challenge myself, though I strongly suspect that I will be invited to do so any minute now.)
Honestly, I don’t have any bright ideas about how exactly to stage an academic iteration: a pie in the face? Banana peel pratfalls? Blind man’s bluff into a vat of tomato sauce?
Perhaps we in the academy should aspire to something more dignified, but maybe, presuming that the success of ALS merits attention as a “best practice” ripe for our own adaptation, what draws massive crowds of participants is precisely the unexpected contrast between the seriousness of the problem and the oddly undignified escapism of the momentary “challenge.”
Some kind of slapstick gesture seems necessary: something physical and messy and shocking, involving a very intimately personal – bodily – engagement.
As silly as it is, the ALS Association’s challenge represents a wonderful manifestation of human ambition and determination: curing a debilitating disease seems undoable until it’s doable. With enough resolve, and enough money to throw at the problem, and enough human intelligence (which mainly takes the form, I will note, of academic research), it can be done.
The same goes for a university education. Our scholarship, our teaching, and our community partnerships all contribute to the creation of a better society as measured by myriad qualitative and quantitative metrics. The notion of millions of citizens taking the time and energy to help us out by doing something that affirms our value would vitally reinvigorate our campuses after years of retrenched government funding and skyrocketing student debt. If our campaign were as successful as the ice bucket challenge – and why not dare to dream big? – we could actually mitigate those twin financial catastrophes that have lately taken such a toll on higher education.
I’ve done the hard part here in announcing this challenge to launch our challenge. Now somebody please just send me the YouTube link when you’ve figured out the specifics.
Randy Malamud is Regents’ Professor and chair of the English department at Georgia State University.
As campus efforts to support boycotts of Israel universities intensify this year — and everyone expects them to in the wake of events in Gaza, the most challenging and controversial question about the movement that sponsors the boycott agenda looms over all of us: Are there anti-Semitic dimensions to the Boycott, Sanctions, and Divestment (BDS) movement?
BDS advocates have long countered the anti-Semitic label by protesting that critics of Israeli government policy do not deserve accusations that they are anti-Semitic. In fact BDS opponents themselves often reject the claim that every critic of Israel, or even every supporter of BDS, is anti-Semitic. Israelis themselves are relentless critics of the government in power, and many of the Jewish state’s strong supporters there and abroad condemn the occupation of the West Bank and urge curtailment of settlement construction or withdrawal from most existing settlements. BDS assertions that they are condemned simply because they are policy critics distract us from the more complex and troubling ways that the movement enhances anti-Semitic aims.
Ever since Lawrence Summers asserted that the divestment movement proposals were “anti-Semitic in their effect, if not in their intent,” we have had a model to use in examining the prejudicial implications of BDS in a more thoughtful way. That does not mean that every divestment proposal is anti-Semitic, but it does help us see why people who advocate the elimination of Israel as a Jewish state are promoting a goal that has anti-Semitic effects.
Arguments that Jews have no ancient connection to the land, that Israelites and Hebrews never existed — positions that some academic BDS advocates promote — also have an anti-Semitic component. The demand that the citizens of Israel give up their right to political self-determination and the unsupportable assertion that the Israeli government is an exceptionally egregious human rights violator are also consciously or unconsciously underwritten by the long-term history of anti-Semitism and the history of efforts to isolate and “other” the Jewish people.
I realize that people will dispute these conclusions, but they nonetheless offer examples of a more serious basis for debating the issue I am urging all of us to address. Doing so also requires that we confront the policies vigorously promoted by virtually all of the BDS movement’s major spokespeople, whether or not the movement officially endorses them. These include advocacy by Omar Barghouti and others of a one-state “solution” encompassing Israel, Gaza, and the West Bank in which Jews would become a minority. That demand is typically accompanied by the call for the Palestinian diaspora’s “right of return” to this new state, a plan that would further marginalize the Jewish population. Both positions are put forward in Barghouti’s Boycott, Divestment, Sanctions, Judith Butler’s Parting Ways, and other books.
The confidence with which some BDS advocates assure us Jews could live peacefully and safely and have full religious freedom in an Arab-dominated state is so contradicted by regional history, culture, and politics that one has to consider the possibility that they really do not care about the fate of Israeli Jews. Naivety alone does not seem to account for so thorough a denial of reality. The real perils Jews could face in an Arab-dominated state undercut the rather pious claims about the movement’s dedication to nonviolence that are part of its founding principles. Once again, highly likely violent effects call into question the status of nonviolent intent. Equally worrisome are those BDS supporters who ally themselveswithHamas, despite the organization’s ferociously anti-Semitic and genocidal charter. One might well wonder why those in the West who would ordinarily oppose a group that vilifies gays — and has an appalling view of women — would overlook these facts because of Hamas’s stance toward Israeli Jews.
While the BDS movement undoubtedly gathers some conscious anti-Semites into its fold, the way in which it more broadly assigns the traditional pariah status of Jews to the Israeli state is equally troubling. Debates about BDS resolutions and petitions often invoke the standard tropes anti-Semitism has deployed, notably that BDS opponents are organized and funded by an international Jewish lobby, an accusation that surfaced during the Modern Language discussion of its 2014 resolution condemning Israeli visa policies. That both NGOs and foreign governments fund BDS activity is rarely mentioned.
Talking about such matters can also lead people to ask themselves whether their hostility to Israel is a vehicle for unconscious resentment toward Jews. Only individual self-reflection, not academic debate, can answer that question. Certainly when BDS advocates spread anti-Jewish stereotypes and myths they owe it to themselves to examine their hearts more rigorously. The fact that a number of Jewish academics support the BDS movement does not absolve anyone of the possibility they harbor an anti-Semitic bias, though the BDS movement likes to say it does.
Helen Fein’s 1987 definition (in The Persisting Question) of anti-Semitism as “a persisting latent structure of hostile beliefs towards Jews as a collectivity” is a good place to begin in thinking about the role of anti-Semitism in BDS passion. Indeed it is not unreasonable to feel that such psychological motives underlie the exceptional level of hostility displayed in some BDS forums, among the most extreme being the Electronic Intifada and Mondoweiss websites. Moreover, there are statements that have anti-Semitic content and anti-Semitic effects — that can be adopted and used by willing anti-Semites — no matter what their original authors think they intended. Such inherent hostility does need to be examined in the academy.
That hostility is often focused on the most demonized term in the BDS lexicon: Zionism. The historical movement and the concept have had many definitions over the years, though in the current political climate simply believing that a Jewish state has a historically, internationally, and morally justified right to exist in Palestine is enough to win disapproval of your Zionist identity. It often doesn’t help if you want Israel to withdraw from the occupied territories. You are still a blind Zionist ideologue.
Absolute opposition to Israel’s existence increases anti-Semitism’s cultural and political reach and impact. Indeed, if anti-Semitism is a fundamental condition of possibility for unqualified opposition to the Jewish state, then anti-Zionism is anti-Semitism’s moral salvation, its perfect disguise, its route to legitimation.
There is a disturbing bait-and-switch element to BDS’s recruitment strategies. The movement recruits students with a call for justice for Palestinians — justice that a two-state solution could provide — then draws them into one-state advocacy, a goal with devastating consequences for Israeli Jews. It justifies its one-state advocacy by demonizing the State of Israel with hyperbolic and irrational accusations.
Meanwhile, the rise of anti-Semitism in Europe and elsewhere puts the lie to the confidence that Jews do not need a homeland whose future and right to self-defense they control. Indeed it strengthens the opposite argument.
The same bait-and-switch effect attaches to the risk that recruiting anti-Zionists, while giving them collective support and making them more fervid, will turn them into anti-Semites. Beginning with the 2006 Journal of Conflict Resolution essay “Anti-Israel Sentiment Predicts Anti-Semitism in Europe” by Edward Kaplan and Charles Small and continuing through to Alvin Rosenfeld’s 2013 collection Resurgent Antisemitism, research has suggested that the more extreme one’s position on Israel, the more likely one is to harbor classic anti-Semitic beliefs. And those who want to abolish the Jewish state show higher rates of belief in Jewish conspiracies and other anti-Semitic delusions. All these patterns intersect uncomfortably with the BDS movement.
There are two peoples in Palestine who deserve justice and deserve homelands. Demonizing one of them, as BDS does, will not promote peace and not lead to a Palestinian state. Rage and hatred may be personally gratifying to some, but they get in the way of a political solution. Indeed they can block the willingness to compromise that is fundamental to any negotiating process. This suggests that anti-Semitism has consequences that those unconsciously succumbing to its influence need to confront. For anti-Semitism tragically offers nothing tangible to the very Palestinians BDS claims to champion.
Cary Nelson is a professor of English at the University of Illinois at Urbana-Champaign. He is the co-editor, with Gabriel Noah Brahm, of The Case Against Academic Boycotts of Israel, scheduled for October distribution by Wayne State University Press.