Brian Cranston’s recitation of “Ozymandias” in last year’s memorable video clip for the final season of Breaking Bad may have elided some of the finer points of Shelley's poem. But it did the job it was meant to do -- evoking the swagger of a grandiose ego, as well as time’s shattering disregard for even the most awe-inspiring claim to fame, whether by an ancient emperor or meth kingpin of the American Southwest.
But time has, in a way, been generous to the figure Shelley calls Ozymandias, who was not a purely fictional character, like Walter White, but rather the pharaoh Ramses II, also called User-maat-re Setep-en-re. (The poet knew of him through a less exact, albeit more euphonious, transcription of the name.) He ruled about one generation before the period that Eric H. Cline, a professor of classics and archeology at George Washington University, recounts in 1177 B.C.: The Year Civilization Collapsed (Princeton University Press).
Today the average person is reasonably likely to know that Ramses was the name of an Egyptian ruler. But very few people will have the faintest idea that anything of interest happened in 1177 B.C. It wasn't one of the 5,000 “essential names, phrases, dates, and concepts” constituting the “shared knowledge of literate American culture” that E.D Hirsch identified in his best-seller Cultural Literacy (1988), nor did it make it onto the revised edition Hirsch issued in 2002. Just over 3,000 years ago, a series of catastrophic events demolished whole cities, destroying the commercial and diplomatic connections among distinct societies that had linked up to form an emerging world order. It seems like this would come up in conversation from time to time. I suspect it may do so more often in the future.
So what happened in 1177 B.C.? Well, if the account attributed to Ramses III is reliable, that was the date of a final, juggernaut-like offensive by what he called the Sea Peoples. By then, skirmishes between Egypt and the seafaring barbarians had been under way, off and on, for some 30 years. But 1177 was the climactic year when, in the pharaoh’s words, “They laid their hands upon the lands as far as the circuit of the earth, their hearts confident…. ” The six tribes of Sea Peoples came from what Ramses vaguely calls “the islands.” Cline indicates that one group, the Peleset, are "generally accepted” by contemporary scholars "as the Philistines, who are identified in the Bible as coming from Crete.” The origins of the other five remain in question. Their rampage did not literally take the Sea Peoples around “the circuit of the earth,” but it was an ambitious military campaign by any standard.
They attacked cities throughout the Mediterranean, in places now called Syria, Turkey, and Lebanon, among others. About one metropolis Ramses says the Sea Peoples “desolated” the population, Ramses says, “and its land was like that which has never come into being.”
Cline reproduces an inscription that shows the Sea Peoples invading Egypt by boat. You need a magnifying glass to see the details, but the battle scene is astounding even without one. Imagine D-Day depicted exclusively with two-dimensional figures. The images are flat, but they swarm with such density that the effect is claustrophobic. It evokes a sense of terrifying chaos, of mayhem pressing in on all sides, so thick that nobody can push through it. Some interpretations of the battle scene, Cline notes, contend that it shows an Egyptian ambush of the would-be occupiers.
Given that the Egyptians ultimately prevailed over the Sea Peoples, it seems plausible: they would have had reason to record and celebrate such a maneuver. Ramses himself boasts of leading combat so effectively that the Sea Peoples who weren't killed or enslaved went home wishing they’d never even heard of Egypt: “When they pronounce my name in their land, then they are burned up.”
Other societies were not so fortunate. One of them, the Hittite empire, at its peak covered much of Turkey and Syria. (If the name seems mildly familiar, that may be because the Hittites, like the Philistines, make a number of appearances in the Bible.) One zone under Hittite control was the harbor city of Ugariot, a mercantile center for the entire region. You name it, Ugarit had it, or at least someone there could order it for you: linen garments, alabaster jars, wine, wheat, olive oil, anything in metal…. In exchange for paying tribute, a vassal city like Ugarit enjoyed the protection of the Hittite armed forces. Four hundred years before the Sea Peoples came on the scene, the king of the Hittites could march troops into Mesopotamia, burn down the city, then march them back home — a thousand miles each way — without bothering to occupy the country, “thus,” writes Cline, “effectively conducting the longest drive-by shooting in history.”
But by the early 12th century, Ugarit had fallen. Archeologists have found, in Cline’s words, "that the city was burned, with a destruction level reaching two meters high in some places.” Buried in the ruins are “a number of hoards … [that] contained precious gold and bronze items, including figurines, weapons and tools, some of them inscribed.” They "appear to have been hidden just before the destruction took place,” but "their owners never returned to retrieve them.” Nor was Ugarit ever rebuilt, which raises the distinct possibility that there were no survivors.
Other Hittite populations survived the ordeal but declined in power, wealth, and security. One of the maps in The Year Civilization Collapsed marks the cities around the Mediterranean that were destroyed during the early decades of the 12th century B.C. — about 40 of them in all.
The overview of what happened in 1177 B.C. that we’ve just taken is streamlined and dramatic — and way too much so not to merit skepticism. It’s monocausal. The Sea Peoples storm the beaches, one city after another collapses, but Ramses III survives to tell the tale…. One value of making a serious study of history, as somebody once said, is that you learn how things don’t happen.
Exactly what did becomes a serious challenge to determine, after a millennium or three. Cline’s book is a detailed but accessible synthesis of the findings and hypotheses of researchers concerned with the societies that developed around the Mediterranean throughout the second millennium B.C., with a special focus on the late Bronze Age, which came to an end in the decades just before and after the high drama of 1177. The last 20 years or so have been an especially productive and exciting time in scholarship concerning that region and era, with important work being done in fields such as archeoseismology and Ugaritic studies. A number of landmark conferences have fostered exchanges across micro-specialist boundaries, and 1177 B.C.: The Year Civilization Collapsed offers students and the interested lay antiquarian a sense of the rich picture that is emerging from debates among the ruins.
Cline devotes more than half of the book to surveying the world that was lost in or around the year in his title — with particular emphasis on the exchanges of goods that brought the Egyptian and Hittite empires, and the Mycenean civilization over in what we now call Greece, into closer contact. Whole libraries of official documents show the kings exchanging goods and pleasantries, calling each “brother,” and marrying off their children to one another in the interest of diplomatic comity. When a ship conveying luxury items and correspondence from one sovereign to another pulled in to dock, it would also carry products for sale to people lower on the social scale. It then returned with whatever tokens of good will the second king was sending back to the first — and also, chances are, commercial goods from that king’s empire, for sale back home.
The author refers to this process as “globalization,” which seems a bit misleading given that the circuits of communication and exchange were regional, not worldwide. In any case, it had effects that can be traced in the layers of scattered archeological digs: commodities and artwork characteristic of one society catch on in another, and by the start of the 12th century a real cosmopolitanism is in effect. At the same time, the economic networks encouraged a market in foodstuffs as well as tin — the major precious resource of the day, something like petroleum became in the 20th century.
But evidence from the digs also shows two other developments during this period: a number of devastating earthquakes and droughts. Some of the cities that collapsed circa 1177 may have been destroyed by natural disaster, or so weakened that they succumbed far more quickly to the marauding Sea Peoples than they would have otherwise. For that matter, it is entirely possible that the Sea Peoples themselves were fleeing from such catastrophes. “In my opinion,” writes Cline, “… none of these individual factors would have been cataclysmic enough on their own to bring down even one of these civilizations, let alone all of them. However, they could have combined to produce a scenario in which the repercussions of each factor were magnified, in what some scholars have called a ‘multiplier effect.’ … The ensuing ‘systems collapse’ could have led to the disintegration of one society after another, in part because of the fragmentation of the global economy and the breakdown of the interconnections upon which each civilization was dependent."
Referring to 1177 B.C. will, at present, only get you blank looks, most of the time. But given how the 21st century is shaping up, it may yet become a common reference point -- and one of more than antiquarian relevance.
I do not know if he was an ancestor of the talk-show host, but one Jean-Baptiste Colbert served as minister of finance for Louis XIV. A page on the tourism-boosting website for Versailles notes that his name lived on "in the concept of colbertism, an economic theory involving strict state control and protectionism."
An apt phrase can echo down through the ages, and the 17th-century Colbert turned at least a couple of them. The idea that each nation has a "balance of trade" was his, for one. And in a piece of wit that surely went over well at court, Colbert explained that "the art of taxation consists in so plucking the goose as to obtain the largest amount of feathers with the least amount of hissing."
Procrastination makes tax resisters of us all, at one time or another. But mostly we submit, just to get it over with, and we keep the hissing to a prudent minimum. Not so the politicians, ideologues, and organizations chronicled by Romain D. Huret in American Tax Resisters (Harvard University Press). Relatively few of them carried rebellion so far as to risk imprisonment or bankruptcy in defense of their principles by outright refusing to pay up. But they were unrelentingly vocal about their fear that the state was hell-bent on reducing them to peonage.
American Tax Resisters proves a little more narrowly focused than its title would suggest; its central concern is with opposition to the income tax, though Huret's interest also extends to protest against any form of progressive taxation. The author is an associate professor of American history at the University of Lyon 2 in France, and writes that he’s now spent two decades pondering "why Americans had such a complex relationship with their federal government."
In selecting one aspect of that complex relationship to study, he makes some surprising though defensible choices. He says very little about the Boston Tea Party or Shay's rebellion, for example. Instead, he takes the Civil War as the moment when anti-tax sentiments began to be expressed in terms that have persisted, with relatively little variation, ever since. The book is weighted more heavily toward narrative than analysis, but the role of major U.S. military commitments in generating and consolidating the country’s tax system does seem to be a recurrent theme.
Before taking office, Lincoln held that government funds ought to be raised solely through tariffs collected, he said, "in large parcels at a few commercial points.” Doing so would require "comparatively few officers in their collection.” In the early months of the war, his administration tried to supplement revenue through an income tax that largely went uncollected. With most of the country’s wealth concentrated in the Northeast, most of the burden would have fallen on a few states.
Instead, revenue came in through the sale of war bonds as well as the increased taxation of goods of all kinds, which meant driving up the prices of household commodities. By 1863, a Congressman from the North was warning of "the enslavement of the white race by debt and taxes and arbitrary power.” The link between anti-tax sentiment and racial politics only strengthened after the Confederacy’s defeat.
The need to pay off war debts, including interest on bonds, kept many of the new taxes imposed by the Lincoln administration in place into the 1880s. Businessmen who prospered during the conflict, as well as tycoons making new fortunes, resented any taxation of their incomes -- let alone the progressive sort, in which the rate increased as the amount of income did. Anti-tax writers insisted that progressive taxation was a policy of European origin, and “communistic,” and even a threat to the nation’s manhood, since it might (through some unspecified means) encourage women to assert themselves in public.
Another current of anti-tax sentiment reflected the anxiety of whites in Dixie, faced with the menace of African-American equality, backed up by the efforts of the Freedmen’s Bureau and other Reconstruction-era government agencies. Huret reprints an anti-tax poster from 1866 in which hard-working white men produce the riches taxed to keep a caricatural ex-slave in happy idleness.
The rhetoric and imagery of anti-tax protests from the late 19th century have shown themselves to be exceptionally durable (only the typography makes that poster seem old-fashioned) and they recur throughout Huret’s account of American tax resistance in the 20th century and beyond. With each new chapter, there is at least one moment when it feels as if the names of the anti-tax leaders and organizations have changed, but not much else. Certainly not the complaints.
Yet that’s not quite true. Something else does emerge in American Tax Resisters, particularly in the chapters covering more recent decades: people's increasingly frustrated and angry sense of the government encroaching on their lives.
By no means does the right wing have a monopoly on the sentiment. But every activist or group Huret writes about is politically conservative, as was also the case in Isaac William Martin's book Rich People’s Movements: Grassroots Campaigns to Untax the One Percent, published last year by Oxford University Press and discussed in this column.
Neither author mentions Edmund Wilson’s book The Cold War and the Income Tax: A Protest (1962), which criticizes “the Infernal Revenue Service,” as some resisters call it, in terms more intelligent and less hysterical than, say, this piece of anti-government rhetoric from 1968 that Hulet quotes: “The federal bureaucracy has among its principle objectives the destruction of the family, the elimination of the middle class, and the creation of a vast mass of people who can be completely controlled.”
Wilson wrote his book after a prolonged conflict with the IRS, which had eventually noticed the author’s failure to file any returns between 1946 and 1955. Wilson explained that as a literary critic he didn’t make much money and figured he was under the threshold of taxable income. Plus which, his lawyer had died. The agents handling his case were unsympathetic, and Wilson’s encounter with the bureaucracy turned into a Kafkaesque farce that eventually drove him from excuses to rationalization: his growing hostility led Wilson to decide that failure to pay taxes was almost an ethical obligation, given that the military-industrial complex was out of control. He vowed never again to earn enough to owe another cent in income tax, though he and the IRS continued to fight it out until his death 10 years later.
I don’t offer this as an example of tax resistance at its most lucid and well-argued. On the contrary, there’s a reason it’s one of Wilson’s few books that fell out of print and stayed there.
But it is a lesson in how the confluence of personal financial strains and the cold indifference of a bureaucratic juggernaut can animate fiery statements about political principle. It’s something to consider, along with the implications of Socrates's definition of man as a featherless biped.
The men who established the republic were no plaster saints of Red State moral uplift. Only one of the half-dozen figures Thomas A. Foster writes about in Sex and the Founding Fathers: The American Quest for a Relatable Past (Temple University Press) would escape denunciation by the Traditional Values Coalition if the Founders were around today.
Accusations of adultery or of fathering children out of wedlock (or both) were made against George Washington, Thomas Jefferson, Benjamin Franklin, and Alexander Hamilton; the last two admitted the truth of the charges. Gouverneur Morris managed to draft the Constitution between rounds of frequent, strenuous fornication -- exercise he pursued despite having a severely mangled right arm and amputated left leg.
Only the the tightly wound John Adams seems to have escaped any hint of scandal. By all evidence, he and Abigail were strictly monogamous and not averse to finger-wagging at the other Founders' morals -- especially Franklin's, which were particularly relaxed. Besides writing a notorious essay on selecting a mistress, Franklin lived with a common-law wife; later, he conducted a good deal of his work as ambassador to France either in bed with well-born Parisian ladies or trying to get them there.
He was also broad-minded in ways that would be fodder for cable TV news today. He seems to have been on friendly terms with one Chevalier d'Eon, a French diplomat who preferred to dress in women's clothing. Poor Richard's ventriloquist was, as it's put nowadays, straight but not narrow.
Tabloid history? No, though much innuendo about the Founders did appear in frankly sensationalist publications of the day. (Negative campaigning goes way back.) Foster, an associate professor of history at DePaul University, is innocent of any muckraking intent. Everything in Sex and the Founding Fathers is a well-established part of the historical record, and in the case of Jefferson's relationship with his slave Sally Hemmings, you'd have to have spent the last 20 years on a desert island not to have heard about it by now.
The author isn't interested in revealing the character or psychology of the early American statesmen. Rather, the book is a metahistory (not that Foster uses such jargon) of how their sex lives and their public roles were understood during across the past 200 years or so. The biography of a major political figure is itself a political act. Historians and others writing about the Founders have dealt with their peccadilloes in different ways over time, the shifts in emphasis and judgment reflecting changes in the national political culture.
George Washington, for example, seems the most austerely virtuous of the country's early leaders, thanks especially to the moralizing fables of Parson Weems. Recent biographies suggest that he had a number of romantic relationships, consummated and otherwise, before marrying Martha. Writers of historical fiction depict the six-foot-three, athletically built military man as exerting powerful animal magnetism upon the colonial womenfolk. (Like Fabio, but with wooden teeth.) In real life, Washington addressed passionate letters to a married woman. If no further improprieties occurred, it was not for want of trying.
Foster notes the tendency to assume that earlier images of the first president were "disembodied" idealizations which have "only recently been humanized." But the record is more nuanced: "Even the earliest images emphasize both his domestic life and his military and government successes," Foster writes, with some 19th-century biographies and paintings "establish[ing] Washington as the romantic man" as well as "head of a prosperous household." But on that last point, one fact was somewhat problematic: Martha, who was a widow when they met, had a number of children by her first husband but never conceived with George.
"No early account hides the fact that he had no children of his own," Foster notes. "But 19th-century writers do not dwell on this aspect of his life, leaving some readers to their own devices to determine this aspect of his private family life." Biographers in the Victorian era "could not anticipate that readers would ever expect an answer to the very personal question of why he had no children."
Refusing to acknowledge the question did not make it go away, however. The lack of progeny was a seeming defect in Washington's status as embodiment of masculine ideals. One answer to the problem was sentimental: The couple could be depicted as blissfully compatible yet saddened by their plight, even without any evidence of it. ("Americans," Foster remarks, "have never hesitated to speak definitively about the loves and inner lives of the Founders, despite a lack of documentation.") Unfortunate as the situation was, Washington finally transcended it by becoming "father of his country." Another solution was to deny that Washingon's virility was compromised at all, by claiming that he had an illegitimate son by the widow of one of his tenant farmers. See also the rumor that Washington died from a cold he caught "from leaping out a window, pants-less, after a romantic encounter with an 'overseer's wife.'"
No other figure in Sex and the Founding Fathers occupies so markedly paternal a role in public life, but in each case Foster brings out the complex and tightly knit relationship between sexual and political life. Even with Benjamin Franklin -- whose flirtatiousness is well-known, as is his earthy advice about the benefits of dating older women -- the author finds aspects of the record that add some nuance to the familiar portrait. I never appreciated just how disturbing a figure he was to his countrymen in the 19th century, when a senator struck his name from a list of candidates for a proposed national hall of fame on these grounds:
"Dr. Franklin's conduct of life was that of a man on a low plane. He was without idealism, without lofty principle, and one side of his character gross and immoral.... [His letter] on the question of keeping a mistress, which, making allowances for the manner of the time, and all allowance for the fact that he might have been in jest, is an abominable and wicked letter; and all his relation to women, and to the family life, were of that character."
Abominable? Well, he wasn't a hypocrite, and that's always a risky thing not to be. Consider also Alexander Hamilton. When accused of financial improprieties involving public funds, he denied it but admitted to having had a fling with a married woman whose husband then tried to blackmail him. "He chose to discuss the affair, in print, publicly, and in the greatest of documented detail to save his public honor," writes Foster. "He was not divorced. His wife did not denounce him. [George] Washington publicly supported him, as did others."
For a long time, biographers treated the matter evasively. They airbrushed the details out of his portrait as much as possible. Nowadays, Foster says, we get "warts-and-all hagiography -- ones that present failings only to dismiss them or have them overshadowed by an overarching theme of national greatness." Either way, he argues, the statesmen of the early republic stand apart from more recent politicians embroiled in sex scandals in one important way. Our contemporary lotharios can skulk off the public stage after a while, while the Founders never can. Their dirty linen hangs out for everyone to see, forever.
As Scott Jaschik points out in his January 13, 2014 article, “The Third Rail,” the terrible stress our newly minted Ph.D.s in English, comp lit, and foreign languages confront when they begin the job search seems only to be escalating rather than abating. Understandably, then, many Modern Language Association convention sessions, as well as a growing body of publications, have been taking up a variety of proposals for addressing the job crisis. Jaschik mentions the session I chaired, “Who Benefits? Competing Agendas and Graduate Education,” and he carefully articulates the basic positions of the panelists as we were all in general agreement that shrinking the size of graduate programs in English would not be the best way to remedy the situation. But the reasons we hold those beliefs in favor of expansion rather than contraction seem to have slipped out of view. I would like to highlight them here.
Let me begin by stating the obvious nature of the suffering: When you defund public higher education, someone is going to have to pay, and it has been our colleagues forced to accept unethically precarious working conditions both during and after grad school, and students at all levels burdened with massively increasing educational debt. These are circumstances we must protest with all the solidarity we can muster. But all this misery, the sense of lives ruined, institutionalized failure, personal anguish — these horrors come not just from oversized grad programs, but from a much larger capitalist economy that is wreaking havoc on many workers and unemployed poor in and out of academia. As Marc Bousquet has explained, it is not a market; or, at least, it is not a “free market” in any real sense despite our common rhetorical reference to the horrors of the “job market.” It is a system we are caught in, and one orchestrated, it’s true, by our own institutional structures that have now been fine-tuned to serve the champions of privatization, defunding, and austerity. In this type of economic system, higher education has become a kind of laboratory for the production of a precarious, contingent, low-wage faculty. The economic inequality within the profession mirrors the economic inequality in the society. From any ethical perspective, it is a system that has gone terribly wrong.
What has been most missing from the discussion about graduate school size has been a concise understanding of why the market logic doesn’t work for English grad programs, and the main reason is because it is not an accurate description of how the system really works. If it were a case of supply and demand, it might make good ethical sense to reduce the overproduction of Ph.D.s to meet the lower demand for tenured professors. In short, if you could reduce the supply without altering demand, this equalizing would clearly make it easier for graduates to get tenured jobs for the simple reason that there would then be fewer Ph.Ds competing for the same number of jobs. But the system does not work that way. Rather, when you reduce supply by shrinking graduate programs, you also end up reducing demand (as I will explain in what follows): our system is so structured that we cannot reduce the one without reducing the other, and that’s a real ethical and political conundrum.
When you shrink graduate student enrollments (the supply side), you inevitably also shrink the size of graduate programs, which means, willy-nilly, that you decrease tenured faculty lines (the demand side) because they are the folks teaching in grad programs. Administrators would be happy to shrink our programs and eliminate some tenured lines through attrition and retirement because new, cheaper temp hires can easily fill in to teach the few undergraduate lower-division classes that some tenured faculty teach.
The gurus of supply and demand would like nothing better than for us graduate faculty to do our own regulating by cutting down of our own accord on producing so many new highly educated people schooled in the legacies of critique and dissent. We then serve the wishes of those seeking more power to hire and fire at will the most vulnerable among us who have no protections under a gutted system of tenure and diminished academic freedom. The system can play itself out under the contraction model, then, as a vicious cycle of reducing supply, which reduces demand for tenured faculty (while increasing the non-tenure-track share of the faculty), which calls for further reducing of supply. To believe that contracting the size of graduate programs can, in and of itself, improve the situation is a misattribution of cause and effect: The real cause of the job misery is the agenda for privatization and defunding public expenditures orchestrated by the global economic system that has been producing misery and suffering for millions of lives around the world as socioeconomic inequalities continue to magnify.
Now, having said all that, I also want to be very clear that there are strategic, local situations where reducing graduate student populations in order to expand funding and support for them, or in order to revise a program (hopefully without shrinking tenured faculty lines), can certainly be the most ethical thing to do. So I am speaking at a general level of overall tactics for the profession, and at that level, shrinking (without other forms of compensation) inevitably leads to weakening graduate education, not strengthening it through some mythical model of “right-sizing” to be achieved by a proposed matching of supply and demand.
But, of course, the pain is real, and it reaches fever pitch in the transitional moments of crisis when graduate students face the “market” for jobs. The wretched system we endure makes it impossible not to sympathize with graduate students who understandably often argue that we must reduce the supply of Ph.D.s to give them a better chance to get a job. Under these enormous tensions, the short-term, crisis-management model of supply and demand can especially seem like the only fair-minded option.
In those moments of anguish, which I myself witness every time one of my own students reaches this transition stage, our only ethical task is to support them and listen to them as best we can to help them navigate the transition. So I want to make sure that my remarks here are not intended to provide any specific advice other than the obvious need for support. Specific situations and contextual demands will have to be navigated with all the pragmatic skills and rhetorical resourcefulness possible. In contrast, then, to a focus on the crisis moment of the job search, I have framed my comments here in terms of a big picture narrative.
From the longer and larger perspective, what becomes most clear is that our system of having elite graduate faculty surrounded by masses of non-tenure-track teachers mostly fulfilling service functions of teaching lower-level humanities distribution courses and writing courses fuels that cycle of devolution. We need, then, to change the academic system over which we do have some control. Systemic changes can be difficult to even imagine, but it is by no means impossible as long as we understand that it will not happen in an overnight revolution. And the first step inevitably leads us to examine more critically the ethical and political work of both curricular revision and resource allocation. In short, it leads us to a careful analysis of the systemic class structure within the profession, bolstered as it is by procedures and policies, many of which we actually have some degree of professional autonomy to alter.
Of course, the resistance to institutional transformation remains overwhelming at times, and the struggle to mitigate our academic hierarchies and internal class stratifications is a long-term project, well beyond the scope of these comments. To even imagine such changes in our local institutional circumstances, we will have to make many arguments convincing our colleagues that a more collective and collaborative approach to teaching assignments will be beneficial for us all in the long run. And I have at least some evidence that something like what I have been suggesting can actually happen. Where I teach in the Pennsylvania State System of Higher Education (PASSHE), our collective bargaining agreement affecting all 14 universities with a total enrollment of over 100,000 students has created an anomaly in U.S. higher education: more than 75 percent of all faculty on all campuses are tenure-track lines (the inverse of the national percentage average), and all faculty teach all levels of courses.
Much work remains to be done, and we too continuously struggle against state underfunding and the pressure to hire more temporary faculty. But the potential benefits of these efforts, I believe, would make our profession less stratified and more responsive to public needs for high quality education at all levels, so that, ultimately, the humanities will become a more vital part of the social fabric of everyday life for more citizens. That is a goal we should never abandon.
David B. Downing is director of graduate studies in literature and criticism at Indiana University of Pennsylvania. He is the editor of Works and Days, and his most recent book (co-edited with Edward J. Carvalho) is Academic Freedom in the Post-9/11 Era.
Whether or not the humanities are truly in crisis, the current debates around them have a certain gun-to-the-head quality. “This is why you -- student, parent, Republican senator -- shouldn’t pull the trigger,” their promoters plead. “We deserve to live; we’re good productive citizens; we, too, contribute to the economy, national security, democracy, etc.” Most of these reasons are perfectly accurate. But it is nonetheless surprising that, in the face of what is depicted as an existential crisis, most believers shy away from existential claims (with someexceptions). And by not defending the humanities on their own turf, we risk alienating the very people on whose support the long-term survival of our disciplines depend: students.
One reason why our defenses can have a desperate ring to them is that we’re not used to justifying ourselves. Most humanists hold the value of the objects they study to be self-evident. The student who falls in love with Kant, Flaubert, or ancient Egypt does not need to provide an explanation for why she would like to devote years of her life to such studies. To paraphrase Max Weber, scholarship in the humanities is a vocation, a “calling” in the clerical sense. It chooses you, you don’t choose it. The problem with this kind of spiritual passion is that it is difficult to describe. To paraphrase another 20th-century giant, Jimi Hendrix, it’s more about the experience.
It’s not surprising, then, that when we humanists feel (or imagine) the budget axe tickling the hairs on the backs of our necks, we don’t have ready-made apologia with which to woo or wow our would-be executioners. And because a calling is hard to explain, we turn instead to more straightforward, utilitarian defenses -- “but employers say they like English majors!” -- which, while true, don’t capture the authentic spirit that moves the humanities student.
There is of course sound logic to this approach. Government and state funding is a zero-sum game, and politicians are more likely to be receptive to practical arguments than to existential propositions. But in the long run, it takes more than state and university budgets to maintain the health of the humanities. It also takes students. And by constantly putting our most productive foot forward, we may unintentionally end up selling ourselves short (disclosure: I, too, have sinned). The fundamental reason why students should devote hours of their weeks to novels, philosophy, art, music, or history is not so that they can hone their communication skills or refine their critical thinking. It is because the humanities offer students a profound sense of existential purpose.
The real challenge that we face today, then, lies in explaining to a perplexed, but not necessarily hostile audience -- and perhaps even to ourselves -- why it is that the study of literature, anthropology, art history, or classics can be so meaningful, and why this existential rationale is equally important as other, more utilitarian ones. This line of argument stands in opposition to proclamations of the humanities’ uselessness: to declare that the humanities are of existential value is to affirm that they are very useful indeed.
So how might we go about defining this existential value? A good place to start would be with existentialism itself. A premise of existentialist philosophy is that we live in a world without inherent meaning. For atheists, this is often understood as the human condition following the death of God. But as Jean-Paul Sartre pointed out in “Existentialism is a Humanism,” even believers must recognize that they ultimately are the ones responsible for the production of meaning (in fact, many early existentialists were Christians). Abraham had to decide for himself whether the angel who commanded him to halt his sacrifice was genuinely a divine messenger. In Sartre's memorable formulation, man is “condemned to be free”; we have no choice but to choose. While it may feel as though a humanities vocation is a calling, you still have to decide to answer the call.
The realization that meaning isn’t something we receive from the outside, from others, but that it always must come from within us, from our conscious, deliberative choices, does not make us crave it any less. We are, existentialists insist, creatures of purpose, a thesis that psychological research has also confirmed.
Now what does this have to do with the humanities? It’s not that obvious, after all, how reading Madame Bovary, the Critique of Pure Reason, or The Book of the Dead can fill your life with purpose. At the same time, we also know that some people do find it deeply meaningful to peruse these works, and even to dedicate their careers to studying them.
What is it, then, that lovers of literature -- to consider but them for the moment -- find so existentially rewarding about reading? In a recent book, my colleague Joshua Landy argues that one of the more satisfying features of literature is that it creates the illusion of a meaningful world. “The poem forms a magic circle from within which all contingency is banished,” he writes apropos of Mallarmé’s celebrated sonnet en -yx. The order we discover in literary works may be magical, but it isn’t metaphysical; it comes from the sense that “everything is exactly what and where it has to be.” Art offers a reprieve from a universe governed by chance; what were merely sordid newspaper clippings can become, when transported into artful narratives, The Red and the Black or Madame Bovary. Landy suggests that fictions produce these illusions through a process of “overdetermination:” the ending of Anna Karenina, for instance, is foreshadowed by its beginning, when Anna witnesses a woman throwing herself under a train.
If art offered only illusions of necessity, it would hardly satisfy existential longing. Pretending that everything happens for a reason is precisely what the existentialists castigated as “bad faith.” Yet there’s an obvious difference between enjoying a novel and, say, believing in Providence. We don’t inhabit fictional worlds, we only pay them visits. No lover of literature actually believes her life is as determined as that of a literary heroine (even Emma Bovary wasn’t psychotic). So why does the semblance of an orderly universe enchant us so?
Well-ordered, fictional worlds attract us, it seems, because we, too, aspire to live lives from which contingency is kept at bay. Beauty, wrote Stendhal, is “only a promise of happiness.” As Alexander Nehamas suggested, in his book of this title, the beautiful work of art provides us with a tantalizing pleasure; beauty engages us in its pursuit. But what do we pursue? “To find something beautiful is inseparable from the need to understand what makes it so,” he writes. Behind the beautiful object -- sonnet, style, or sculpture -- we reach for the idea of order itself. The promise of happiness made by art is a promise of purpose.
But a promise of purpose is still a bird in the bush: it can disappear when you put down the book, or leave the concert hall. For the philosopher Immanuel Kant, art only provides us with an empty sense of purpose; or as he put it, in his distinctively Kantian way, "purposiveness without purpose" (it’s even better in German).
It’s true that few existential crises have been resolved by a trip to the museum or the download of a new album. But Kant may have underestimated how the sense of artistic purpose can also seep into our own lives. For instance, as Plato and every teenager know well, instrumental music can give voice to inexpressible feelings without the help of language. These emotional frameworks can convey a potent sense of purpose. When my youngest daughter spent six weeks in the neonatal ICU with a life-threatening condition, my mind kept replaying the second movement of Beethoven’s seventh symphony to tame my fears. Its somber, resolute progress, punctuated by brief moments of respite, helped to keep my vacillating emotions under control. As in films, sometimes it is the soundtrack that gives meaning to our actions.
The promise of order found in beautiful works of art, then, can inspire us to find purpose in our own lives. The illusion of a world where everything is in its place helps us view reality in a different light. This process is particularly clear -- indeed, almost trivial -- in those humanistic disciplines that do not deal primarily with aesthetic objects, such as philosophy. We aren't attracted to the worldviews of Plato, Kant, or Sartre, purely for the elegance of their formal structure. If we’re swayed by their philosophies, it’s because they allow us to discover hitherto unnoticed patterns in our lives. Sometimes, when you read philosophy, it seems as though the whole world has snapped into place. This is not an experience reserved for professional philosophers, either: at the conclusion of a philosophy course that my colleagues Debra Satz and Rob Reich offer to recovering female addicts, one student declared, “I feel like a butterfly drawn from a cocoon.”
So where art initially appeals to us through intimations of otherworldly beauty, a more prolonged engagement with the humanities can produce a sense of order in the here and now. One could even say that Plato got things the wrong way around: first we’re attracted by an ideal universe, and then we’re led to discover that our own reality is not as absurd as it once seemed. And while particularly evident with philosophy, this sensation of finally making sense of the world, and of your own place in it, can come from many quarters of the humanities. In a delightful interview (originally conducted in French), Justice Stephen Breyer recently exclaimed, “It’s all there in Proust — all mankind!” Other readers have had similar responses to Dante, Shakespeare, Tolstoy, and many more.
But exploring the humanities is not like a trip to the mall: you don't set off to find an off-the-rack outfit to wear. Proust can change your life, but if you only saw the world through his novel, it would be a rather impoverished life. Worse, it would be inauthentic: no author, no matter how great, can tell you what the meaning of your life is. That is something we must cobble together for ourselves, from the bits and pieces of literature, philosophy, religion, history, and art that particularly resonate in us. “These fragments I have shored against my ruins,” T.S. Eliot wrote at the end of The Waste Land. No poem offers a better illustration of this cultural bricolage: Shakespeare answers Dante, and the Upanishads disclose what the Book of Revelation had suppressed.
So here we find an existential rationale for a liberal education. To be sure, the humanities do not figure alone in this endeavor: psychology, biology, and physics can contribute to our perception of ourselves in relation to the world, as can economics, sociology, and political science. But the more a discipline tends toward scientific precision, the more it privileges a small number of accepted, canonical explanations of those aspects of reality it aims to describe. If 20 biology professors lectured on Darwin’s theory of evolution, chances are they’d have a lot in common. But if 20 French professors lectured on Proust’s Recherche, chances are they’d be quite different. The same could be said, perhaps to a lesser extent, for 20 lectures on Plato’s Republic. The kinds of objects that the humanities focus on are generally irreducible to a single explanation. This is why they provide such good fodder for hungry minds: there are so many ways a poem, a painting, or a philosophy book can stick with you.
In his diatribe against the way the humanities have been taught since the '60s, Allan Bloom harrumphed, “On the portal of the humanities is written in many ways and many tongues, ‘There is no truth -- at least here.’ ” But the point of a liberal education is not to read great works in order to discover The Truth. Its point is to give students the chance to fashion purposeful lives for themselves. This is why authors such as Freud, whose truth-value is doubted by many, can still be a source of meaning for others. Conversely, this is also why humanities professors, many of whom are rightfully concerned about the truth-value of certain questions or interpretations, do not always teach the kinds of classes where students can serendipitously discover existential purpose.
There are more than existential reasons to study the humanities. Some are intellectual: history, for instance, responds to our profound curiosity about the past. Some are practical. To celebrate one is not to deny others. The biggest difficulty with defending the humanities is the embarrassment of riches: because humanists are like foxes and learn many different things, it is hard to explain them to the hedgehogs of the world, who want to know what One Big Thing we do well. The danger is that, in compressing our message so it gets heard, we leave out precisely the part that naturally appeals to our future students. Yes, students and parents are worried about employment prospects. But what parents don’t also want their child to lead a meaningful life? We are betraying our students if, as a society, we do not tell them that purpose is what ultimately makes a life well-lived.
Dan Edelstein is a professor of French and (by courtesy) history at Stanford University. He directs the Stanford Summer Humanities Institute.
The nine muses are a motley bunch. We’ve boiled them down into a generic symbol for inspiration: a toga-clad young woman, possibly plucking a string instrument. But in mythology they oversaw an odd combination of arts and sciences. They were sisters, which allegorically implies a kinship among their fields of expertise. If so, the connections are hard to see.
Six of them divvied up the classical literary, dramatic, and musical genres – working multimedia in the cases of Erato (inventor of the lyre and of love poetry) and Euterpe (who played the flute and inspired elegaic songs and poems). The other three muses handled choreography, astronomy, and history. That leaves and awful lot of creative and intellectual endeavor completely unsupervised. Then again it’s possible that Calliope has become a sort of roaming interdisciplinary adjunct muse, since there are so few epic poets around for her to inspire these days.
An updated pantheon is certainly implied by Peter Charles Hoffer’s Clio Among the Muses: Essays on History and the Humanities (New York University Press). Clio, the demi-goddess in charge of history, is traditionally depicted with a scroll or a book. But as portrayed by Hoffer -- a professor of history at the University of Georgia – she is in regular communication with her peers in philosophy, law, the social sciences, and policy studies. I picture her juggling tablet, laptop and cellphone, in the contemporary manner.
Ten years ago Hoffer published Past Imperfect, a volume assessing professional misconduct by American historians. The book was all too timely, appearing as it did in the wake of some highly publicized cases of plagiarism and fraud. But Hoffer went beyond expose and denunciation. He discussed the biases and sometimes shady practices of several well-respected American historians over the previous 200 years. By putting the recent cases of malfeasance into a broader context, Hoffer was not excusing them; on the contrary, he was clearly frustrated with colleagues who minimized the importance of dealing with the case of someone like Michael Bellesiles, a historian who fabricated evidence. But he also recognized that history itself, as a discipline, had a history. Even work that seemed perfectly sound might be shot through with problems only visible with the passing of time.
While by no means a sequel, Clio Among the Muses continues the earlier book’s effort to explain that revisionism is not a challenge to historical knowledge, but rather intrinsic to the whole effort to establish that knowledge in the first place. “If historians are fallible,” Hoffer writes, “there is no dogma in history itself, no hidden agenda, no sacred forms – not any that really matter – that are proof against revision… Worthwhile historical scholarship is based on a gentle gradualism, a piling up of factual knowledge, a sifting and reframing of analytical models, an ongoing collective enterprise that unites generation after generation of scholars to their readers and listeners.”
Hoffer’s strategy is to improve the public’s appreciation of history by introducing it to the elements of historiography. (That being the all-too-technical term for the history of what historians do, in all its methodological knottiness.) One way to do so would be through a comprehensive narrative, such as Harry Elmer Barnes offered in A History of Historical Writing (1937), a work of terrific erudition and no little tedium. Fortunately Hoffer took a different route.
Clio Among the Muses instead sketches the back-and-forth exchanges between history and other institutions and fields of study: religion, philosophy, law, literature, and public policy, among others. Historians explore the topics, and use the tools, created in these other domains. At the same time, historical research can exert pressure on, say, how a religious scripture is interpreted or a law is applied.
Clio’s dealings with her sisters are not always happy. One clear example is a passage Hoffer quotes from Charles Beard, addressing his colleagues at a meeting of the American Historical Association in 1933: “The philosopher, possessing little or no acquaintance with history, sometimes pretends to expound the inner secret of history, but the historian turns upon him and expounds the secret of the philosopher, as far as it may be expounded at all, by placing him in relation to the movement of ideas and interests in which he stands or floats, by giving to his scheme of thought its appropriate relativity.”
Sibling rivalry? The relationships are complicated, anyway, and Hoffer has his hands full trying to portray them. The essays are learned but fairly genial, and somehow not bogged down by the fundamental impossibility of what the author is trying to do. He covers the relationship between history and the social sciences – all of them -- in just under two dozen pages. Like Evel Knievel jumping a canyon, you have to respect the fact that, knowing the odds, he just went ahead with it.
But then, one of Hoffer’s remarks suggests that keeping one’s nerve is what his profession ultimately requires:
“Historical writing is not an exercise in logical argument so much as an exercise in creative imagination. Historians try to do the impossible: retrieve an ever-receding and thus never reachable past. Given that the task is impossible, one cannot be surprised that historians must occasionally use fallacy – hasty generalization, weak analogy, counterfactual hypotheticals, incomplete comparisons, and even jumping around in past time and space to glimpse the otherwise invisible yesteryear.”
And if they did not do so, we’d see very little of it at all.
Supporters of the American Studies Association’s call for a boycott of Israel universities are distorting what the boycott is – and how it will affect academe. The "institutional boycott" is likely to function as a political test in a hidden form. It violates principles of academic freedom. And in practice, it has been, and is likely to continue to be, a campaign for the exclusion of individual scholars who work in Israel, from the global academic community. It’s time to look with more care at the boycott and what it’s really about.
What the ASA Resolution Says
The ASA resolution reaffirms, in a general and abstract way, its support for the principle of academic freedom. It then says that it will “honor the call of Palestinian civil society for a boycott of Israeli academic institutions.” It goes on to offer guarantees that it will support the academic freedom of scholars who speak about Israel and who support the boycott; the implication here is that this refers to scholars who are opponents of Israel or of Israeli policy. The resolution does not specifically mention the academic freedom of individual Israeli scholars or students, nor does it mention protection for people to speak out against the boycott, nor does it say anything about the academic freedom of people to collaborate with Israeli colleagues.
What the ASA names "the call of Palestinian civil society for a boycott" is the Palestinian Campaign for the Academic and Cultural Boycott of Israel (PACBI) "Call for Academic and Cultural Boycott of Israel." The PACBI call explicitly says that the "vast majority of Israeli intellectuals and academics," that is to say individuals, have contributed to, or have been "complicit in through their silence," the Israeli human rights abuses which are the reasons given for boycott. There would be no sense in making this claim if no sanctions against individuals were envisaged. The PACBI guidelines state that "virtually all" Israeli academic institutions are guilty in the same way.
These claims, about the collective guilt of Israeli academics and institutions are strongly contested empirically. Opponents of the boycott argue that Israeli academe is pluralistic and diverse and contains many individuals who explicitly oppose anti-Arab racism, Islamophobia and the military and the civilian occupations of the West Bank. These claims about the guilt of Israeli academe are also contested by those who hold that the principle of collective guilt is a violation of the norms of the global academic community and of natural justice. Opponents of the boycott argue that academics and institutions should be judged by the content of their work and by the nature of their academic norms and practices, not by the state in which they are employed.
The PACBI guidelines go on to specify what is meant by the "institutional" boycott. "[T]hese institutions, all their activities, and all the events they sponsor or support must be boycotted." And "[e] and projects involving individuals explicitly representing these complicit institutions should be boycotted." The guidelines then offer an exemption for some other classes of individual as follows: "Mere institutional affiliation to the Israeli academy is therefore not a sufficient condition for applying the boycott."
A Political Test by Another Name
Refusing to collaborate with academics on the basis of their nationality is, prima facie, a violation of the norms of academic freedom and of the principle of the universality of science. It seems to punish scholars not for something related to their work, nor for something that they have done wrong, but because of who they are.
In 2002 Mona Baker, an academic in Britain, fired two Israelis from the editorial boards of academic journals that she owned and edited. Gideon Toury and Miriam Shlesinger are both well-respected internationally as scholars and also as public opponents of Israeli human rights abuses, but nevertheless they were "boycotted." The boycott campaign sought a more sophisticated formulation which did not appear to target individuals just for being Israeli.
In 2003, the formulation of the "institutional boycott" was put into action with a resolution to the Association of University Teachers (AUT), an academic trade union in Britain, that members should "sever any academic links they may have with official Israeli institutions, including universities." Yet in the same year, Andrew Wilkie, an Oxford academic, rejected an Israeli who applied to do a Ph.D. with him, giving as a reason that he had served in the Israeli armed forces. The boycott campaign in the UK supported Andrew Wilkie against criticism which focused on his boycott of an individual who had no affiliation of any kind to an Israeli academic institution. If the principle was accepted that anybody who had been in the Israeli armed forces was to be boycotted, then virtually every Israeli Jew would be thus targeted.
In 2006 the boycott campaign took a new tack, offering an exemption from the boycott to Israelis who could demonstrate their political cleanliness. The other British academic union, NATFHE, called for a boycott of Israeli scholars who failed to "publicly dissociate themselves" from ‘Israel’s apartheid policies." The political test opened the campaign up to a charge of McCarthyism: the implementation of a boycott on this basis would require some kind of machinery to be set up to judge who was allowed an exemption and who was not. The assertion that Israel is "apartheid" is emotionally charged and strongly contested. While it is possible for such analogies to be employed carefully and legitimately, it is also possible for such analogies to function as statements of loyalty to the Palestinians. They sometimes function as short cuts to the boycott conclusion, and as ways of demonizing Israel, Israelis, and those who are accused of speaking on their behalf. In practice, the boycott campaign attempts to construct supporters of the boycott as friends of Palestine and opponents of the boycott as enemies of Palestine.
It is reasonable to assume that under the influence of the campaign for an "institutional boycott," much boycotting of individuals goes on silently and privately. It is also reasonable to assume that Israeli scholars may come to fear submitting papers to journals or conferences if they think they may be boycotted, explicitly or not; this would lead to a "self-boycott" effect. There are anecdotal examples of the kinds of things which are likely to happen under the surface even of an institutional boycott. An Israeli colleague contacted a British academic in 2008, saying that he was in town and would like to meet for a coffee to discuss common research interests. The Israeli was told that the British colleague would be happy to meet, but he would first have to disavow Israeli apartheid.
The PACBI call, endorsed by ASA, says that Israeli institutions are guilty, Israeli intellectuals are guilty, Israeli academics who explicitly represent their institutions should be boycotted, but an affiliation in itself, is not grounds for boycott. The danger is that Israelis will be asked not to disavow Israel politically, but to disavow their university ‘institutionally’, as a pre-condition for recognition as legitimate members of the academic community. Israelis may be told that they are welcome to submit an article to a journal or to attend a seminar or a conference as an individual: EG David Hirsh is acceptable, David Hirsh, Tel Aviv University is not. Some Israelis will, as a matter of principle, refuse to appear only as an individual; others may be required by the institution which pays their salary, or by the institution which funds their research, not to disavow.
An ‘Institutional Boycott’ Still Violates Principles of Academic Freedom
Academic institutions themselves, in Israel as anywhere else, are fundamentally communities of scholars; they protect scholars, they make it possible for scholars to research and to teach, and they defend the academic freedom of scholars. The premise of the "institutional boycott" is that in Israel, universities are bad but scholars are (possibly, exceptionally) good, that universities are organs of the state while individual scholars are employees who may be (possibly, exceptionally) not guilty of supporting Israeli "apartheid" or some similar formulation.
There are two fundamental elements that are contested by opponents of the boycott in the "institutional boycott" rhetoric. First, it is argued, academic institutions are a necessary part of the structure of academic freedom. If there were no universities, scholars would band together and invent them, in order to create a framework within which they could function as professional researchers and teachers, and within which they could collectively defend their academic freedom.
Second, opponents of the boycott argue that Israeli academic institutions are not materially different from academic institutions in other free countries: they are not segregated by race, religion or gender, they have relative autonomy from the state, they defend academic freedom and freedom of criticism, not least against government and political pressure. There are of course threats to academic freedom in Israel, as there are in the U.S. and elsewhere, but the record of Israeli institutions is a good one in defending their scholars from political interference. Neve Gordon, for example, still has tenure at Ben Gurion University, in spite of calling for a boycott of his own institution; Ilan Pappe left Haifa voluntarily after having been protected by his institution even after traveling the world denouncing his institution and Israel in general as genocidal, Nazi and worthy of boycott.
Jon Pike argued that the very business of academia does not open itself up to a clear distinction between individuals and institutions. For example the boycott campaign has proposed that while Israelis may submit papers as individuals, they would be boycotted if they submitted it from their institutions. He points out that "papers that ‘issue from Israeli institutions' or are 'submitted from Israeli institutions' are worried over, written by, formatted by, referenced by, checked by, posted off by individual Israeli academics. Scientists, theorists, and researchers do their thinking, write it up and send it off to journals. It seems to me that Israeli academics can’t plausibly be so different from the rest of us that they have discovered some wonderful way of writing papers without the intervention of a human, individual, writer."
Boycotting academic institutions means refusing to collaborate with Israeli academics, at least under some circumstances if not others; and then we are likely to see the reintroduction of some form of "disavowal" test.
The Boycott Is an Exclusion of Jewish Scholars Who Work in Israel
In 2011 the University of Johannesburg decided, under pressure from the boycott campaign, to cut the institutional links it had with Ben Gurion University for the study of irrigation techniques in arid agriculture. Logically the cutting of links should have meant the end of the research with the Israeli scholars being boycotted as explicit representatives of their university. What in fact happened was that the boycotters had their public political victory and then the two universities quietly renegotiated their links under the radar, with the knowledge of the boycott campaign, and the research into agriculture continued. The boycott campaign portrayed this as an institutional boycott that didn’t harm scientific co-operation or Israeli individuals. The risks are that such pragmatism (and hypocrisy) will not always be the outcome and that the official position of "cutting links" will actually be implemented; in any case, the University of Johannesburg solution encourages a rhetoric of stigmatization against Israeli academics, even if it quietly neglects to act on it.
Another risk is that the targeting of Israelis by the "institutional boycott," or the targeting of the ones who are likely to refuse to disavow their institutional affiliations, is likely to impact disproportionately Jews. The risk here is that the institutional boycott has the potential to become, in its actual implementation, an exclusion of Jewish Israelis, although there will of course be exemption for some "good Jews": anti-Zionist Jewish Israelis or Israeli Jewish supporters of the boycott campaign. The result would be a policy which harms Israeli Jews more than anybody else. Further, among scholars who insist on "breaking the institutional boycott" or on arguing against it in America, Jews are likely to be disproportionately represented. If there are consequences which follow these activities, which some boycotters will regard as scabbing, the consequences will impact most heavily on American Jewish academics. Under any accepted practice of equal opportunities impact assessment, the policy of "institutional boycott" would cross the red lines which would normally constitute warnings of institutional racism.
The reality of the "institutional boycott" is that somebody will be in charge of judging who should be boycotted and who should be exempt. Even the official positions of ASA and PACBI are confusing and contradictory; they say there will be no boycott of individuals but they nevertheless make claims which offer justification for a boycott of individuals. But there is the added danger that some people implementing the boycott locally are likely not to have even the political sophistication of the official boycott campaign. There is a risk that there will still be boycotts of individuals (Mona Baker), political tests (NATFHE), breaking of scientific links (University of Johannesburg) and silent individual boycotts.
Even if nobody intends this, it is foreseeable that in practice the effects of a boycott may include exclusions, opprobrium, and stigma against Jewish Israeli academics who do not pass, or who refuse to submit to, one version or another of a test of their ideological purity; similar treatment may be visited upon those non-Israeli academics who insist on working with Israeli colleagues. There is a clear risk that an ‘institutional boycott’, if actually implemented, would function as such a test.
PACBI is the "Palestinian Campaign for the Academic and Cultural Boycott of Israel." What it hopes to achieve is stated in its name. It hopes to institute an "academic boycott of Israel." The small print concerning the distinction between institutions and individuals is contradictory, unclear and small. It is likely that some people will continue to understand the term "academic boycott of Israel," in a common sense way, to mean a boycott of Israeli academics.
David Hirsh is lecturer in sociology at Goldsmiths, the University of London. He is founding editor of Engage, a network and website that opposes boycotts of Israel and anti-Semitism.