The folklore of Indonesia and Thailand tells of a frog who is born under half of a coconut-shell bowl and lives out his life there. In time, he draws the only sensible conclusion: the inside of the shell is the whole universe.
“The moral judgment in the image,” writes Benedict Anderson in Life Beyond Boundaries: A Memoir (Verso), “is that the frog is narrow-minded, provincial, stay-at-home, and self-satisfied for no good reason. For my part, I stayed nowhere long enough to settle down in one place, unlike the proverbial frog.”
Anderson, a professor emeritus of international studies, government and Asia studies at Cornell University, wrote major studies of the history and culture of Southeast Asia. A certain degree of cosmopolitanism went with the fieldwork. But the boundaries within a society can be patrolled just as insistently as its geographical borders -- and in the case of academic specialties, the guards inspecting passports tend to be quite unapologetically suspicious.
In that regard, Anderson was an even more remarkable citizen of the world, for his death late last year has been felt as a loss in several areas of the humanities as well as at least a couple of the social sciences. Nearly all of this reflects what someone writing in a scholarly journal once dubbed “Benedict Anderson’s pregnant phrase” -- i.e., the main title of his 1983 work Imagined Communities: Reflections on the Origin and Spread of Nationalism, which treated the mass production of books and periodicals in vernacular languages (what he called “print capitalism”) as a catalytic factor in creating a shared sense of identity and, with it, the desire for national sovereignty.
By the 1990s, people were pursuing tangents from Anderson’s argument with ever more tenuous connection to nationalism -- and still less to the specific emphasis on print capitalism. Any group formed and energized by some form of mass communication might be treated as an imaginary community. Here one might do a search for “Benedict Anderson” and ”World of Warcraft” to see why the author came to think of his best-known title as “a pair of words from which the vampires of banality have by now sucked almost all the blood.” Even so, Imagined Communities has shown remarkable longevity, and its landmark status is clearly international: it had been translated into more than 30 languages as of 2009, when it appeared in a Thai edition.
The reader of Life Beyond Boundaries soon understands why Anderson eventually developed mixed feelings about his “pregnant phrase” and its spawn. His sense of scholarship, and of life itself, was that it ought to be a mode of open-ended exploration, of using what you’ve learned to figure out what you could learn. Establishing a widely known line of thought must have become frustrating once it’s assumed to represent the only direction in which you can move. Professional interest is not the only kind of interest; what it recognizes as knowledge is no measure of the world outside the shell.
Anderson wrote the memoir by request: a Japanese colleague asked for it as a resource to show students something of the conduct of scholarship abroad and to challenge the “needlessly timid” ethos fostered by Japanese professors’ “patriarchal attitude.” Long retired -- and evidently reassured by the thought that few of his American colleagues would ever see the book -- Anderson was wry and spot-on in recounting the unfamiliar and not always agreeable experience of American academic life as he found it after emigrating to the United States from England as a graduate student in the late 1950s. For one thing, his professors looked askance at his papers, where he might indulge in a sardonic remark if so inspired, or pursue a digressive point in his footnotes.
“In a friendly way,” he writes, “my teachers warned me to stop writing like this …. It was really hard for me to accept this advice, as in previous schools I had always been told that, in writing, ‘dullness’ was the thing to be avoided at all cost.” He also underscores the paradox that the pragmatic American disinterest in “grand theory” coexisted with an academic hunger for it, renewed on a seasonal basis:
“‘Theory,’ mirroring the style of late capitalism, has obsolescence built into it, in the manner of high-end commodities. In year X students had to read and more or less revere Theory Y while sharpening their teeth on passé Theory W. Not too many years later, they were told to sharpen their teeth on passé Theory Y, admire Theory Z, and forget about Theory W.”
Lest anyone assume this refers to the situation in the humanities, it’s worth clarifying that one example he gives is the “modernization theory” that once ruled the social sciences roost. And similar ridings of the trend wave also prevail in the choice of areas for research. The antidote, he found, came from leaving the academic coconut bowl to explore Indonesia, the Philippines and Thailand:
“I began to realize something fundamental about fieldwork: that it is useless to concentrate exclusively on one’s ‘research project.’ One has to be endlessly curious about everything, sharpen one’s eyes and ears, and take notes about anything …. The experience of strangeness makes all your senses much more sensitive than normal, and your attachment to comparison grows deeper. This is also why fieldwork is so useful when you return home. You will have developed habits of observation and comparison that encourage or force you to start noticing that your own culture is just as strange ….”
Unfortunately the author does not say how his intended Japanese public responded to Life Beyond Boundaries. A lot probably depends on how well the moments of humor and reverie translated. But in English they read wonderfully, and the book is a gem.
Prestige has its privileges. When a well-established award is announced -- as the 100th set of Pulitzer Prize winners was on Tuesday -- it tends to consume the available limelight. Anything less monumental tends to disappear into its shadow.
But a couple of developments in the humanities this week strike me as being as newsworthy as the Pulitzers. If anything, they are possibly more consequential in the long run.
For one, we have the Whiting Foundation’s new Public Engagement Fellowship, which named its first recipients on Tuesday. The fellowship ought not to be confused with the Whiting Award, which since 1985 has been given annually to 10 authors “based on early accomplishment and the promise of great work to come.” The winners receive $50,000 each, along with, presumably, the professed esteem and subdued malice of their peers.
By contrast, the Public Engagement Fellowships go to professors who have shown “a demonstrated commitment to using their scholarly expertise to reach wider audiences,” in order to fund ambitious projects designed to have direct and significant impact on a specific public outside the academy.” There are eight scholars in the fellowship’s inaugural cohort, including, for instance, Zoë Kontes, an associate professor of classics at Kenyon College, who will spend a semester creating a podcast to explore the black market in looted artifacts.
As with the literary prize, the fellowship comes with $50,000, with $10,000 earmarked for the project’s expenses and the rest covering the recipient’s stipend. Neither the number of fellows nor the apportionment of finances is set in stone, as I learned from Daniel Reid, the foundation’s executive director, when we met last week.
He explained that after more than 40 years of funding dissertations in the humanities at elite universities, the Whiting Foundation had decided it was time to direct its attention to a relatively underserved aspect of humanities scholarship: the cultivation of new ways of making connections with the world beyond the campus. Last year, the foundation contacted administrators at 40 universities, encouraging them to nominate faculty with projects that might be appropriate for funding.
“This has been a learning process on both sides,” Reid said, “for [the foundation] in running things and for the institutions in getting a sense of what we’re looking for.” He explained that the proposals were then evaluated by a group of seven people who had considerable experience with the communication of specialized knowledge to a wide public. The names are not public, though Reid indicates that a number of them are prominent figures in scholarship, publishing and museum or gallery curation. (The need for secrecy is understandable: publicizing the names would leave the Whiting judges as vulnerable as delegates to this summer’s political conventions are starting to feel.)
For the second group of Public Engagement Fellows, the Whiting Foundation will double the number of colleges and universities it contacts in search of nominations, with the long-term goal of making the process open to all higher education institutions. In the future, the number of recipients may range from six to 10. I gave the example of Kontes’s podcast on the looting of antiquities as an example (not quite at random: consider me on the waiting list to subscribe) but hope the other projects stimulate interest, discussion and perhaps some healthy competition.
The other development from earlier in the week is Duke University Press’s announcement that it will be publishing an edition of the works of Stuart Hall, who can -- without exaggeration, if not without argument -- be called the founding father of cultural studies as an academic discipline, at least in Great Britain. The Wikipedia entry for Hall is surprisingly thorough, so anyone for whom the name does not signify might want get up to speed there.
Hall is the case of a figure in the humanities whose impact is both widely recognized yet difficult to assess for an American -- for the simple reason that, even at the peak of his influence, his work was remarkably difficult to find. A number of his major writings seem to have been published as mimeographed papers. He published books, but not that many found their way into American bookstores. So the prospect of having his scattered and fugitive writings in an edition from a major university press is appealing.
I heard that Ken Wissoker, the press's editorial director, might have some background information on why we are getting Hall’s work in this form only now, two years after his death. He confirmed my impression in an email note and gave a little background that seems worth putting into the record: “David Morley had edited two or three volumes of Stuart’s essays for Macmillan U.K. back in the late ’80s, but my understanding is that Stuart decided against having them come out (or delayed it into not happening). The original cultural studies essays were in a lot of different places …. Xeroxes and then PDFs circulated, but it would have been very difficult to track down all the originals …. Stuart saw the work as conjunctural and didn’t want it becoming scripture. Ironically, this was only a problem in English. There are translations to Mandarin and German (and I believe Spanish and/or Portuguese).”
The first of the two titles in the Duke edition will be out this fall, and the second will be published next spring. One is a set of lectures on the intellectual foundations of cultural studies, the other the first volume of Hall’s autobiography. “The memoir will have a second volume,” Wissoker says, “that will be more of an intellectual and political summation ‘what I think now’ book.” Farther down the line there will be a volume of selected essays, and Laura Sell, Duke's publicity and advertising manager, says that a number of thematically organized collections on “politics, race, photography, black aesthetics, Marxism and post-Marxism, [and] the Caribbean” will come in due course.
When Winston Smith discovers the blind spot in his apartment -- the niche just out of range of the telescreen, Big Brother’s combination video feed and surveillance system -- it is, George Orwell tells us, “partly the unusual geography of the room” that allows him to take the risk of writing in a diary.
Later Smith finds another room with no telescreen at all, where he and Julia create another zone of privacy: the shared kind, intimacy. It can’t last, of course, and it doesn’t, with brutal consequences for both of them. (Thoughtcrime does not pay.)
The dystopia of Orwell’s 1984 is very much the product of its era, which spanned roughly the period between Hitler’s ascension to power in 1933 and Stalin’s death 20 years later. And while the novel’s depiction of a world without privacy can still raise a reader’s hackles, its technology now looks both retrofuturist and surprisingly inefficient. The telescreens are menacing, but there’s always a chance that Big Brother’s watchers will overlook something. And look at the tools that Winston uses to carve out his own domain of personal memory and antitotalitarian sentiment: a pen and paper. The authorities manage to read his thoughts eventually, but it takes most of the novel to get to that point. Today, Winston would be destined to Room 101 before he powered down his notebook.
Last week, Inoted that Meg Leta Jones’s book Ctrl+Z: The Right to Be Forgotten (NYU Press) arrives at a time when ever fewer activities or communicative exchanges occur without the accompaniment of some form of information technology intervening. Digital traces generated along the way are gathered, analyzed, sold. And the right to privacy becomes a little more purely notional each time one’s eyes slide down the text of a user agreement on the way to clicking “accept.”
A kind of fatalism is involved, one resting on the tacit but powerful tendency to assume that technology itself defines what information will gathered, and how, and the use to be made of it. Implied is a trade-off between privacy and various benefits -- with both the cost and the reward determined by what our devices do and require. Privacy is, in this view, a function of engineering necessities, not of political or moral decisions.
The initial, blunt challenge to technological determinism comes in Ctrl+Z’s opening chapters, where Jones, an assistant professor of communications, culture and technology at Georgetown University, contrasts how the European Union and the United States frame their policies concerning the availability of personal information online. Here personal information would include employment history, financial data and arrest records, as well as, say, material communicated via social media.
In the United States, she writes, the default attitude “permits the collection and transfer of personal information and prevents abuse through self-regulation and market forces,” while E.U. states “operate under comprehensive regimes that protect information across both the public and private sectors and are enforced by specialized data-protection agencies.”
The contrast becomes striking when “data protection” might be better described as protecting the reputation or well-being of the individual to which the data pertains. Take the case of someone who, as a young adult, is arrested for vandalism and destruction of property and serves a jail sentence, all of which was written up in a newspaper in 1990 as well as being documented in official records. Once released, he swears off his old ways and spends the next 25 years in steady employment and overall irreproachable conduct. He awakes to find that the newspaper has digitized its archives and made them searchable via Google.
If our reformed graffiti artist lives in America, he can do little if anything about it, apart from asking the paper to take down its accurate but deeply embarrassing article. There is also a chance his conviction will be publicized on any of various websites dedicated to posting mug shots.
In a number of E.U. countries, by contrast, he could appeal to laws that forbid public reference to someone’s criminal record if it is no longer news or if the ex-con has undergone significant rehabilitation. He might also file a request with Google to remove links to sites mentioning the old transgression. In 2014, the Court of Justice of the European Union ruled that the search engine had to establish a take-down system for people who wanted personal information removed from its search results.
There are variations from country to country, but Jones finds that the E.U. “data subject” (in effect, the citizen’s digital doppelgänger) can claim a “general right to personality” -- a certain degree of dignified immunity from unwelcome attention. The American data subject, by contrast, is presumed to take the Wild West ethos of the Internet pretty much as a given, with any effort to delete information or limit its circulation being labeled, almost inevitably, as Orwellian. (Even so, a number of piecemeal efforts have been made in the United States to protect children and victims of harassment and bullying, including laws against revenge porn.)
But as Jones goes on to show, any preference for one of these frameworks over the other will soon enough be faced with the much harder matter of dealing with new and unanticipated shades of gray left out of the public/private distinction. And the other dichotomy -- between having every bit of personal data (flattering, humiliating or neither) either preserved forever in a digital archive or destined for the memory hole -- is also looking out of date. Jones’s book doesn’t predict what comes next, but it’s a great stimulant for anyone bracing themselves to think about it.
New book argues that students involved in campus protests over controversial speakers or ideas should instead support a marketplace of ideas in which all notions are heard and the best rise to the top.
Privacy does not appear to have much of a future. As with the Arctic ice, we have no plan in place to halt the erosion, let alone to restore what has already melted away. “The right to be let alone,” wrote U.S. Supreme Court Justice Louis D. Brandeis in a classic formulation from 1928, is “the most comprehensive of rights and the right most valued by civilized men.” The menace to privacy he was addressing came from the state; at issue, evidence collected by tapping a bar’s telephone during Prohibition. He issued his opinion with a definite sense of the shape of things to come.
“Ways may someday be developed,” he wrote, “by which the Government, without removing papers from secret drawers, can reproduce them in court, and by which it will be enabled to expose to a jury the most intimate occurrences of the home. Advances in the psychic and related sciences may bring means of exploring unexpressed beliefs, thoughts and emotions.”
What he did not foresee, and probably could not, was how the right to privacy might be undermined -- dissolved, really -- by forces within the sphere of private decision making. Today, access to most of the digital tools and conveniences we’ve grown to depend on comes in exchange for permission to record and analyze our use of them. Every search and purchase, each comment and click, can now be assumed to go on your permanent record.
The scope and intensity of the monitoring are difficult, and unpleasant, to imagine. Meg Leta Jones, an assistant professor of communication, culture and technology at Georgetown University, sketches some of the facts of everyday life in the digital panopticon in her book Ctrl+Z: The Right to Be Forgotten (NYU Press):
“When a user logs on to the Internet and visits a website, hundreds of electronic tracking files may be triggered to capture data from the user’s activity on the site and from other information held in existing stored files, and that data is sent to companies. A study done by The Wall Street Journal found that the nation’s top fifty websites installed an average of sixty-four pieces of tracking technology, and a dozen sites installed over one hundred …. Some files respawn, known as zombie cookies, even after the user actively deletes them. All of this information is then sold on data exchanges …. Sensors creating data about owners and others from phones, cars, credit cards, televisions, household appliances, wearable computing necklaces, watches and eyewear, and the growing list of ‘Internet of things’ devices, mean that more personal information is being disclosed, processed and discovered. All of these nodes of discoverability will be added to the structure of discoverability sooner or later, if they have not been already.”
The scale and voracity of big data (which is to surveillance what big pharma is to medication) are not the only forces at work in liquidating the private sphere. The Internet and the roaming devices that feed into it -- cameras, tablets, phones, etc. -- now form the individual’s auxiliary backup brain. Two imaginative exercises that Jones proposes will make clear the potential vulnerabilities this entails.
First, recall the most painful thing in your life: something so excruciating, regrettable or tinged with shame or guilt, or both, that you would burn out the very memory of it, were that an option. Then, think about the worst person you have ever known: someone whose capacity for, in Jones's words, “inconsiderate, nasty, spiteful or twisted behavior” you wish you had learned about early on and so avoided experiencing firsthand.
There’s no substitute for a thick skin or good intuition -- least of all the external, digitized gray matter used to store written and visual documents and to mediate what seems like a larger share of each succeeding generation’s social interaction. Instead, it has the potential (even, to go by anecdotal evidence, a talent) for making things much worse. The dark secret or agonizing experience can be recorded, stored, duplicated endlessly and broadcast for all the world to see. The barrier between public and private self can vaporize with a press of the “send” button, whether in haste or by accident. Conversely, the malicious person might be able to cover his own tracks -- taking down his slanderous blog or threatening comments, for example -- and carry on after the damage is done.
Both could happen at once, of course: the vilest person you know could get hold of whatever you most want to hide or escape, and thereupon, presumably, burn your life to the ground. But the situations that Jones describes need not come to pass to have an effect: “Worry or fear may curb your behavior on- and off-line to avoid the risks of unwanted attention, misinterpretation or abuse.”
It is tempting to reply that worry and fear can be useful emotions, and that learning “to curb your behavior on- and off-line to avoid the risks of unwanted attention, misinterpretation and abuse” is otherwise called growing up. One problem with moralism of this caliber is that it dodges a hard truth: yes, maturity includes knowing to avoid behavior that is dangerous or likely to have undesirable consequences. (Example: consuming a great deal of alcohol and telling someone what you really think of them.) Sometimes this knowledge must be acquired the hard way and mulled over in private; the important thing is that enough of it accumulates. The learning process involves recognizing and repudiating mistakes, and gaining a certain amount of distance from the earlier self who made them.
But what happens to the possibility of growth and change if the infraction can’t be escaped? If a document of the regrettable memory is out there, retrievable by others, perhaps impossible to live down? For that matter, what becomes of “the right to be let alone” when seemingly private life generates data that is compiled and analyzed and sold?
The legal and moral implications require a rethinking of much of what we take for granted, and Jones is plugged in to many of the conversations. Her perspective on the question of privacy is thoughtful enough that I’ll return to Ctrl+Z in next week’s column. Suffice to say that she is not prepared to call for an autopsy, just yet.
Fame can be fickle and destiny, perverse -- but what are we to call how posterity has treated Stefan Zweig? In the period between the world wars, he was among the best-known authors in the world and, by some reckonings, the most translated. Jewish by birth and Austrian by nationality, Zweig was perhaps most of all Viennese by sensibility. His fiction expressed the refined cynicism about sexual mores that readers associated with Sigmund Freud and Arthur Schnitzler, and he played the role of Vienna coffeehouse man of letters to perfection.
Zweig’s cosmopolitanism was of the essence: his biographical and critical studies of European literary and historical figures spanned the Renaissance through the early 20th century and even roamed far enough abroad to include Mary Baker Eddy, the very American if otherwise sui generis founder of Christian Science. (The portrait of her in his book Mental Healers is etched in acid, but it occupies a surprisingly appropriate spot: between the accounts of Franz Mesmer and Freud.)
His books fueled the Nazi bonfires. Even apart from their racial obsessions, Zweig was precisely the sort of author to drive Hitler and Goebbels into paroxysms, and after years of exile he committed suicide in Brazil in 1942. His reputation did not survive the war, either, at least not among English-language readers. After four decades or so of relatively far-flung reading, I think of him as one of those authors who seem only ever to show up in parentheses and footnotes, or sometimes pointed out as a biographer too prone to psychologizing or melodrama. Being barely remembered trumps being totally forgotten (it’s more than most of us will be, anyway), but Zweig hardly seemed like a figure poised for rediscovery when, not too long ago, the comeback began.
The essays and speeches collected in Messages From a Lost World: Europe on the Brink (Pushkin Press) form a supplement to the volume that launched the revival -- Zweig’s memoir, The World of Yesterday, which the University of Nebraska Press published in the new translation that Pushkin Press issued in 2009. (Finished just before the author’s suicide, the book first appeared in English in 1943. That earlier translation can also be had, in ebook format.) The recent NYRB Classics editions of his fiction have had a lot to do with it being more than a one-book revival, but e-publishing and print-on-demand operations account for nearly everything by Zweig in English now being available. A mixed blessing, given that some such “publishers” do little but sell you copies of material available for free from the Internet Archive or Project Gutenberg.
The World of Yesterday is less autobiography than a self-portrait of the European literati during the final years of the belle epoque -- the four decades of relative peace and prosperity on the continent that ended with World War I. The new communication technologies and modes of high-speed transport were shrinking the world, while the spread of education, scientific progress and humane cultural values would presumably continue. The earliest pieces in Messages From a Lost World contain Zweig’s musings on the spiritual impact of the war, written while it was still in progress and with no end in sight. They are the thoughts of a man trying to find his way out of what must have seemed a completely reasonable state of despair:
“Never since it came into being has the whole world been so communally seized by nervous energy. Until now a war was only an isolated flare-up in the immense organism that is humanity, a suppurating limb which could be cauterized and thus healed, whilst all the remaining limbs were free to perform their normal functions without the least hindrance …. But due to its steady conquest of the globe, humanity forged ever-closer links, so today a fever quivers within its whole organism; horrors easily traverse the entire cosmos. There is not a workshop, not an isolated farm, not a hamlet deep in a forest from which they have not torn a man so that he might launch himself into the fray, and each of these beings is intimately connected to others by myriad threads of feeling; even the most insignificant among them has breathed so much of the feverish heat, his sudden disappearance makes those that remain that much colder, more alone and empty.”
In pieces from the 1920s and early ’30s, Zweig takes it as a moral imperative to champion the cause of peace by reminding his readers and listeners that humanity could no longer afford the sort of belligerent nationalism that had led them into the Great War. Respect for the possibilities of human development should replace claims to military greatness:
“If men lived [in earlier eras] as if in the folds of a mountain, their sight limited by the peaks on either side, we today see as if from a summit all worldly happenings in the same moment, in their exact dimensions and proportions. And because we have this commanding view across the surface of the earth, we must now usher in new standards. It’s no longer a case of which country must be placed ahead of another at their expense, but how to accomplish universal movement, progress, civilization. The history of tomorrow must be a history of all humanity and the conflicts between individual countries must be seen as redundant alongside the common good of the community.”
If the world could be changed by elegantly expressed humanist sentiments, this passage, from a speech delivered in 1932, might have altered the course of history. But the way it tempted fate now looks even more bitterly ironic than it did after Hitler took office a few months later. For in spite of his lofty vantage point (“we today see as if from a summit all worldly happenings in the same moment, in their exact dimensions and proportions”) and a depth of historical vision giving him insight into Magellan, Casanova, Napoleon, Goethe, Nietzsche and Mary Antoinette (among others), Zweig was struck dumb by the course of events after 1933.
Not literally, of course; on the contrary, later selections in Messages From a Lost World show that he could turn on the spigots of eloquence all too easily. (The tendency of Zweig's prose to turn purple and kitschy has been mocked at length by the translator Michael Hoffmann.) But he remained notoriously -- and to a great many people, unforgivably -- averse to speaking out against what was happening in Germany. He did say that, considering the company it put him in, having his book torched by the Nazis was an honor. Yet as a statement of solidarity against a regime that would, in time, burn people, that seems decidedly wanting.
Uncharitable people have accused Zweig of cowardice, while Hannah Arendt’s essay on him, found in Reflections on Literature and Culture (Stanford University Press, 2007) treats Zweig as an example of the cultural mandarin so determined to avoid the grubby realities of politics that he disgraced his own idealism. Whether or not that implies a lack of courage is a difficult question.
But surely it isn’t being excessively generous to wonder if Zweig’s failure was one of imagination, rather than of nerve: the inability, first of all, to grasp that Hitler was more than just another nationalist demagogue, followed by a paralysis at seeing the mad dream of the Thousand-Year Reich tearing into reality across most of Europe, with no plan to stop there. Against the horrible future, all he had left was nostalgia -- with a memory of what security and boundless optimism had been like, once.
A celebrity has been defined as somebody who is well-known for being well-known. And by that measure, Gary A. Olson’s Stanley Fish, America's Enfant Terrible: The Authorized Biography (Southern Illinois University Press) is a celebrity biography of sorts. At no point does the subject go into rehab, but other than that, the book hits most of the tabloid marks: humble origins, plucky self-fashioning, innovative and controversial work, exorbitant earnings (for a Milton scholar), and illicit romance (leading to marital bliss). Much of this was once English department gossip, but now posterity is the richer for it.
Olson, president of Daemen College in Amherst, N.Y., even reveals a brush with Hollywood. When plans were underway to film one of David Lodge’s novels about academe, the producer wanted to cast Walter Matthau as Morris Zapp, a character clearly based on Fish. Matthau wasn't interested, and the movie languished in development hell, but in Olson’s words, “Stanley let it be known that he would love the opportunity to play the role himself.” (Another trait of the celeb bio: subject called by first name.)
If Fish wrote a book of career advice, a suitable title might be The Art of the Deal. Describing his career in the late 1970s and early ’80s, Olson writes:
“Stanley’s annual schedule was unimaginably grueling to many faculty. He would jet from university to university, giving workshops, papers and presentations, both in the United States and abroad. He was in great demand and he knew it, so he pushed the boundaries of what universities would pay to bring in a humanities professor to speak. Whatever he was offered by a university, he would demand more -- and he usually got it. The host who invited him would scramble to meet his fee, asking various campus departments and entities to contribute to the event until the requisite fee had been collected. Throughout his career he must have visited almost every notable university in every state of the union.”
Then again, a book of career advice from Stanley Fish would be virtually useless to anyone else today, like a tourist guide to a city that's been hit by an earthquake. This fall will be the 50th anniversary of Fish receiving tenure at the University of California at Berkeley. He was 28 years old and had been teaching for four years. Berkeley was his first position; he turned down two previous offers before accepting it. Given the circumstances, Stanley Fish, America’s Enfant Terrible will pose a challenge to many readers that rarely comes up with a nonfiction book: that of suspending disbelief. The chapter I just quoted is called “Academic Utopia” -- and as with other utopias, you can’t really get there from here.
“From my point of view,” Olson reports Fish saying, “there are a lot of people out there making mistakes, and I’m just going to tell them they’re making mistakes.” As if to authenticate that statement, the title of Fish’s most recent book is Think Again: Contrarian Reflections on Life, Culture, Politics, Religion, Law, and Education (Princeton University Press) while his next, due this summer, is called Winning Arguments: What Works and Doesn't Work in Politics, the Bedroom, the Courtroom, and the Classroom (Harper).
Originally the “people out there” so targeted were his fellow scholars of 16th- and 17th-century English literature. The range of his corrective efforts grew to include leading figures in literary theory, followed by the legal profession and finally -- with his New York Times column, which ran from 1995 to 2013 -- the entire public sphere. Olson wrote about the theoretical and rhetorical force of Fish’s work in two earlier volumes. Here the author largely underplays Fish’s intellectual biography beyond a few references to professors who influenced him as a student. Instead, Olson emphasizes the tangible institutional power that Fish acquired and learned to wield in his route to becoming a university administrator of national prominence.
Starting in 1985, when he was chair of the English department at Johns Hopkins University, Fish made a series of unexpected and much-discussed moves -- first to Duke University, where he reshaped the respectable but inconspicuous English department into high-theory powerhouse of the early 1990s, and later to the University of Illinois at Chicago, where he was dean of the College of Liberal Arts and Sciences. There, the biographer says, Fish’s “hiring of celebrity academics and the generating of positive publicity” were steps meant “to lift the miasma of an inferiority complex, to help faculty collectively feel that they were part of an enterprise that was known and admired by others.” (There were other appointments and stints and high professional honors along the way; the chronology at the back of the book is overstuffed with them.)
Like Morris Zapp, Stanley Fish has an appealing side, as Olson portrays him: he works hard (as a graduate student, he read Paradise Lost five times in one summer), he sounds like a good teacher and the qualities that read as brash and arrogant in some circumstances are probably gruff but lovable, like a Muppets character, in others. That said, letting decisions and values be determined by the need to feel “known and admired by others” is the very definition of what sociologists used to call “other-directedness.” More recently, and fittingly enough, it’s associated with the culture of celebrity.
No celebrity biography is complete without a catastrophe or two, mixed in with the triumphs -- and in the case of Fish’s role as a “developer” of intellectual and institutional real estate, there is not such a bright line between success and disaster. He enhanced the reputation of Duke’s humanities programs (and that of the university press), but most of the stars he hired relocated within a few years. And Fish’s efforts at UIC are remembered for running at a deficit that lasted beyond his deanship.
In an email note to Olson, I asked about various things that had caused a knot to form in my stomach while reading his book. We learn that Fish arranged a summer stipend of $58,000 for one of his cronies. Olson says he once responded to the challenge of an audience member to explain why anyone should believe him by saying, “Because I am Stanley Fish. I teach at Johns Hopkins University, and I make $75,000 a year.” (I recall hearing that one in the early 1980s; the equivalent now would be around $200,000.)
I've benefited from reading Fish in the past, but I find it hard to imagine anyone racking up student debt that will take decades to pay off regarding the figure Olson presents as anything but a monster. To be fair, the final pages of the book depict Fish in more recent times as a less grandiose figure, even as touched with regret or disappointment. (Being a septuagenarian enfant terrible seems like a pretty melancholy prospect.)
Anyway, Olson replied to my question as follows: “Well, some do think of him as a monster, although probably not for those reasons. I don’t argue in the book that he should be admired -- or reviled. That’s an individual choice. You have to remember: Fish is from an older generation of academics, when higher education was growing exponentially and exuberantly. Fish represents the rise of high theory. Despite his passion for teaching, he is known most as an intellectual and a scholar; that’s what drew the high pay and the high praise. You don’t hire someone like Fish to enhance your teaching as much as you do to bring a certain prestige to your institution. That’s why Duke, UIC and other institutions wanted him. And, generally, these proved to be good institutional decisions.”
Perhaps. It seems to be a question of short-term versus long-term benefit. But it’s hard to understand how giving large barrels of money to a few transient scholarly A-listers “to help faculty collectively feel that they were part of an enterprise that was known and admired by others” was ever a good idea -- much less a sustainable one. After Fish, the deluge?