The folklore of Indonesia and Thailand tells of a frog who is born under half of a coconut-shell bowl and lives out his life there. In time, he draws the only sensible conclusion: the inside of the shell is the whole universe.
“The moral judgment in the image,” writes Benedict Anderson in Life Beyond Boundaries: A Memoir (Verso), “is that the frog is narrow-minded, provincial, stay-at-home, and self-satisfied for no good reason. For my part, I stayed nowhere long enough to settle down in one place, unlike the proverbial frog.”
Anderson, a professor emeritus of international studies, government and Asia studies at Cornell University, wrote major studies of the history and culture of Southeast Asia. A certain degree of cosmopolitanism went with the fieldwork. But the boundaries within a society can be patrolled just as insistently as its geographical borders -- and in the case of academic specialties, the guards inspecting passports tend to be quite unapologetically suspicious.
In that regard, Anderson was an even more remarkable citizen of the world, for his death late last year has been felt as a loss in several areas of the humanities as well as at least a couple of the social sciences. Nearly all of this reflects what someone writing in a scholarly journal once dubbed “Benedict Anderson’s pregnant phrase” -- i.e., the main title of his 1983 work Imagined Communities: Reflections on the Origin and Spread of Nationalism, which treated the mass production of books and periodicals in vernacular languages (what he called “print capitalism”) as a catalytic factor in creating a shared sense of identity and, with it, the desire for national sovereignty.
By the 1990s, people were pursuing tangents from Anderson’s argument with ever more tenuous connection to nationalism -- and still less to the specific emphasis on print capitalism. Any group formed and energized by some form of mass communication might be treated as an imaginary community. Here one might do a search for “Benedict Anderson” and ”World of Warcraft” to see why the author came to think of his best-known title as “a pair of words from which the vampires of banality have by now sucked almost all the blood.” Even so, Imagined Communities has shown remarkable longevity, and its landmark status is clearly international: it had been translated into more than 30 languages as of 2009, when it appeared in a Thai edition.
The reader of Life Beyond Boundaries soon understands why Anderson eventually developed mixed feelings about his “pregnant phrase” and its spawn. His sense of scholarship, and of life itself, was that it ought to be a mode of open-ended exploration, of using what you’ve learned to figure out what you could learn. Establishing a widely known line of thought must have become frustrating once it’s assumed to represent the only direction in which you can move. Professional interest is not the only kind of interest; what it recognizes as knowledge is no measure of the world outside the shell.
Anderson wrote the memoir by request: a Japanese colleague asked for it as a resource to show students something of the conduct of scholarship abroad and to challenge the “needlessly timid” ethos fostered by Japanese professors’ “patriarchal attitude.” Long retired -- and evidently reassured by the thought that few of his American colleagues would ever see the book -- Anderson was wry and spot-on in recounting the unfamiliar and not always agreeable experience of American academic life as he found it after emigrating to the United States from England as a graduate student in the late 1950s. For one thing, his professors looked askance at his papers, where he might indulge in a sardonic remark if so inspired, or pursue a digressive point in his footnotes.
“In a friendly way,” he writes, “my teachers warned me to stop writing like this …. It was really hard for me to accept this advice, as in previous schools I had always been told that, in writing, ‘dullness’ was the thing to be avoided at all cost.” He also underscores the paradox that the pragmatic American disinterest in “grand theory” coexisted with an academic hunger for it, renewed on a seasonal basis:
“‘Theory,’ mirroring the style of late capitalism, has obsolescence built into it, in the manner of high-end commodities. In year X students had to read and more or less revere Theory Y while sharpening their teeth on passé Theory W. Not too many years later, they were told to sharpen their teeth on passé Theory Y, admire Theory Z, and forget about Theory W.”
Lest anyone assume this refers to the situation in the humanities, it’s worth clarifying that one example he gives is the “modernization theory” that once ruled the social sciences roost. And similar ridings of the trend wave also prevail in the choice of areas for research. The antidote, he found, came from leaving the academic coconut bowl to explore Indonesia, the Philippines and Thailand:
“I began to realize something fundamental about fieldwork: that it is useless to concentrate exclusively on one’s ‘research project.’ One has to be endlessly curious about everything, sharpen one’s eyes and ears, and take notes about anything …. The experience of strangeness makes all your senses much more sensitive than normal, and your attachment to comparison grows deeper. This is also why fieldwork is so useful when you return home. You will have developed habits of observation and comparison that encourage or force you to start noticing that your own culture is just as strange ….”
Unfortunately the author does not say how his intended Japanese public responded to Life Beyond Boundaries. A lot probably depends on how well the moments of humor and reverie translated. But in English they read wonderfully, and the book is a gem.
Prestige has its privileges. When a well-established award is announced -- as the 100th set of Pulitzer Prize winners was on Tuesday -- it tends to consume the available limelight. Anything less monumental tends to disappear into its shadow.
But a couple of developments in the humanities this week strike me as being as newsworthy as the Pulitzers. If anything, they are possibly more consequential in the long run.
For one, we have the Whiting Foundation’s new Public Engagement Fellowship, which named its first recipients on Tuesday. The fellowship ought not to be confused with the Whiting Award, which since 1985 has been given annually to 10 authors “based on early accomplishment and the promise of great work to come.” The winners receive $50,000 each, along with, presumably, the professed esteem and subdued malice of their peers.
By contrast, the Public Engagement Fellowships go to professors who have shown “a demonstrated commitment to using their scholarly expertise to reach wider audiences,” in order to fund ambitious projects designed to have direct and significant impact on a specific public outside the academy.” There are eight scholars in the fellowship’s inaugural cohort, including, for instance, Zoë Kontes, an associate professor of classics at Kenyon College, who will spend a semester creating a podcast to explore the black market in looted artifacts.
As with the literary prize, the fellowship comes with $50,000, with $10,000 earmarked for the project’s expenses and the rest covering the recipient’s stipend. Neither the number of fellows nor the apportionment of finances is set in stone, as I learned from Daniel Reid, the foundation’s executive director, when we met last week.
He explained that after more than 40 years of funding dissertations in the humanities at elite universities, the Whiting Foundation had decided it was time to direct its attention to a relatively underserved aspect of humanities scholarship: the cultivation of new ways of making connections with the world beyond the campus. Last year, the foundation contacted administrators at 40 universities, encouraging them to nominate faculty with projects that might be appropriate for funding.
“This has been a learning process on both sides,” Reid said, “for [the foundation] in running things and for the institutions in getting a sense of what we’re looking for.” He explained that the proposals were then evaluated by a group of seven people who had considerable experience with the communication of specialized knowledge to a wide public. The names are not public, though Reid indicates that a number of them are prominent figures in scholarship, publishing and museum or gallery curation. (The need for secrecy is understandable: publicizing the names would leave the Whiting judges as vulnerable as delegates to this summer’s political conventions are starting to feel.)
For the second group of Public Engagement Fellows, the Whiting Foundation will double the number of colleges and universities it contacts in search of nominations, with the long-term goal of making the process open to all higher education institutions. In the future, the number of recipients may range from six to 10. I gave the example of Kontes’s podcast on the looting of antiquities as an example (not quite at random: consider me on the waiting list to subscribe) but hope the other projects stimulate interest, discussion and perhaps some healthy competition.
The other development from earlier in the week is Duke University Press’s announcement that it will be publishing an edition of the works of Stuart Hall, who can -- without exaggeration, if not without argument -- be called the founding father of cultural studies as an academic discipline, at least in Great Britain. The Wikipedia entry for Hall is surprisingly thorough, so anyone for whom the name does not signify might want get up to speed there.
Hall is the case of a figure in the humanities whose impact is both widely recognized yet difficult to assess for an American -- for the simple reason that, even at the peak of his influence, his work was remarkably difficult to find. A number of his major writings seem to have been published as mimeographed papers. He published books, but not that many found their way into American bookstores. So the prospect of having his scattered and fugitive writings in an edition from a major university press is appealing.
I heard that Ken Wissoker, the press's editorial director, might have some background information on why we are getting Hall’s work in this form only now, two years after his death. He confirmed my impression in an email note and gave a little background that seems worth putting into the record: “David Morley had edited two or three volumes of Stuart’s essays for Macmillan U.K. back in the late ’80s, but my understanding is that Stuart decided against having them come out (or delayed it into not happening). The original cultural studies essays were in a lot of different places …. Xeroxes and then PDFs circulated, but it would have been very difficult to track down all the originals …. Stuart saw the work as conjunctural and didn’t want it becoming scripture. Ironically, this was only a problem in English. There are translations to Mandarin and German (and I believe Spanish and/or Portuguese).”
The first of the two titles in the Duke edition will be out this fall, and the second will be published next spring. One is a set of lectures on the intellectual foundations of cultural studies, the other the first volume of Hall’s autobiography. “The memoir will have a second volume,” Wissoker says, “that will be more of an intellectual and political summation ‘what I think now’ book.” Farther down the line there will be a volume of selected essays, and Laura Sell, Duke's publicity and advertising manager, says that a number of thematically organized collections on “politics, race, photography, black aesthetics, Marxism and post-Marxism, [and] the Caribbean” will come in due course.
When Winston Smith discovers the blind spot in his apartment -- the niche just out of range of the telescreen, Big Brother’s combination video feed and surveillance system -- it is, George Orwell tells us, “partly the unusual geography of the room” that allows him to take the risk of writing in a diary.
Later Smith finds another room with no telescreen at all, where he and Julia create another zone of privacy: the shared kind, intimacy. It can’t last, of course, and it doesn’t, with brutal consequences for both of them. (Thoughtcrime does not pay.)
The dystopia of Orwell’s 1984 is very much the product of its era, which spanned roughly the period between Hitler’s ascension to power in 1933 and Stalin’s death 20 years later. And while the novel’s depiction of a world without privacy can still raise a reader’s hackles, its technology now looks both retrofuturist and surprisingly inefficient. The telescreens are menacing, but there’s always a chance that Big Brother’s watchers will overlook something. And look at the tools that Winston uses to carve out his own domain of personal memory and antitotalitarian sentiment: a pen and paper. The authorities manage to read his thoughts eventually, but it takes most of the novel to get to that point. Today, Winston would be destined to Room 101 before he powered down his notebook.
Last week, Inoted that Meg Leta Jones’s book Ctrl+Z: The Right to Be Forgotten (NYU Press) arrives at a time when ever fewer activities or communicative exchanges occur without the accompaniment of some form of information technology intervening. Digital traces generated along the way are gathered, analyzed, sold. And the right to privacy becomes a little more purely notional each time one’s eyes slide down the text of a user agreement on the way to clicking “accept.”
A kind of fatalism is involved, one resting on the tacit but powerful tendency to assume that technology itself defines what information will gathered, and how, and the use to be made of it. Implied is a trade-off between privacy and various benefits -- with both the cost and the reward determined by what our devices do and require. Privacy is, in this view, a function of engineering necessities, not of political or moral decisions.
The initial, blunt challenge to technological determinism comes in Ctrl+Z’s opening chapters, where Jones, an assistant professor of communications, culture and technology at Georgetown University, contrasts how the European Union and the United States frame their policies concerning the availability of personal information online. Here personal information would include employment history, financial data and arrest records, as well as, say, material communicated via social media.
In the United States, she writes, the default attitude “permits the collection and transfer of personal information and prevents abuse through self-regulation and market forces,” while E.U. states “operate under comprehensive regimes that protect information across both the public and private sectors and are enforced by specialized data-protection agencies.”
The contrast becomes striking when “data protection” might be better described as protecting the reputation or well-being of the individual to which the data pertains. Take the case of someone who, as a young adult, is arrested for vandalism and destruction of property and serves a jail sentence, all of which was written up in a newspaper in 1990 as well as being documented in official records. Once released, he swears off his old ways and spends the next 25 years in steady employment and overall irreproachable conduct. He awakes to find that the newspaper has digitized its archives and made them searchable via Google.
If our reformed graffiti artist lives in America, he can do little if anything about it, apart from asking the paper to take down its accurate but deeply embarrassing article. There is also a chance his conviction will be publicized on any of various websites dedicated to posting mug shots.
In a number of E.U. countries, by contrast, he could appeal to laws that forbid public reference to someone’s criminal record if it is no longer news or if the ex-con has undergone significant rehabilitation. He might also file a request with Google to remove links to sites mentioning the old transgression. In 2014, the Court of Justice of the European Union ruled that the search engine had to establish a take-down system for people who wanted personal information removed from its search results.
There are variations from country to country, but Jones finds that the E.U. “data subject” (in effect, the citizen’s digital doppelgänger) can claim a “general right to personality” -- a certain degree of dignified immunity from unwelcome attention. The American data subject, by contrast, is presumed to take the Wild West ethos of the Internet pretty much as a given, with any effort to delete information or limit its circulation being labeled, almost inevitably, as Orwellian. (Even so, a number of piecemeal efforts have been made in the United States to protect children and victims of harassment and bullying, including laws against revenge porn.)
But as Jones goes on to show, any preference for one of these frameworks over the other will soon enough be faced with the much harder matter of dealing with new and unanticipated shades of gray left out of the public/private distinction. And the other dichotomy -- between having every bit of personal data (flattering, humiliating or neither) either preserved forever in a digital archive or destined for the memory hole -- is also looking out of date. Jones’s book doesn’t predict what comes next, but it’s a great stimulant for anyone bracing themselves to think about it.
New book argues that students involved in campus protests over controversial speakers or ideas should instead support a marketplace of ideas in which all notions are heard and the best rise to the top.
Fame can be fickle and destiny, perverse -- but what are we to call how posterity has treated Stefan Zweig? In the period between the world wars, he was among the best-known authors in the world and, by some reckonings, the most translated. Jewish by birth and Austrian by nationality, Zweig was perhaps most of all Viennese by sensibility. His fiction expressed the refined cynicism about sexual mores that readers associated with Sigmund Freud and Arthur Schnitzler, and he played the role of Vienna coffeehouse man of letters to perfection.
Zweig’s cosmopolitanism was of the essence: his biographical and critical studies of European literary and historical figures spanned the Renaissance through the early 20th century and even roamed far enough abroad to include Mary Baker Eddy, the very American if otherwise sui generis founder of Christian Science. (The portrait of her in his book Mental Healers is etched in acid, but it occupies a surprisingly appropriate spot: between the accounts of Franz Mesmer and Freud.)
His books fueled the Nazi bonfires. Even apart from their racial obsessions, Zweig was precisely the sort of author to drive Hitler and Goebbels into paroxysms, and after years of exile he committed suicide in Brazil in 1942. His reputation did not survive the war, either, at least not among English-language readers. After four decades or so of relatively far-flung reading, I think of him as one of those authors who seem only ever to show up in parentheses and footnotes, or sometimes pointed out as a biographer too prone to psychologizing or melodrama. Being barely remembered trumps being totally forgotten (it’s more than most of us will be, anyway), but Zweig hardly seemed like a figure poised for rediscovery when, not too long ago, the comeback began.
The essays and speeches collected in Messages From a Lost World: Europe on the Brink (Pushkin Press) form a supplement to the volume that launched the revival -- Zweig’s memoir, The World of Yesterday, which the University of Nebraska Press published in the new translation that Pushkin Press issued in 2009. (Finished just before the author’s suicide, the book first appeared in English in 1943. That earlier translation can also be had, in ebook format.) The recent NYRB Classics editions of his fiction have had a lot to do with it being more than a one-book revival, but e-publishing and print-on-demand operations account for nearly everything by Zweig in English now being available. A mixed blessing, given that some such “publishers” do little but sell you copies of material available for free from the Internet Archive or Project Gutenberg.
The World of Yesterday is less autobiography than a self-portrait of the European literati during the final years of the belle epoque -- the four decades of relative peace and prosperity on the continent that ended with World War I. The new communication technologies and modes of high-speed transport were shrinking the world, while the spread of education, scientific progress and humane cultural values would presumably continue. The earliest pieces in Messages From a Lost World contain Zweig’s musings on the spiritual impact of the war, written while it was still in progress and with no end in sight. They are the thoughts of a man trying to find his way out of what must have seemed a completely reasonable state of despair:
“Never since it came into being has the whole world been so communally seized by nervous energy. Until now a war was only an isolated flare-up in the immense organism that is humanity, a suppurating limb which could be cauterized and thus healed, whilst all the remaining limbs were free to perform their normal functions without the least hindrance …. But due to its steady conquest of the globe, humanity forged ever-closer links, so today a fever quivers within its whole organism; horrors easily traverse the entire cosmos. There is not a workshop, not an isolated farm, not a hamlet deep in a forest from which they have not torn a man so that he might launch himself into the fray, and each of these beings is intimately connected to others by myriad threads of feeling; even the most insignificant among them has breathed so much of the feverish heat, his sudden disappearance makes those that remain that much colder, more alone and empty.”
In pieces from the 1920s and early ’30s, Zweig takes it as a moral imperative to champion the cause of peace by reminding his readers and listeners that humanity could no longer afford the sort of belligerent nationalism that had led them into the Great War. Respect for the possibilities of human development should replace claims to military greatness:
“If men lived [in earlier eras] as if in the folds of a mountain, their sight limited by the peaks on either side, we today see as if from a summit all worldly happenings in the same moment, in their exact dimensions and proportions. And because we have this commanding view across the surface of the earth, we must now usher in new standards. It’s no longer a case of which country must be placed ahead of another at their expense, but how to accomplish universal movement, progress, civilization. The history of tomorrow must be a history of all humanity and the conflicts between individual countries must be seen as redundant alongside the common good of the community.”
If the world could be changed by elegantly expressed humanist sentiments, this passage, from a speech delivered in 1932, might have altered the course of history. But the way it tempted fate now looks even more bitterly ironic than it did after Hitler took office a few months later. For in spite of his lofty vantage point (“we today see as if from a summit all worldly happenings in the same moment, in their exact dimensions and proportions”) and a depth of historical vision giving him insight into Magellan, Casanova, Napoleon, Goethe, Nietzsche and Mary Antoinette (among others), Zweig was struck dumb by the course of events after 1933.
Not literally, of course; on the contrary, later selections in Messages From a Lost World show that he could turn on the spigots of eloquence all too easily. (The tendency of Zweig's prose to turn purple and kitschy has been mocked at length by the translator Michael Hoffmann.) But he remained notoriously -- and to a great many people, unforgivably -- averse to speaking out against what was happening in Germany. He did say that, considering the company it put him in, having his book torched by the Nazis was an honor. Yet as a statement of solidarity against a regime that would, in time, burn people, that seems decidedly wanting.
Uncharitable people have accused Zweig of cowardice, while Hannah Arendt’s essay on him, found in Reflections on Literature and Culture (Stanford University Press, 2007) treats Zweig as an example of the cultural mandarin so determined to avoid the grubby realities of politics that he disgraced his own idealism. Whether or not that implies a lack of courage is a difficult question.
But surely it isn’t being excessively generous to wonder if Zweig’s failure was one of imagination, rather than of nerve: the inability, first of all, to grasp that Hitler was more than just another nationalist demagogue, followed by a paralysis at seeing the mad dream of the Thousand-Year Reich tearing into reality across most of Europe, with no plan to stop there. Against the horrible future, all he had left was nostalgia -- with a memory of what security and boundless optimism had been like, once.
A celebrity has been defined as somebody who is well-known for being well-known. And by that measure, Gary A. Olson’s Stanley Fish, America's Enfant Terrible: The Authorized Biography (Southern Illinois University Press) is a celebrity biography of sorts. At no point does the subject go into rehab, but other than that, the book hits most of the tabloid marks: humble origins, plucky self-fashioning, innovative and controversial work, exorbitant earnings (for a Milton scholar), and illicit romance (leading to marital bliss). Much of this was once English department gossip, but now posterity is the richer for it.
Olson, president of Daemen College in Amherst, N.Y., even reveals a brush with Hollywood. When plans were underway to film one of David Lodge’s novels about academe, the producer wanted to cast Walter Matthau as Morris Zapp, a character clearly based on Fish. Matthau wasn't interested, and the movie languished in development hell, but in Olson’s words, “Stanley let it be known that he would love the opportunity to play the role himself.” (Another trait of the celeb bio: subject called by first name.)
If Fish wrote a book of career advice, a suitable title might be The Art of the Deal. Describing his career in the late 1970s and early ’80s, Olson writes:
“Stanley’s annual schedule was unimaginably grueling to many faculty. He would jet from university to university, giving workshops, papers and presentations, both in the United States and abroad. He was in great demand and he knew it, so he pushed the boundaries of what universities would pay to bring in a humanities professor to speak. Whatever he was offered by a university, he would demand more -- and he usually got it. The host who invited him would scramble to meet his fee, asking various campus departments and entities to contribute to the event until the requisite fee had been collected. Throughout his career he must have visited almost every notable university in every state of the union.”
Then again, a book of career advice from Stanley Fish would be virtually useless to anyone else today, like a tourist guide to a city that's been hit by an earthquake. This fall will be the 50th anniversary of Fish receiving tenure at the University of California at Berkeley. He was 28 years old and had been teaching for four years. Berkeley was his first position; he turned down two previous offers before accepting it. Given the circumstances, Stanley Fish, America’s Enfant Terrible will pose a challenge to many readers that rarely comes up with a nonfiction book: that of suspending disbelief. The chapter I just quoted is called “Academic Utopia” -- and as with other utopias, you can’t really get there from here.
“From my point of view,” Olson reports Fish saying, “there are a lot of people out there making mistakes, and I’m just going to tell them they’re making mistakes.” As if to authenticate that statement, the title of Fish’s most recent book is Think Again: Contrarian Reflections on Life, Culture, Politics, Religion, Law, and Education (Princeton University Press) while his next, due this summer, is called Winning Arguments: What Works and Doesn't Work in Politics, the Bedroom, the Courtroom, and the Classroom (Harper).
Originally the “people out there” so targeted were his fellow scholars of 16th- and 17th-century English literature. The range of his corrective efforts grew to include leading figures in literary theory, followed by the legal profession and finally -- with his New York Times column, which ran from 1995 to 2013 -- the entire public sphere. Olson wrote about the theoretical and rhetorical force of Fish’s work in two earlier volumes. Here the author largely underplays Fish’s intellectual biography beyond a few references to professors who influenced him as a student. Instead, Olson emphasizes the tangible institutional power that Fish acquired and learned to wield in his route to becoming a university administrator of national prominence.
Starting in 1985, when he was chair of the English department at Johns Hopkins University, Fish made a series of unexpected and much-discussed moves -- first to Duke University, where he reshaped the respectable but inconspicuous English department into high-theory powerhouse of the early 1990s, and later to the University of Illinois at Chicago, where he was dean of the College of Liberal Arts and Sciences. There, the biographer says, Fish’s “hiring of celebrity academics and the generating of positive publicity” were steps meant “to lift the miasma of an inferiority complex, to help faculty collectively feel that they were part of an enterprise that was known and admired by others.” (There were other appointments and stints and high professional honors along the way; the chronology at the back of the book is overstuffed with them.)
Like Morris Zapp, Stanley Fish has an appealing side, as Olson portrays him: he works hard (as a graduate student, he read Paradise Lost five times in one summer), he sounds like a good teacher and the qualities that read as brash and arrogant in some circumstances are probably gruff but lovable, like a Muppets character, in others. That said, letting decisions and values be determined by the need to feel “known and admired by others” is the very definition of what sociologists used to call “other-directedness.” More recently, and fittingly enough, it’s associated with the culture of celebrity.
No celebrity biography is complete without a catastrophe or two, mixed in with the triumphs -- and in the case of Fish’s role as a “developer” of intellectual and institutional real estate, there is not such a bright line between success and disaster. He enhanced the reputation of Duke’s humanities programs (and that of the university press), but most of the stars he hired relocated within a few years. And Fish’s efforts at UIC are remembered for running at a deficit that lasted beyond his deanship.
In an email note to Olson, I asked about various things that had caused a knot to form in my stomach while reading his book. We learn that Fish arranged a summer stipend of $58,000 for one of his cronies. Olson says he once responded to the challenge of an audience member to explain why anyone should believe him by saying, “Because I am Stanley Fish. I teach at Johns Hopkins University, and I make $75,000 a year.” (I recall hearing that one in the early 1980s; the equivalent now would be around $200,000.)
I've benefited from reading Fish in the past, but I find it hard to imagine anyone racking up student debt that will take decades to pay off regarding the figure Olson presents as anything but a monster. To be fair, the final pages of the book depict Fish in more recent times as a less grandiose figure, even as touched with regret or disappointment. (Being a septuagenarian enfant terrible seems like a pretty melancholy prospect.)
Anyway, Olson replied to my question as follows: “Well, some do think of him as a monster, although probably not for those reasons. I don’t argue in the book that he should be admired -- or reviled. That’s an individual choice. You have to remember: Fish is from an older generation of academics, when higher education was growing exponentially and exuberantly. Fish represents the rise of high theory. Despite his passion for teaching, he is known most as an intellectual and a scholar; that’s what drew the high pay and the high praise. You don’t hire someone like Fish to enhance your teaching as much as you do to bring a certain prestige to your institution. That’s why Duke, UIC and other institutions wanted him. And, generally, these proved to be good institutional decisions.”
Perhaps. It seems to be a question of short-term versus long-term benefit. But it’s hard to understand how giving large barrels of money to a few transient scholarly A-listers “to help faculty collectively feel that they were part of an enterprise that was known and admired by others” was ever a good idea -- much less a sustainable one. After Fish, the deluge?
Early last month I started reading Maria Konnikova’s The Confidence Game: Why We Fall for It … Every Time (Viking) but had to put the book aside under the pressure of other matters -- only to have the expression “con artist” hit the headlines in short order. And in the context of electoral politics, no less, with one Republican presidential candidate applying it to another. (No names! Otherwise people with a Google News alerts for them will just flood into the comments section to rail and vent.)
My notes from a few weeks ago show no inkling that the book might be topical. Instead, I listed movies about the confidence game (the most theatrical of crimes) and dredged up recollections of David W. Maurer’s The Big Con: The Story of the Confidence Man, first published in 1940. A reviewer of one of Maurer’s later books said that while he was “retired as an active professor of English at the University of Louisville” as of 1972, he was “the foremost student of the argots and subcultures in various precincts of the Wild Side … whose easy authority and unerring judgment brought the language and culture of the American underworld to the attention of scholars.” The tribute still stands, although Maurer’s studies of the slang of drug addicts, prostitutes, forgers and pickpockets (many available in JSTOR) are now period pieces: the scholarly equivalent of vintage pulp fiction.
Parts of The Big Con have a Depression-era flavor, but it is the author’s masterpiece and, if not the last word on the subject, certainly the definitive one. Other, lesser criminal subcultures generate argots, but for Maurer the confidence men are the professional elite of the underworld, and their specialized lingo is the concentrated expression of a well-tested understanding of human psychology. Mastery of it distinguishes those qualified for “the long con” (sustained and intricate operations extracting large sums from victims) from, e.g., small-timers running the three-card monte scam on a street corner.
The long con, as depicted on screen, is a marvel to witness. David Mamet wrote and directed the two best movies in this area: House of Games (1987) and The Spanish Prisoner (1997). The latter film takes its name from the quintessential long con, still viable after centuries of use. The contrast between short and long cons is blended into the Oedipal shenanigans of The Grifters (1990), adapted from the novel by pulp-noir virtuoso Jim Thompson, who I suspect read Maurer’s book somewhere along the way. And in a considerably lighter vein there is Dirty Rotten Scoundrels (1988) with Michael Caine as a smooth long-con artist and Steve Martin as the short-con operator who becomes both his nemesis and protégé.
Looking up those dates, I’m struck by the absence from the list of any more recent entry. What we have instead, it seems, is a string of movies such as Boiler Room (2000), The Wolf of Wall Street (2013) and The Big Short (2015), which treat the world of high finance as a racket. The most recent film’s title sounds like a nod to Maurer, and his monograph does cover a classic long con called “the rag,” which involves a phony stockbroker’s office. (There, investors have the chance to get rich through insider trading, or so they think until the office vanishes, along with the suckers’ money.)
“If confidence men operate outside the law, it must be remembered that they are not much further outside than many pillars of our society who go under names less sinister,” Maurer said in a passage that Maria Konnikova, a New Yorker contributor, quotes in The Confidence Game. She moves quickly to give recent evidence for the point, such as the U.S. Department of Justice’s suit against USIS, which she describes as “the contractor that used to supply two-thirds of the security clearances for much of the [U.S.] intelligence community.” The DOJ found that “the company had faked well over half a million background checks between 2008 and 2012 -- or 40 percent of total background checks.” Corruption on that scale lies beyond the most ambitious con’s dreams of success.
Konnikova’s basic argument -- developed through a mixture of anecdotes and behavioral-science findings, after the manner associated with journalist and author Malcolm Gladwell -- is that both the grifter’s manipulative skills and the victim’s susceptibility are matters of human neurobiology and everyday social psychology. She cites an investigation organized by “Charles Bond, a psychologist at Texas Christian University who has studied lying since the 1980s,” who, in 2006, gathered information on beliefs about lying held by dozens of countries in 43 languages. Three-quarters of the responses in one phase of the study identified “gaze aversion” as a signal that someone was lying, while “two-thirds noted a shift in posture, another two-thirds that liars scratch and touch themselves more, and 62 percent said that [liars] tell longer stories. The answers spanned sixty-three countries.” But other studies show that these beliefs -- while cross-cultural, if not universal -- are poor guides to assessing a stranger’s truthfulness. Someone delivering a carefully worded lie while holding a steady gaze and not fidgeting is, in effect, already halfway to being believed (or elected, as the case may be).
Konnikova pays tribute to Maurer’s classic by linking each of her chapters to one of the phases or components of a long con, as itemized in the grifter’s lexicon. The first stage, called “the put-up,” is perhaps the most intuitive: the con artist identifies a mark (victim) by picking up signals of the individual’s interests, personality traits and self-perceptions, and then begins to cultivate casual familiarity or trust. “There’s nothing a con artist likes to do more than make us feel powerful and in control,” she writes. “We are the ones calling the shots, making the choices, doing the thinking. They are merely there to do our bidding. And so as we throw off ever more clues, we are becoming increasingly blind to the clues being thrown off by others.”
Here, the author overstates things, since the talented grifter is also busy creating signals. In a later chapter, she describes a fellow who arrives at a charity event in London, acting a bit drunk and overfriendly and seemingly unaware that he didn’t shave that morning. His mark is woman of the world who knows the type: one of those would-be charming playboys, feckless but harmless, getting by on an allowance from his aristocratic relations. (Suffice to say that she takes a check from him and lives to regret it.)
With the dramatis personae in place, “the play” (the con itself) gets underway. Other characters may have walk-on parts, such as the “secretaries” and “brokers” in a phony stock-market game who get a cut of the criminal profits. The mark is made privy to whatever circumstances the con involves (e.g., a long-forgotten wine cellar full of rare and expensive vintages, an amazing real-estate deal, the chance to get in on the next big invention) and drawn into the secrets and moments of exhilaration that go with this once-in-a-lifetime opportunity.
But the most fascinating and disturbing aspect of a long con is how the con artist manages and redirects any anxieties or misgivings the victim may feel. After the “blowoff” -- when the victim is left with a case of bad wine with fancy labels, ownership of swampland, etc. -- a really well-executed con will leave the mark too embarrassed to complain. (Erving Goffman's classic paper "On Cooling the Mark Out" uses a late phase of the con game as a key to understanding how society at large reconciles the ordinary person to the disappointing realities of life.)
The lab experiments and social-scientific inquiries that Konnikova describes offer plausible (if seldom especially surprising) analyses of the cognitive and emotional forces at work. People prefer to think of themselves as smart, helpful, good judges of character and destined for lives better than the one they’ve settled for, thus far. And someone appealing to those feelings can end up with all of your money and no known forwarding address. Perhaps the most interesting and memorable thing about The Confidence Game is not that researchers can now explain the roots of our vulnerability, but rather the way it confirms something Maurer implied in his book 75 years ago: there’s no one better able to understand the individual human psyche than someone prepared to exploit it, undistracted by the slightest remorse.
Nearly a month has passed since the release of “The Costs of Publishing Monographs: Toward a Transparent Methodology,” a document prepared by the consulting and research division of Ithaka S+R. (Ithaka is also associated with JSTOR, the scholarly journals repository.) The report seems not to have drawn much attention outside the ranks of the Association of American University Presses, which seems odd. It ought to be of some interest to the larger constituency of those who buy, read and/or write scholarly books.
If you mention the price of academic-press books to people who’ve never purchased one, the effect is akin to a cartoon character with eyeballs popping out and exclamation marks hovering in the air, with a thought balloon reading, “What a racket!” (On one occasion I heard it said aloud.) The dismay will usually cool off some as you explain how the specialist nature of scholarly publications tends to preclude economies of scale. A small audience means low press runs, yielding high per-unit costs. That’s not the whole story, of course, but it often suffices to explain why, say, a slender new book interpreting Moby Dick might cost five times as much as a Melville biography thick enough to serve as a doorstop -- and why no one in the family has purchased Aunt Louise’s book, even if they’re proud she got tenure for it.
The authors of the new Ithaka report mention a ballpark estimate of the expense to a press of preparing a scholarly book for publication (not printing, just getting it to that point) that has been bandied about over the past couple: $20,000. It’s problematic, but let’s imagine, for the sake of argument, that it costs that much to prepare and to print a monograph, and that every single one of its 400 copies is sold. In that case the absolute lowest wholesale price of a single volume has to be $50, just to break even. Many trade publishers would consider a print run 10 times that size to be small -- with each copy selling at a much lower price while still making a profit. It’s not that trade presses are models of efficiency that scholarly presses ought somehow to emulate -- not at all. They resemble one another about as much as an ostrich egg and a cannonball do, and the differences cannot be tinkered away.
Ithaka’s researchers collected information on the expenses involved in bringing out 382 books from the arts, humanities and social sciences published by 20 American university presses during their 2014 fiscal year. The data assembled were granular -- drawn from the sort of in-house bookkeeping each department (editorial, production, marketing, etc.) had to do while handling each title. Some expenses are more discretely defined than others. The cost of sending a manuscript out for copyediting, for example, is not too hard to determine; just look at the invoice. Calculating the fraction of an acquisition editor’s salary that went into a given book seems more difficult -- besides which there are the overhead expenses of clerical labor, rent, tech support and so on, some of them provided by the hosting university.
The 20 presses surveyed range from small presses (averaging roughly 11 employees publishing 46 titles per year, with an annual revenue from books of under $1.5 million) to powerhouses (circa 82 employees, 253 titles and more than $6 million annual revenue). They are segmented into four size categories, with five presses each, and with some effort made for geographical diversity and varying publishing foci (monographs, journals, regional titles).
In short, it must be one hell of a spreadsheet -- and the researchers establish three ways of defining cost per book to reflect the varying impacts of staff time, overhead expense and institutional support. One effect of the analysis is that the figure of $20,000 per book in preparation expenses goes right out the window: the study “yielded a wide range of costs per title, from a low of $15,140 to a high of $129,909, and the range of costs is wide both within and across groups.” Taking in the varying ways of assessing the expenses of almost 400 titles, the researchers find that the average cost per monograph is between $28,747 (using the minimal baseline) and not quite $40,000 (factoring in indirect overhead expenses). It bears repeating that this is not the final cost of publishing, printing, binding and warehousing monographs of the predigital sort would entail additional expense.
The Ithaka report focuses, rather, “on the costs of producing the first digital copy of ‘a high-quality digital monograph.’” For that to be the benchmark -- rather than the traditional hardback monograph -- is in keeping with the expectation that scholarship be made available in open-access form, as both federal mandates and the emerging academic ethos increasingly demand.
For scholarly publishing to meet the standards of quality established over the past century will require continued investment in the kinds of intensive, skilled labor that university presses foster. How to meet that demand while simultaneously developing ways of funding open-access publishing remains to be worked out. Ithaka S+R’s report doesn’t underestimate the difficulties; it just reminds us that the problem is on the agenda, or needs to be. Otherwise, the shape of things to come in scholarly publishing could get very messy -- and not in an especially creative way.
“One of the most profoundly exciting moments of my life,” Gertrude Stein recalled in a lecture at Columbia University in the mid-1930s, “was when at about 16 I suddenly concluded that I would not make all knowledge my province.” It is one of her more readily intelligible sentences, but I have never been able to imagine the sentiment it expresses. Why “profoundly exciting”? To me it sounds profoundly depressing, but then we’re all wired differently.
Umberto Eco, who died last week at the age of 84, once defined the polymath as someone “interested in everything, and nothing else.” (Now that’s more like it!) The formulation is paradoxical, or almost: the twist comes from taking “nothing else” to mean “nothing more.” It would be clearer to say that polymaths are “interested in everything, and nothing less,” but also duller. Besides, the slight flavor of contradiction is appropriate -- for Eco is describing an attitude of mind condemned to tireless curiosity and endless dissatisfaction, first of all with its own limits.
Eco’s work has been a model and an inspiration for this column for almost 30 years now, which is about 20 more than I’ve been writing it. The seed was planted by Travels in Hyperreality, the first volume of his newspaper and magazine writings to appear in English. Last year “Intellectual Affairs” celebrated the long-overdue translation of Eco’s book of sage advice on writing a thesis. An earlier essay considered the public dialogues that he and Jürgen Habermas were carrying on with figures from the Vatican. And now -- as if to make a trilogy of it -- saying farewell to Eco seems like an occasion to discuss perhaps the most characteristic quality of Eco’s mind: its rare and distinctive omnivorousness.
Eco himself evidently restricted his own comments on polymathy to that one terse definition. I must be garrulous by contrast but will try to make only two fairly brief points.
(1) As his exchange of open letters with Cardinal Carlo Maria Martini, the former archbishop of Milan, indicated, Eco was a lapsed but not entirely ex-Catholic: one who no longer believed but -- for reasons of personal background and of scholarly expertise as a medievalist -- still carried much of the church’s cultural legacy inside himself. His first book, published in 1956, was a study of St. Thomas Aquinas’s aesthetics that began as a thesis written “in the spirit of the religious worldview” of its subject. And the encyclopedic range and dialectical intricacies of the Angelic Doctor’s Summa Theologica never lost their hold on Eco’s imagination.
“Within Thomas's theological architecture,” Eco wrote in an essay in 1986, “you understand why man knows things, why his body is made in a certain way, why he has to examine facts and opinions to make a decision, and resolve contradictions without concealing them, trying to reconcile them openly …. He aligned the divergent opinions [of established philosophers and theologians], clarified the meaning of each, questioned everything, even the revealed datum, enumerated the possible objections, and essayed the final mediation.”
Eco regarded the Summa’s transformation into an authoritative statement of religious doctrine as nothing less than a disaster. In the hands of his successors, “Thomas's constructive eagerness for a new system” degenerated into “the conservative vigilance of an untouchable system.” Eco was -- like Étienne Gilson and Alasdair MacIntyre, among others -- part of the 20th-century rediscovery of Aquinas as the builder of a dynamo rather than the framer of a dogma. And there’s no question but that the medieval theologian exemplified “an interest in everything, and nothing else.”
(2) In the early 1960s, Eco was invited to participate in an interdisciplinary symposium on “demythicization and image” in Rome, along with an impressive array of philosophers, theologians, historians and classical scholars. Among them would be Jesuit and Dominican monks. He felt an understandable twinge of anxiety. “What was I going to say to them?” he recalled thinking. Remembering his enormous collection of comic books, Eco had a flash of inspiration:
“Basically [Superman] is a myth of our time, the expression not of a religion but of an ideology …. So I arrive in Rome and began my paper with a pile of Superman comics on the table in front of me. What will they do, throw me out? No sirree, half the comic books disappeared; would you believe it, with all the air of wishing to examine them, the monks with their wide sleeves spirited them away ….”
The anecdote might be used as an example of Eco’s interest in semiotics: the direction his work took after establishing himself as a medievalist. Comic books, Leonardo da Vinci paintings, treatises in Latin on demonology …. all collections of signs in systems, and all potentially analyzable. Nor was his conference presentation on Superman the end of it. Not much later, Eco published an essay about the world of Charlie Brown called "On 'Krazy Kat' and 'Peanuts.'"
But in fact those two papers were written before Eco’s turn to semiotics -- or semiology, if you prefer -- in the late 1960s. (The one on Peanuts reads as being influenced by Sartre, as much as anyone else.) Eco’s attitude towards mass media and popular culture was never one either of slumming or of populist celebration. Nor was it a matter of showing off the power and sharpness of cool new theoretical tools by carving up otherwise neglected specimens. He took it as a given that cartoons, movies and the crappy books issued by Italy’s vanity-publishing racket were -- like theological speculation or political conflict -- things that merited analysis and critique or that could become so, given interesting questions about them.
At the end of his remarks on Aquinas 30 years ago, Eco tried to imagine how the author might conduct himself if suddenly returned to life. Of course there’s no way to judge the accuracy of such a thought experiment’s results, but Eco’s conclusion seems like a personal creed: “He would realize that one cannot and must not work out a definitive, concluded system, like a piece of architecture, but a sort of mobile system, a loose-leaf Summa, because in his encyclopedia of the sciences the notion of historical temporariness would have entered …. I know for sure that he would take part in celebrations of his work only to remind us that it is not a question of deciding how still to use what he thought, but to think new things.”
And, Eco might have added, how to avoid settling for less than everything your mind might drive itself to understand.