books

Two landmark developments in the humanities this week

Prestige has its privileges. When a well-established award is announced -- as the 100th set of Pulitzer Prize winners was on Tuesday -- it tends to consume the available limelight. Anything less monumental tends to disappear into its shadow.

But a couple of developments in the humanities this week strike me as being as newsworthy as the Pulitzers. If anything, they are possibly more consequential in the long run.

For one, we have the Whiting Foundation’s new Public Engagement Fellowship, which named its first recipients on Tuesday. The fellowship ought not to be confused with the Whiting Award, which since 1985 has been given annually to 10 authors “based on early accomplishment and the promise of great work to come.” The winners receive $50,000 each, along with, presumably, the professed esteem and subdued malice of their peers.

By contrast, the Public Engagement Fellowships go to professors who have shown “a demonstrated commitment to using their scholarly expertise to reach wider audiences,” in order to fund ambitious projects designed to have direct and significant impact on a specific public outside the academy.” There are eight scholars in the fellowship’s inaugural cohort, including, for instance, Zoë Kontes, an associate professor of classics at Kenyon College, who will spend a semester creating a podcast to explore the black market in looted artifacts.

As with the literary prize, the fellowship comes with $50,000, with $10,000 earmarked for the project’s expenses and the rest covering the recipient’s stipend. Neither the number of fellows nor the apportionment of finances is set in stone, as I learned from Daniel Reid, the foundation’s executive director, when we met last week.

He explained that after more than 40 years of funding dissertations in the humanities at elite universities, the Whiting Foundation had decided it was time to direct its attention to a relatively underserved aspect of humanities scholarship: the cultivation of new ways of making connections with the world beyond the campus. Last year, the foundation contacted administrators at 40 universities, encouraging them to nominate faculty with projects that might be appropriate for funding.

“This has been a learning process on both sides,” Reid said, “for [the foundation] in running things and for the institutions in getting a sense of what we’re looking for.” He explained that the proposals were then evaluated by a group of seven people who had considerable experience with the communication of specialized knowledge to a wide public. The names are not public, though Reid indicates that a number of them are prominent figures in scholarship, publishing and museum or gallery curation. (The need for secrecy is understandable: publicizing the names would leave the Whiting judges as vulnerable as delegates to this summer’s political conventions are starting to feel.)

For the second group of Public Engagement Fellows, the Whiting Foundation will double the number of colleges and universities it contacts in search of nominations, with the long-term goal of making the process open to all higher education institutions. In the future, the number of recipients may range from six to 10. I gave the example of Kontes’s podcast on the looting of antiquities as an example (not quite at random: consider me on the waiting list to subscribe) but hope the other projects stimulate interest, discussion and perhaps some healthy competition.

The other development from earlier in the week is Duke University Press’s announcement that it will be publishing an edition of the works of Stuart Hall, who can -- without exaggeration, if not without argument -- be called the founding father of cultural studies as an academic discipline, at least in Great Britain. The Wikipedia entry for Hall is surprisingly thorough, so anyone for whom the name does not signify might want get up to speed there.

Hall is the case of a figure in the humanities whose impact is both widely recognized yet difficult to assess for an American -- for the simple reason that, even at the peak of his influence, his work was remarkably difficult to find. A number of his major writings seem to have been published as mimeographed papers. He published books, but not that many found their way into American bookstores. So the prospect of having his scattered and fugitive writings in an edition from a major university press is appealing.

I heard that Ken Wissoker, the press's editorial director, might have some background information on why we are getting Hall’s work in this form only now, two years after his death. He confirmed my impression in an email note and gave a little background that seems worth putting into the record: “David Morley had edited two or three volumes of Stuart’s essays for Macmillan U.K. back in the late ’80s, but my understanding is that Stuart decided against having them come out (or delayed it into not happening). The original cultural studies essays were in a lot of different places …. Xeroxes and then PDFs circulated, but it would have been very difficult to track down all the originals …. Stuart saw the work as conjunctural and didn’t want it becoming scripture. Ironically, this was only a problem in English. There are translations to Mandarin and German (and I believe Spanish and/or Portuguese).”

The first of the two titles in the Duke edition will be out this fall, and the second will be published next spring. One is a set of lectures on the intellectual foundations of cultural studies, the other the first volume of Hall’s autobiography. “The memoir will have a second volume,” Wissoker says, “that will be more of an intellectual and political summation ‘what I think now’ book.” Farther down the line there will be a volume of selected essays, and Laura Sell, Duke's publicity and advertising manager, says that a number of thematically organized collections on “politics, race, photography, black aesthetics, Marxism and post-Marxism, [and] the Caribbean” will come in due course.

Editorial Tags: 

Review (part 2) of Meg Leta Jones, "Ctrl+Z: The Right to Be Forgotten"

When Winston Smith discovers the blind spot in his apartment -- the niche just out of range of the telescreen, Big Brother’s combination video feed and surveillance system -- it is, George Orwell tells us, “partly the unusual geography of the room” that allows him to take the risk of writing in a diary.

Later Smith finds another room with no telescreen at all, where he and Julia create another zone of privacy: the shared kind, intimacy. It can’t last, of course, and it doesn’t, with brutal consequences for both of them. (Thoughtcrime does not pay.)

The dystopia of Orwell’s 1984 is very much the product of its era, which spanned roughly the period between Hitler’s ascension to power in 1933 and Stalin’s death 20 years later. And while the novel’s depiction of a world without privacy can still raise a reader’s hackles, its technology now looks both retrofuturist and surprisingly inefficient. The telescreens are menacing, but there’s always a chance that Big Brother’s watchers will overlook something. And look at the tools that Winston uses to carve out his own domain of personal memory and antitotalitarian sentiment: a pen and paper. The authorities manage to read his thoughts eventually, but it takes most of the novel to get to that point. Today, Winston would be destined to Room 101 before he powered down his notebook.

Last week, I noted that Meg Leta Jones’s book Ctrl+Z: The Right to Be Forgotten (NYU Press) arrives at a time when ever fewer activities or communicative exchanges occur without the accompaniment of some form of information technology intervening. Digital traces generated along the way are gathered, analyzed, sold. And the right to privacy becomes a little more purely notional each time one’s eyes slide down the text of a user agreement on the way to clicking “accept.”

A kind of fatalism is involved, one resting on the tacit but powerful tendency to assume that technology itself defines what information will gathered, and how, and the use to be made of it. Implied is a trade-off between privacy and various benefits -- with both the cost and the reward determined by what our devices do and require. Privacy is, in this view, a function of engineering necessities, not of political or moral decisions.

The initial, blunt challenge to technological determinism comes in Ctrl+Z’s opening chapters, where Jones, an assistant professor of communications, culture and technology at Georgetown University, contrasts how the European Union and the United States frame their policies concerning the availability of personal information online. Here personal information would include employment history, financial data and arrest records, as well as, say, material communicated via social media.

In the United States, she writes, the default attitude “permits the collection and transfer of personal information and prevents abuse through self-regulation and market forces,” while E.U. states “operate under comprehensive regimes that protect information across both the public and private sectors and are enforced by specialized data-protection agencies.”

The contrast becomes striking when “data protection” might be better described as protecting the reputation or well-being of the individual to which the data pertains. Take the case of someone who, as a young adult, is arrested for vandalism and destruction of property and serves a jail sentence, all of which was written up in a newspaper in 1990 as well as being documented in official records. Once released, he swears off his old ways and spends the next 25 years in steady employment and overall irreproachable conduct. He awakes to find that the newspaper has digitized its archives and made them searchable via Google.

If our reformed graffiti artist lives in America, he can do little if anything about it, apart from asking the paper to take down its accurate but deeply embarrassing article. There is also a chance his conviction will be publicized on any of various websites dedicated to posting mug shots.

In a number of E.U. countries, by contrast, he could appeal to laws that forbid public reference to someone’s criminal record if it is no longer news or if the ex-con has undergone significant rehabilitation. He might also file a request with Google to remove links to sites mentioning the old transgression. In 2014, the Court of Justice of the European Union ruled that the search engine had to establish a take-down system for people who wanted personal information removed from its search results.

There are variations from country to country, but Jones finds that the E.U. “data subject” (in effect, the citizen’s digital doppelgänger) can claim a “general right to personality” -- a certain degree of dignified immunity from unwelcome attention. The American data subject, by contrast, is presumed to take the Wild West ethos of the Internet pretty much as a given, with any effort to delete information or limit its circulation being labeled, almost inevitably, as Orwellian. (Even so, a number of piecemeal efforts have been made in the United States to protect children and victims of harassment and bullying, including laws against revenge porn.)

But as Jones goes on to show, any preference for one of these frameworks over the other will soon enough be faced with the much harder matter of dealing with new and unanticipated shades of gray left out of the public/private distinction. And the other dichotomy -- between having every bit of personal data (flattering, humiliating or neither) either preserved forever in a digital archive or destined for the memory hole -- is also looking out of date. Jones’s book doesn’t predict what comes next, but it’s a great stimulant for anyone bracing themselves to think about it.

Editorial Tags: 

New book critiques campus censorship movement and pushes for marketplace of ideas

Smart Title: 

New book argues that students involved in campus protests over controversial speakers or ideas should instead support a marketplace of ideas in which all notions are heard and the best rise to the top.

Review (part 1) of Meg Leta Jones, "Ctrl+Z: The Right to Be Forgotten"

Privacy does not appear to have much of a future. As with the Arctic ice, we have no plan in place to halt the erosion, let alone to restore what has already melted away. “The right to be let alone,” wrote U.S. Supreme Court Justice Louis D. Brandeis in a classic formulation from 1928, is “the most comprehensive of rights and the right most valued by civilized men.” The menace to privacy he was addressing came from the state; at issue, evidence collected by tapping a bar’s telephone during Prohibition. He issued his opinion with a definite sense of the shape of things to come.

“Ways may someday be developed,” he wrote, “by which the Government, without removing papers from secret drawers, can reproduce them in court, and by which it will be enabled to expose to a jury the most intimate occurrences of the home. Advances in the psychic and related sciences may bring means of exploring unexpressed beliefs, thoughts and emotions.”

What he did not foresee, and probably could not, was how the right to privacy might be undermined -- dissolved, really -- by forces within the sphere of private decision making. Today, access to most of the digital tools and conveniences we’ve grown to depend on comes in exchange for permission to record and analyze our use of them. Every search and purchase, each comment and click, can now be assumed to go on your permanent record.

The scope and intensity of the monitoring are difficult, and unpleasant, to imagine. Meg Leta Jones, an assistant professor of communication, culture and technology at Georgetown University, sketches some of the facts of everyday life in the digital panopticon in her book Ctrl+Z: The Right to Be Forgotten (NYU Press):

“When a user logs on to the Internet and visits a website, hundreds of electronic tracking files may be triggered to capture data from the user’s activity on the site and from other information held in existing stored files, and that data is sent to companies. A study done by The Wall Street Journal found that the nation’s top fifty websites installed an average of sixty-four pieces of tracking technology, and a dozen sites installed over one hundred …. Some files respawn, known as zombie cookies, even after the user actively deletes them. All of this information is then sold on data exchanges …. Sensors creating data about owners and others from phones, cars, credit cards, televisions, household appliances, wearable computing necklaces, watches and eyewear, and the growing list of ‘Internet of things’ devices, mean that more personal information is being disclosed, processed and discovered. All of these nodes of discoverability will be added to the structure of discoverability sooner or later, if they have not been already.”

The scale and voracity of big data (which is to surveillance what big pharma is to medication) are not the only forces at work in liquidating the private sphere. The Internet and the roaming devices that feed into it -- cameras, tablets, phones, etc. -- now form the individual’s auxiliary backup brain. Two imaginative exercises that Jones proposes will make clear the potential vulnerabilities this entails.

First, recall the most painful thing in your life: something so excruciating, regrettable or tinged with shame or guilt, or both, that you would burn out the very memory of it, were that an option. Then, think about the worst person you have ever known: someone whose capacity for, in Jones's words, “inconsiderate, nasty, spiteful or twisted behavior” you wish you had learned about early on and so avoided experiencing firsthand.

There’s no substitute for a thick skin or good intuition -- least of all the external, digitized gray matter used to store written and visual documents and to mediate what seems like a larger share of each succeeding generation’s social interaction. Instead, it has the potential (even, to go by anecdotal evidence, a talent) for making things much worse. The dark secret or agonizing experience can be recorded, stored, duplicated endlessly and broadcast for all the world to see. The barrier between public and private self can vaporize with a press of the “send” button, whether in haste or by accident. Conversely, the malicious person might be able to cover his own tracks -- taking down his slanderous blog or threatening comments, for example -- and carry on after the damage is done.

Both could happen at once, of course: the vilest person you know could get hold of whatever you most want to hide or escape, and thereupon, presumably, burn your life to the ground. But the situations that Jones describes need not come to pass to have an effect: “Worry or fear may curb your behavior on- and off-line to avoid the risks of unwanted attention, misinterpretation or abuse.”

It is tempting to reply that worry and fear can be useful emotions, and that learning “to curb your behavior on- and off-line to avoid the risks of unwanted attention, misinterpretation and abuse” is otherwise called growing up. One problem with moralism of this caliber is that it dodges a hard truth: yes, maturity includes knowing to avoid behavior that is dangerous or likely to have undesirable consequences. (Example: consuming a great deal of alcohol and telling someone what you really think of them.) Sometimes this knowledge must be acquired the hard way and mulled over in private; the important thing is that enough of it accumulates. The learning process involves recognizing and repudiating mistakes, and gaining a certain amount of distance from the earlier self who made them.

But what happens to the possibility of growth and change if the infraction can’t be escaped? If a document of the regrettable memory is out there, retrievable by others, perhaps impossible to live down? For that matter, what becomes of “the right to be let alone” when seemingly private life generates data that is compiled and analyzed and sold?

The legal and moral implications require a rethinking of much of what we take for granted, and Jones is plugged in to many of the conversations. Her perspective on the question of privacy is thoughtful enough that I’ll return to Ctrl+Z in next week’s column. Suffice to say that she is not prepared to call for an autopsy, just yet.

Author discusses new book on largely forgotten hero of desegregation

Smart Title: 

New book tells story of a largely forgotten man who won a landmark Supreme Court decision.

Review of Stefan Zweig, 'Messages From a Lost World: Europe on the Brink'

Fame can be fickle and destiny, perverse -- but what are we to call how posterity has treated Stefan Zweig? In the period between the world wars, he was among the best-known authors in the world and, by some reckonings, the most translated. Jewish by birth and Austrian by nationality, Zweig was perhaps most of all Viennese by sensibility. His fiction expressed the refined cynicism about sexual mores that readers associated with Sigmund Freud and Arthur Schnitzler, and he played the role of Vienna coffeehouse man of letters to perfection.

Zweig’s cosmopolitanism was of the essence: his biographical and critical studies of European literary and historical figures spanned the Renaissance through the early 20th century and even roamed far enough abroad to include Mary Baker Eddy, the very American if otherwise sui generis founder of Christian Science. (The portrait of her in his book Mental Healers is etched in acid, but it occupies a surprisingly appropriate spot: between the accounts of Franz Mesmer and Freud.)

His books fueled the Nazi bonfires. Even apart from their racial obsessions, Zweig was precisely the sort of author to drive Hitler and Goebbels into paroxysms, and after years of exile he committed suicide in Brazil in 1942. His reputation did not survive the war, either, at least not among English-language readers. After four decades or so of relatively far-flung reading, I think of him as one of those authors who seem only ever to show up in parentheses and footnotes, or sometimes pointed out as a biographer too prone to psychologizing or melodrama. Being barely remembered trumps being totally forgotten (it’s more than most of us will be, anyway), but Zweig hardly seemed like a figure poised for rediscovery when, not too long ago, the comeback began.

The essays and speeches collected in Messages From a Lost World: Europe on the Brink (Pushkin Press) form a supplement to the volume that launched the revival -- Zweig’s memoir, The World of Yesterday, which the University of Nebraska Press published in the new translation that Pushkin Press issued in 2009. (Finished just before the author’s suicide, the book first appeared in English in 1943. That earlier translation can also be had, in ebook format.) The recent NYRB Classics editions of his fiction have had a lot to do with it being more than a one-book revival, but e-publishing and print-on-demand operations account for nearly everything by Zweig in English now being available. A mixed blessing, given that some such “publishers” do little but sell you copies of material available for free from the Internet Archive or Project Gutenberg.

The World of Yesterday is less autobiography than a self-portrait of the European literati during the final years of the belle epoque -- the four decades of relative peace and prosperity on the continent that ended with World War I. The new communication technologies and modes of high-speed transport were shrinking the world, while the spread of education, scientific progress and humane cultural values would presumably continue. The earliest pieces in Messages From a Lost World contain Zweig’s musings on the spiritual impact of the war, written while it was still in progress and with no end in sight. They are the thoughts of a man trying to find his way out of what must have seemed a completely reasonable state of despair:

“Never since it came into being has the whole world been so communally seized by nervous energy. Until now a war was only an isolated flare-up in the immense organism that is humanity, a suppurating limb which could be cauterized and thus healed, whilst all the remaining limbs were free to perform their normal functions without the least hindrance …. But due to its steady conquest of the globe, humanity forged ever-closer links, so today a fever quivers within its whole organism; horrors easily traverse the entire cosmos. There is not a workshop, not an isolated farm, not a hamlet deep in a forest from which they have not torn a man so that he might launch himself into the fray, and each of these beings is intimately connected to others by myriad threads of feeling; even the most insignificant among them has breathed so much of the feverish heat, his sudden disappearance makes those that remain that much colder, more alone and empty.”

In pieces from the 1920s and early ’30s, Zweig takes it as a moral imperative to champion the cause of peace by reminding his readers and listeners that humanity could no longer afford the sort of belligerent nationalism that had led them into the Great War. Respect for the possibilities of human development should replace claims to military greatness:

“If men lived [in earlier eras] as if in the folds of a mountain, their sight limited by the peaks on either side, we today see as if from a summit all worldly happenings in the same moment, in their exact dimensions and proportions. And because we have this commanding view across the surface of the earth, we must now usher in new standards. It’s no longer a case of which country must be placed ahead of another at their expense, but how to accomplish universal movement, progress, civilization. The history of tomorrow must be a history of all humanity and the conflicts between individual countries must be seen as redundant alongside the common good of the community.”

If the world could be changed by elegantly expressed humanist sentiments, this passage, from a speech delivered in 1932, might have altered the course of history. But the way it tempted fate now looks even more bitterly ironic than it did after Hitler took office a few months later. For in spite of his lofty vantage point (“we today see as if from a summit all worldly happenings in the same moment, in their exact dimensions and proportions”) and a depth of historical vision giving him insight into Magellan, Casanova, Napoleon, Goethe, Nietzsche and Mary Antoinette (among others), Zweig was struck dumb by the course of events after 1933.

Not literally, of course; on the contrary, later selections in Messages From a Lost World show that he could turn on the spigots of eloquence all too easily. (The tendency of Zweig's prose to turn purple and kitschy has been mocked at length by the translator Michael Hoffmann.) But he remained notoriously -- and to a great many people, unforgivably -- averse to speaking out against what was happening in Germany. He did say that, considering the company it put him in, having his book torched by the Nazis was an honor. Yet as a statement of solidarity against a regime that would, in time, burn people, that seems decidedly wanting.

Uncharitable people have accused Zweig of cowardice, while Hannah Arendt’s essay on him, found in Reflections on Literature and Culture (Stanford University Press, 2007) treats Zweig as an example of the cultural mandarin so determined to avoid the grubby realities of politics that he disgraced his own idealism. Whether or not that implies a lack of courage is a difficult question.

But surely it isn’t being excessively generous to wonder if Zweig’s failure was one of imagination, rather than of nerve: the inability, first of all, to grasp that Hitler was more than just another nationalist demagogue, followed by a paralysis at seeing the mad dream of the Thousand-Year Reich tearing into reality across most of Europe, with no plan to stop there. Against the horrible future, all he had left was nostalgia -- with a memory of what security and boundless optimism had been like, once.

Editorial Tags: 

Review of Gary A. Olson, 'Stanley Fish, America's Enfant Terrible: The Authorized Biography'

A celebrity has been defined as somebody who is well-known for being well-known. And by that measure, Gary A. Olson’s Stanley Fish, America's Enfant Terrible: The Authorized Biography (Southern Illinois University Press) is a celebrity biography of sorts. At no point does the subject go into rehab, but other than that, the book hits most of the tabloid marks: humble origins, plucky self-fashioning, innovative and controversial work, exorbitant earnings (for a Milton scholar), and illicit romance (leading to marital bliss). Much of this was once English department gossip, but now posterity is the richer for it.

Olson, president of Daemen College in Amherst, N.Y., even reveals a brush with Hollywood. When plans were underway to film one of David Lodge’s novels about academe, the producer wanted to cast Walter Matthau as Morris Zapp, a character clearly based on Fish. Matthau wasn't interested, and the movie languished in development hell, but in Olson’s words, “Stanley let it be known that he would love the opportunity to play the role himself.” (Another trait of the celeb bio: subject called by first name.)

If Fish wrote a book of career advice, a suitable title might be The Art of the Deal. Describing his career in the late 1970s and early ’80s, Olson writes:

“Stanley’s annual schedule was unimaginably grueling to many faculty. He would jet from university to university, giving workshops, papers and presentations, both in the United States and abroad. He was in great demand and he knew it, so he pushed the boundaries of what universities would pay to bring in a humanities professor to speak. Whatever he was offered by a university, he would demand more -- and he usually got it. The host who invited him would scramble to meet his fee, asking various campus departments and entities to contribute to the event until the requisite fee had been collected. Throughout his career he must have visited almost every notable university in every state of the union.”

Then again, a book of career advice from Stanley Fish would be virtually useless to anyone else today, like a tourist guide to a city that's been hit by an earthquake. This fall will be the 50th anniversary of Fish receiving tenure at the University of California at Berkeley. He was 28 years old and had been teaching for four years. Berkeley was his first position; he turned down two previous offers before accepting it. Given the circumstances, Stanley Fish, America’s Enfant Terrible will pose a challenge to many readers that rarely comes up with a nonfiction book: that of suspending disbelief. The chapter I just quoted is called “Academic Utopia” -- and as with other utopias, you can’t really get there from here.

“From my point of view,” Olson reports Fish saying, “there are a lot of people out there making mistakes, and I’m just going to tell them they’re making mistakes.” As if to authenticate that statement, the title of Fish’s most recent book is Think Again: Contrarian Reflections on Life, Culture, Politics, Religion, Law, and Education (Princeton University Press) while his next, due this summer, is called Winning Arguments: What Works and Doesn't Work in Politics, the Bedroom, the Courtroom, and the Classroom (Harper).

Originally the “people out there” so targeted were his fellow scholars of 16th- and 17th-century English literature. The range of his corrective efforts grew to include leading figures in literary theory, followed by the legal profession and finally -- with his New York Times column, which ran from 1995 to 2013 -- the entire public sphere. Olson wrote about the theoretical and rhetorical force of Fish’s work in two earlier volumes. Here the author largely underplays Fish’s intellectual biography beyond a few references to professors who influenced him as a student. Instead, Olson emphasizes the tangible institutional power that Fish acquired and learned to wield in his route to becoming a university administrator of national prominence.

Starting in 1985, when he was chair of the English department at Johns Hopkins University, Fish made a series of unexpected and much-discussed moves -- first to Duke University, where he reshaped the respectable but inconspicuous English department into high-theory powerhouse of the early 1990s, and later to the University of Illinois at Chicago, where he was dean of the College of Liberal Arts and Sciences. There, the biographer says, Fish’s “hiring of celebrity academics and the generating of positive publicity” were steps meant “to lift the miasma of an inferiority complex, to help faculty collectively feel that they were part of an enterprise that was known and admired by others.” (There were other appointments and stints and high professional honors along the way; the chronology at the back of the book is overstuffed with them.)

Like Morris Zapp, Stanley Fish has an appealing side, as Olson portrays him: he works hard (as a graduate student, he read Paradise Lost five times in one summer), he sounds like a good teacher and the qualities that read as brash and arrogant in some circumstances are probably gruff but lovable, like a Muppets character, in others. That said, letting decisions and values be determined by the need to feel “known and admired by others” is the very definition of what sociologists used to call “other-directedness.” More recently, and fittingly enough, it’s associated with the culture of celebrity.

No celebrity biography is complete without a catastrophe or two, mixed in with the triumphs -- and in the case of Fish’s role as a “developer” of intellectual and institutional real estate, there is not such a bright line between success and disaster. He enhanced the reputation of Duke’s humanities programs (and that of the university press), but most of the stars he hired relocated within a few years. And Fish’s efforts at UIC are remembered for running at a deficit that lasted beyond his deanship.

In an email note to Olson, I asked about various things that had caused a knot to form in my stomach while reading his book. We learn that Fish arranged a summer stipend of $58,000 for one of his cronies. Olson says he once responded to the challenge of an audience member to explain why anyone should believe him by saying, “Because I am Stanley Fish. I teach at Johns Hopkins University, and I make $75,000 a year.” (I recall hearing that one in the early 1980s; the equivalent now would be around $200,000.)

I've benefited from reading Fish in the past, but I find it hard to imagine anyone racking up student debt that will take decades to pay off regarding the figure Olson presents as anything but a monster. To be fair, the final pages of the book depict Fish in more recent times as a less grandiose figure, even as touched with regret or disappointment. (Being a septuagenarian enfant terrible seems like a pretty melancholy prospect.)

Anyway, Olson replied to my question as follows: “Well, some do think of him as a monster, although probably not for those reasons. I don’t argue in the book that he should be admired -- or reviled. That’s an individual choice. You have to remember: Fish is from an older generation of academics, when higher education was growing exponentially and exuberantly. Fish represents the rise of high theory. Despite his passion for teaching, he is known most as an intellectual and a scholar; that’s what drew the high pay and the high praise. You don’t hire someone like Fish to enhance your teaching as much as you do to bring a certain prestige to your institution. That’s why Duke, UIC and other institutions wanted him. And, generally, these proved to be good institutional decisions.”

Perhaps. It seems to be a question of short-term versus long-term benefit. But it’s hard to understand how giving large barrels of money to a few transient scholarly A-listers “to help faculty collectively feel that they were part of an enterprise that was known and admired by others” was ever a good idea -- much less a sustainable one. After Fish, the deluge?

Section: 
Editorial Tags: 

Author discusses book on college a cappella

Smart Title: 

Author discusses book about the history, inner workings and social role of college a cappella.

Q&A on new book that explores the history of for-profit institutions

Smart Title: 

Historian A. J. Angulo examines the history of for-profit colleges and universities and how many of the problems surrounding these institutions aren't new, but rooted in a past that goes back to the colonial era.

Review of Maria Konnikova, "The Confidence Game: Why We Fall for It ... Every Time"

Early last month I started reading Maria Konnikova’s The Confidence Game: Why We Fall for It … Every Time (Viking) but had to put the book aside under the pressure of other matters -- only to have the expression “con artist” hit the headlines in short order. And in the context of electoral politics, no less, with one Republican presidential candidate applying it to another. (No names! Otherwise people with a Google News alerts for them will just flood into the comments section to rail and vent.)

My notes from a few weeks ago show no inkling that the book might be topical. Instead, I listed movies about the confidence game (the most theatrical of crimes) and dredged up recollections of David W. Maurer’s The Big Con: The Story of the Confidence Man, first published in 1940. A reviewer of one of Maurer’s later books said that while he was “retired as an active professor of English at the University of Louisville” as of 1972, he was “the foremost student of the argots and subcultures in various precincts of the Wild Side … whose easy authority and unerring judgment brought the language and culture of the American underworld to the attention of scholars.” The tribute still stands, although Maurer’s studies of the slang of drug addicts, prostitutes, forgers and pickpockets (many available in JSTOR) are now period pieces: the scholarly equivalent of vintage pulp fiction.

Parts of The Big Con have a Depression-era flavor, but it is the author’s masterpiece and, if not the last word on the subject, certainly the definitive one. Other, lesser criminal subcultures generate argots, but for Maurer the confidence men are the professional elite of the underworld, and their specialized lingo is the concentrated expression of a well-tested understanding of human psychology. Mastery of it distinguishes those qualified for “the long con” (sustained and intricate operations extracting large sums from victims) from, e.g., small-timers running the three-card monte scam on a street corner.

The long con, as depicted on screen, is a marvel to witness. David Mamet wrote and directed the two best movies in this area: House of Games (1987) and The Spanish Prisoner (1997). The latter film takes its name from the quintessential long con, still viable after centuries of use. The contrast between short and long cons is blended into the Oedipal shenanigans of The Grifters (1990), adapted from the novel by pulp-noir virtuoso Jim Thompson, who I suspect read Maurer’s book somewhere along the way. And in a considerably lighter vein there is Dirty Rotten Scoundrels (1988) with Michael Caine as a smooth long-con artist and Steve Martin as the short-con operator who becomes both his nemesis and protégé.

Looking up those dates, I’m struck by the absence from the list of any more recent entry. What we have instead, it seems, is a string of movies such as Boiler Room (2000), The Wolf of Wall Street (2013) and The Big Short (2015), which treat the world of high finance as a racket. The most recent film’s title sounds like a nod to Maurer, and his monograph does cover a classic long con called “the rag,” which involves a phony stockbroker’s office. (There, investors have the chance to get rich through insider trading, or so they think until the office vanishes, along with the suckers’ money.)

“If confidence men operate outside the law, it must be remembered that they are not much further outside than many pillars of our society who go under names less sinister,” Maurer said in a passage that Maria Konnikova, a New Yorker contributor, quotes in The Confidence Game. She moves quickly to give recent evidence for the point, such as the U.S. Department of Justice’s suit against USIS, which she describes as “the contractor that used to supply two-thirds of the security clearances for much of the [U.S.] intelligence community.” The DOJ found that “the company had faked well over half a million background checks between 2008 and 2012 -- or 40 percent of total background checks.” Corruption on that scale lies beyond the most ambitious con’s dreams of success.

Konnikova’s basic argument -- developed through a mixture of anecdotes and behavioral-science findings, after the manner associated with journalist and author Malcolm Gladwell -- is that both the grifter’s manipulative skills and the victim’s susceptibility are matters of human neurobiology and everyday social psychology. She cites an investigation organized by “Charles Bond, a psychologist at Texas Christian University who has studied lying since the 1980s,” who, in 2006, gathered information on beliefs about lying held by dozens of countries in 43 languages. Three-quarters of the responses in one phase of the study identified “gaze aversion” as a signal that someone was lying, while “two-thirds noted a shift in posture, another two-thirds that liars scratch and touch themselves more, and 62 percent said that [liars] tell longer stories. The answers spanned sixty-three countries.” But other studies show that these beliefs -- while cross-cultural, if not universal -- are poor guides to assessing a stranger’s truthfulness. Someone delivering a carefully worded lie while holding a steady gaze and not fidgeting is, in effect, already halfway to being believed (or elected, as the case may be).

Konnikova pays tribute to Maurer’s classic by linking each of her chapters to one of the phases or components of a long con, as itemized in the grifter’s lexicon. The first stage, called “the put-up,” is perhaps the most intuitive: the con artist identifies a mark (victim) by picking up signals of the individual’s interests, personality traits and self-perceptions, and then begins to cultivate casual familiarity or trust. “There’s nothing a con artist likes to do more than make us feel powerful and in control,” she writes. “We are the ones calling the shots, making the choices, doing the thinking. They are merely there to do our bidding. And so as we throw off ever more clues, we are becoming increasingly blind to the clues being thrown off by others.”

Here, the author overstates things, since the talented grifter is also busy creating signals. In a later chapter, she describes a fellow who arrives at a charity event in London, acting a bit drunk and overfriendly and seemingly unaware that he didn’t shave that morning. His mark is woman of the world who knows the type: one of those would-be charming playboys, feckless but harmless, getting by on an allowance from his aristocratic relations. (Suffice to say that she takes a check from him and lives to regret it.)

With the dramatis personae in place, “the play” (the con itself) gets underway. Other characters may have walk-on parts, such as the “secretaries” and “brokers” in a phony stock-market game who get a cut of the criminal profits. The mark is made privy to whatever circumstances the con involves (e.g., a long-forgotten wine cellar full of rare and expensive vintages, an amazing real-estate deal, the chance to get in on the next big invention) and drawn into the secrets and moments of exhilaration that go with this once-in-a-lifetime opportunity.

But the most fascinating and disturbing aspect of a long con is how the con artist manages and redirects any anxieties or misgivings the victim may feel. After the “blowoff” -- when the victim is left with a case of bad wine with fancy labels, ownership of swampland, etc. -- a really well-executed con will leave the mark too embarrassed to complain. (Erving Goffman's classic paper "On Cooling the Mark Out" uses a late phase of the con game as a key to understanding how society at large reconciles the ordinary person to the disappointing realities of life.)

The lab experiments and social-scientific inquiries that Konnikova describes offer plausible (if seldom especially surprising) analyses of the cognitive and emotional forces at work. People prefer to think of themselves as smart, helpful, good judges of character and destined for lives better than the one they’ve settled for, thus far. And someone appealing to those feelings can end up with all of your money and no known forwarding address. Perhaps the most interesting and memorable thing about The Confidence Game is not that researchers can now explain the roots of our vulnerability, but rather the way it confirms something Maurer implied in his book 75 years ago: there’s no one better able to understand the individual human psyche than someone prepared to exploit it, undistracted by the slightest remorse.

Editorial Tags: 

Pages

Subscribe to RSS - books
Back to Top