In new book, scholars make the case for value of diversity in higher education and society generally

Smart Title: 

In new collection of essays, scholars make the case for diversity as essential to higher education and society generally.

Review of Lucas Graves's "Deciding What’s True: The Rise of Political Fact-Checking in American Journalism"

“Everyone is entitled to his own opinions,” the sociologist and politico Daniel Patrick Moynihan said, “but not to his own facts.” He may have been improving upon a similar if less trenchant remark (“ …but no man has a right to be wrong in his facts”) attributed to the financier Bernard Baruch.

Until sitting down to write this I did not know about Baruch’s version. A certainty that my eagle-eyed editor would inquire about the source obliged me to vet the attribution to Moynihan; she requires more than my vague recollection of having read it somewhere. In checking my facts, she bolsters my conscience, enforcing Moynihan’s (and Baruch’s) point about accountability.

Lucas Graves, an assistant professor of journalism and mass communication at the University of Wisconsin at Madison, uses the expression “internal fact-checking” to describe this kind of preventative, behind-the-scenes work. It tries “to eliminate untruth, not call attention to it” by catching and correcting mistakes in an article before it goes to press. In Deciding What’s True: The Rise of Political Fact-Checking in American Journalism (Columbia University Press), Graves traces how internal fact-checking morphed into something almost antithetical: the very public evaluation of factual assertions made by politicians and other figures in the news.

News organizations such as PolitiFact, and The Washington Post’s Fact Checker -- to name only the most nationally prominent -- intervene so frequently in American public discourse now that it seems counterintuitive to think they’ve only recently become a force in the world. Until the last two or three presidential election cycles, scrutiny of a candidate’s claims tended to be episodic and ad hoc -- and often enough conducted by the opposing campaign, bringing its own biases to the process. To the ethos of newspaper editors and reporters circa 1950, the idea of confirming or debunking a public figure’s statements of fact seemed perilously close to an expression of opinion, to be avoided at the risk of compromising one’s reputation for objectivity. Reporting that a fact was in dispute might be acceptable in some cases, but making a judgment call on it was best left to the pundits and thumb suckers.

The title Deciding What’s True is clearly an homage to Deciding What’s News by Herbert J. Gans, a classic study of newsroom culture, and Graves followed in his predecessor’s participant-observer footsteps by working for two major fact-checking organizations between 2010 and 2012. The book thus benefits from having two vantage points: the historical and sociological perspective available from media-studies scholarship, plus close ethnographic observation of how major fact-checking stories are discovered, investigated, debated in-house before being sent out to make their mark on the world.

His most striking insight, it seems to me, is how specific, self-defined and virtually self-contained the world of professionalized fact-checking tends to be. The naïve observer might think of fact-checking organizations as being akin to media watchdog groups such as the Media Research Center on the right and Media Matters for America to the left, with PolitiFact falling somewhere in between. But in reality the fact-checking milieu sees itself as unrelated to the partisan watchdog groups: it doesn’t work with or quote them, and Graves recounts one fact-checker as saying he almost decided to kill an investigation when he saw that Media Matters was already interested in it. Likewise, fact-checking journalists see a bright line between their work and blogging.

This is not just a matter of professional amour propre. The major fact-checking organizations have ties to established media institutions, including journalism schools, and retain a belief (which watchdogs and bloggers alike tend to reject) in old newsroom ideals of objectivity, impartiality and conscientious reporting.

The ’00s put confidence in those ideals under enormous strain from a number of catastrophically bad judgment calls (reporting war propaganda and Wall Street shilling without due diligence) as well as cases of plagiarism or outright fabrication in major news publications. Compounding those problems, even inducing some of them, was the growing array of new media competing for public attention while also driving up the pace of the news cycle.

In an email exchange with Graves, I indicated that PolitiFact, Fact Checker and so on seemed like a response, in part, to the 24-hour news cycle that emerged around the time of the first Gulf War and intensified still more once the internet started to permeate everyday life. Rumors, misinformation and bogus statistics could spread faster, and farther, than ever before.

Graves agreed, but added, “Another way to think about that is that the traditional model of objectivity, for all of its flaws, made some sense in a world where professional journalists acted as gatekeepers and could effectively police the borders of political discourse. Then wild rumors about the president’s birthplace didn’t have to be debunked because they could be denied coverage altogether. But the opening up of political discourse after the 1960s and the fragmentation of the media beginning in the 1990s -- both healthy developments in many ways, and both with echoes in the 19th century -- also effectively spelled the end of the journalist as gatekeeper. And especially with the rise of the internet, that fragmentation calls for a more critical style of political reporting that’s willing to directly challenge false claims.”

In principle, at least, systematic and high-quality fact-checking ought to make politicians and other public figures more careful about the claims they make while giving the public a running lesson in critical thought at the same time. At times, Deciding What’s True seems to encourage that hope. But I’ve been reading the book between rounds of binge-watching campaign coverage, and it is not an experience to recommend. The idea that fact-checking can impose some kind of restraint on a candidate, or influence public response, seems utterly negated by the candidacy of Donald Trump. His well-documented but unrelenting dishonesty -- his talent for lying without restraint or regard for evidence, outright and brazenly, even after the facts have been shown repeatedly -- never wavers yet makes no dent in his level of support. This seems really strange.

“This a large and complicated question,” Graves responded, “and people who study journalism and political communication are trying to approach it in many different ways. But one answer is that there have always been fairly wide slices of the American electorate that are deeply suspicious of establishment discourse, sometimes with good reason. If you listen to Trump supporters in interviews, they seem to accept that he doesn’t have the grasp of policy that other politicians do, and they don’t necessarily believe everything he says or subscribe to all of his views. He seems to say whatever he thinks and embrace a common-sense approach that many people find appealing. Beyond that, none of us makes political calculations in the detached, rational way that political theorists sometimes imagine.

“And at the same time, fact-checking has made a difference,” Graves continued. “It arguably has helped to solidify the ceiling over Trump’s support, giving ammunition to both voters and politicians who say they’ll never back him. And it has had a tremendous influence on coverage of his campaign, with front-page articles on Trump’s extraordinary disregard for the facts and constant references to his falsehoods even in straight news reports. We have nothing to compare this race to, and it’s impossible to say where this thing would be if more journalists had stuck to the traditional ‘he said, she said’ formula.”

Editorial Tags: 

Review of Teddy Wayne, 'Loner: A Novel'

Teddy Wayne's Loner: A Novel (Simon & Schuster) is the second book I've read in as many weeks narrated by a manipulative and highly verbal straight white man possessing a degree of upward social mobility as well as the impulse to see how much emotional damage he can inflict on others. Journalistic custom requires three instances to spot a trend, but reader, I do not have it in me to endure any more such company. (Anyway, both narratives resonate with Aaron James's political and philosophical musings, discussed here earlier in the month.)

The other volume was Diary of an Oxygen Thief, an anonymous and purportedly autobiographical work that "went from self-published obscurity to best-sellerdom," as reported in Publishers Weekly this summer. Loner is set at Harvard University, more or less in the present day, while Diary roams between Ireland and the United States as the narrator works as an advertising art director around the turn of the millennium. Despite considerable differences, the books follow broadly comparable narrative arcs. Romantic entanglements between characters (not just the hooking-up part but the emotional upheaval sometimes accompanying it) generally turn out to be misunderstandings at best. Often enough the disasters are intentional.

Neither author seems to be aware of, or responding to, the other's work, but they seem to be mapping similar terrain. And the fairly positive reception for Loner and Diary of an Oxygen Thief suggests that readers find something recognizable about the emotional landscape they depict. To discuss the similarities without giving away significant plot turns means being carefully vague at times. Ultimately it is the narrator's attitude or verbal demeanor that sticks with the reader more than the events recounted.

David Federman, the narrator of Loner, arrives at Harvard as a freshman with an acute sense of his middling as well as seemingly perfect confidence in his prospects as a member of an elite. Entitlement and embarrassment do not make for a stable combination, however, and it becomes increasingly volatile once he becomes aware of Veronica Morgan Wells, the figure he addresses in the second person from that point on: "[It] was obvious, from your clothes, your body language, the impervious confidence you projected, as if any affront would bounce off you like a battleship deflecting a BB pellet: you came from money …. It wasn’t just your financial capital that set you apart; it was your worldliness, your taste, your social capital. What my respectable, professional parents had deprived me of by their conventional ambitions and absence of imagination."

Not a unprecedented situation, of course, as the narrator himself realizes. But any similarity between Veronica Morgan Wells and Daisy Fay Buchanan is slight compared to the fact that Jay Gatsby, whatever else you might call him, wasn't a stalker. David Federman's unreliability as a narrator is shown chiefly in the fact that thinks Veronica accepts his carefully planned coincidental meetings at face value and that his effort to ingratiate himself is working. The campaign has its comic aspects. All of it unfolds against a background of campus sex codes, feminist cultural-studies seminars and expressions of concern about social inequality.

But David's increasingly fetishistic obsession with her, and his willingness to use another female character sexually as a means to gaining access to Veronica, grows very uncomfortable to witness from the inside. He goes from callow virgin to budding young psychopath very rapidly and without missing a step. He even manages to incorporate some of the campus sex code into his strategy.

The unnamed narrator of Diary of an Oxygen Thief is much less preoccupied with social status, or at least less overtly so, and his introspection never leaves the reader with understanding of what drives his malevolence toward women. His sadism is purely emotional but well practiced. In ending things, he follows a scorched-earth policy:

“‘This is what I look like when I’m pretending to listen to your boring conversation.’ I froze my sweetest expression, my innocent blues eyes widening in pseudo-interest, the same expression I’d used on teachers. … ‘This is what I look like when I’m pretending to be in love with you …. I’m going to dismantle us tonight. And there’s nothing you can do about it. You’ll have to sit there and listen while I wrench the U from the S. You’ll question your own judgment. Maybe you’ll never really trust yourself again. I hope so. Because if I don’t want you, and believe me I don’t, then I don’t want you being happy with someone else when there’s any doubt that I might get another girl.’”

What makes it considerably nastier is that the narrator treats this not as a way to get out of a relationship but as the whole point of it -- a moment when the self-loathing that he otherwise numbs with alcohol can be off-loaded on the woman he's maneuvered into position to endure it.

At a crucial moment in each book, the axis pivots to reveal just how limited and self-deluded the narrator is about his sense of control over others and over himself. The manipulation rebounds on him, but not as revenge only. The reader is left in a position to see that his seemingly pathological mind games can also be understood as having a certain logic: "Though Hollywood would have us believe that all we seek in romantic relationships is love," one character says, "it is just one of several exchangeable commodities, along with sex, money, status, validation, services and so on." An exchange, furthermore, in which one side can only win at the other's expense. Failure to understand that is a guarantee of losing.

I'm not going to argue with anyone else's sense of these things: people who reach such bleak conclusions probably have grounds for doing so. Still, it would be good to think that readers aren't responding to these two page-turners simply as confirmation of their own experience, but in the spirit of facing a worst-case scenario in order to find the nerve to try again.

Editorial Tags: 

Essay on Edgar Cayce, sociology of religion, terahertz waves and 'Repo Man'

Around this time 20 years ago, I met an elderly gentleman who’d had what sounded like an exceptionally interesting and unusual dissertation-writing experience. A couple of recent coincidences bring the encounter to mind and so inspired this little causerie.

His name was Harmon Bro, and he was in his late 70s when we met. He’d spent the better part of 50 years as an ordained minister and Jungian psychotherapist. If anyone ever looked the part of a Jungian archetype, it was Harmon, who personified the Wise Old Man. In 1955, the University of Chicago Divinity School awarded him a Ph.D. after accepting a doctoral thesis called “The Charisma of the Seer: A Study in the Phenomenology of Religious Leadership.”

It was based in part on work Harmon did in his early 20s as an assistant to Edgar Cayce, “the sleeping prophet.” Despite minimal education, Cayce, it is said, could give long, extemporaneous discourses in response to questions posed to him while he was in a trance state. Among these “readings” were medically sophisticated diagnoses of people miles or continents away, as well as detailed accounts of ancient history and predictions of the future.

Cayce died in 1945, but he left a vast mass of transcripts of his “readings.” By the 1960s, publishers were mining them to produce a seemingly endless series of paperback books extolling Cayce’s powers. Insofar as the New Age can be said to have founding figures, he was one of them.

Harmon was clearly a believer in Cayce’s miraculous powers. I was not (and am not) but have always enjoyed the legends by and about him. As a schoolboy, for example, he would put a textbook under his pillow and absorb its contents while asleep. He graduated (so to speak) to the Akashic Records -- an ethereal library documenting life on Atlantis and in ancient Egypt, and much else besides. He could also see into the future, but the track record is not impressive: China did not convert to Christianity in 1968, nor did Armageddon arrive in 1999. Cayce also predicted that an earthquake in the 1960s would cause California to sink into the Pacific Ocean. It remains attached to the continental United States as of this writing.

Harmon didn’t take skepticism as a threat or an insult, and anyway I preferred listening to arguing. He stressed how very improbable Cayce had been as a subject for serious scholarly attention in the 1950s -- at the University of Chicago, no less. It took three or four tries to get his topic approved; by the time the dissertation was finished and accepted, it felt like every faculty member concerned with the history and psychology of religion had weighed in on it. He happily lent me a copy (when anyone expresses interest in a decades-old dissertation, its author will usually have one of two responses: pleasure or horror), and from reading it, I could see that the scrutiny had been all for the best. It obliged him to practice a kind of methodological agnosticism about Cayce’s powers, and he demonstrated a solid grounding in the social-scientific literature on religion -- in particular, Max Weber’s work on prophetic charisma.

But by 1996, Harmon Bro was not at all happy with the institutions routinizing that charisma. The man he’d known and studied had an ethical message -- “love thy neighbor as thyself,” more or less. The New Age ethos amounted to “love thyself and improve thy karma.” You didn’t have to share his worldview to see his point.

The timing was fortunate: we grew acquainted during what proved to be the final year of Harmon Bro’s life. His obituary in the Chicago Tribune in 1997 made no reference to Cayce, but looking it up just now leaves me with a definite feeling of synchronicity: Harmon died on Sept. 13, which is also the date I’m finishing this piece. A message from Harmon, via the cosmic unconscious?

Probably not, although it was another and even more far-flung coincidence that reminded me of him in the first place. On Friday, the journal Nature Communication published a paper called “Terahertz time-gated spectral imaging for content extraction through layered structures,” which the science-news website EurekAlert kindly translates into laymanese as “Researchers prototype system for reading closed books.” Not by putting them under a pillow and sleeping on them, alas, but it’s impressive even so.

Researchers at the Massachusetts Institute of Technology and the Georgia Tech Institute of Technology collaborated in developing a system that uses bursts of terahertz radiation (“the band of electromagnetic radiation between microwaves and infrared light,” says EurekAlert) to create images of the surfaces of individual pieces of paper in a stack. Ink in a printed letter absorbs the radiation differently from the blank page around it; the contrast between the signals reflecting back are fed into an algorithm that identifies the letter on the page. The prototype can “read” the surfaces of up to nine pages in a pile; with more work, reading at greater depths seems possible. The story quotes one of the researchers as saying, “The Metropolitan Museum in New York showed a lot of interest in this, because they want to, for example, look into some antique books that they don’t even want to touch.” The signal-sorting algorithm may yet enable spambots to defeat captchas. (Which arguably represents grounds for halting research right away, though that is unlikely.)

The train of association between breaking technological news from last week and the memory of one of the more generous and unusual people to cross my path is admittedly twisty and random. On the other hand, reading by terahertz radiation seems like another example of Clarke’s Third Law: “Any sufficiently advanced technology is indistinguishable from magic.”

Editorial Tags: 
Image Caption: 
Edgar Cayce

Review of Aaron James's book on a new theory of Donald Trump

Doubleday published Aaron James’s thought-provoking little treatise Assholes: A Theory of Donald Trump in early May, but I have not seen a single reference to the book since the candidate clinched the Republican nomination later that month.

In the meantime, several million pundit-hours of commentary have gone to assessing the presidential horse race, mainly by people who live at the track. James, by contrast, is a professor and chair of philosophy at the University of California, Irvine. His major work of scholarship to date, Fairness in Practice: A Social Contract for a Global Economy (Oxford University Press, 2012), was well received by his peers, though it has been largely overshadowed by his pioneering work in asshole studies.

Let us first define terms. What, then, o Socrates, is an asshole? And how does the asshole differ from someone who is just a jerk?

The distinction is important. “The asshole,” James writes, “is the guy (they are mainly men) who systematically allows himself advantages in social relationships out of an entrenched (and mistaken) sense of entitlement that immunizes him against the complaints of other people.” His sense of entitlement is absolute; his self-aggrandizing behavior is spontaneous and noticeably lacking in inhibition. The asshole may recognize that violating certain norms of acceptable behavior may cause pain or give offense but feels no conflict over that possibility.

The jerk, by contrast, is aware it is normal to apologize or express embarrassment -- and does so, sincerely or not. Someone parking in a handicapped parking space without the appropriate plates or sticker may be either a jerk or an asshole, but only the jerk will feel the need to come up with, at least, an excuse.

More important, the asshole will, James writes, often “feel indignant when questions about his conduct are raised. That, from his point of view, shows he is not getting the respect he deserves.” Just such an escalation -- from habitual, self-centered indifference toward the feelings of others to rage at even the perception of being slighted -- became familiar as part of Trump’s debating style throughout the Republican primary debates.

It proved effective, and that is the puzzle, which only deepened in the course of the summer. Somehow the candidate’s incessant and tireless asshole behavior (he has been at it for more than a year now, full time; even from this side of the process, it feels like 10) has never seriously damaged his base of support.

H. L. Mencken once defined a demagogue as someone “who preaches doctrines he knows to be untrue to men he knows to be idiots.” Trump has commanded the national stage with greater success than any demagogue since the 1930s, and yet Mencken’s quip is, as James points out, doubly insufficient in characterizing the candidate. For one, Trump is not so much dishonest as completely uninterested in whether or not what he says is true. (See Harry G. Frankfurt's On Bullshit [Princeton University Press, 2005].) Nor are Trump supporters all idiots. For many, James theorizes, “Trump’s value is mainly as a stratagem of asshole management: when stuck with heaps of assholes, turn to an even bigger, better asshole, in hopes of bringing order for public benefit …. In a system where officials routinely thwart the public interest, capitalizing on their position for power and profit, only an asshole so skilled as to school the other assholes properly, and so to awe them into submission, would restore order and peace, for the greater good of everyone.”

The asshole, so elevated and empowered, sounds quite a bit like the sovereign in Leviathan, which is no accident. Assholes: A Theory of Donald Trump offers quick tutorials on Hobbes and Rousseau to suggest that the candidate’s rise makes a certain amount of sense in the context of a republic collapsing under strain.

Support for Trump, by this reading, is the perverse and rather paradoxical effect of 30 years (arguably more) of growing economic inequality and cultural atomization. Whatever communitarian spirit may have once glued the country together, the collage has been coming unstuck for a while now. Sustained growth over first two or three decades following World War II made it seem at least possible that 21st-century American citizens would take stability, security and opportunity as birthrights. Economic crises would be the stuff of history lectures. The biggest problem would be managing all our free time.

The sense of having gone off course somehow runs deep. Yet we have largely lost any language for framing an alternative. The notion of the general welfare has grown quaint, if not suspect. The individual self is engaged in a zero-sum game with the rest of the world; for anything to count as a good, it must have the potential to generate invidious comparisons. “Each [of us] needing to affirm his or her own value,” says James, “we devolve into a destructive contest for rank and superiority.”

We live, it seems, in an asshole oligarchy. Nobody thinks of Trump as an exception. But he is the one guy saying -- over and over, between the insult tweets and explosive ranting -- that the status quo is bad, folks, you have no idea how bad, trust me. The whole thing must be put into bankruptcy, after which he’ll negotiate a new social contract for us. What have you got to lose?

James is under no illusions about the candidate’s sincerity, competence, self-control or emotional stability. He calls Trump’s campaign rallies “the modern version of executions for public entertainment; it’s the dynamics of crowds and power that, with the help of technology, made the 20th century the bloodiest in human history.” So, not a endorsement. The idea that putting Trump in office represents a “strategy of asshole management … a last-ditch effort at taming a corrupt political system” can be explained rationally. That doesn’t make it a rational idea, though, and patience with the thought experiment will probably decrease as election draws closer.

Whatever Trump’s candidacy may reveal about the state of the social fabric, he’s torn a few more holes in it already. James quotes a line from Rousseau that arguably sums up the spirit of his book: “The manner in which public affairs are conducted gives a sufficiently accurate indication of the moral character and state of health of the body politic.” The implications of that sentence are almost as horrifying as the thought of Donald Trump with the nuclear launch codes.

Editorial Tags: 

Review of Anne Trubek, 'The History and Uncertain Future of Handwriting'

In late spring, I had to endorse a number of legal documents using a digital rendering of my name in a cursive script, chosen from a a menu of simulated handwriting styles. It was like my signature, except legible. Beneath its surface, so to speak, was my identifying data -- confirmed by what the company producing the application calls “robust authentication methods.”

Indeed, if it were necessary to prove the authenticity of the “signature” in court, there is no question that the digital glyph could be verified rigorously, whereas a handwriting expert would be hard-pressed to find much uniformity between versions of the scrawl that I make on paper. The visible part of an electronic signature, with its imitation of penmanship, is just a formality. It accommodates our lingering sense that entering a binding legal obligation really ought to include the act of affirming one’s identity by one’s own hand. (With the whole hand, that is, not just the finger used to click “I accept.”) At this stage, the feeling is vestigial. Given another generation or two of kids who grow up knowing how to type before they can ride a bicycle, it could disappear entirely, and  the practice of writing by hand could become an antiquarian hobby, like churning your own butter or making horseshoes.

But not necessarily. Anne Trubek covers a great deal of interesting ground in The History and Uncertain Future of Handwriting  (Bloomsbury), though much of the comment it has received concerns the "uncertain future" part, in an elegaic mode. I found rather that many of the book's points are made most clearly at the start, when she discusses the earliest known system of writing, Sumerian cuneiform, which emerged around 3000 B.C.E. (Formerly an associate professor of rhetoric, composition and English at Oberlin College, Trubek is the founder of Belt magazine, devoted to urban life in the Rust Belt.)

Encouraged by a curator to pick up some cuneiform tablets for a closer look, Trubek is struck by how compact they are: roughly palm-sized, covered with tiny marks made on wet clay with a stylus. While the writing instruments were simple, mastering them was not: “The tip of the stylus was triangular, and turning it caused different slants and directions for the marks -- some went deeper than others , some went horizontally or vertically, and the bottom of the stylus would make a round mark -- each with a distinct meaning.” To develop competence required years of study at the edubba (the Sumerian word for school: literally, “tablet house”) and also, one assumes, regular offerings to Nabu, the god of scribes, wisdom and literature.  

“By 1600 B.C.E.,” writes Trubek, “no Sumerian speakers were alive,” but it continued to be taught as a classical language, while cuneiform remained in use for another thousand years. For all its difficulty, cuneiform was easier to learn than the Egyptian writing system; it was relatively utilitarian (often used for business contracts and tax records), while hieroglyphics “were as much an art form as they were a means of information storage.” And clay tablets “continued to be used even after most people had shifted to papyrus.”

It is a lesson in the durable habits of the late adopter -- and a reminder that “the uncertain future of handwriting” (or of any other aspect of literacy) will not be decided by the automatic workings of progress and obsolescence.

As she roams across the centuries, Trubek points out two or three factors that have largely set the terms for the development of handwriting, quite apart from the qualities of the tools we use. One, of course, is that only a small a fraction of the population has been able to acquire the skill throughout most of history. In addition, people who could read did not always learn to write, as the skill was difficult and slow to master. Finally, people in authority have long tried to impose norms on how words should appear on the page. Charlemagne in the ninth century might not have otherwise have much resembled American schoolteachers in the 20th century, but they shared a passion for standardized penmanship.

It’s easy to see how these three difficult realities -- widespread illiteracy, tortuous pedagogy, and the demand for uniformity -- would tend to reinforce one another. Trubek’s story is in part one of rapid changes followed by stubborn inertia. One side-effect of Gutenberg’s invention of the printing press was that, in putting scribes out of work, many were compelled to take up a new career: they became writing instructors. That can only have encouraged more efficient pedagogy -- and with it a broadening and deepening of the pool of those people able to write, as well as read, more books.

But the drive to standardize and to reinforce social hierarchy seems, if anything, to have intensified: handwriting became an index of status, rank and moral uprightness. A 17th-century writing manual recommended a particular script for women, “for as much as they (having not the patience to take any great paines, besides phantasticiall and humorsome) must be taught that which they may instantly learn.” Immigrants who practiced the Palmer Method -- one of the predominant forms of handwriting instruction available a hundred years ago -- would be more readily assimilated under its “powerful hygienic effect.” Good penmanship — for a purer America!

The invention of the typewriter was one giant leap for standardization, and Trubek quotes one researcher’s conclusion that “familiarity with the typewriter makes students better penmen, not worse.” But it also provoked worries that teachers were neglecting handwriting instruction. In time, “the universal typewriter may swallow all.”

It didn’t, of course. What actually happened (and this is one of the most striking points in a book full of them) was that another attitude towards handwriting came into focus: a sense of it being like one’s fingerprints, with distinct qualities visible only to the trained eye. Graphologists make the still larger claim that handwriting analysis can reveal aspects of the personality. Because graphology ranks somewhere between dowsing and astrology in scientific rigor, I had assumed its origins were lost in the mists of antiquity. But it turns out the idea took shape no later than the 17th century, with efforts to systematize it really getting underway only in the 19th century, amidst worries about individuality being crushed by the march of progress.

Trubek’s history of handwriting is a story of metamorphosis, not of decline. Given my experience with “signing” digital documents a few months ago, I was interested and amused to learn it was a variation on a theme: A century or so back, one manufacturer “marketed -- unsuccessfully, it seems -- a typewriter whose letter keys were formed from handwriting of the buyer.” If the future of handwriting is uncertain, that’s in part because no one can tell what uses and meanings we may find for it yet.

Editorial Tags: 

Review of article on using clickbait techniques in scholarly titles

The wits of the Algonquin circle once held a competition to see which one of them could come up with the most sensational headline. If a prize was given, I assume it was something fermented. Dorothy Parker won -- because of course she did -- with “Pope Elopes.”

Well, that one would certainly sell some papers -- or, as we say now, go viral. Until recently, the art of the headline was largely defined by the haiku-like challenge to balance impact and brevity within the constraints of a newspaper format. The greater a headline’s prominence, the larger the type, but the fewer the syllables it could contain. Given those terms, Parker’s masterpiece seems difficult to surpass. (That said, the legendary New York tabloid headline “Headless Body in Topless Bar” merits a special commendation for accompanying a real-life story.) Digital publications don’t have to adjust the length of a title, or even an article, to Procrustean specifications, but they have to take into account that readers’ attention is under continual bombardment. A headline must tickle the curiosity or otherwise imply that the article will at least be worth the opportunity cost built into reading it.

The contemporary phenomenon of “clickbait” makes that promise and then breaks it almost immediately. The Oxford dictionary defines the neologism as referring to online material “whose main purpose is the attract attention and encourage visitors to click on a link to a particular webpage.” It subsumes a variety of what might be called, to be generous about it, fluff, including diet tips, sex advice, amazing new discoveries that you will not believe, lists of movies or TV shows (annotated to celebrate or mock them), photographs of celebrities (from high school yearbooks, the red carpet or mug shots) and video footage of animals engaged in adorable behavior. In taking the bait, visitors drive up site traffic and boost exposure for its advertisers. Clickbait content is to boredom what seawater is to thirst. If consuming it has any benefits, it's hard to imagine what they would be.

Two months ago, Gwilym Lockwood published a paper called “Academic Clickbait: Articles With Positively Framed Titles, Interesting Phrasing and No Wordplay Get More Attention Online” in The Winnower, an open-access online scholarly publishing platform. The author, a Ph.D. student in the neurobiology of language department at the Max Planck Institute for Psycholinguistics, describes his primary area of research as “a fairly niche topic: iconicity (or how much a word sounds like what it means) in Japanese ideophones (or words that are like onomatopoeia but much more so).” He notes that one of the papers based on that research “managed to get an Altmetric score of four,” while another proved “much more successful, with an Altmetric score of 49.” As of this writing, Lockwood’s paper in in The Winnower displays a score of 284, which definitely counts for a break out of the niche.

Calling something “academic clickbait” hardly seems like a recommendation -- least of all given that, as Lockwood writes, “clickbait content tends to be put together in a more cursory way” than, say, a newspaper article; “far more effort goes into attracting the click in the first place than creating content of value.” Far from enriching the vocabulary of scholarly insult, however, Lockwood intends to show how small but significant tweaks to a paper’s title can make it more likely to win the attention of one’s fellow specialists and possibly among wider circles as well.

He collected the titles of 2,136 articles appearing in the open-access journal Frontiers in Psychology in 2013 and 2014 and, with the aid of two assistants, determined how they scored on six factors studied by previous researchers interested in the sharing of newspaper articles as well as citation statistics for scientific papers. He also counted the number of words in each title and collected the article’s Altmetric score (which factors in discussion in mass media and on academic blogs, as well as citations in papers). Some of the findings included:

  • A short title did not necessarily give an article greater visibility, despite earlier research showing that articles with shorter titles are cited more often than those with longer titles.
  • Titles clearly stating that the research showed or proved something attracted more attention than titles that did not.
  • Likewise with what Lockwood calls “arousing phrasing,” which is marked by “more general and less technical terminology” and “interesting or eye-catching turns of phrase.”
  • Framing the title as a question can increase the frequency with which an article is downloaded (other studies have suggested as much), but it did not correspond to a stronger Altmetric score.
  • Titles were rated as having or lacking “social currency,” depending on whether “a nonacademic [would] sound impressive and interesting if they were talking about this topic to their nonacademic friends in the pub.” Not surprisingly, this was the factor for which Lockwood and his assistants’ scores showed the widest variation in judgment.
  • General conclusions: “The positive framing of an article's findings in the title and phrasing the title in an arousing way increases how much online attention an article gets, independently of nonclickbait measures like how interesting the topic is or the length of the title. However, including a question in the title makes no difference, and having wordplay in the title actively harms an article's Altmetric score. This suggests that academic media is treated similarly to nonacademic media by the public in terms of what initially attracts people's attention.”

For all the figures, tables and citations, the project seems like a bit of a lark -- or so one might take the disclosure that the two research assistants “were compensated by [Lockwood] for their time and effort with dinner and beer.” For that matter, the title “Academic Clickbait” embodies what it names: it’s designed to tempt the reader into having a look.

At the same time, however, the title also does the article itself something of a disservice. Lockwood's advice is in general sound; it explains some ways to convey a sense of the significance of research to a reasonably wide range of possible readers who might be interested in it. By contrast, clickbait enriches somebody, but it's definitely not the public.

Editorial Tags: 
Image Source: 

Essay on scholarship concerning 'The Apprentice'

Many people -- including not a few members of his own party -- are dreaming of the day when they can point at the Republican presidential candidate and say, in their best imitation of his voice, “You’re fired!” But be careful: Donald Trump has attempted to trademark his catchphrase and the thumb-and-forefinger movement that accompanied it across 14 seasons of The Apprentice.

The U.S. Patent and Trademark Office rejected his application, but the man is almost compulsively litigious, and you might get a cease-and-desist order anyway. Trump’s claim on the expression and gesture as part of his brand was among the first revelations from my recent immersion in the scholarly literature concerning The Apprentice.

Yes, “the scholarly literature concerning The Apprentice” does sound like the premise for a bit of political satire, no denying it. But this report is nothing of the kind. No parody is intended, or necessary. My inquiry began on the assumption that I would probably find a couple of academic papers on Trump’s reality-television incarnation. Any hope of dashing off a quick-and-easy squib for this column disappeared as my reading queue filled up with a dozen articles from scholarly journals, not counting such tangential but pertinent material as Laurie Ouellette’s “Branding the Right: The Affective Economy of Sarah Palin.”

The literature varies considerably in emphasis and quality, and there is now, arguably, quite enough of it. Here are a few points gleaned from my reading.

The Apprentice debuted on NBC in January 2004; the first academic paper on it, Katherine N. Kinnick and Sabrena R. Parton's “Workplace Communication: What The Apprentice Teaches About Communication Skills,” appeared in the December 2005 issue of Business Communication Quarterly. The show’s basic template seems to have been preserved from season to season, as well as in the franchised productions in other countries:

Sixteen young professionals with impressive credentials and uncommon good looks compete in team challenges for a chance to earn a US $250,000 salary and the title of president of one of business mogul Donald Trump’s enterprises. At the conclusion of each episode, the losing team is called to Trump’s boardroom, where one player is eliminated with Trump’s now trademarked phrase, “You’re fired!”

The program rapidly established itself as the “highest-rated new show among the advertiser-coveted 18- to 49-year-old age group” in the United States, with the season finale drawing more than 40 million viewers. Subsequent papers on the British and Irish versions of The Apprentice demonstrate that the appeal was not strictly American. But the export failed in other markets. “Work, Power and Performance: Analyzing the 'Reality' Game of The Apprentice,” by Jo Littler and Nick Couldry (published in the journal Cultural Sociology in 2011), points out that the German and Finnish programs each lasted just one season.

At least some of the success of the British and American versions could derive from how the show translates the normally precarious conditions of employment under neoliberalism into the entertainment of a high-stakes game: “The fact that there are no safety nets for contestants on the program is constantly emphasized,” Littler and Couldry write. “Indeed, the risk of being cast aside is turned into a source of dramatic excitement and tension (‘You’re fired!’).” Viewers accustomed to social-democratic norms of employment might be less inclined to feeling vicariously involved in the contestants’ hopes and fears. The authors note that on the short-lived Finnish show, “You’re fired!” was replaced with “You are free to leave” -- a less humiliating pronouncement, if somewhat anticlimactic by contrast.

In their pioneering discussion of “what The Apprentice teaches about communication skills” from 2005, Kinnick and Parton cited a letter to The Wall Street Journal in which “Trump claimed that many business schools have made The Apprentice mandatory viewing and that he has received many letters asking that the episodes be packaged for the educational market.”

Of late, the words “Trump claimed” inspire far more caution than they once did. Still, much of the scholarship on The Apprentice takes its pedagogical significance as a given. Whether or not Trump’s name appears on the syllabus, his reality-television program is, after all these years, an element of how students entering the classroom understand or imagine the white-collar workplace.

In their content analysis of the first season, Kinnick and Parton identified a number of what they called “Trumpisms”: statements made on camera by the tycoon offering advice on communication and persuasion in the business world. (One example may suffice: “Negotiation is a very delicate art. Sometimes you have to be tough; sometimes you have to be sweet as pie -- it depends upon who you are dealing with.”) Kinnick and Parton wrote very little about Trump’s apothegms, and later scholars have found even less to say. The show’s more important lessons are taught, rather, by example.

Two papers by Chit Cheung Matthew Sung -- Exploring the Interplay of Gender, Discourse and (Im)politeness” (2012) and “Media Representations of Gender and Leadership From a Discourse Perspective” (2013) -- point out how Trump’s interaction with the losing team in each week’s episode establishes the de facto norms for acceptable communication.

Because the main objective of the meeting is for Trump to find out which member in the losing team is the weakest and should be fired, stereotypically masculine, aggressive behaviors such as insulting, criticizing, attacking others to put them down are not only common, but also considered normative at times. Indeed, the classification of the boardroom interaction as taking place in a “masculine” domain can be justified by, for instance, Trump’s usual style of speaking during the boardroom meetings over the 15 episodes: the frequent use of interruptions, the issuing of direct and unattenuated directives, the giving of cruel criticisms and negative evaluations without mitigation, and his dominance of the speaking floor.

The zero-sum approach leaves precious little room for communication styles that “place emphasis on the relational aspects of the interaction” while fostering “avoidance of confrontations,” “the use of politeness strategies and hedging devices, as well as minimal responses and supportive feedback.” Such interaction is typically identified as feminine. The association between gender and interaction style here is open to question, of course, but in the world of The Apprentice, it usually operates to women’s disadvantage in fairly direct ways: “While being tough may run the risk of being negatively perceived as ‘unwomanly,’” writes Sung, “acting in a feminine way may be seen as a sign of incompetence and viewed more negatively than being ‘rude.’”

A double bind, then, in effect, but with consequences going beyond any supposed “war of the sexes.” Writing in the Western Journal of Communication (2011), Daniel J. Lair finds another layer of mixed messages -- seeming to endorse old-fashioned diligence and virtue while simultaneously making considerable allowances for manipulative behavior and unbridled self-promotion. In “Surviving the Corporate Jungle: The Apprentice as Equipment for Living in the Contemporary Work World,” Lair writes:

On its surface, The Apprentice suggests that the key to success in the contingent new economy has not fundamentally changed, and that a “by your bootstrap” mentality is every bit the foundation for the reality television contestants of 21st-century late capitalism as it was for the characters of Horatio Alger’s early capitalism novels of the 19th century. Beneath that surface, however, savvy viewers uncover a strategy suggesting hard work, talent and perseverance are not enough, and that to really succeed one must adopt the cynical, detached attitudes governing the “game” of aestheticized work.

In “the ‘game’ of aestheticized work,” every decision and gesture is driven by the need to promote the self as brand. That, at least, is one business Trump has run without bankruptcy -- of the financial variety, anyway. And it’s one he can always go back to, whatever the electorate may decide.

Editorial Tags: 
Image Source: 
Getty Images

Review of Ken Ono's 'My Search for Ramanujan: How I Learned to Count'

Now that “genius” has become the job title for the person who fixes your MacBook, we need something considerably stronger to describe the Indian mathematician Srinivasa Ramanujan. Awe seems like the only suitable response to the work Ramanujan did and how he did it.

He was born in the southern part of the country in 1887, one year following publication of A Synopsis of Elementary Results in Pure Mathematics by George Shoobridge Carr, a math tutor in London. The volume would have been long since completely forgotten had Ramanujan not come across it as a high school student. Carr assembled more than 6,000 formulas and theorems in order of growing complexity -- but without the full proofs. Those Ramanujan worked out for himself.

By his twenties, Ramanujan was filling notebooks with his own extremely advanced work in pure mathematics, samples of which he sent to G. H. Hardy, an eminent number theorist at Cambridge University, in 1913. Following the example of Carr’s Synopsis, Ramanujan presented his findings without spelling out the proofs. He also used notation that had grown out of date, and it is easy to imagine the Cambridge don throwing the letter with its attachments into a drawer, along with all the other pleas for attention from amateur mathematicians. Instead, Hardy examined Ramanujan's material, found it interesting and in some cases staggeringly original, and helped wrangle the fellowship that brought the young Indian savant to Cambridge in 1914.

Ramanujan spent the most of the remainder of his short life in England, immersed in finding or inventing whole new domains of mathematics, even as tuberculosis undermined his health. Whether mathematicians discover concepts (as astronomers do galaxies) or create them (as composers do symphonies) is a matter of perennial controversy; for his part, Ramanujan said that ideas came to him in dreams sent by the Hindu goddess Namagiri. However one understands that claim, much of the work was so advanced that his colleagues were barely beginning to catch up when he died in India in 1920, at the age of 32.

The effort continues. Ken Ono's My Search for Ramanujan: How I Learned to Count (Springer) is the memoir of a mathematician who has devoted much of his career to working out the proofs and methods that his predecessor left unstated. And the story would be interesting enough as such, even if the author's life did not have its own twists and turns. Ono, a professor of mathematics and computer science at Emory University, wrote the book in collaboration with the late Amir D. Aczel, best known as the author of Fermat's Last Theorem. The input of a capable historian and popularizer of mathematics undoubtedly helped Ono create a smooth a compelling narrative out of extremely difficult material.

By anyone else's standard, Ono was, like his siblings, a gifted child, although fate seems to have rendered his talents a burden. His parents emigrated from Japan in the 1950s, and the author recalls his own childhood in the 1970s as defined by a "confusing and frustrating intersection of incompatible cultures." Even harder to reckon with was the unmeetable standard of Olympian intellect embodied by his father, Takashi Ono, a professor of mathematics (now emeritus) at Johns Hopkins University. As for his mother, Ken Ono describes her as "present[ing] herself as a martyr who had sacrificed all self-interest for the family," thus "instilling in us a sense of duty to succeed in the lives that they had planned for us."

And planned with unforgiving precision, it seems: his parents' only friends "were other professors with overachieving children who were being accepted by top private colleges and winning elite music competitions," establishing "models of perfection" that Ono and his brothers were reminded of constantly. He describes his parents as carrying the tiger mom outlook (that "if their children are not at the top of their class, then the parents aren't doing their job") to such an extreme that not even academic achievement merited praise. While anything less than perfection brought shame upon the family, mere excellence hardly merited notice.

One brother is a now a biochemist and university president, the other is a music professor, and Ken himself has an imposingly long list of professional achievements. Judged simply by the results, then, the Ono parenting style was a success. But the cost was enormous: decades of anxiety, self-doubt and self-contempt, taking him to the verge of suicide. The sight of math prodigies so young that their legs didn't touch the floor when they sat down in the classroom made passing advanced undergraduate courses feel like proof of inadequacy. Harsh and unrelenting parental voices echoed in his head ("Ken-chan, you no can hide …. You must be one of the best, and right now you losing out to 10-year-old kid with Pac-Man watch").

But the push to overachieve also met inner resistance. He engaged in competitive bicycling and played gigs as a disc jockey, and it sounds like there were enough fraternity shenanigans to feel liberated from what Ono calls "my old image as Asian-American math nerd." He had brushes with what would count as academic humiliation even by standards far less exacting than his own. But behaving "like a goofball" (in the author's preferred expression) seems, on the whole, to have been therapeutic. Ono eventually received his Ph.D. -- an achievement his parents took as a given and so never commented on.

One remarkable thing about Ono's narrative is that he seldom, if ever, sounds angry. To understand is to forgive, the proverb runs -- and coming to an intellectual comprehension of one's parents' outlook and behavior is a necessary step toward dealing with the consequences. (The second- and third-generation offspring of immigrants often have to come to terms with how the first generation navigated the unfamiliar or hostile circumstances they faced.) But in Ken Ono's case, there is another, equally compelling force: a series of encounters with the example and legacy of Ramanujan -- sometimes accidental and, at other times, sounding very much like destiny. I am reluctant to say much more than that because the part of the book's emotional power comes from the element of surprise at how developments unfold. Suffice it to say that mathematics, which for obvious reasons Ono came to consider an unpleasant and compulsory part of his lot in life, comes alive for him with all the beauty and mind-blowing glory that Ramanujan implied in referring to the goddess.

But that revelation has a much more human aspect in Ono's memoir, which is an account of the life-enhancing (and quite possibly life-saving) influence of a few friends and mentors. When G. H. Hardy responded to Ramanujan's letter in 1913 and fostered the promise of his early work, it saved a genius from the threat of oblivion and made possible an extraordinary flourishing of mathematical creativity. It will not give too much away to say that My Search for Ramanujan tells a comparable story, and does so in a way that pays tribute to collegiality as something more than a form of courtesy.

Editorial Tags: 

Overview (part 2) of fall 2016 books from university presses (essay)

Last month, while looking over thousands of listings for forthcoming books in dozens of university-press catalogs for this fall, I flagged 300 titles for further consideration as possible topics for future columns. Within that selection, a few clusters of books seemed to reflect trends, or interesting coincidences at least, and I noted a few of them here.

That survey, however unscientific and incomplete, was fairly well received. Here’s part two. As in the first installment, material in quotation marks is from catalog descriptions of the books. I’ve been sparing with links, but more information on each title is available from its publisher’s website, easily located via the Association of American University Presses directory.

Scholarly publishers might count as pioneers of what Jacob H. Rooksby calls The Branding of the American Mind: How Universities Capture, Manage and Monetize Intellectual Property and Why It Matters (Johns Hopkins, University Press, October), although the aggregate profits from every monograph ever published must be small change compared to one good research partnership with Big Pharma. Rooksby explores “higher education’s love affair with intellectual property itself, in all its dimensions” and challenges “the industry’s unquestioned and growing embrace of intellectual property from the perspective of research in law, higher education and the social sciences.” (Sobering thought: In this context, “the industry” refers to higher education.)

Making intellectual property more profitable is Fredrik Erixon and Björn Weigel’s concern in The Innovation Illusion: How So Little Is Created by So Many Working So Hard (Yale University Press, October), which treats “existing government regulations and corporate practices” as a menace to economic growth and prosperity: “Capitalism, they argue, has lost its mojo.”

If so, Google is undoubtedly developing an algorithm to look for it. At least three books on Big Data try to chart its impact on research, policy and the way we live now. Contributors to Big Data Is Not a Monolith, edited by Cassidy R. Sugimoto, Hamid R. Ekbia and Michael Mattioli (The MIT Press, October), assess the scope and heterogeneity of practices and processes subsumed under that heading. Roberto Simanowski’s Data Love: The Seduction and Betrayal of Digital Technologies (Columbia University Press, September) warns of the codependent relationship between “algorithmic analysis and data mining,” on the one hand, and “those who -- out of stinginess, convenience, ignorance, narcissism or passion -- contribute to the amassing of ever more data about their lives, leading to the statistical evaluation and individual profiling of their selves.” Christine L. Borgman focuses on the implications of data mining for scholarly research in Big Data, Little Data, No Data: Scholarship in the Networked World (The MIT Press, September), first published last year and now appearing in paperback. While “having the right data is usually better than having more data” and “little data can be just as valuable as big data,” the future of scholarship demands “massive investment in knowledge infrastructures,” whatever the scale of data involved.

Events in real time occasionally rush ahead of the publishing schedule. Several months ago David Owen advised the British public to “vote leave” in The U.K.’s In-Out Referendum: E.U. Foreign and Defence Policy Reform (Haus Publishing, distributed by the University of Chicago Press) but it reaches the American market only this month. Christopher Baker-Beall analyzes The European Union’s Fight Against Terrorism: Discourse, Policies, Identity (Manchester University Press, September) with an eye to “the wider societal impact of the ‘fight against terrorism’ discourse” in the European Union and “the various ways in which this policy is contributing to the ‘securitization’ of social and political life within Europe.” Recent developments suggest this will be a growing field of study.

The E.U.’s days are numbered, according to Larry Elliott and Dan Atkinson, because Europe Isn’t Working (Yale University Press, August). Or rather, more precisely, the euro isn’t. The currency “has failed to deliver on its promise of more jobs, more growth and greater equality,” and the E.U.’s “current policy of kicking the can down the road and hoping that something will turn up” can’t continue forever. A less fatalistic account of The Euro and the Battle of Ideas by Markus K. Brunnermeier et al. (Princeton University Press, August) traces the currency’s vicissitudes to “the philosophical differences between the founding countries of the Eurozone, particularly Germany and France.” But “these seemingly incompatible differences can be reconciled to ensure Europe’s survival.”

Meanwhile, on this side of the Atlantic, it’s time to start phasing out paper money, argues Kenneth S. Rogoff in The Curse of Cash (Princeton, August). The bigger denominations ($100 and up) enable “tax evasion, corruption, terrorism, the drug trade, human trafficking and the rest of a massive global underground economy” and have also “paralyzed monetary policy in virtually every advanced economy.” Small bills and coins are not such a problem, but the Franklins (and larger) could be replaced by a state-backed digital currency. For now, Arvind Narayanan et al. reveal “everything you need to know about the new global money for the internet age” in Bitcoin and Cryptocurrency Technologies: A Comprehensive Introduction (Princeton, August), complete with “an accompanying website that includes instructional videos for each chapter, homework problems, programming assignments and lecture slides.” Perfectly honest and law-abiding people will find the book of interest, but it seems like a must-read for anyone with a professional commitment to tax evasion, the drug trade and the like.

As it happens, the fall brings a bumper crop of scholarship on crime, punishment and policing, at varying levels of abstraction and grit. Andrew Millie’s Philosophical Criminology (Policy Press, distributed by the University of Chicago Press, November) is described as “the first book to foreground this emerging field” -- which it certainly is not. Whatever the contribution of the book itself, hype at this level counts as a species of counterfeiting. The anthropologists Jean Comaroff and John L. Comaroff compare developments in South Africa, the United States and the United Kingdom in The Truth About Crime: Sovereignty, Knowledge, Social Order (University of Chicago, December), while the contributors to Accusation: Creating Criminals, edited by George Pavlich and Matthew P. Unger (University of British Columbia, October) consider “the founding role that accusation plays in creating potential criminals.” Here we find another large claim: “his book launches an important new field of inquiry.” As an armchair criminologist, I am curious to see learn this differs from the venerable and well-worked field of labeling theory.

Closer to the street, Michael D. White and Henry F. Fradella consider Stop and Frisk: The Use and Abuse of a Controversial Policing Tactic (NYU Press, October) -- a practice much in the headlines in recent years, usually in connection with the issue of racial profiling. Their conclusions -- that “stop and frisk did not contribute as greatly to the drop in New York’s crime rates, as many proponents … have argued,” but also that “it can be judiciously used to help deter crime in a way that respects the rights and needs of citizens” -- are sure to provoke arguments from a variety of perspectives.

Forrest Stuart was stopped on the street for questioning 14 times in the first year of field work for Down, Out and Under: Arrest Policing and Everyday Life in Skid Row (University of Chicago Press, August), “often for doing little more than standing there.” He finds that the “distrust between police and the residents of disadvantaged neighborhoods” is “a tragedy built on mistakes and misplaced priorities more than on heroes and villains”; parties on both sides “are genuinely trying to do the right thing, yet too often come up short.”

Another ethnographic dispatch from the extremes of poverty, Christopher P. Dum’s Exiled in America: Life on the Margins in a Residential Motel (Columbia University Press, September) reports on the “squalid, unsafe and demeaning circumstances” of the housing of last resort “for many vulnerable Americans -- released prisoners, people with disabilities or mental illness, struggling addicts, the recently homeless, and the working poor.” The catalog entry for the book doesn’t mention it, but you feel the police presence all the same.

The overcrowding of American prisons is often explained as the byproduct of draconian mandatory sentencing laws, but Wisconsin Sentencing in the Tough-on-Crime Era: How Judges Retained Power and Why Mass Incarceration Happened Anyway by Michael M. O’Hear (Wisconsin, January) argues even in “a state where judges have considerable discretion in sentencing … the prison population has ballooned anyway, increasing nearly tenfold over forty years.” Over the same period, long-term solitary confinement has grown increasingly commonplace, as discussed in a column from six months ago concerning an anthology of writings by scholars, activists and prisoners. Keramet Reiter offers a case study in 23/7: Pelican Bay Prison and the Rise of Long-Term Solitary Confinement (Yale University Press, October). The title refers to how many hours a day prisoners spend “in featureless cells, with no visitors or human contact for years on end, and they are held entirely at administrators’ discretion.”

The practice signals that prison authorities have not just abandoned the idea of reformation but moved on to something more severe: a clear willingness to destroy prisoners’ minds. By contrast, Daniel Karpowitz’s College in Prison: Reading in an Age of Mass Incarceration (Rutgers University Press, February) describes Bard College’s program offering undergraduate education to New York state prisoners. The book serves as “a study in how institutions can be reimagined and reformed in order to give people from all walks of life a chance to enrich their minds and expand their opportunities” while making “a powerful case for why liberal arts education is still vital to the future of democracy in the United States.”

Daniel LaChance’s Executing Freedom: The Cultural Life of Capital Punishment in the United States (University of Chicago Press, October) asks why, by “the mid-1990s, as public trust in big government was near an all-time low,” a staggering 80 percent of Americans supported the death penalty. “Why did people who didn’t trust government to regulate the economy or provide daily services nonetheless believe that it should have the power to put its citizens to death?” The question implies a belief in the consistency and coherence of public opinion that is either naïve or rhetorical; in any case, the author maintains that “the height of 1970s disillusion” led to a belief in “the simplicity and moral power of the death penalty” as “a potent symbol for many Americans of what government could do” -- and, presumably, get right. That confidence has been shaken by a long string of reversals of verdict in recent years, which “could prove [the death penalty’s] eventual undoing in the United States.”

Given the brazen, methodical and massively destructive corruption leading to the near collapse of the world’s financial system eight years ago, Mary Breiner Ramirez and Steven A. Ramirez call for a new variety of capital punishment in The Case for the Corporate Death Penalty: Restoring Law and Order on Wall Street (NYU Press, January). “Despite overwhelming proof of wide-ranging and large-scale fraud on Wall Street before, during and after the crisis,” the government’s response amounted to “fines that essentially punished innocent shareholders instead of senior leaders at the megabanks.” Crony capitalism and white-collar crime will continue until the danger of corporate conviction -- having the company’s charter revoked, i.e., putting it out of business -- is credibly on the table.

In effect, if corporations enjoy the legal protection granted them by the Supreme Court’s dubious but effective interpretation of the 14th Amendment, they also should face the possibility of being put to death -- after due process, of course. And fair enough, although the last word here comes from that bumper sticker saying “I’ll believe corporations are people when Texas executes one.”

Editorial Tags: 
Image Source: 


Subscribe to RSS - Books
Back to Top