Books

Review of Anne Trubek, 'The History and Uncertain Future of Handwriting'

In late spring, I had to endorse a number of legal documents using a digital rendering of my name in a cursive script, chosen from a a menu of simulated handwriting styles. It was like my signature, except legible. Beneath its surface, so to speak, was my identifying data -- confirmed by what the company producing the application calls “robust authentication methods.”

Indeed, if it were necessary to prove the authenticity of the “signature” in court, there is no question that the digital glyph could be verified rigorously, whereas a handwriting expert would be hard-pressed to find much uniformity between versions of the scrawl that I make on paper. The visible part of an electronic signature, with its imitation of penmanship, is just a formality. It accommodates our lingering sense that entering a binding legal obligation really ought to include the act of affirming one’s identity by one’s own hand. (With the whole hand, that is, not just the finger used to click “I accept.”) At this stage, the feeling is vestigial. Given another generation or two of kids who grow up knowing how to type before they can ride a bicycle, it could disappear entirely, and  the practice of writing by hand could become an antiquarian hobby, like churning your own butter or making horseshoes.

But not necessarily. Anne Trubek covers a great deal of interesting ground in The History and Uncertain Future of Handwriting  (Bloomsbury), though much of the comment it has received concerns the "uncertain future" part, in an elegaic mode. I found rather that many of the book's points are made most clearly at the start, when she discusses the earliest known system of writing, Sumerian cuneiform, which emerged around 3000 B.C.E. (Formerly an associate professor of rhetoric, composition and English at Oberlin College, Trubek is the founder of Belt magazine, devoted to urban life in the Rust Belt.)

Encouraged by a curator to pick up some cuneiform tablets for a closer look, Trubek is struck by how compact they are: roughly palm-sized, covered with tiny marks made on wet clay with a stylus. While the writing instruments were simple, mastering them was not: “The tip of the stylus was triangular, and turning it caused different slants and directions for the marks -- some went deeper than others , some went horizontally or vertically, and the bottom of the stylus would make a round mark -- each with a distinct meaning.” To develop competence required years of study at the edubba (the Sumerian word for school: literally, “tablet house”) and also, one assumes, regular offerings to Nabu, the god of scribes, wisdom and literature.  

“By 1600 B.C.E.,” writes Trubek, “no Sumerian speakers were alive,” but it continued to be taught as a classical language, while cuneiform remained in use for another thousand years. For all its difficulty, cuneiform was easier to learn than the Egyptian writing system; it was relatively utilitarian (often used for business contracts and tax records), while hieroglyphics “were as much an art form as they were a means of information storage.” And clay tablets “continued to be used even after most people had shifted to papyrus.”

It is a lesson in the durable habits of the late adopter -- and a reminder that “the uncertain future of handwriting” (or of any other aspect of literacy) will not be decided by the automatic workings of progress and obsolescence.

As she roams across the centuries, Trubek points out two or three factors that have largely set the terms for the development of handwriting, quite apart from the qualities of the tools we use. One, of course, is that only a small a fraction of the population has been able to acquire the skill throughout most of history. In addition, people who could read did not always learn to write, as the skill was difficult and slow to master. Finally, people in authority have long tried to impose norms on how words should appear on the page. Charlemagne in the ninth century might not have otherwise have much resembled American schoolteachers in the 20th century, but they shared a passion for standardized penmanship.

It’s easy to see how these three difficult realities -- widespread illiteracy, tortuous pedagogy, and the demand for uniformity -- would tend to reinforce one another. Trubek’s story is in part one of rapid changes followed by stubborn inertia. One side-effect of Gutenberg’s invention of the printing press was that, in putting scribes out of work, many were compelled to take up a new career: they became writing instructors. That can only have encouraged more efficient pedagogy -- and with it a broadening and deepening of the pool of those people able to write, as well as read, more books.

But the drive to standardize and to reinforce social hierarchy seems, if anything, to have intensified: handwriting became an index of status, rank and moral uprightness. A 17th-century writing manual recommended a particular script for women, “for as much as they (having not the patience to take any great paines, besides phantasticiall and humorsome) must be taught that which they may instantly learn.” Immigrants who practiced the Palmer Method -- one of the predominant forms of handwriting instruction available a hundred years ago -- would be more readily assimilated under its “powerful hygienic effect.” Good penmanship — for a purer America!

The invention of the typewriter was one giant leap for standardization, and Trubek quotes one researcher’s conclusion that “familiarity with the typewriter makes students better penmen, not worse.” But it also provoked worries that teachers were neglecting handwriting instruction. In time, “the universal typewriter may swallow all.”

It didn’t, of course. What actually happened (and this is one of the most striking points in a book full of them) was that another attitude towards handwriting came into focus: a sense of it being like one’s fingerprints, with distinct qualities visible only to the trained eye. Graphologists make the still larger claim that handwriting analysis can reveal aspects of the personality. Because graphology ranks somewhere between dowsing and astrology in scientific rigor, I had assumed its origins were lost in the mists of antiquity. But it turns out the idea took shape no later than the 17th century, with efforts to systematize it really getting underway only in the 19th century, amidst worries about individuality being crushed by the march of progress.

Trubek’s history of handwriting is a story of metamorphosis, not of decline. Given my experience with “signing” digital documents a few months ago, I was interested and amused to learn it was a variation on a theme: A century or so back, one manufacturer “marketed -- unsuccessfully, it seems -- a typewriter whose letter keys were formed from handwriting of the buyer.” If the future of handwriting is uncertain, that’s in part because no one can tell what uses and meanings we may find for it yet.

Editorial Tags: 

Review of article on using clickbait techniques in scholarly titles

The wits of the Algonquin circle once held a competition to see which one of them could come up with the most sensational headline. If a prize was given, I assume it was something fermented. Dorothy Parker won -- because of course she did -- with “Pope Elopes.”

Well, that one would certainly sell some papers -- or, as we say now, go viral. Until recently, the art of the headline was largely defined by the haiku-like challenge to balance impact and brevity within the constraints of a newspaper format. The greater a headline’s prominence, the larger the type, but the fewer the syllables it could contain. Given those terms, Parker’s masterpiece seems difficult to surpass. (That said, the legendary New York tabloid headline “Headless Body in Topless Bar” merits a special commendation for accompanying a real-life story.) Digital publications don’t have to adjust the length of a title, or even an article, to Procrustean specifications, but they have to take into account that readers’ attention is under continual bombardment. A headline must tickle the curiosity or otherwise imply that the article will at least be worth the opportunity cost built into reading it.

The contemporary phenomenon of “clickbait” makes that promise and then breaks it almost immediately. The Oxford dictionary defines the neologism as referring to online material “whose main purpose is the attract attention and encourage visitors to click on a link to a particular webpage.” It subsumes a variety of what might be called, to be generous about it, fluff, including diet tips, sex advice, amazing new discoveries that you will not believe, lists of movies or TV shows (annotated to celebrate or mock them), photographs of celebrities (from high school yearbooks, the red carpet or mug shots) and video footage of animals engaged in adorable behavior. In taking the bait, visitors drive up site traffic and boost exposure for its advertisers. Clickbait content is to boredom what seawater is to thirst. If consuming it has any benefits, it's hard to imagine what they would be.

Two months ago, Gwilym Lockwood published a paper called “Academic Clickbait: Articles With Positively Framed Titles, Interesting Phrasing and No Wordplay Get More Attention Online” in The Winnower, an open-access online scholarly publishing platform. The author, a Ph.D. student in the neurobiology of language department at the Max Planck Institute for Psycholinguistics, describes his primary area of research as “a fairly niche topic: iconicity (or how much a word sounds like what it means) in Japanese ideophones (or words that are like onomatopoeia but much more so).” He notes that one of the papers based on that research “managed to get an Altmetric score of four,” while another proved “much more successful, with an Altmetric score of 49.” As of this writing, Lockwood’s paper in in The Winnower displays a score of 284, which definitely counts for a break out of the niche.

Calling something “academic clickbait” hardly seems like a recommendation -- least of all given that, as Lockwood writes, “clickbait content tends to be put together in a more cursory way” than, say, a newspaper article; “far more effort goes into attracting the click in the first place than creating content of value.” Far from enriching the vocabulary of scholarly insult, however, Lockwood intends to show how small but significant tweaks to a paper’s title can make it more likely to win the attention of one’s fellow specialists and possibly among wider circles as well.

He collected the titles of 2,136 articles appearing in the open-access journal Frontiers in Psychology in 2013 and 2014 and, with the aid of two assistants, determined how they scored on six factors studied by previous researchers interested in the sharing of newspaper articles as well as citation statistics for scientific papers. He also counted the number of words in each title and collected the article’s Altmetric score (which factors in discussion in mass media and on academic blogs, as well as citations in papers). Some of the findings included:

  • A short title did not necessarily give an article greater visibility, despite earlier research showing that articles with shorter titles are cited more often than those with longer titles.
  • Titles clearly stating that the research showed or proved something attracted more attention than titles that did not.
  • Likewise with what Lockwood calls “arousing phrasing,” which is marked by “more general and less technical terminology” and “interesting or eye-catching turns of phrase.”
  • Framing the title as a question can increase the frequency with which an article is downloaded (other studies have suggested as much), but it did not correspond to a stronger Altmetric score.
  • Titles were rated as having or lacking “social currency,” depending on whether “a nonacademic [would] sound impressive and interesting if they were talking about this topic to their nonacademic friends in the pub.” Not surprisingly, this was the factor for which Lockwood and his assistants’ scores showed the widest variation in judgment.
  • General conclusions: “The positive framing of an article's findings in the title and phrasing the title in an arousing way increases how much online attention an article gets, independently of nonclickbait measures like how interesting the topic is or the length of the title. However, including a question in the title makes no difference, and having wordplay in the title actively harms an article's Altmetric score. This suggests that academic media is treated similarly to nonacademic media by the public in terms of what initially attracts people's attention.”

For all the figures, tables and citations, the project seems like a bit of a lark -- or so one might take the disclosure that the two research assistants “were compensated by [Lockwood] for their time and effort with dinner and beer.” For that matter, the title “Academic Clickbait” embodies what it names: it’s designed to tempt the reader into having a look.

At the same time, however, the title also does the article itself something of a disservice. Lockwood's advice is in general sound; it explains some ways to convey a sense of the significance of research to a reasonably wide range of possible readers who might be interested in it. By contrast, clickbait enriches somebody, but it's definitely not the public.

Editorial Tags: 
Image Source: 
iStock/Dacian_G

Essay on scholarship concerning 'The Apprentice'

Many people -- including not a few members of his own party -- are dreaming of the day when they can point at the Republican presidential candidate and say, in their best imitation of his voice, “You’re fired!” But be careful: Donald Trump has attempted to trademark his catchphrase and the thumb-and-forefinger movement that accompanied it across 14 seasons of The Apprentice.

The U.S. Patent and Trademark Office rejected his application, but the man is almost compulsively litigious, and you might get a cease-and-desist order anyway. Trump’s claim on the expression and gesture as part of his brand was among the first revelations from my recent immersion in the scholarly literature concerning The Apprentice.

Yes, “the scholarly literature concerning The Apprentice” does sound like the premise for a bit of political satire, no denying it. But this report is nothing of the kind. No parody is intended, or necessary. My inquiry began on the assumption that I would probably find a couple of academic papers on Trump’s reality-television incarnation. Any hope of dashing off a quick-and-easy squib for this column disappeared as my reading queue filled up with a dozen articles from scholarly journals, not counting such tangential but pertinent material as Laurie Ouellette’s “Branding the Right: The Affective Economy of Sarah Palin.”

The literature varies considerably in emphasis and quality, and there is now, arguably, quite enough of it. Here are a few points gleaned from my reading.

The Apprentice debuted on NBC in January 2004; the first academic paper on it, Katherine N. Kinnick and Sabrena R. Parton's “Workplace Communication: What The Apprentice Teaches About Communication Skills,” appeared in the December 2005 issue of Business Communication Quarterly. The show’s basic template seems to have been preserved from season to season, as well as in the franchised productions in other countries:

Sixteen young professionals with impressive credentials and uncommon good looks compete in team challenges for a chance to earn a US $250,000 salary and the title of president of one of business mogul Donald Trump’s enterprises. At the conclusion of each episode, the losing team is called to Trump’s boardroom, where one player is eliminated with Trump’s now trademarked phrase, “You’re fired!”

The program rapidly established itself as the “highest-rated new show among the advertiser-coveted 18- to 49-year-old age group” in the United States, with the season finale drawing more than 40 million viewers. Subsequent papers on the British and Irish versions of The Apprentice demonstrate that the appeal was not strictly American. But the export failed in other markets. “Work, Power and Performance: Analyzing the 'Reality' Game of The Apprentice,” by Jo Littler and Nick Couldry (published in the journal Cultural Sociology in 2011), points out that the German and Finnish programs each lasted just one season.

At least some of the success of the British and American versions could derive from how the show translates the normally precarious conditions of employment under neoliberalism into the entertainment of a high-stakes game: “The fact that there are no safety nets for contestants on the program is constantly emphasized,” Littler and Couldry write. “Indeed, the risk of being cast aside is turned into a source of dramatic excitement and tension (‘You’re fired!’).” Viewers accustomed to social-democratic norms of employment might be less inclined to feeling vicariously involved in the contestants’ hopes and fears. The authors note that on the short-lived Finnish show, “You’re fired!” was replaced with “You are free to leave” -- a less humiliating pronouncement, if somewhat anticlimactic by contrast.

In their pioneering discussion of “what The Apprentice teaches about communication skills” from 2005, Kinnick and Parton cited a letter to The Wall Street Journal in which “Trump claimed that many business schools have made The Apprentice mandatory viewing and that he has received many letters asking that the episodes be packaged for the educational market.”

Of late, the words “Trump claimed” inspire far more caution than they once did. Still, much of the scholarship on The Apprentice takes its pedagogical significance as a given. Whether or not Trump’s name appears on the syllabus, his reality-television program is, after all these years, an element of how students entering the classroom understand or imagine the white-collar workplace.

In their content analysis of the first season, Kinnick and Parton identified a number of what they called “Trumpisms”: statements made on camera by the tycoon offering advice on communication and persuasion in the business world. (One example may suffice: “Negotiation is a very delicate art. Sometimes you have to be tough; sometimes you have to be sweet as pie -- it depends upon who you are dealing with.”) Kinnick and Parton wrote very little about Trump’s apothegms, and later scholars have found even less to say. The show’s more important lessons are taught, rather, by example.

Two papers by Chit Cheung Matthew Sung -- Exploring the Interplay of Gender, Discourse and (Im)politeness” (2012) and “Media Representations of Gender and Leadership From a Discourse Perspective” (2013) -- point out how Trump’s interaction with the losing team in each week’s episode establishes the de facto norms for acceptable communication.

Because the main objective of the meeting is for Trump to find out which member in the losing team is the weakest and should be fired, stereotypically masculine, aggressive behaviors such as insulting, criticizing, attacking others to put them down are not only common, but also considered normative at times. Indeed, the classification of the boardroom interaction as taking place in a “masculine” domain can be justified by, for instance, Trump’s usual style of speaking during the boardroom meetings over the 15 episodes: the frequent use of interruptions, the issuing of direct and unattenuated directives, the giving of cruel criticisms and negative evaluations without mitigation, and his dominance of the speaking floor.

The zero-sum approach leaves precious little room for communication styles that “place emphasis on the relational aspects of the interaction” while fostering “avoidance of confrontations,” “the use of politeness strategies and hedging devices, as well as minimal responses and supportive feedback.” Such interaction is typically identified as feminine. The association between gender and interaction style here is open to question, of course, but in the world of The Apprentice, it usually operates to women’s disadvantage in fairly direct ways: “While being tough may run the risk of being negatively perceived as ‘unwomanly,’” writes Sung, “acting in a feminine way may be seen as a sign of incompetence and viewed more negatively than being ‘rude.’”

A double bind, then, in effect, but with consequences going beyond any supposed “war of the sexes.” Writing in the Western Journal of Communication (2011), Daniel J. Lair finds another layer of mixed messages -- seeming to endorse old-fashioned diligence and virtue while simultaneously making considerable allowances for manipulative behavior and unbridled self-promotion. In “Surviving the Corporate Jungle: The Apprentice as Equipment for Living in the Contemporary Work World,” Lair writes:

On its surface, The Apprentice suggests that the key to success in the contingent new economy has not fundamentally changed, and that a “by your bootstrap” mentality is every bit the foundation for the reality television contestants of 21st-century late capitalism as it was for the characters of Horatio Alger’s early capitalism novels of the 19th century. Beneath that surface, however, savvy viewers uncover a strategy suggesting hard work, talent and perseverance are not enough, and that to really succeed one must adopt the cynical, detached attitudes governing the “game” of aestheticized work.

In “the ‘game’ of aestheticized work,” every decision and gesture is driven by the need to promote the self as brand. That, at least, is one business Trump has run without bankruptcy -- of the financial variety, anyway. And it’s one he can always go back to, whatever the electorate may decide.

Editorial Tags: 
Image Source: 
Getty Images

Review of Ken Ono's 'My Search for Ramanujan: How I Learned to Count'

Now that “genius” has become the job title for the person who fixes your MacBook, we need something considerably stronger to describe the Indian mathematician Srinivasa Ramanujan. Awe seems like the only suitable response to the work Ramanujan did and how he did it.

He was born in the southern part of the country in 1887, one year following publication of A Synopsis of Elementary Results in Pure Mathematics by George Shoobridge Carr, a math tutor in London. The volume would have been long since completely forgotten had Ramanujan not come across it as a high school student. Carr assembled more than 6,000 formulas and theorems in order of growing complexity -- but without the full proofs. Those Ramanujan worked out for himself.

By his twenties, Ramanujan was filling notebooks with his own extremely advanced work in pure mathematics, samples of which he sent to G. H. Hardy, an eminent number theorist at Cambridge University, in 1913. Following the example of Carr’s Synopsis, Ramanujan presented his findings without spelling out the proofs. He also used notation that had grown out of date, and it is easy to imagine the Cambridge don throwing the letter with its attachments into a drawer, along with all the other pleas for attention from amateur mathematicians. Instead, Hardy examined Ramanujan's material, found it interesting and in some cases staggeringly original, and helped wrangle the fellowship that brought the young Indian savant to Cambridge in 1914.

Ramanujan spent the most of the remainder of his short life in England, immersed in finding or inventing whole new domains of mathematics, even as tuberculosis undermined his health. Whether mathematicians discover concepts (as astronomers do galaxies) or create them (as composers do symphonies) is a matter of perennial controversy; for his part, Ramanujan said that ideas came to him in dreams sent by the Hindu goddess Namagiri. However one understands that claim, much of the work was so advanced that his colleagues were barely beginning to catch up when he died in India in 1920, at the age of 32.

The effort continues. Ken Ono's My Search for Ramanujan: How I Learned to Count (Springer) is the memoir of a mathematician who has devoted much of his career to working out the proofs and methods that his predecessor left unstated. And the story would be interesting enough as such, even if the author's life did not have its own twists and turns. Ono, a professor of mathematics and computer science at Emory University, wrote the book in collaboration with the late Amir D. Aczel, best known as the author of Fermat's Last Theorem. The input of a capable historian and popularizer of mathematics undoubtedly helped Ono create a smooth a compelling narrative out of extremely difficult material.

By anyone else's standard, Ono was, like his siblings, a gifted child, although fate seems to have rendered his talents a burden. His parents emigrated from Japan in the 1950s, and the author recalls his own childhood in the 1970s as defined by a "confusing and frustrating intersection of incompatible cultures." Even harder to reckon with was the unmeetable standard of Olympian intellect embodied by his father, Takashi Ono, a professor of mathematics (now emeritus) at Johns Hopkins University. As for his mother, Ken Ono describes her as "present[ing] herself as a martyr who had sacrificed all self-interest for the family," thus "instilling in us a sense of duty to succeed in the lives that they had planned for us."

And planned with unforgiving precision, it seems: his parents' only friends "were other professors with overachieving children who were being accepted by top private colleges and winning elite music competitions," establishing "models of perfection" that Ono and his brothers were reminded of constantly. He describes his parents as carrying the tiger mom outlook (that "if their children are not at the top of their class, then the parents aren't doing their job") to such an extreme that not even academic achievement merited praise. While anything less than perfection brought shame upon the family, mere excellence hardly merited notice.

One brother is a now a biochemist and university president, the other is a music professor, and Ken himself has an imposingly long list of professional achievements. Judged simply by the results, then, the Ono parenting style was a success. But the cost was enormous: decades of anxiety, self-doubt and self-contempt, taking him to the verge of suicide. The sight of math prodigies so young that their legs didn't touch the floor when they sat down in the classroom made passing advanced undergraduate courses feel like proof of inadequacy. Harsh and unrelenting parental voices echoed in his head ("Ken-chan, you no can hide …. You must be one of the best, and right now you losing out to 10-year-old kid with Pac-Man watch").

But the push to overachieve also met inner resistance. He engaged in competitive bicycling and played gigs as a disc jockey, and it sounds like there were enough fraternity shenanigans to feel liberated from what Ono calls "my old image as Asian-American math nerd." He had brushes with what would count as academic humiliation even by standards far less exacting than his own. But behaving "like a goofball" (in the author's preferred expression) seems, on the whole, to have been therapeutic. Ono eventually received his Ph.D. -- an achievement his parents took as a given and so never commented on.

One remarkable thing about Ono's narrative is that he seldom, if ever, sounds angry. To understand is to forgive, the proverb runs -- and coming to an intellectual comprehension of one's parents' outlook and behavior is a necessary step toward dealing with the consequences. (The second- and third-generation offspring of immigrants often have to come to terms with how the first generation navigated the unfamiliar or hostile circumstances they faced.) But in Ken Ono's case, there is another, equally compelling force: a series of encounters with the example and legacy of Ramanujan -- sometimes accidental and, at other times, sounding very much like destiny. I am reluctant to say much more than that because the part of the book's emotional power comes from the element of surprise at how developments unfold. Suffice it to say that mathematics, which for obvious reasons Ono came to consider an unpleasant and compulsory part of his lot in life, comes alive for him with all the beauty and mind-blowing glory that Ramanujan implied in referring to the goddess.

But that revelation has a much more human aspect in Ono's memoir, which is an account of the life-enhancing (and quite possibly life-saving) influence of a few friends and mentors. When G. H. Hardy responded to Ramanujan's letter in 1913 and fostered the promise of his early work, it saved a genius from the threat of oblivion and made possible an extraordinary flourishing of mathematical creativity. It will not give too much away to say that My Search for Ramanujan tells a comparable story, and does so in a way that pays tribute to collegiality as something more than a form of courtesy.

Editorial Tags: 

Overview (part 2) of fall 2016 books from university presses (essay)

Last month, while looking over thousands of listings for forthcoming books in dozens of university-press catalogs for this fall, I flagged 300 titles for further consideration as possible topics for future columns. Within that selection, a few clusters of books seemed to reflect trends, or interesting coincidences at least, and I noted a few of them here.

That survey, however unscientific and incomplete, was fairly well received. Here’s part two. As in the first installment, material in quotation marks is from catalog descriptions of the books. I’ve been sparing with links, but more information on each title is available from its publisher’s website, easily located via the Association of American University Presses directory.

Scholarly publishers might count as pioneers of what Jacob H. Rooksby calls The Branding of the American Mind: How Universities Capture, Manage and Monetize Intellectual Property and Why It Matters (Johns Hopkins, University Press, October), although the aggregate profits from every monograph ever published must be small change compared to one good research partnership with Big Pharma. Rooksby explores “higher education’s love affair with intellectual property itself, in all its dimensions” and challenges “the industry’s unquestioned and growing embrace of intellectual property from the perspective of research in law, higher education and the social sciences.” (Sobering thought: In this context, “the industry” refers to higher education.)

Making intellectual property more profitable is Fredrik Erixon and Björn Weigel’s concern in The Innovation Illusion: How So Little Is Created by So Many Working So Hard (Yale University Press, October), which treats “existing government regulations and corporate practices” as a menace to economic growth and prosperity: “Capitalism, they argue, has lost its mojo.”

If so, Google is undoubtedly developing an algorithm to look for it. At least three books on Big Data try to chart its impact on research, policy and the way we live now. Contributors to Big Data Is Not a Monolith, edited by Cassidy R. Sugimoto, Hamid R. Ekbia and Michael Mattioli (The MIT Press, October), assess the scope and heterogeneity of practices and processes subsumed under that heading. Roberto Simanowski’s Data Love: The Seduction and Betrayal of Digital Technologies (Columbia University Press, September) warns of the codependent relationship between “algorithmic analysis and data mining,” on the one hand, and “those who -- out of stinginess, convenience, ignorance, narcissism or passion -- contribute to the amassing of ever more data about their lives, leading to the statistical evaluation and individual profiling of their selves.” Christine L. Borgman focuses on the implications of data mining for scholarly research in Big Data, Little Data, No Data: Scholarship in the Networked World (The MIT Press, September), first published last year and now appearing in paperback. While “having the right data is usually better than having more data” and “little data can be just as valuable as big data,” the future of scholarship demands “massive investment in knowledge infrastructures,” whatever the scale of data involved.

Events in real time occasionally rush ahead of the publishing schedule. Several months ago David Owen advised the British public to “vote leave” in The U.K.’s In-Out Referendum: E.U. Foreign and Defence Policy Reform (Haus Publishing, distributed by the University of Chicago Press) but it reaches the American market only this month. Christopher Baker-Beall analyzes The European Union’s Fight Against Terrorism: Discourse, Policies, Identity (Manchester University Press, September) with an eye to “the wider societal impact of the ‘fight against terrorism’ discourse” in the European Union and “the various ways in which this policy is contributing to the ‘securitization’ of social and political life within Europe.” Recent developments suggest this will be a growing field of study.

The E.U.’s days are numbered, according to Larry Elliott and Dan Atkinson, because Europe Isn’t Working (Yale University Press, August). Or rather, more precisely, the euro isn’t. The currency “has failed to deliver on its promise of more jobs, more growth and greater equality,” and the E.U.’s “current policy of kicking the can down the road and hoping that something will turn up” can’t continue forever. A less fatalistic account of The Euro and the Battle of Ideas by Markus K. Brunnermeier et al. (Princeton University Press, August) traces the currency’s vicissitudes to “the philosophical differences between the founding countries of the Eurozone, particularly Germany and France.” But “these seemingly incompatible differences can be reconciled to ensure Europe’s survival.”

Meanwhile, on this side of the Atlantic, it’s time to start phasing out paper money, argues Kenneth S. Rogoff in The Curse of Cash (Princeton, August). The bigger denominations ($100 and up) enable “tax evasion, corruption, terrorism, the drug trade, human trafficking and the rest of a massive global underground economy” and have also “paralyzed monetary policy in virtually every advanced economy.” Small bills and coins are not such a problem, but the Franklins (and larger) could be replaced by a state-backed digital currency. For now, Arvind Narayanan et al. reveal “everything you need to know about the new global money for the internet age” in Bitcoin and Cryptocurrency Technologies: A Comprehensive Introduction (Princeton, August), complete with “an accompanying website that includes instructional videos for each chapter, homework problems, programming assignments and lecture slides.” Perfectly honest and law-abiding people will find the book of interest, but it seems like a must-read for anyone with a professional commitment to tax evasion, the drug trade and the like.

As it happens, the fall brings a bumper crop of scholarship on crime, punishment and policing, at varying levels of abstraction and grit. Andrew Millie’s Philosophical Criminology (Policy Press, distributed by the University of Chicago Press, November) is described as “the first book to foreground this emerging field” -- which it certainly is not. Whatever the contribution of the book itself, hype at this level counts as a species of counterfeiting. The anthropologists Jean Comaroff and John L. Comaroff compare developments in South Africa, the United States and the United Kingdom in The Truth About Crime: Sovereignty, Knowledge, Social Order (University of Chicago, December), while the contributors to Accusation: Creating Criminals, edited by George Pavlich and Matthew P. Unger (University of British Columbia, October) consider “the founding role that accusation plays in creating potential criminals.” Here we find another large claim: “his book launches an important new field of inquiry.” As an armchair criminologist, I am curious to see learn this differs from the venerable and well-worked field of labeling theory.

Closer to the street, Michael D. White and Henry F. Fradella consider Stop and Frisk: The Use and Abuse of a Controversial Policing Tactic (NYU Press, October) -- a practice much in the headlines in recent years, usually in connection with the issue of racial profiling. Their conclusions -- that “stop and frisk did not contribute as greatly to the drop in New York’s crime rates, as many proponents … have argued,” but also that “it can be judiciously used to help deter crime in a way that respects the rights and needs of citizens” -- are sure to provoke arguments from a variety of perspectives.

Forrest Stuart was stopped on the street for questioning 14 times in the first year of field work for Down, Out and Under: Arrest Policing and Everyday Life in Skid Row (University of Chicago Press, August), “often for doing little more than standing there.” He finds that the “distrust between police and the residents of disadvantaged neighborhoods” is “a tragedy built on mistakes and misplaced priorities more than on heroes and villains”; parties on both sides “are genuinely trying to do the right thing, yet too often come up short.”

Another ethnographic dispatch from the extremes of poverty, Christopher P. Dum’s Exiled in America: Life on the Margins in a Residential Motel (Columbia University Press, September) reports on the “squalid, unsafe and demeaning circumstances” of the housing of last resort “for many vulnerable Americans -- released prisoners, people with disabilities or mental illness, struggling addicts, the recently homeless, and the working poor.” The catalog entry for the book doesn’t mention it, but you feel the police presence all the same.

The overcrowding of American prisons is often explained as the byproduct of draconian mandatory sentencing laws, but Wisconsin Sentencing in the Tough-on-Crime Era: How Judges Retained Power and Why Mass Incarceration Happened Anyway by Michael M. O’Hear (Wisconsin, January) argues even in “a state where judges have considerable discretion in sentencing … the prison population has ballooned anyway, increasing nearly tenfold over forty years.” Over the same period, long-term solitary confinement has grown increasingly commonplace, as discussed in a column from six months ago concerning an anthology of writings by scholars, activists and prisoners. Keramet Reiter offers a case study in 23/7: Pelican Bay Prison and the Rise of Long-Term Solitary Confinement (Yale University Press, October). The title refers to how many hours a day prisoners spend “in featureless cells, with no visitors or human contact for years on end, and they are held entirely at administrators’ discretion.”

The practice signals that prison authorities have not just abandoned the idea of reformation but moved on to something more severe: a clear willingness to destroy prisoners’ minds. By contrast, Daniel Karpowitz’s College in Prison: Reading in an Age of Mass Incarceration (Rutgers University Press, February) describes Bard College’s program offering undergraduate education to New York state prisoners. The book serves as “a study in how institutions can be reimagined and reformed in order to give people from all walks of life a chance to enrich their minds and expand their opportunities” while making “a powerful case for why liberal arts education is still vital to the future of democracy in the United States.”

Daniel LaChance’s Executing Freedom: The Cultural Life of Capital Punishment in the United States (University of Chicago Press, October) asks why, by “the mid-1990s, as public trust in big government was near an all-time low,” a staggering 80 percent of Americans supported the death penalty. “Why did people who didn’t trust government to regulate the economy or provide daily services nonetheless believe that it should have the power to put its citizens to death?” The question implies a belief in the consistency and coherence of public opinion that is either naïve or rhetorical; in any case, the author maintains that “the height of 1970s disillusion” led to a belief in “the simplicity and moral power of the death penalty” as “a potent symbol for many Americans of what government could do” -- and, presumably, get right. That confidence has been shaken by a long string of reversals of verdict in recent years, which “could prove [the death penalty’s] eventual undoing in the United States.”

Given the brazen, methodical and massively destructive corruption leading to the near collapse of the world’s financial system eight years ago, Mary Breiner Ramirez and Steven A. Ramirez call for a new variety of capital punishment in The Case for the Corporate Death Penalty: Restoring Law and Order on Wall Street (NYU Press, January). “Despite overwhelming proof of wide-ranging and large-scale fraud on Wall Street before, during and after the crisis,” the government’s response amounted to “fines that essentially punished innocent shareholders instead of senior leaders at the megabanks.” Crony capitalism and white-collar crime will continue until the danger of corporate conviction -- having the company’s charter revoked, i.e., putting it out of business -- is credibly on the table.

In effect, if corporations enjoy the legal protection granted them by the Supreme Court’s dubious but effective interpretation of the 14th Amendment, they also should face the possibility of being put to death -- after due process, of course. And fair enough, although the last word here comes from that bumper sticker saying “I’ll believe corporations are people when Texas executes one.”

Editorial Tags: 
Image Source: 
iStock

Review of Robert Legvold, "Return to Cold War"

Historical analogy is a blunt and clumsy tool, and one serving better as a rhetorical device than as a method of analysis. The so-called law of the instrument -- i.e., “if all you have is a hammer, everything looks like a nail” -- applies to historical analogy with double force. And not just because the stock of examples is usually narrow and cliché addled, as with the entirely too familiar Munich Pact formula: “X is the new Hitler; Y’s policy resembles that of Neville Chamberlain in 1938; therefore doing Z would exhibit Churchill-like foresight.” Nearly always the analogy is blatant propaganda on behalf of Z. You never find it used for heuristic purposes, such as determining who the current Wernher von Braun might be.

The deeper problem is that historical analogy is always just on the verge of a cognitive short circuit. Finding patterns in the world is one of the evolutionarily adaptive knacks of the human brain, but we’re still learning to test and fine-tune it -- an especially difficult prospect when the patterns we find (or think we find) belong to the realm of human action. What looks like historical parallel from one angle may well turn out to be self-fulfilling prophecy. This can be a problem, especially if large weapons systems come into play.

While never so dramatic as analogies drawn from the Weimar-to-Nuremberg continuum, framing contemporary geopolitics as a Cold War-like standoff between two superpowers has been a regular temptation over the years -- at least, for the one superpower left standing. The main candidates to take the erstwhile Soviet Union’s place have been China and the global jihadist movement, with Putin-era Russia as a more recent nominee.

Indeed, books and articles with “New Cold War” in the title began appearing even before the old one was quite finished -- indications, perhaps, of a wish for a certain degree of familiarity and continuity between eras, a recognizable and navigable lineup of affiliations and hostilities. The passing of a quarter century has also made the bipolar thermonuclear quagmire of an earlier era look more orderly and stable than the anarchic system of free-floating multilateral anxiety that prevails today.

For the past couple of weeks, I was on the verge of reading Return to Cold War (Polity) by Robert Legvold, a professor emeritus of political science at Columbia University, but then kept putting it off. Perhaps it was the lack of a question mark in the title: Return to Cold War sound like an imperative. The cover shows an upside-down dove, depicted as if in the middle of a kamikaze dive or following airborne contact with a very high wall. The whole thing seemed designed to squelch any flicker of optimism that had somehow survived the day's news.

But once I actually opened the book, I found such apprehensions were misplaced: Legvold is not given to simplistic analogy nor does he indulge any notion that a return to long-term, two-sided geopolitical stalemate is possible, much less desirable. If relations between the United States and Russia have deteriorated to the point that comparisons to the Cold War status quo are appropriate, it is only within the limits defined by the absence, as yet, of ideological differences that call for a fight to the death of one system or the other. The deterioration was not inevitable, and even with it underway, there have been episodes of cooperation, albeit growing fewer and narrower as the mutual distrust continues. The common denominator between the countries has been the failure to assess things at all equitably: “If one searched for a leader, policy maker or politician on either side who included somewhere in her or his analysis thoughts about missteps or failings on both sides, the quest would have been in vain.”

Not that foresight was impossible. Legvold quotes a striking comment by George F. Kennan, author of the American policy of containment at the start of the Cold War. “Expanding NATO,” wrote Kennan in 1997 in The New York Times, “would be the most fateful error of American policy in the entire post-Cold War era. Such a decision may be expected to inflame the nationalist, anti-Western and militaristic tendencies in Russian opinion; to have an adverse effect on the development of Russian democracy; to restore the atmosphere of the Cold War in East-West relations; and to impel Russian foreign policy in directions decidedly not to our liking.” By no means is that the key explaining the entire course of the past 20 years, but as predictions go, it has its merits.

The author's presentation is succinct, lucid, fairly dispassionate and almost incessantly even-handed. I got the sense that he wrote it as if addressing an assembly of the policy-making elites of both sides, pointing out the confluence of blunders and rationalizations that worsened steadily to create a situation that, if not necessarily irreversible, now looks likely to continue in the same direction for some time to come.

Editorial Tags: 

Overview of fall 2016 books from university presses

Over the weekend I went through the fall 2016 catalog of every publisher belonging to the Association of American University Presses. Or at least I tried -- a number of fall catalogs have not been released yet, or else the publishers have hidden the PDFs on their websites with inexplicable cunning. (It seems as if savvy publicists would insist that catalogs be featured so prominently on the homepage that it’s almost impossible to overlook them. Perhaps half my time went to playing “Where’s Waldo?” so evidently not.) A few sites hadn’t been updated in at least a year. At one of them, the most recent catalog is from 2012, although the press itself seems still to be in existence. Let’s just hope everyone there is OK.

After assembling roughly 70 catalogs, I began to cull a list of books to consider for this column in the months ahead, which now runs to 400 titles, give or take a few, with more to be added as the search for Waldo continues. When you take an overview of a whole season’s worth of university-press output in one marathon survey, you can detect certain patterns or themes. A monograph on the white-power music underground? Duly noted. A second one, publishing a month later? That is a bit more striking. (The journalistic rule of thumb is that three makes a trend; for now, we’re left with a menacing coincidence.)

Some of the convergences seemed to merit notice, even in advance of the books themselves being available. Here are a few topical clusters that readers may find of interest. The text below in quotation marks after each book comes from the publisher’s description of it, unless otherwise specified. I have been sparing about the use of links, but more information on the books and authors can be readily found online.

“Whither democracy?” seems like an apt characterization of quite a few titles appearing this autumn and early winter. Last year, Jennifer L. Hochschild and Katherine Levine Einstein asked, Do Facts Matter? Information and Misinformation in American Politics, published by the University of Oklahoma Press and out in paperback this month, concluding that “citizens’ inability or unwillingness to use the facts they know in their political decision making may be frustrating,” but the real danger comes from “their acquisition and use of incorrect ‘knowledge’” put out by unscrupulous “political elites.” By contrast, James E. Campbell’s Polarized: Making Sense of a Divided America (Princeton University Press, July) maintains that if the two major parties are “now ideologically distant from each other and about equally distant from the political center” it’s because “American politics became highly polarized from the bottom up, not the top down, and this began much earlier than often thought,” meaning the 1960s.

Frances E. Lee sets the date later, and the locus of polarization higher in the body politic, in Insecure Majorities: Congress and the Perpetual Campaign (University of Chicago Press, September). She sees developments in the 1980s unleashing “competition for control of the government [that] drives members of both parties to participate in actions that promote their own party’s image and undercut that of the opposition, including the perpetual hunt for issues that can score political points by putting the opposing party on the wrong side of public opinion.”

Democracy: A Case Study by David A. Moss (Harvard University Press, January 2017) takes fierce partisanship as a given in American political life -- not a bug but a feature -- and recounts and analyzes 19 episodes of conflict, from the Constitutional Convention onward. Wasting no time in registering his dissent, the libertarian philosopher Jason Brennan comes out Against Democracy (Princeton, August) on the grounds that competent governance requires rational and informed decision making, while “political participation and democratic deliberation actually tend to make people worse -- more irrational, biased and mean.” The alternative he proposes is  “epistocracy”: rule by the knowledgeable. Good luck with that! Reaching that utopia from here will be quite an adventure, especially given that some voters regard “irrational, biased and mean” as qualifications for office.

Fall, when the current election cycle ends, is also be the season of books on the Anthropocene -- the idea that human impact on the environment has been so pronounced that we must define a whole phase of planetary history around it. There is an entry for the term in Fueling Culture: 101 Words for Energy and Environment (Fordham University Press, January), and it appears in the title of at least three books: one from Monthly Review Press (distributed by NYU Press) in September and one each from Princeton and Transcript Verlag (distributed by Columbia University Press) in November. Stacy Alaimo’s Exposed: Environmental Politics and Pleasures in Posthuman Times (University of Minnesota Press, October) opens with the statement “The Anthropocene is no time to set things straight.” (The author calls for “a material feminist posthumanism,” and it sounds like she draws on queer theory as well, so chances are “straight” is an overdetermined word choice.)

The neologism is tweaked in Staying With the Trouble: Making Kin in the Chthulucene (Duke University Press, September) by Donna J. Haraway, who “eschews referring to our current epoch as the Anthropocene, preferring to conceptualize it as what she calls the Chthulucene, as it more aptly and fully describes our epoch as one in which the human and nonhuman are inextricably linked in tentacular practices.” Someone in a position to know tells me that Haraway derives her term from “chthonic” (referring to the subterranean) rather than Cthulhu, the unspeakable ancient demigod of H. P. Lovecraft’s horror fiction. Maybe so, but the reference to tentacles suggests otherwise.

A couple of titles from Columbia University Press try to find a silver lining in the clouds of Anthropocene smog -- or at least to start dispersing them before it’s too late. Michael E. Mann and Tom Toles pool their skills as atmospheric scientist and Pulitzer-winning cartoonist (respectively) in The Madhouse Effect: How Climate Change Denial Is Threatening Our Planet, Destroying Our Politics and Driving Us Crazy (September), which satirizes “the intellectual pretzels into which denialists must twist logic to explain away the clear evidence that man-made activity has changed our climate.” Despite its seemingly monitory title, Geoffrey Heal’s Endangered Economies: How the Neglect of Nature Threatens Our Prosperity (December) is actually an argument for “conserving nature and boosting economic growth” as mutually compatible goals.

If so, it will be necessary to counter the effects of chickenization -- which, it turns out, is U.S. Department of Agriculture slang for “the transformation of all farm animal production” along factory lines, as described in Ellen K. Silbergeld’s Chickenizing Farms and Food: How Industrial Meat Production Endangers Workers, Animals and Consumers (Johns Hopkins University Press, September). Tiago Saraiva shows that the Germans began moving in the same direction, under more sinister auspices, in Fascist Pigs: Technoscientific Organisms and the History of Fascism (The MIT Press, September): “specially bred wheat and pigs became important elements in the institutionalization and expansion of fascist regimes …. Pigs that didn’t efficiently convert German-grown potatoes into pork and lard were eliminated.” A different sociopolitical matrix governs the contemporary American “pasture-raised pork market,” of which Brad Weiss offers an ethnographic account in Real Pigs: Shifting Values in the Field of Local Pork (Duke University Press, August).

And finally -- for this week, anyway -- there is the ecological and biomedical impact of the free-ranging creatures described in Peter P. Marra and Chris Santella’s Cat Wars: The Devastating Consequences of a Cuddly Killer (Princeton, September). Besides the fact that cats kill “birds and other animals by the billions” in the United States, the authors warn of “the little-known but potentially devastating public health consequences of rabies and parasitic Toxoplasma passing from cats to humans at rising rates.” The authors also maintain that “a small but vocal minority of cat advocates has campaigned successfully for no action in much the same way that special interest groups have stymied attempts to curtail smoking and climate change.” I write this while wearing a T-shirt that reads “Crazy Cat Guy” but will be the first to agree that the problem here is primarily human. There’s a reason it’s called the Anthropocene and not the Felinocene.

A number of other themes and topics from university-press fall books offering might bear mentioning in another column, later this summer. With luck, the pool of candidates will grow in the meantime; we’ll see if any new trends crystallize out in the process.

Editorial Tags: 
Image Source: 
iStock

Author discusses his new book about the origins of a vision of public higher education

Smart Title: 

Author discusses his new book on a much praised philosophy for public higher education.

Review of Maurizio Viroli's "How to Choose a Leader: Machiavelli's Advice to Citizens"

For most people the word “Machiavellian” carries no connotation of virtue, and it’s never meant as praise. A stock theatrical character of the Elizabethan era was the Machiavel, who “delights in his own villainy and gloats over his successes in lengthy soliloquies,” as one literary historian puts it, with Shakespeare’s Iago and Richard III being prime examples. A Newsweek article from last year characterizes Tony Blair as “a Machiavel with a Messiah complex,” surely one of the more inventive insults in recent memory.

Otherwise it is the adjectival form of the Italian statesman’s name that turns up most often -- usually in a political context, though also in articles about Game of Thrones, reality television and (this seems odd) professional soccer. I notice that one of the major American presidential candidates seems to be described as Machiavellian more often than the other. That doesn’t necessarily imply greater concern about moral turpitude; it could just be that her opponent lacks the impulse control required of a true Machiavel.

Be that as it may, Maurizio Viroli’s How to Choose a Leader: Machiavelli’s Advice to Citizens (Princeton University Press) challenges the longstanding tendency to make the Renaissance author’s name synonymous with the art of political skulduggery. Viroli (a professor of government at the University of Texas at Austin and professor emeritus of politics at Princeton University) offers us a kinder, gentler Machiavelli -- one notably free from cynicism, with nothing but the common good in mind.

Counterintuitive though his perspective may sound, Viroli’s presentation of Machiavelli reflects an understanding of the Florentine thinker that has become well established, if not incontrovertible, over the past 40 years or so. (On which more anon.) The element of novelty comes, rather, from how Viroli has put that interpretation to work. He builds an election-year handbook around 20 pithy quotations from Machiavelli which he then glosses and expands upon through references to American history and longer extracts from Machiavelli’s work (chiefly the Discourses on Livy). No mention of the current campaign cycle is made, as such; the manuscript was undoubtedly turned in well before the primaries started. All the more striking, then, that How to Choose a Leader occasionally offers pointed criticisms of people and developments in the news. The effect is particularly impressive when the remark in question was made 500 hundred years ago.

A couple of passages from Machiavelli epitomize his thinking on civic virtue. Neither of them squares at all with his familiar, sinister reputation.

The first we might call, however anachronistically, a statement of populist confidence:

“As for prudence and stability of purpose, I affirm that a people is more prudent, more stable and of better judgment than a prince. Nor is it without reason that the voice of the people has been likened to the voice of God; for we see that widespread beliefs fulfill themselves. … As to the justice of their opinions on public affairs, [they] seldom find that after hearing two speakers of equal ability urging them in opposite directions, they do not adopt the sounder view, or are unable to decide on the truth of what they hear.”

Viroli likes this passage so much that he quotes it twice within a few pages. Machiavelli’s other crucial idea concerns endurance, corruption and renewal. “All the things of this world,” Machiavelli writes, making clear that he has republics, in mind, “have a limit to their existence.” The institutions that survive longest and most perfectly “possess the intrinsic means of frequently renewing themselves” by returning to the principles and virtues “by means of which they obtain their first growth and reputation.” Return and renewal are necessary because an institution’s excellence or defining quality “in the process of time … becomes corrupted [and] will of necessity destroy the body unless something intervenes to bring it back to its normal condition.”

This outlook may sound deeply conservative, although Hannah Arendt, as Viroli notes, called Machiavelli “the spiritual father of revolution in the modern sense.” His influence on John Adams and Alexander Hamilton has been taken up in the scholarship. One might also note that the Italian communist Antonio Gramsci took him as a guide to thinking through political strategy. No interpretation can exhaust him; he is a large thinker, containing multitudes.

Still, we can be reasonably certain that possible applications to electoral politics in a nation of more than 300 million people never crossed Machiavelli’s mind. But Viroli understands the voting process as, in principle, an opportunity for renewal and revitalization. And perhaps especially such a opportunity in an election year -- at least, in general. (This time, maybe not so much.)

“Poverty never was allowed to stand in the way of the achievement of any rank or honor,” writes Machiavelli apropos the Roman republic, “and virtue and merit were sought for under whatever roof they dwelt ….” So what is the contemporary application?

“A president of the United States of America,” writes Viroli, “therefore must be wholeheartedly committed to the principle that the republic must offer all its citizens the same opportunities to be rewarded according to their merit and virtue.” Viroli offers the G.I. Bill of Rights as an example of egalitarian and meritocratic policy à la Machiavelli, who warns that “corruption and incapacity to maintain free institutions result from a great inequality.”

Furthermore, a worthy leader will be characterized by having a close knowledge of history: “As regards the exercise of the mind, [the leader] should read history, and therein study the actions of eminent men,” writes Machiavelli, in order to “examine the causes of their victories and defeats, so that he may imitate the former and avoid the latter.”

The past also provides models of deportment: “Great men and powerful republics preserve an equal dignity and courage in prosperity and adversity.”

Viroli glosses this as: “We must have at the helm of the republic a person who is not so inebriated by success as to become abject in the face of defeat.”

But it’s a longish passage on terrible leaders from the Discourses on Livy that should earn Machiavelli a spot as cable news pundit of the week: “Made vain and intoxicated by good fortune, they attribute their success to merits which they do not possess, and this makes them odious and insupportable to all around them. And when they have afterwards to meet a reverse of fortune, they quickly fall into the other extreme, and become abject and vile.”

Machiavelli also warns of the dangers of an old boys’ club, which are unlikely to be mitigated when a few girls join it: “Prolonged commands brought Rome to servitude.”

The reference here is to how prolonged military commands led to cronyism, but Viroli takes it as having other implications: “Politicians who remain in power for a long time tend to form networks of private allegiances. Through favors and contacts, they often manage to attain the support of many citizens who regard them, not the republic, as the principle object of their loyalty.”

Whether or not How to Choose a Leader is, as the saying goes, “the right book at the right time,” it’s certainly an odd book for an odd time. Presenting itself as a guide to democratic decision making, it reads instead like a roundabout exposé of how badly eroded any meaningful sense of the common good has become -- something the politicians can barely even gesture toward, much less pursue.

Editorial Tags: 
Image Source: 
Princeton University Press

Interview with author of new book about American intellectualism

Smart Title: 

Author discusses her new book, Reimagining Popular Notions of American Intellectualism.

Pages

Subscribe to RSS - Books
Back to Top