books

Why are some countries better at science and technology?

Smart Title: 

Author asks why some countries are better than others at science and technology.

Overview of fall 2016 books from university presses

Over the weekend I went through the fall 2016 catalog of every publisher belonging to the Association of American University Presses. Or at least I tried -- a number of fall catalogs have not been released yet, or else the publishers have hidden the PDFs on their websites with inexplicable cunning. (It seems as if savvy publicists would insist that catalogs be featured so prominently on the homepage that it’s almost impossible to overlook them. Perhaps half my time went to playing “Where’s Waldo?” so evidently not.) A few sites hadn’t been updated in at least a year. At one of them, the most recent catalog is from 2012, although the press itself seems still to be in existence. Let’s just hope everyone there is OK.

After assembling roughly 70 catalogs, I began to cull a list of books to consider for this column in the months ahead, which now runs to 400 titles, give or take a few, with more to be added as the search for Waldo continues. When you take an overview of a whole season’s worth of university-press output in one marathon survey, you can detect certain patterns or themes. A monograph on the white-power music underground? Duly noted. A second one, publishing a month later? That is a bit more striking. (The journalistic rule of thumb is that three makes a trend; for now, we’re left with a menacing coincidence.)

Some of the convergences seemed to merit notice, even in advance of the books themselves being available. Here are a few topical clusters that readers may find of interest. The text below in quotation marks after each book comes from the publisher’s description of it, unless otherwise specified. I have been sparing about the use of links, but more information on the books and authors can be readily found online.

“Whither democracy?” seems like an apt characterization of quite a few titles appearing this autumn and early winter. Last year, Jennifer L. Hochschild and Katherine Levine Einstein asked, Do Facts Matter? Information and Misinformation in American Politics, published by the University of Oklahoma Press and out in paperback this month, concluding that “citizens’ inability or unwillingness to use the facts they know in their political decision making may be frustrating,” but the real danger comes from “their acquisition and use of incorrect ‘knowledge’” put out by unscrupulous “political elites.” By contrast, James E. Campbell’s Polarized: Making Sense of a Divided America (Princeton University Press, July) maintains that if the two major parties are “now ideologically distant from each other and about equally distant from the political center” it’s because “American politics became highly polarized from the bottom up, not the top down, and this began much earlier than often thought,” meaning the 1960s.

Frances E. Lee sets the date later, and the locus of polarization higher in the body politic, in Insecure Majorities: Congress and the Perpetual Campaign (University of Chicago Press, September). She sees developments in the 1980s unleashing “competition for control of the government [that] drives members of both parties to participate in actions that promote their own party’s image and undercut that of the opposition, including the perpetual hunt for issues that can score political points by putting the opposing party on the wrong side of public opinion.”

Democracy: A Case Study by David A. Moss (Harvard University Press, January 2017) takes fierce partisanship as a given in American political life -- not a bug but a feature -- and recounts and analyzes 19 episodes of conflict, from the Constitutional Convention onward. Wasting no time in registering his dissent, the libertarian philosopher Jason Brennan comes out Against Democracy (Princeton, August) on the grounds that competent governance requires rational and informed decision making, while “political participation and democratic deliberation actually tend to make people worse -- more irrational, biased and mean.” The alternative he proposes is  “epistocracy”: rule by the knowledgeable. Good luck with that! Reaching that utopia from here will be quite an adventure, especially given that some voters regard “irrational, biased and mean” as qualifications for office.

Fall, when the current election cycle ends, is also be the season of books on the Anthropocene -- the idea that human impact on the environment has been so pronounced that we must define a whole phase of planetary history around it. There is an entry for the term in Fueling Culture: 101 Words for Energy and Environment (Fordham University Press, January), and it appears in the title of at least three books: one from Monthly Review Press (distributed by NYU Press) in September and one each from Princeton and Transcript Verlag (distributed by Columbia University Press) in November. Stacy Alaimo’s Exposed: Environmental Politics and Pleasures in Posthuman Times (University of Minnesota Press, October) opens with the statement “The Anthropocene is no time to set things straight.” (The author calls for “a material feminist posthumanism,” and it sounds like she draws on queer theory as well, so chances are “straight” is an overdetermined word choice.)

The neologism is tweaked in Staying With the Trouble: Making Kin in the Chthulucene (Duke University Press, September) by Donna J. Haraway, who “eschews referring to our current epoch as the Anthropocene, preferring to conceptualize it as what she calls the Chthulucene, as it more aptly and fully describes our epoch as one in which the human and nonhuman are inextricably linked in tentacular practices.” Someone in a position to know tells me that Haraway derives her term from “chthonic” (referring to the subterranean) rather than Cthulhu, the unspeakable ancient demigod of H. P. Lovecraft’s horror fiction. Maybe so, but the reference to tentacles suggests otherwise.

A couple of titles from Columbia University Press try to find a silver lining in the clouds of Anthropocene smog -- or at least to start dispersing them before it’s too late. Michael E. Mann and Tom Toles pool their skills as atmospheric scientist and Pulitzer-winning cartoonist (respectively) in The Madhouse Effect: How Climate Change Denial Is Threatening Our Planet, Destroying Our Politics and Driving Us Crazy (September), which satirizes “the intellectual pretzels into which denialists must twist logic to explain away the clear evidence that man-made activity has changed our climate.” Despite its seemingly monitory title, Geoffrey Heal’s Endangered Economies: How the Neglect of Nature Threatens Our Prosperity (December) is actually an argument for “conserving nature and boosting economic growth” as mutually compatible goals.

If so, it will be necessary to counter the effects of chickenization -- which, it turns out, is U.S. Department of Agriculture slang for “the transformation of all farm animal production” along factory lines, as described in Ellen K. Silbergeld’s Chickenizing Farms and Food: How Industrial Meat Production Endangers Workers, Animals and Consumers (Johns Hopkins University Press, September). Tiago Saraiva shows that the Germans began moving in the same direction, under more sinister auspices, in Fascist Pigs: Technoscientific Organisms and the History of Fascism (The MIT Press, September): “specially bred wheat and pigs became important elements in the institutionalization and expansion of fascist regimes …. Pigs that didn’t efficiently convert German-grown potatoes into pork and lard were eliminated.” A different sociopolitical matrix governs the contemporary American “pasture-raised pork market,” of which Brad Weiss offers an ethnographic account in Real Pigs: Shifting Values in the Field of Local Pork (Duke University Press, August).

And finally -- for this week, anyway -- there is the ecological and biomedical impact of the free-ranging creatures described in Peter P. Marra and Chris Santella’s Cat Wars: The Devastating Consequences of a Cuddly Killer (Princeton, September). Besides the fact that cats kill “birds and other animals by the billions” in the United States, the authors warn of “the little-known but potentially devastating public health consequences of rabies and parasitic Toxoplasma passing from cats to humans at rising rates.” The authors also maintain that “a small but vocal minority of cat advocates has campaigned successfully for no action in much the same way that special interest groups have stymied attempts to curtail smoking and climate change.” I write this while wearing a T-shirt that reads “Crazy Cat Guy” but will be the first to agree that the problem here is primarily human. There’s a reason it’s called the Anthropocene and not the Felinocene.

A number of other themes and topics from university-press fall books offering might bear mentioning in another column, later this summer. With luck, the pool of candidates will grow in the meantime; we’ll see if any new trends crystallize out in the process.

Editorial Tags: 
Image Source: 
iStock

Author discusses his new book about the origins of a vision of public higher education

Smart Title: 

Author discusses his new book on a much praised philosophy for public higher education.

Review of Maurizio Viroli's "How to Choose a Leader: Machiavelli's Advice to Citizens"

For most people the word “Machiavellian” carries no connotation of virtue, and it’s never meant as praise. A stock theatrical character of the Elizabethan era was the Machiavel, who “delights in his own villainy and gloats over his successes in lengthy soliloquies,” as one literary historian puts it, with Shakespeare’s Iago and Richard III being prime examples. A Newsweek article from last year characterizes Tony Blair as “a Machiavel with a Messiah complex,” surely one of the more inventive insults in recent memory.

Otherwise it is the adjectival form of the Italian statesman’s name that turns up most often -- usually in a political context, though also in articles about Game of Thrones, reality television and (this seems odd) professional soccer. I notice that one of the major American presidential candidates seems to be described as Machiavellian more often than the other. That doesn’t necessarily imply greater concern about moral turpitude; it could just be that her opponent lacks the impulse control required of a true Machiavel.

Be that as it may, Maurizio Viroli’s How to Choose a Leader: Machiavelli’s Advice to Citizens (Princeton University Press) challenges the longstanding tendency to make the Renaissance author’s name synonymous with the art of political skulduggery. Viroli (a professor of government at the University of Texas at Austin and professor emeritus of politics at Princeton University) offers us a kinder, gentler Machiavelli -- one notably free from cynicism, with nothing but the common good in mind.

Counterintuitive though his perspective may sound, Viroli’s presentation of Machiavelli reflects an understanding of the Florentine thinker that has become well established, if not incontrovertible, over the past 40 years or so. (On which more anon.) The element of novelty comes, rather, from how Viroli has put that interpretation to work. He builds an election-year handbook around 20 pithy quotations from Machiavelli which he then glosses and expands upon through references to American history and longer extracts from Machiavelli’s work (chiefly the Discourses on Livy). No mention of the current campaign cycle is made, as such; the manuscript was undoubtedly turned in well before the primaries started. All the more striking, then, that How to Choose a Leader occasionally offers pointed criticisms of people and developments in the news. The effect is particularly impressive when the remark in question was made 500 hundred years ago.

A couple of passages from Machiavelli epitomize his thinking on civic virtue. Neither of them squares at all with his familiar, sinister reputation.

The first we might call, however anachronistically, a statement of populist confidence:

“As for prudence and stability of purpose, I affirm that a people is more prudent, more stable and of better judgment than a prince. Nor is it without reason that the voice of the people has been likened to the voice of God; for we see that widespread beliefs fulfill themselves. … As to the justice of their opinions on public affairs, [they] seldom find that after hearing two speakers of equal ability urging them in opposite directions, they do not adopt the sounder view, or are unable to decide on the truth of what they hear.”

Viroli likes this passage so much that he quotes it twice within a few pages. Machiavelli’s other crucial idea concerns endurance, corruption and renewal. “All the things of this world,” Machiavelli writes, making clear that he has republics, in mind, “have a limit to their existence.” The institutions that survive longest and most perfectly “possess the intrinsic means of frequently renewing themselves” by returning to the principles and virtues “by means of which they obtain their first growth and reputation.” Return and renewal are necessary because an institution’s excellence or defining quality “in the process of time … becomes corrupted [and] will of necessity destroy the body unless something intervenes to bring it back to its normal condition.”

This outlook may sound deeply conservative, although Hannah Arendt, as Viroli notes, called Machiavelli “the spiritual father of revolution in the modern sense.” His influence on John Adams and Alexander Hamilton has been taken up in the scholarship. One might also note that the Italian communist Antonio Gramsci took him as a guide to thinking through political strategy. No interpretation can exhaust him; he is a large thinker, containing multitudes.

Still, we can be reasonably certain that possible applications to electoral politics in a nation of more than 300 million people never crossed Machiavelli’s mind. But Viroli understands the voting process as, in principle, an opportunity for renewal and revitalization. And perhaps especially such a opportunity in an election year -- at least, in general. (This time, maybe not so much.)

“Poverty never was allowed to stand in the way of the achievement of any rank or honor,” writes Machiavelli apropos the Roman republic, “and virtue and merit were sought for under whatever roof they dwelt ….” So what is the contemporary application?

“A president of the United States of America,” writes Viroli, “therefore must be wholeheartedly committed to the principle that the republic must offer all its citizens the same opportunities to be rewarded according to their merit and virtue.” Viroli offers the G.I. Bill of Rights as an example of egalitarian and meritocratic policy à la Machiavelli, who warns that “corruption and incapacity to maintain free institutions result from a great inequality.”

Furthermore, a worthy leader will be characterized by having a close knowledge of history: “As regards the exercise of the mind, [the leader] should read history, and therein study the actions of eminent men,” writes Machiavelli, in order to “examine the causes of their victories and defeats, so that he may imitate the former and avoid the latter.”

The past also provides models of deportment: “Great men and powerful republics preserve an equal dignity and courage in prosperity and adversity.”

Viroli glosses this as: “We must have at the helm of the republic a person who is not so inebriated by success as to become abject in the face of defeat.”

But it’s a longish passage on terrible leaders from the Discourses on Livy that should earn Machiavelli a spot as cable news pundit of the week: “Made vain and intoxicated by good fortune, they attribute their success to merits which they do not possess, and this makes them odious and insupportable to all around them. And when they have afterwards to meet a reverse of fortune, they quickly fall into the other extreme, and become abject and vile.”

Machiavelli also warns of the dangers of an old boys’ club, which are unlikely to be mitigated when a few girls join it: “Prolonged commands brought Rome to servitude.”

The reference here is to how prolonged military commands led to cronyism, but Viroli takes it as having other implications: “Politicians who remain in power for a long time tend to form networks of private allegiances. Through favors and contacts, they often manage to attain the support of many citizens who regard them, not the republic, as the principle object of their loyalty.”

Whether or not How to Choose a Leader is, as the saying goes, “the right book at the right time,” it’s certainly an odd book for an odd time. Presenting itself as a guide to democratic decision making, it reads instead like a roundabout exposé of how badly eroded any meaningful sense of the common good has become -- something the politicians can barely even gesture toward, much less pursue.

Editorial Tags: 
Image Source: 
Princeton University Press

Interview with author of new book about American intellectualism

Smart Title: 

Author discusses her new book, Reimagining Popular Notions of American Intellectualism.

Review of Robert L. Belknap's "Plots"

The story is told of how, during an interview at a film festival in the 1960s, someone asked the avant-garde director Jean-Luc Godard, “But you must at least admit that a film has to have a beginning, a middle and an end?” To which Godard replied, “Yes, but not necessarily in that order.”

Touché! Creative tampering with established patterns of storytelling (or with audience expectations, which is roughly the same thing) is among the basic prerogatives of artistic expression -- one to be exercised at whatever risk of ticket buyers demanding their money back. Most of the examples of such tampering that Robert L. Belknap considers in Plots (Columbia University Press) are drawn from literary works now at least a century old. That we still read them suggests their narrative innovations worked -- so well, in fact, that they may go unnoticed now, taken as given. And the measure of Belknap’s excellence as a critic is how rewarding his close attention to them proves.

The late author, a professor of Slavic languages at Columbia University, delivered the three lectures making up Plots in 2011. Belknap’s preface to the book indicates that he considered the manuscript ready for publication at the time of his death in 2014. Plots has an adamantine quality, as if decades of thought and teaching were being crystallized and enormously compressed. Yet it is difficult to read the final paragraphs as anything but the author’s promise to say a great deal more.

Whether the lectures were offered as the overture to Belknap’s magnum opus or in lieu of one, Plots shuttles between narrative theory (from Aristotle to the Russian formalists) and narrative practice (Shakespeare and Dostoevsky, primarily) at terrific speed and with a necessary minimum of jargon. Because the jargon contains an irreducible core of the argument, we might as well start (even though Belknap does not) with the Russian formalists’ contrast between fabula and siuzhet.

Each can be translated as “plot.” The more or less standard sense of fabula, at least as I learned it in ancient times, is the series of events or actions as they might be laid out on a timeline. The author tweaks this a little by defining fabula as “the relationship among the incidents in the world the characters inhabit,” especially cause-and-effect relationships. By contrast, siuzhet is how events unfold within the literary narrative or, as Belknap puts it, “the relationship among the same incidents in the world of the text.”

To frame the contrast another way, siuzhet is how the story is told, while fabula is what “really” happened. The scare quotes are necessary because the distinction applies to fiction and drama as well as, say, memoir and documentary film. “In small forms, like fairy tales,” Belknap notes, fabula and siuzhet “tend to track one another rather closely, but in larger forms, like epics or novels, they often diverge.” (Side note: A good deal of short fiction is also marked by that divergence. An example that comes to mind is “The Tell-Tale Heart” by Edgar Allan Poe, where the siuzhet of the narrator’s account of what happened and why is decidedly different from the fabula to be worked out by the police appearing at the end of the story.)

Belknap returns to Aristotle for the original effort to understand the emotional impact of a certain kind of siuzhet: the ancient tragedies. An effective drama, by the philosopher’s lights, depicted the events of a single day, in a single place, through a sequence of actions so well integrated that no element could be omitted without the whole narrative coming apart. “This discipline in handling the causal relationship between incidents,” says Belknap, “produces the sense of inevitability that characterizes the strongest tragedies.” The taut siuzhet chronicling a straightforward fabula reconciled audiences to the workings of destiny.

Turning Aristotle’s analysis into a rule book, as happened in later centuries, was like forcing playwrights to wear too-small shoes. The fashion could not last. In the second lecture, Belknap turns to Shakespeare, who found another way to work:

“He sacrificed the causal tightness that had served classic drama so well in order to build thematic tightness around parallel plots. Usually the parallel plots involve different social levels -- masters and servants, kings and courtiers, supernatural beings and humans -- and usually the plots are not too parallel to intersect occasionally and interact causally at some level, though never enough to satisfy Aristotle’s criterion that if any incident be removed, the whole plot of the play should cease to make sense …. Similarity in plots can be represented as the overlap between two areas, and those areas may be broken down into individual points of similarity, dissimilarity, contrast, etc. Without knowing it, a Shakespearean audience is making such analyses all the time it watches a play, and the points of overlap and contrast enter their awareness.”

It’s not clear whether Belknap means to include the modern Shakespearean audience -- possibly not, since contemporary productions tend to trim down the secondary plots, if not eliminate them. But the Bard had other devices in hand for complicating fabula-siuzhet arrangements -- including what Belknap identifies as “a little-discussed peculiarity of Shakespearean plotting, the use of lies.” In both classical and Shakespearean drama, there are crucial scenes in which a character’s identity or situation is revealed to others whose confusion or deception has been important for the plot. But whereas mistakes and lies “are about equally prevalent” in the ancient plays, Shakespeare has a clear preference: “virtually every recognition scene is generated primarily out of a lie, not an error.”

In a striking elaboration of that point, Belknap treats the lie as a kind of theatrical performance -- “a little drama, with at least the rudiments of a plot” -- that often “express[es] facts about the liar, the person lied to or the person lied about.” The lie is a manipulative play within a play in miniature. And in Hamlet, at least, the (literal) play within a play is the prince’s means of trying to force his uncle to tell the truth.

Now, such intricate developments at the level of form also involve changes in how the writer and the audience understand the world (and, presumably, themselves). The Shakespearean cosmos gets messier than that of classical drama, but loosening the chains of cause and effect does not create absolute chaos. The motives and consequences of the characters’ actions make manifest their otherwise hidden inner lives. To put it another way, mutations in siuzhet (how the story is told) reflect changes in fabula (what really happens in the world) and vice versa. Belknap suggests -- tongue perhaps not entirely in cheek -- that Shakespeare was on the verge of inventing the modern psychological novel and might have, had he lived a few more years.

By the final lecture, on Dostoyevsky’s Crime and Punishment, Belknap has come home to his area of deepest professional interest. (He wrote two well-regarded monographs on The Brothers Karamazov.) Moving beyond his analysis of parallel plots in Shakespeare, he goes deep into the webs of allusion and cross-referencing among Russian authors of the 19th century to make the case that Crime and Punishment contains a much more deliberate narrative architecture than it is credited with having. (Henry James’s characterization of Russian novels as “fluid puddings” undoubtedly applies.)

He even makes a bid for the novel epilogue as being aesthetically and thematically integral to the book as a whole. Other readers may find that argument plausible. I’ll just say that Plots reveals that with Belknap’s death, we lost a critic and literary historian of great power and considerable ingenuity.

Editorial Tags: 

Essay on 18th-century note taking

Matthew Daniel Eddy’s fascinating paper “The Interactive Notebook: How Students Learned to Keep Notes During the Scottish Enlightenment” is bound to elicit a certain amount of nostalgia in some readers. (The author is a professor of philosophy at Durham University; the paper, forthcoming in the journal Book History, is available for download from his Academia page.)

Interest in the everyday, taken-for-granted aspects of scholarship (the nuts and bolts of the life of the mind) has grown among cultural historians over the past couple of decades. At the same time, and perhaps not so coincidentally, many of those routines have been in flux, with card catalogs and bound serials disappearing from university libraries and scholarship itself seeming to drift ever closer to a condition of paperlessness. The past few years have seen a good deal of work on the history of the notebook, in all its many forms. I think Eddy’s contribution to this subspecialty may prove a breakthrough work, as Anthony Grafton’s The Footnote: A Curious History (1997) and H. J. Jackson’s Marginialia: Readers Writing in Books (2001) were in the early days of metaerudition.

“Lecture notes,” Eddy writes, “as well as other forms of writing such as letters, commonplace books and diaries, were part of a larger early modern manuscript world which treated inscription as an active force that shaped the mind.” It’s the focus on note taking itself -- understood as an activity bound up with various cultural imperatives -- that distinguishes notebook studies (pardon the expression) from the research of biographers and intellectual historians who use notebooks as documents.

Edinburgh in the late 18th century was buzzing with considerable philosophical and scientific activity, but the sound in the lecture notes Eddy describes came mainly from student noses being held to the grindstone. For notebook keeping was central to the pedagogical experience -- a labor-intensive and somewhat costly activity, deeply embedded in the whole social system of academe. Presumably the less impressive specimens became kindling, but the lecture notebooks Eddy describes were the concrete embodiment of intellectual discipline and craftsmanship -- multivolume works worthy of shelf space in the university library or handed down to heirs. Or, often enough, sold, whether to less diligent students or to the very professors who had given the lectures.

The process of notebook keeping, as Eddy reconstructs it, ran something like this: before a course began, the student would purchase a syllabus and a supply of writing materials -- including “quares” of loose papers or “paper books” (which look pocket-size in a photo) and a somewhat pricier “note book” proper, bound in leather.

The syllabus included a listing of topics covered in each lecture. Eddy writes that “most professors worked very hard to provide lecture headings that were designed to help students take notes in an organized fashion” as they tried to keep up with “the rush of the learning process as it occurred in the classroom.” Pen or pencil in hand, the student filled up his quares or paper book with as much of the lecture material as he could grasp and condense, however roughly. The pace made it difficult to do more than sketch the occasional diagram, and Eddy notes that “many students struggled to even write basic epitomisations of what they had heard.”

The shared challenge fostered the student practice of literally comparing notes -- and in any case, even the most nimble student was far from through when the lecture was done. Then it was necessary to “fill out” the rough notes, drawing on memory of what the professor said, the headings in the syllabus and the course readings -- a time-consuming effort that could run late into the night. “Extending my notes taken at the Chemical and Anatomical lectures,” one student wrote in his diary, “employs my whole time and prevents my doing any thing else. Tired, uneasy & low-spirited.”

As his freshman year ended, another wrote, “My late hours revising my notes taken at the lectures wore on my constitution, and I longed for the approach of May and the end of the lectures.”

Nor was revision and elaboration the end of it. From the drafts, written on cheap paper, students copied a more legible and carefully edited text into their leather notebooks, title pages in imitation of those found in printed books. The truly devoted student would prepare an index. “While many of them complained about the time this activity required,” Eddy writes, “I have found no one who questioned the cognitive efficacy that their teachers attached to the act of copying.”

Making a lecture notebook was the opposite of multitasking. It meant doing the same task repeatedly, with deeper attention and commitment at each stage. Eddy surmises that medical students who prepared especially well-crafted lecture notebooks probably attended the same course a number of times, adding to and improving the record, over a course of years.

At the same time, this single-minded effort exercised a number of capacities. Students developed “various reading, writing and drawing skills that were woven together into note-taking routines … that were in turn infused with a sense of purpose, a sense that the acts of note taking and notebook making were just as important as the material notebook that they produced.”

You can fill an immaterial notebook with a greater variety of content (yay, Evernote!), but I’m not sure that counts as either an advantage or an improvement.

Editorial Tags: 
Image Source: 
istock

Author discusses new book on history of area studies and Middle East studies

Smart Title: 

Author discusses his new book on the evolution of Middle East studies in specific and area studies more broadly.

Review of Michael Shermer, "Skeptic: Viewing the World With a Rational Eye"

The one really startling moment in Michael Shermer’s Skeptic: Viewing the World With a Rational Eye (Henry Holt and Company) comes in chapter 38, which begins with a description of his abduction by, presumably, space aliens. This was in 1983, “in the wee hours of the morning, “somewhere along a lonely rural highway approaching Haigler, Nebraska.”

That it took place near Haigler, Neb., is not the surprising part, of course: the extraterrestrials show a definite preference for such locales. Nobody ever gets abducted from the White House lawn. (They must have their reasons.)

But by chapter 38 -- each chapter being a stand-alone essay originally appearing in Shermer’s monthly column for Scientific American -- the author has debunked Atlantis, The Bible Code, Bigfoot and psychic powers, as well as several “alternative medicine” practices now prevalent enough, and in some cases dangerous enough, to make you wonder what the Food and Drug Administration people actually do all day. The polemics against pseudoscience appear alongside explanations and reflections on science, proper. They express the confidence of a rational mind in an intelligible universe -- and that context makes the author’s Nebraska interlude seem doubly odd.

“[A] large craft with bright lights overtook me and forced me to the side of the road. Alien beings exited the craft and abducted me for 90 minutes, after which I found myself back on the road with no memory of what transpired inside the ship. The experience was real and I can prove that it happened because I recounted it to a film crew shortly after, and I am still in contact with some of the aliens.”

Scientific American must have received some strange letters to the editor after that month’s column, I’d imagine, but Shermer’s close encounter can be explained in fairly mundane terms, which we’ll get to in a moment.

Shermer’s willingness to go on the record with the experience distinguishes his essays from the work of the late Martin Gardner, whose 1952 classic Fads and Fallacies in the Name of Science (Dover Publications) and other writings are the obvious point of reference for comparison. Their perspectives on the world have a large margin of overlap, and Gardner would find nothing to argue with when Shermer writes:

“The price to pay for liberty, in addition to eternal vigilance, is eternal patience with the vacuous blather occasionally expressed beneath the shield of free speech. … In a free society skeptics are the watchdogs of irrationalism -- the consumer advocates of bad ideas. Yet debunking is not simply the divestment of bunk; its utility is found in offering a viable alternative, along with a lesson on how thinking goes wrong.”

In another creedal statement, Shermer writes, “What does it mean to have an open mind? It is to find the essential balance between orthodoxy and heresy, between a total commitment to the status quo and the blind pursuit of new ideas, between being open-minded enough to accept radical new ideas and being so open-minded that your brains fall out. Skepticism is about finding that balance.”

For his part, Shermer acknowledges Gardner as a model. He also quotes with approval a point made by the science fiction writer and hard-science popularizer Isaac Asimov: “When people thought the earth was flat, they were wrong. When people thought the earth was spherical, they were wrong. But if you think that thinking the earth is spherical is just as wrong as thinking the earth is flat, then your view is wronger than both of them put together.” (Shermer calls this Asimov’s Axiom.) The irreplaceable and epoch-defining power of scientific research forms the bedrock of each of these figures’ worldviews -- to be defended against relapses into magical thinking as much as possible. Frustrating the effort, if not rendering it completely pointless, is the mass media’s general disinclination to report on real science in any depth while covering the ersatz varieties with all the skill and dedication it can muster.

The situation is not entirely the fault of the executives and producers behind such pseudoreality TV fare as the program about the “controversy” over whether the Apollo moon landings were faked. (You decide!) For part of the problem, as Shermer complains, is that the work of explaining the sciences to a nonspecialist public is typically neglected and even despised by scientists themselves. Carl Sagan and Stephen Jay Gould were notable exceptions, but only at the cost of damage to their professional reputations.

Sagan and Gould are also among the inspirations mentioned in Skeptic, and a number of the essays carry on their work of popularizing the nature and the results of scientific inquiry. But more than a third of the pieces are dedicated to exposing and combating various kinds of quackery, crackpottery, delusions and outright fraud -- a string of updates and supplements to Gardner’s Fads and Fallacies in the Name of Science, in effect. Shermer indicates that he expects to bring out more essay collections in the future, and I greatly hope that when he does, the largest portion of them will be devoted to explaining and interpreting real scientific developments.

For pseudoscience, by contrast, does not develop -- not really. Bad ideas ever appear or disappear, just changing in form. Discrediting claims about lost continents or the mysterious energy of the crystals, unknown to physics, may be a matter of public intellectual hygiene, but at best the reader will learn some basic principles of critical thought (such as “after all this time, it makes sense to talk about the ape man once there’s a corpse to examine; otherwise let’s not”) which are expressed just as well in skeptical writings from years past. Long stretches of Shermer’s book are given over to such exercises in debunkery, and reading them was like watching a man shoot fish in a barrel -- dead fish, at that.

Much more interesting are the essays in which he confronts the factors disposing us to believe weird things. What can make the writing of avowed skeptics frustrating is the tendency to establish a net dichotomy between the reasonable, rigorous, scientific, intelligent people (who don’t succumb to irrational beliefs) and the superstitious, slovenly, harebrained idiots (who do).

That gets boring to read after a while, and, more to the point, it is wrong. For, as Shermer aptly puts it, “smart people believe weird things because they are skilled at defending beliefs they arrived at for nonsmart reasons.” Indeed, cognitive skill and cognitive vulnerability may be linked in inconvenient ways.

“Humans evolved brains that are pattern-recognition machines,” writes Shermer, “designed to detect signals that enhance or threaten survival amid a very noisy world. Also known as association learning (associating A and B as causally connected), we are very good at it, or at least good enough to have survived and passed on the genes for the capacity of association learning. Unfortunately, the system has flaws. Superstitions are false associations -- A appears to be connected to B, but it is not (the baseball player who doesn’t shave and hits a home run). Las Vegas was built on false association learning.” Humans’ unique, language-fostered capacity to create and emit narrative adds another layer of complication and potential trouble: “Like all other animals, we evolved to connect the dots between events in nature to discern meaningful patterns for our survival. Like no other animals, we tell stories about the patterns we find.”

And it is a mark of our success as a species (so far) that we have not had to perfect the skills needed to recognize and break false associations: “The problem is that although true pattern recognition helps us survive, false pattern recognition does not necessarily get us killed, and so the overall phenomenon endured the winnowing process of natural selection.”

Although not explicitly framed as an example, the author’s abduction in Nebraska more than 30 years ago is relevant. Shermer mentions his love of long-distance biking in a number of essays, and in 1983 he was “in the opening days of the 3,100-mile nonstop transcontinental Race Across America,” accompanied by a support crew in a motor home.

After “83 straight hours and 1,259 miles” on his bike, Shermer says, the driver of the motor home flashed its headlights “and pulled alongside while my crew entreated me to take a sleep break.” In a waking dream state, his brain began calling up memories of a science-fiction program he’d watched in childhood: “Suddenly my support team was transmogrified into aliens,” he writes. After 90 minutes of sleep, Shermer found that they had returned to human form, and he then “recounted [the hallucination] to the ABC’s Wide World of Sports television crew” covering the race.

The circumstances were fortunate. He was able to recognize what had happened, to disentangle experience from reality, without trouble or distress. “But at the time the experience was real,” he writes, “and that’s the point. The human capacity for self-delusion is boundless, and the effects of belief are overpowering.” The lesson is better taught by someone who has experienced being overpowered than by someone who thinks of self-delusion as other people’s problem.

Author discusses his new book about why higher education matters

Smart Title: 

Author discusses his new book about why higher education still matters.

Pages

Subscribe to RSS - books
Back to Top