Books

Review of Paul R. Ehrlich and Michael Charles Tobias, 'Hope on Earth: A Conversation'

Last month, both the Pentagon and the Intergovernmental Panel on Climate Change of the United Nations issued projections of the long-term impact of hydrocarbon emissions. They could "slow down economic growth, make poverty reduction more difficult, further erode food security, and prolong existing and create new poverty traps, the latter particularly in urban areas and emerging hot spots of hunger.”

That was the wording of the U.N. report, but the Pentagon sounded much the same, warning that climate change "will influence resource competition while placing additional burdens on economies, societies, and governance institutions around the world." The U.S. military characterizes these tendencies as "threat multipliers that will aggravate stressors abroad such as poverty, environmental degradation, political instability, and social tensions – conditions that can enable terrorist activity and other forms of violence."

Isn't such talk rather alarmist, considering all the research by scientists who reject the idea of anthropogenic global warming? Consider a recent survey of the literature appearing in scientific journals between 1991 and 2012. By my calculation, scientists rejecting the man-made climate change published an impressive 0.17 percent of peer-reviewed papers containing the phrases "global warming" or "global climate change." That's almost one-fifth of one percent!

Clearly the debate is far from over. But a couple of weeks back, an assistant professor of philosophy at Rochester Institute of Technology named Lawrence Torcello argued, in a much-discussed article, that we have "good reason to consider the funding of climate denial to be criminally and morally negligent." His comments inspired an enormous amount of hate mail, including a number of threats of violence. Many of his correspondents were so outraged that they could not bring themselves actually to read the article, relying instead on second- and thirdhand accounts of Torcello's argument that refuted, not what he wrote, but what he could almost certainly be imagined to have intended to say.

Lest anyone feel too sympathetic to Torcello, I must point out that he failed to consider other explanations for why 99.83 percent of the scientific papers discussing climate change assessed it to be a real problem. It is possible, for example, that the researchers who wrote them were funded by the dirty tree-hugging hippies running the Pentagon.

Now, irony regarding any topic that elicits hate mail seldom turns out well. The people you don't anger, you tend to confuse. But it proves almost irresistibly tempting once a debate has reached a standoff. Pieces remain in play on the chess board but neither side makes any progress. That's where things stand now. More than two-thirds of the American public thinks that global warming is real despite the fact that it still gets cold in winter, just as they believe the earth to be spherical even though the front yard is, plainly, flat. Many will stick to those opinions, no matter how well-funded the denialists may be. (There's just no reasoning with some people.)

The more substantial discussion now seems to focus on the processes of climate change -- about whether, say, the continued melting of polar ice will trigger the release of enormous amounts of methane into the atmosphere. If so, how soon? And how fast, once it starts?

The particular mechanisms involved in climate change don't much interest Paul R. Ehrlich and Michael Charles Tobias in their book Hope on Earth: A Conversation, from University of Chicago Press. Ehrlich, the senior author, is a professor of population studies at Stanford University, where he is also president of the Center for Conservation Biology; Tobias is a writer and and documentary director primarily interested in environmental issues. But the question of the tempo of ecological disaster hovers over the discussion as a whole.

Ehrlich's The Population Bomb (1968) became one of the most ubiquitous of alarming paperbacks during the 1970s. It was the neo-Malthussian equivalent of one of Hal Lindsay's books about the End Times; extrapolating from birth rates and the rate of growth of food supplies, it projected worldwide famine and social collapse by about 1980. The promotional copy for Hope on Earth identifies Ehrlich as one of "the world's leading interdisciplinary environmental scientists," and undoubtedly he does remain one of the best =-known. But it is important to keep in mind that some ecologists were sharply critical of The Population Bomb even at the height of its popularity, seeing it as reductive and alarmist. Ehrlich overemphasized the environmental impact of poor countries while underemphasizing that of pollution and wastefulness in the consumerist societies. He also failed to grasp either the increased agricultural output that came with the Green Revolution or the environmental impact of the pesticides making it possible.

Ehrlich ventures no such prognostication in his dialogue with Tobias, conducted over a couple of days at the Rocky Mountain Biological Laboratory in Colorado. As far as I can determine, the exchange took place during 2011, with revisions and elaborations of the transcript from both parties continuing over the following year. In spite of the protracted effort, it remains very much a conversation, for good and for ill. It roams without a map or a set agenda, however provisional, and the speakers are prone to data-dump monologues whose pertinence is not always obvious.

The conversation can be interesting when Ehrlich and Tobias butt heads. They approach environmental matters in distinct and sometimes conflicting perspectives. Tobias seems exemplary of a generation of environmentalists who emerged in Ehrlich's wake -- one for which preserving biodiversity and the wilderness are concerns inseparable from the defense of animal rights, as well as an attachment to ascetic mysticism. (He's reminiscent of the "level 5 vegan" who appears in one episode of "The Simpsons": "I won't eat anything that casts a shadow.") By contrast, Ehrlich clings fast to both a secular worldview and a belief that overpopulation, as such, is a major driving factor in ecological problems. He enjoys the comforts and conveniences available in advanced industrial society and can eat a chicken sandwich with few, if any, moral qualms. "The thing I hate about vegetarians," he says, "is that they're not put off by the screaming of cabbage."

One substantial issue sometimes emerges from their bull session, only to sink back out of view again. It could be called the problem of ecological triage: of how to decide what can be saved and what can't, and on what basis such judgments can be made.

"One of the troubles," says Ehrlich, "is that there are far too limited funds going into trying to save our life-support systems. It's a big allocation issue, how much to spend on description and cataloging and protecting species, as opposed to focusing on populations and the ecosystem services they provide…. Is it more important, for instance, to maintain pest control services in the grain baskets of the world or to protect narrow endemic species in tropical hotspots? Not an easy question to answer, and one with ethical implications."

And one that will come to the fore more and more over the next few decades, if the effects of climate change are felt on anything like the scale that scientists are discussing. Tobias characterizes it -- more generally, if also with more wool -- as the problem of "ascertaining those points of convergence wherein there are thematic flash points, positive pathways that the majority of scientists and people in general can agree upon in an effort to improve the conditions of life on Earth -- both for our species and others."

By "points of convergence" and "thematic flash points," he seems to mean whatever minimal bases of agreement about core values and priorities can serve as a basis for deciding how many arks can be built, and who gets a compartment. The upshot of Ehrlich and Tobias's discussion -- if not their actual conclusion, since they seem not to reach one -- is that no such basis can be identified at present. The effects of climate change may become severe within a couple of decades. The authors probably meant their title to be encouraging, but irony seems to have prevailed, because Hope on Earth offers precious little of it.

Editorial Tags: 

Q&A with author of new book on how polarization has created stale higher ed policy but for-profits can get their way

Smart Title: 

A new book argues federal lawmakers have failed to maintain a rational higher ed policy but come together on behalf of for-profits that spend generously.

Review of Frederic Gros, 'A Philosophy of Walking'

In December, the journal Brain Connectivity published a paper called "Short- and Long-Term Effects of a Novel on Connectivity in the Brain," based on a study conducted at Emory University. The researchers did MRI scans of the brains of 21 undergraduate students over a period of days before, during, and after they read a best-selling historical page-turner called Pompeii over the course of nine evenings. A significant increase of activity in "the left angular supramarginal gyri and right posterior temporal gyri" occurred during the novel-reading phase of the experiment, which fell off rapidly once they finished the book -- the gyri being, the report explained, regions of the brain "associated with perspective taking and story comprehension."

Not a big surprise; you'd figure as much. But the researchers also found that an elevated level of activity continued in the bilateral somatosensory cortex for some time after the subjects were done reading. In the novel, a young Roman aqueduct engineer visiting the ancient vacation resort of Pompeii "soon discovers that there are powerful forces at work -- both natural and man-made -- threatening to destroy him." Presumably the readers had identified with the protagonist, and parts of their brains were still running away from the volcano for up to five days after they finished the book.

So one might construe the findings, anyway. The authors are more cautious. But they raise the question of whether the experience of reading novels "is sufficiently powerful to cause a detectable reorganization of cortical networks" -- what they call a "hybrid mentalizing-narrative network configuration." Or to put it another way, a long-term rearrangement of the mind's furniture.

It isn't a work of fiction, and I am but a solitary reader without so much as access to an electroencephalograph, but A Philosophy of Walking by Frédéric Gros, a French best-seller from 2011 just published in English by Verso, seems to have been setting up its own "hybrid mentalizing-narrative network configuration" within my head over the past few days. Maybe it's the weather. After so many months of cold weather and leaden skies, Gros's evocation of the pleasures of being outside, moving freely, in no particular hurry, stirs something deep within.

The author, a professor of philosophy at the University of Paris, has, among other things, edited volumes in the posthumous edition of Michel Foucault's lectures at the College de France. But the authority Gros brings to his reflections on walking comes only in part from knowing the lives and writings of ambulatory thinkers across the millennia, beginning in ancient Greece. He is a scholar but also a connoisseur -- someone who has hiked and wandered enough in his time, over a sufficient variety of terrains, to know at first hand the range of moods (ecstasy, monotony, exhaustion) that go with long walks.

It is a work of advocacy, and of propaganda against sedentary thinking. The first of Gros's biographical essays is on Nietzsche, who took up walking in the open air while suffering from migraine headaches, eyestrain, and late-night vomiting spasms. It did not cure him, but it did transform him. He might be the one spending time at health resorts, but it was contemporary intellectual life that manifested invalidism.

"We do not belong to those who have ideas only among books, when stimulated by books," Nietzsche wrote. "It is our habit to think outdoors -- walking, leaping, climbing, dancing, preferably on lonely mountains or near the sea where even the trails become thoughtful. Our first questions about the value of a book, of a human being, or of a musical composition, are: Can they walk? Even more, can they dance?" Long, solitary hikes such as those taken by Nietzsche -- and also by Rousseau, the subject of another essay -- are only one mode of philosophical pedestrianism. The precisely timed daily constitutional that Kant took each day, so regular that his neighbors could set their watches by it, has gone down in history as an example of his extreme rigor (one easily recognized even by the layman who can't tell his an a posteriori from his elbow). Gros adds a telling detail to this otherwise commonplace biographical fact: Kant took care to walk at a measured, even pace, since he was profoundly averse to sweating.

At the other extreme was the ancient philosophical school known as the Cynics, with its cultivation of an almost violent indifference to comfort and propriety. The Cynics were homeless vagrants on principle. They denied themselves, as much as possible, every luxury, or even convenience, taken for granted by their fellow Greeks.

That included footwear: "They had done so much walking," Gros says, "that they hardly needed shoes or even sandals, the soles of their feet being much like leather." When the Cynics showed up in a town square, their constant exposure to nature's elements gave a jagged edge to the harangues in which they attacked commonplace ideas and values. Gros sees walking, then, as the foundation of the Cynics' philosophical method:

"Philosophers of the type one might call sedentary enjoy contrasting the appearance with the essence of things. Behind the curtain of tangible sights, behind the veil of visibilities, they try to identify what is pure and essential, hoping perhaps to display, above the colors of the world, the glittering, timeless transparency of their own thought…. The Cynic cut through that classic opposition. He was not out to seek or reconstruct some truth behind appearances. He would flush it out from the radical nature of immanence: just below the world's images, he was searching for what supported them. The elemental: nothing true but sun, wind, earth and sky; their truth residing in their unsurpassable vigor."

Walking is not a sport, Gros takes care to emphasize. You don't need any equipment (not even shoes, for an old-school Cynic) nor is any instruction required. The skill set is extremely limited and mastered by most people in infancy. Its practice is noncompetitive.

But in a paradox that gives the book much of its force, we don't all do it equally well. It's not just that some of us are clumsy or susceptible to blisters. Gros contrasts the experience of a group of people talking to one another while marching their way through a walking tour (an example of goal-driven and efficiency-minded behavior) and the unhurried pace of someone for whom the walk has become an end in itself, a point of access to the sublimely ordinary. And so he has been able to give the matter a lot of thought:

"Basically, walking is always the same, putting one foot in front of the other. But the secret of that monotony is that it constitutes a remedy for boredom. Boredom is immobility of the body confronted with emptiness of mind. The repetitiveness of walking eliminates boredom, for, with the body active, the mind is no longer affected by its lassitude, no longer drawn from its inertia the vague vertigo in an endless spiral.… The body's monotonous duty liberates thought. While walking, one is not obliged to think, to think this or that. During that continuous but automatic effort of the body, the mind is placed at one's disposal. It is then that thoughts can arise, surface or take shape."

As for the clumsiness and blisters, I hope they will disappear soon. It's the practice of walking, not reading about it, that makes all the difference. But no book has rewired my bilateral somatosensory cortex so thoroughly in a long while.

Editorial Tags: 

Review of Goran Therborn, 'The Killing Fields of Inequality'

In one of those cases where satire cannot trump cold hard fact, the power brokers and heavy thinkers who gathered at an Alpine resort in Davos, Switzerland, for the World Economic Forum last month expressed great concern about the danger that growing inequality poses to social stability everywhere. As well they might.

Strictly speaking, "widening income disparities" was only one of 10 issues flagged by the Forum's Outlook on the Global Agenda 2014 report, along with "a lack of values in leadership" and "the rapid spread of misinformation online." But a couple of concerns on the list -- "persistent structural unemployment" and "the diminishing confidence in economic policies" -- were variations on the same theme. Two or three other topics were related to income disparity only a little less directly

In case you didn't make it to Davos last month (my invitation evidently got lost in the mail this year ... as it has every year, come to think of it), another gathering this summer will cover much of the same ground. The 18th World Congress of the International Sociological Association -- meeting in Yokohama, Japan, in mid-July -- has as its theme "Facing an Unequal World: Challenges for Global Sociology." The scheduling of their events notwithstanding, it was the sociologists who were really farsighted about the issue of growing inequality, not the "Davos men." The ISA announced the theme for its congress as early as December 2010.

And the conversation in Japan is sure to be more focused and substantive. A lot of business networking goes on during the World Economic Forum. By some accounts, the topic of inequality figured more prominently in the news releases than in actual discussions among participants. It's almost as if all of Bono's efforts at Davos were for nought.

Available a solid six months before the sociologists put their heads together in Yokohama, Goran Therborn's The Killing Fields of Inequality (Polity) ought to steer the public's thinking into deeper waters than anything that can be reached with a reductive notion like "widening income disparities." Money provides one measure of inequality, but so do biomedical statistics, which record what Therborn, a professor emeritus of sociology at the University of Cambridge, calls "vital inequality." (Income disparities fall under the heading of "resource inequalities," along with disparities in access to nutrition, education, and other necessities of life.)

A third, less quantifiable matter is "existential inequality," which Therborn defines as "the unequal allocation of personhood, i.e., of autonomy, dignity, degrees of freedom, and of rights to respect and self-development." A big-tent concept of Therborn's own making, existential inequality covers the limitations and humiliations imposed by racism, sexism, and homophobia but also the experience of "people with handicaps and disabilities or just the indigent overlorded by poorhouse wardens or condescending socio-medical powerholders," among others.

While analytically distinct, the three forms of inequality tend to be mutually reinforcing, often in perfectly understandable but no less miserable ways: "Nationwide U.S. surveys of the last decade show that the lower the income of their parents, the worse is the health of the children, whether measured in overall health assessment, limitations on activity, school absence for illness, emergency ward visits, or hospital days."

The differences in health between the offspring of well-off and low-income parents "have been measured from the child's age of two, and the differentials then grow with age." A study of mortality rates among men in Central and East European countries shows a pattern of higher education corresponding to a longer life; men with only a primary education not only died earlier but were more prone to longstanding illnesses. (The patterns among women were comparable "but differentials are smaller, less than half the male average.")

Such inequalities within countries look small compared to those between countries, of course -- and Therborn piles up the examples of so many varieties of inequality from such diverse places that it becomes, after a while, either numbing or unbearable. Generalization is hazardous, but the pattern seems to be that a considerable variety of inequalities, both inter- and intranational, has sharpened over the past 30 years or so. Not even the author's own country of origin, Sweden -- so long the promised land for social democrats -- has been spared. Therborn's study of income developments in the Stockholm Metropolitan area between 1991 and 2010 showed that "the less affluent 80 percent of the population saw their income share decline, while the most prosperous 10 percent had their share augmented from 25 to 32 percent."

Furthermore, the share of the income that top tenth earned from playing the Stockholm Stock Exchange grew 282 percent over the same period. In Sweden as elsewhere, "the top side of intra-national inequality is driven primarily by capital expansion and concentration, and that at the bottom by (politically alterable) policies to keep the poor down and softened up to accept anything."

It seems unlikely that the CEOs, financiers, and politicians at Davos ever had it put to them quite like that. But Therborn seems equally unhappy with his own discipline, which he thinks has somehow managed to dodge thinking about inequality as such.

"Among the fifty odd Research Committees of the International Sociological Association," he writes, "there is not one focused on inequality." The closest approximation is the one on "Social Stratification," which he says "has mainly been interested in intergenerational social mobility."

That mobility having been, for the most part, upwards. But the distance from the bottom of society to its top verges ever more on the dystopian. In a rare flourish, Therborn invokes the alternative: "the positive lure of enlightened societies governed by rational and inclusive deliberation, where nobody is outcast or humiliated, and where everybody has a chance to develop his/her abilities."

To reach it, or even to move in that direction, implies a battle. "Nobody knows how it will end," he concludes. "Which side will you be on?"

I don't think he's asking just the people who will be there in Yokohama this summer.

Editorial Tags: 

Essay on 'Creditocracy' by Andrew Ross and 'The Falling Rate of Learning and the Neoliberal Endgame' by David J. Blacker

Intellectual Affairs

Two great models of eloquence in the English language are The Book of Common Prayer and the translation of the Bible usually called the King James Version. A memorable passage that appears in both volumes crossed my mind while thinking about a couple of recent works of social criticism. (It also happens that Princeton University Press recently brought out The Book of Common Prayer: A Biography by Alan Jacobs, a professor of humanities at Baylor University, which a couple of readers have highly recommended.)

The text in question appears a couple of times in the New Testament as part of what's usually called "the Lord's Prayer." The Book of Common Prayer, the older of the two volumes, renders one line of the prayer as "Forgive us our trespasses, as we forgive those who trespass against us." The KJV rendering says, "Forgive us our debts, as we forgive our debtors."

To my ear, "trespasses" works better rhythmically, and it expresses the notion of "sin" or "offense" in a slightly more elegant manner. By contrast, "debt" or "debtor" expresses the same thought in a blunt and harsh way, and even conjures the old cartoon image of St. Peter recording good and evil deeds in a big ledger at the gates of heaven. Puzzled by the contrast, I consulted an extremely literal translation by J.N. Darby -- a Victorian Biblical scholar of uncompromising severity -- who suggests that "debt" is indeed what the original text says.  

Around the time Darby was working on his translation, Friedrich Nietzsche fleshed out an argument about the interrelationship among guilt, debt, and memory. Bringing up an atheist philosopher pretty well guarantees someone is now offended. But The Genealogy of Morals spells out in bleak and somewhat lurid terms a point left implicit in the prayer: The debtor is at the mercy of the creditor, who has the right (or at least the power) to inflict suffering -- even bloody revenge -- when payment is not made.

Whatever else it may signify, the brutal connotations of "debt" make forgiveness sound much more demanding and consequential than "trespass" would imply. (Awkward recollection: Learning the prayer as a little kid, I pictured God being unhappy that people were ignoring a sign on His lawn.)

Homo economicus never spent all that much time on moral accounting. But at least the old bourgeois virtues included restraint and a residual belief that self-interest was justified insofar as it served a larger good. The issues that concern Andrew Ross in his new book Creditocracy (discussed in last week's column) unfold in a world where debt itself is a kind of demigod, answerable to no higher power of any kind -- and certainly not to the state.

As the example of credit-default swaps on subprime mortgages in the go-go '00s made clear, the alchemists of finance are able to create profitable investment opportunities out of the risk (i.e., the degree of likelihood) of non-repayment -- making possible the creation of enormous fortunes from loans that cannot be repaid, at least not in full. That is but one link in a complex chain of debt-creation. Should the speculative bubble burst, the job of preventing economic meltdown falls to the government (which already has its own deficits, of course) at whatever risk to allocations for education, infrastructure, etc.

Add to it an average household debt that, Ross notes, grew from 43 percent of gross domestic product in 1980 to 97 percent in 2008 -- across three decades of stagnating wages. Throughout that period, 60 percent of income gains went to the country's wealthiest 1 percent -- a trend that changed dramatically when the economic crisis hit. Since then, 95 percent of income gains have gone to that debt-creating (if not job-creating) sliver.

David J. Blacker, a professor of philosophy of education and legal studies at the University of Delaware, characterizes the situation with a simple image in The Falling Rate of Learning and the Neoliberal Endgame (Zero Books):

"Imagine a casino in which you play with the house money and if you win you get to keep all the winnings to yourself, whereas if you lose, the house covers your bets. The literally astronomical public sums required to continue this arrangement for the minutest percentage of the population is the proximal cause of the squeeze on public resources. Schoolchildren, the poor, the sick, the disabled, the elderly etc., must all sacrifice so elites no longer have to undergo the risks that are officially supposed to be inherent in their role as fearless capitalist risk-takers. ..."  But genuine competition and risk are reserved "for small businesses and other little people like private and public sector employees."

Ross responds to the debt-driven status quo by challenging a whole series of moral reflexes that have traditionally accompanied debt: the feelings of obligation and culpability, of shame and implied weakness, that the prayer rendered in the King James translation take as a given. When access to socially necessary goods (particularly higher education) is restricted or undermined by an economy making debt all but inescapable for countless people, someone ought to feel guilty when students default on their loans -- just not the students themselves. The next step is to call for large-scale fiscal disobedience: a social movement of millions of people pledging to default on their student loans. On the far side of that and other radical confrontations with the debt machine, Ross conceives the possibility of morally sound, humanely responsible systems of finance, based on communitarian social forms. Not utopia, perhaps, but a long way from here.

Massive default is a strategy I find it easier to admire, or at least to daydream about, than to recommend. It is not impossible that a million people might make such a pledge. Carrying out the action is another matter -- and if only a fraction see it through, the result is bound to be martyrdom of an uninspiring and ineffectual kind. In any case, I have no student debt to default on in solidarity, and calling for others to do so would be a case of telling them, "Let's you and him go fight."

Like Creditocracy, David Blacker's book was written in the wake of Occupy Wall Street. But where Ross occasionally sounds like Pierre-Joseph Proudhon -- with his vision of a mutualist society of small producers, exchanging goods and services with a new form of money that doesn't promote inequality -- Blacker thinks along much more classically Marxist lines. The predatory forms of financial speculation that led to the crisis five years ago will not be regulated out of existence, nor are they deviations or tumors growing on a fundamentally healthy economy. The casino will keep rewarding the high rollers when they win and shaking the rest of society down when they lose. Such investment in manufacture as continues to be made will need workers with skills and the capacity to adapt to technological developments -- but ever fewer of them.

Most of the population will be an object for social control, rather than Schooling proper. At some level most of us sense this already, making the whole notion of "education as investment in the future" an ever more problematic principle. Blacker has written probably the gloomiest book I have read in years, but in some ways it seems like a practical one. He is not a survivalist. He thinks pedagogy still has a role, provided it's geared to understanding the dire probabilities and finding ways to respond to them. It helps that Blacker is a sharp and forceful writer, giving his analysis something of the  vividness and urgency of an Old Testament prophet delivering warnings that nobody really wants to hear.

Editorial Tags: 
Image Source: 
Wikimedia Commons

Review of Peter Charles Hoffer, 'Clio Among the Muses: Essays on History and the Humanities'

Intellectual Affairs

The nine muses are a motley bunch. We’ve boiled them down into a generic symbol for inspiration: a toga-clad young woman, possibly plucking a string instrument. But in mythology they oversaw an odd combination of arts and sciences. They were sisters, which allegorically implies a kinship among their fields of expertise. If so, the connections are hard to see.

Six of them divvied up the classical literary, dramatic, and musical genres – working multimedia in the cases of Erato (inventor of the lyre and of love poetry) and Euterpe (who played the flute and inspired elegaic songs and poems). The other three muses handled choreography, astronomy, and history. That leaves and awful lot of creative and intellectual endeavor completely unsupervised. Then again it’s possible that Calliope has become a sort of roaming interdisciplinary adjunct muse, since there are so few epic poets around for her to inspire these days.

An updated pantheon is certainly implied by Peter Charles Hoffer’s Clio Among the Muses: Essays on History and the Humanities (New York University Press). Clio, the demi-goddess in charge of history, is traditionally depicted with a scroll or a book. But as portrayed by Hoffer -- a professor of history at the University of Georgia – she is in regular communication with her peers in philosophy, law, the social sciences, and policy studies. I picture her juggling tablet, laptop and cellphone, in the contemporary manner.

Ten years ago Hoffer published Past Imperfect, a volume assessing professional misconduct by American historians. The book was all too timely, appearing as it did in the wake of some highly publicized cases of plagiarism and fraud. But Hoffer went beyond expose and denunciation. He discussed the biases and sometimes shady practices of several well-respected American historians over the previous 200 years. By putting the recent cases of malfeasance into a broader context, Hoffer was not excusing them; on the contrary, he was clearly frustrated with colleagues who minimized the importance of dealing with the case of someone like Michael Bellesiles, a historian who fabricated evidence. But he also recognized that history itself, as a discipline, had a history. Even work that seemed perfectly sound might be shot through with problems only visible with the passing of time.

While by no means a sequel, Clio Among the Muses continues the earlier book’s effort to explain that revisionism is not a challenge to historical knowledge, but rather intrinsic to the whole effort to establish that knowledge in the first place. “If historians are fallible,” Hoffer writes, “there is no dogma in history itself, no hidden agenda, no sacred forms – not any that really matter – that are proof against revision… Worthwhile historical scholarship is based on a gentle gradualism, a piling up of factual knowledge, a sifting and reframing of analytical models, an ongoing collective enterprise that unites generation after generation of scholars to their readers and listeners.”

Hoffer’s strategy is to improve the public’s appreciation of history by introducing it to the elements of historiography. (That being the all-too-technical term for the history of what historians do, in all its methodological knottiness.) One way to do so would be through a comprehensive narrative, such as Harry Elmer Barnes offered in A History of Historical Writing (1937), a work of terrific erudition and no little tedium. Fortunately Hoffer took a different route.

Clio Among the Muses instead sketches the back-and-forth exchanges between history and other institutions and fields of study: religion, philosophy, law, literature, and public policy, among others. Historians explore the topics, and use the tools, created in these other domains. At the same time, historical research can exert pressure on, say, how a religious scripture is interpreted or a law is applied.

Clio’s dealings with her sisters are not always happy. One clear example is a passage Hoffer quotes from Charles Beard, addressing his colleagues at a meeting of the American Historical Association in 1933: “The philosopher, possessing little or no acquaintance with history, sometimes pretends to expound the inner secret of history, but the historian turns upon him and expounds the secret of the philosopher, as far as it may be expounded at all, by placing him in relation to the movement of ideas and interests in which he stands or floats, by giving to his scheme of thought its appropriate relativity.”

Sibling rivalry? The relationships are complicated, anyway, and Hoffer has his hands full trying to portray them. The essays are learned but fairly genial, and somehow not bogged down by the fundamental impossibility of what the author is trying to do. He covers the relationship between history and the social sciences – all of them -- in just under two dozen pages. Like Evel Knievel jumping a canyon, you have to respect the fact that, knowing the odds, he just went ahead with it.

But then, one of Hoffer’s remarks suggests that keeping one’s nerve is what his profession ultimately requires:

“Historical writing is not an exercise in logical argument so much as an exercise in creative imagination. Historians try to do the impossible: retrieve an ever-receding and thus never reachable past. Given that the task is impossible, one cannot be surprised that historians must occasionally use fallacy – hasty generalization, weak analogy, counterfactual hypotheticals, incomplete comparisons, and even jumping around in past time and space to glimpse the otherwise invisible yesteryear.”

And if they did not do so, we’d see very little of it at all.

 

Editorial Tags: 

interview with Tim Lacy on 'The Dream of a Democratic Culture: Mortimer J. Adler and the Great Books Idea'

Originally published by Encyclopedia Britannica in 1952, Great Books of the Western World offered a selection of core texts representing the highest achievements of European and North American culture. That was the ambition. But today the set is perhaps best remembered as a peculiar episode in the history of furniture.

Many an American living room displayed its 54 volumes -- “monuments of unageing intellect,” to borrow a phrase from Yeats. (The poet himself, alas, did not make the grade as Great.) When it first appeared, the set cost $249.50, the equivalent of about $2,200 today. It was a shrewd investment in cultural capital, or at it least it could be, since the dividends came only from reading the books. Mortimer Adler – the philosopher and cultural impresario who envisioned the series in the early 1940s and led it through publication and beyond, into a host of spinoff projects – saw the Great Books authors as engaged in a Great Conversation across the centuries, enriching the meaning of each work and making it “endlessly rereadable.”

Adler's vision must have sounded enticing when explained by the Britannica salesman during a house call. Also enticing: the package deal, with Bible and specially designed bookcase, all for $10 down and $10 per month. But with some texts the accent was on endless more than rereadable (the fruits of ancient biological and medical research, for example, are dry and stony) and it is a good bet that many Great Books remained all but untouched by human hands.

Well, that’s one way to tell the Great Books story: High culture meets commodity fetishism amidst Cold War anxiety over the state of American education. But Tim Lacy gives a far more generous and considerably more complex analysis of the phenomenon in The Dream of a Democratic Culture: Mortimer J. Adler and the Great Books Idea, just published by Palgrave Macmillan. The book provides many unflattering details about how Adler’s pedagogical ambitions were packaged and marketed, including practices shady enough to have drawn Federal Trade Commission censure in the 1970s. (These included bogus contests, luring people into "advertising research analysis surveys" that turned into sales presentations, and misleading "bundling" of additional Great Books-related products without making clear the additional expense.) At the same time, it makes clear that Adler had more in mind than providing a codified and “branded” set of masterpieces that the reader should passively absorb (or trudge through, as the case may be).

The Dream of a Democratic Culture started life as a dissertation at Loyola University in Chicago, where Lacy is currently an academic adviser at the university’s Stritch School of Medicine. In its final pages, he describes the life-changing impact on him, some 20 years ago, of studying Adler’s How to Read a Book (1940), a longtime bestseller. He owns and is reading his way through the Great Books set, and his study reflects close attention to Adler’s own writings and the various supplementary Great Books projects. But in analyzing the life and work of “the Great Bookie,” as one of Adler’s friends dubbed him, Lacy is never merely celebratory. In the final dozen years or so before his death in 2001, Adler became one of the more splenetic culture warriors – saying, for example, that the reason no black authors appeared in the expanded 1990 edition of the Great Books was because they “didn’t write any good books.”

Other such late pronouncements have been all too memorable -- but Lacy, without excusing them, makes a case that they ought not to be treated as Adler’s definitive statements. On the contrary, they seem to betray principles expressed earlier in his career. Lacy stops short of diagnosing the aging philosopher’s bigoted remarks as evidence of declining mental powers, though it is surely a tempting explanation. Then again, working at a medical school would probably leave a non-doctor chary about that sort of thing.

I found The Dream of a Democratic Culture absorbing and was glad to be able to interview the author about it by email; the transcript follows. Between questions, I looked around a used-books website to check out the market in secondhand copies of Great Books of the Western World is like. One listing for the original 1952 edition is especially appealing, and not just because of its price (under $250, in today’s currency). “The whole set is in very good condition,” the bookseller writes, “i.e., not read at all.”

Q: How did your personal encounter with the Great Books turn into a scholarly project?

A: I started my graduate studies in history, at Loyola University Chicago, during the 1997-98 academic year. My initial plan was to work on U.S. cultural history, with a plan to zoom in on either urban environmental history or intellectual history in an urban context. I was going to earn an M.A. and then see about my possibilities for a Ph.D. program.

By the end of 1998 the only thing that had become clear to me was that I was confused. I had accumulated some debt and a little bit of coursework, but I needed a break rethink my options. I took a leave of absence for the 1999 calendar year. During that period I decided three things: (1) I wanted to stay at Loyola for my Ph.D. work; (2) Environmental history was not going to work for me there; (3) Cultural and intellectual history would work for me, but I would need to choose my M.A. thesis carefully to make it work for doctoral studies.

Alongside this intense re-education in the discipline of history I had maintained, all through the 1997 to 1999 period, my reading of the Britannica's Great Books set. I had also accumulated more books on Adler, including his two autobiographies, during stress-relief forays into Chicago's most excellent used bookstore scene. Given Adler's Chicago connections, one almost always saw his two or three of his works in the philosophy sections of these stores.

During a cold December day in 1999, while sitting in a Rogers Park coffee shop near Loyola, this all came together in a sudden caffeine-laced epiphany: Why not propose the Great Books themselves as the big project for my graduate study? I sat on the idea for a few days, both thinking about all the directions I could take for research and pounding myself on the head for not having thought of the project sooner. I knew at this point that Adler hadn't been studied much, and I had a sense that this could be a career's worth of work.

The project was going to bring together professional and personal interests in a way that I had not imagined possible when thinking about graduate school.

Q: Did you meet any resistance to working on Adler and the Great Books? They aren’t exactly held in the highest academic esteem.

A: The first resistance came late in graduate school, and after, when I began sending papers, based on my work, out to journals for potential publication. There I ran into some surprising resistance, in two ways. First, I noticed a strong reluctance toward acknowledging Adler's contributions to American intellectual life. As is evident in my work and in the writings of others (notably Joan Shelley Rubin and Lawrence Levine, but more recently in Alex Beam), Adler had made a number of enemies in the academy, especially in philosophy. But I had expected some resistance there. I know Adler was brusque, and had written negatively about the increasing specialization of the academy (especially in philosophy but also in the social sciences) over the course of the 20th century.

The second line of resistance, which was somewhat more surprising, came because I took a revisionist, positive outlook on the real and potential contributions of the great books idea. Of course this resistance linked back to Adler, who late in his life — in concert with conservative culture warriors --- declared that the canon was set and not revisable. Some of the biggest promoters of the great books idea had, ironically, made it unpalatable to a great number of intellectuals. I hadn't anticipated the fact that Adler and the Great Books were so tightly intertwined, synonymous even, in the minds of many academics.

Q: Selecting a core set of texts was only part of Adler's pedagogical program. Your account shows that it encompassed a range of forms of instruction, in various venues (on television and in newspapers as well as in classrooms and people’s homes). The teaching was, or is, pitched at people of diverse age groups, social backgrounds, and so on -- with an understanding that there are numerous ways of engaging with the material. Would you say something about that?

A: The great books idea in education --- whether higher, secondary, or even primary --- was seen by its promoters as intellectually romantic, adventurous even. It involved adults and younger students tackling primary texts instead of textbooks. As conceived by Adler and Hutchins, the great books idea focused people on lively discussion rather than boring Ben Stein-style droning lectures, or PowerPoints, or uninspiring, lowest-common-denominator student-led group work.

One can of course pick up bits of E.D. Hirsch-style "cultural literacy" (e.g., important places, names, dates, references, and trivia) through reading great books, or even acquire deeper notes of cultural capital as described in John Guillory's excellent but complex work, Cultural Capital: The Problem of Literary Canon Formation (1993). But the deepest goal of Adler's model of close reading was to lead everyday people into the high stakes world of ideas.  This was no mere transaction in a "marketplace of ideas," but a full-fledged dialogue wherein one brought all her or his intellectual tools to the workbench.

Adler, Hutchins, John Erskine, Jacques Barzun, and Clifton Fadiman prided themselves being good discussion leaders, but most promoters also believed that this kind of leadership could be passed to others. Indeed, the Great Books Foundation trained (and still trains) people to lead seminars in a way that would've pleased Erskine and Adler. Education credentials matter to institutions, but the Foundation was willing train people off the street to lead great books reading groups.

This points to the fact that the excellent books by famous authors promoted by the great books movement, and the romance inherent in the world of ideas, mattered more than the personality or skill of any one discussion moderator. All could access an engagement with excellence, and that excellence could manifest in texts from a diverse array of authors.

Q: It seems like the tragedy of Adler is that he had this generous, capacious notion that could be called the Great Books as a sort of shorthand – but what he's remembered for is just the most tangible and commodified element of it. A victim of his own commercial success?

A: Your take on the tragedy of Adler is pretty much mine. Given his lifelong association with the great books project, his late-life failings almost guaranteed that the larger great books idea would be lost in the mess of both his temporary racism  and promotion of Britannica's cultural commodity. The idea came to be seen as a mere byproduct of his promotional ability. The more admirable, important, and flexible project of close readings, critical thinking, and good citizenship devolved into a sad Culture Wars spectacle of sniping about race, class, and gender. This is why I tried, in my "Coda and Conclusion" to end on a more upbeat note by discussing the excellent work of Earl Shorris and my own positive adventures with great books and Adler's work.

Q: Was it obvious to you from the start that writing about Adler would entail a sort of prehistory of the culture wars, or did that realization come later?

A: At first I thought I would be exploring Adler's early work on the great books during my graduate studies. I saw myself intensely studying the 1920s-1950s period. Indeed, that's all I covered for my master's project which was completed in 2002.

However, I began to see the Culture Wars more clearly as I began to think in more detail about the dissertation. It was right around this time that I wrote a short, exploratory paper on Adler's 1980s-era Paideia Project. When I mapped Paideia in relation to "A Nation at Risk" and William Bennett, I began to see that my project would have to cover Bloom, the Stanford Affair, and the 1990 release of the second edition of Britannica's set. Around the same time I also wrote a paper on Adler's late 1960s books. When I noticed the correlation between his reactions to "The Sixties" and those of conservative culture warriors, it was plain to me that I would have to explore Adler as the culture warrior.

So even though I never set out to write about the Culture Wars, I got excited when I realized how little had been done on the topic, and that the historiography was thin. My focus would limit my exploration (unlike Andrew Hartman's forthcoming study), but I was pleased to know that I might be hanging around with a vanguard of scholars doing recent history on the Culture Wars.

Q: While Adler’s response to the upheaval of the 1960s was not enthusiastic, he was also quite contemptuous of Alan Bloom’s The Closing of the American Mind. How aware of Bloom's book and its aftermath were you when you bought and started reading the Great Books?

A: Honestly, I had little knowledge of Allan Bloom nor his ubiquitous The Closing of the American Mind until the mid-1990s. This requires a little background explanation. I started college in 1989 and finished in 1994. As a small-town Midwestern teenager and late-1980s high schooler, I was something of a rube when I started college. I was only vaguely aware, in 1989, that there was even a culture war ongoing out there (except in relation to HIV and AIDS).

I'm ashamed to admit, now, how unaware I was of the cultural scene generally. Moreover, I was insulated from some of it, and its intensity, during my early college years when it was at its height because I began college as an engineering student. Not only was my area of study far outside the humanities, the intensity of coursework in engineering sheltered me from all news beyond sports (my news reading outlet at the time). Even when I began to see that engineering wasn't for me, around 1992, my (then) vocational view of college caused me to move to chemistry rather than a humanities subject.

My own rudimentary philosophy of education kept me from thinking more about the Culture Wars until my last few years as a college student. It was then that I first heard about Bloom and his book. Even so, I only read passages in it, through the work of others, until I bought a copy of the book around 2000. I didn't read The Closing of the American Mind, word-for-word, until around 2003-04 while dissertating.

Q: There was no love lost between Adler and Bloom – you make that clear!

A: In my book you can see that Adler really wanted it known that he believed Leo Strauss and all his disciples, especially Bloom, were elitists. Adler believed that the knowledge (philosophy, history, theology, psychology, etc.) contained in great books were accessible to all. While scholarship and the knowledge of elites could add to what one gained from reading great books, there was a great deal in those works that was accessible to the common man and hence available to make better citizens.

So while Adler was sort of a comic-book character, you might say he was a clown for democratic citizenship -- a deceptively smart clown champion for democratizing knowledge and for raising the bar on intelligent discourse. This analogy is faulty, however, because of the intensity and seriousness with which he approached his intellectual endeavors. He loved debate with those who were sincerely engaged in his favorite topics (political philosophy, education, common sense philosophy, etc.).

I see only advantages in the fact that I was not personally or consistently engaged in the culture wars of the late 1980s and early 1990s. It has given me an objective distance, emotionally and intellectually, that I never believed possible for someone working on a topic that had occurred in her/his lifetime. Even though I started graduate school as something of a cultural and religious conservative (this is another story), I never felt invested in making my developing story into something that affirmed my beliefs about religious, culture, and America in general.

A belief that tradition and history had something to offer people today led me to the great books, but that did not confine me into a specific belief about what great books could, or should, offer people today. I was into great books for the intellectual challenge and personal development as a thinker, not for what great books could tell me about today's political, social, cultural, and intellectual scene.

Q: You defend Adler and the Great Books without being defensive, and I take it that you hope your book might help undo some of the damage to the reputation of each -- damage done by Adler himself, arguably, as much as by those who denounced him. But is that really possible, at this late a date? Won’t it take a generation or two? Or is there something about Adler's work that can be revived sooner, or even now?

A: Thank you very much for the compliment in your distinction about defending and being defensive. I did indeed seek to revise the way in which Adler is covered in the historiography. Because most other accounts about him have been, in the main, mocking and condescending, any revisionary project like mine would necessarily have to be more positive -- to inhabit his projects and work, which could result in something that might appear defensive. I think my mentor, Lewis Erenberg, and others will confirm that I did not always strike the right tone in my early work. It was a phase I had to work through to arrive at a mature, professional take on the whole of Adler's life and the Great Books Movement.

As for salvaging Adler's work as a whole, I don't know if that's possible. Some of it is dated and highly contextual. But there is much worth reviewing and studying in his corpus. My historical biography, focused on the great books in the United States, makes some headway in that area.

Some of Adler's other thinking about great books on the international scene will make it into a manuscript, on which I'm currently working, about the transnational history of the great books idea. If all goes well (fingers crossed), that piece will be paired with another by a philosopher and published as "The Great Books Controversy" in a series edited by Jonathan Zimmerman and Randall Curren.

I think a larger book on Adler's work in philosophy is needed, especially his work in his own Institute for Philosophical Research. I don't know if my current professional situation will give me the time and resources to accomplish much more on Adler. And even if my work situation evolves, I do have interests in other historical areas (anti-intellectualism, Chicago's intellectual history, a Jacques Maritain-in-America project). Finally, I also need keep up my hobby of reading more great books!

Editorial Tags: 

Review of Janet Roitman, 'Anti-Crisis'

The Chinese word for “crisis,” as generations of commencement speakers have reminded us, is written using the same character as “opportunity.” Whatever inspirational quality this chestnut may possess does not grow with repetition – and it is a curmudgeonly pleasure to learn that it’s wrong, or at best only fractionally true.

In fact both “crisis” and “opportunity” are written with two characters. The one they share can mean “quick-witted” or “device,” depending on context, and can be combined with another glyph to write “airplane.” (An airplane is uplifting, albeit not motivationally.) And Victor H. Mair, the professor of Chinese at the University of Pennsylvania who debunked this hardy linguistic urban legend, points out that apart from the Sinological blunder, it’s terrible advice: “Any would-be guru who advocates opportunism in the face of crisis should be run out of town on a rail, for his/her advice will only compound the danger of the crisis.”

But you don’t uproot a cultural weed all that easily -- especially not when crisis-mindedness has become totally normal. That’s a paradox but it’s also indisputable. A quick search of Google News finds 89.5 million articles with the word “crisis” in them as of this writing. Rhetorical inflation has a lot to do with it, of course. But it’s also the long-term effect of a state of mind that Susan Sontag characterized so well in an essay from 1988: “A permanent modern scenario: apocalypse looms … and it doesn’t occur. And it still looms. […] Modern life accustoms us to live with the intermittent awareness of monstrous, unthinkable – but, we are told, quite probable – disasters.”

The instances she had in mind were the threat of nuclear war and the AIDS epidemic. In 25 years, neither has disappeared, though other catastrophes (actual and potential) have moved to the fore. The crises change, but not the structure of feeling.

Anti-Crisis by Janet Roitman, published by Duke University Press, digs deeper than Sontag’s comments on apocalypse fatigue. Roitman, an associate professor of anthropology at the New School, approaches the ongoing discussion of subprime mortgage "crisis" (as it’s hard not to think of it) with questions about the assumptions and implicit limitations of a word so ubiquitous that it is normally taken for granted.

She does so by way of the late Reinhart Koselleck’s approach to intellectual history, known by a term even some of his English-language commentators have preferred to leave untranslated: Begriffsgeschichte. No way am I going to try to type that again, so let’s just refer to it as “conceptual history.” But arguably use of the full Teutonic monty is justified in order to distinguish Koselleck’s work from what, in the Anglo-American tradition, is called the history of ideas.

As Koselleck writes in an entry for a major conceptual-history handbook on social and political ideas, the term “crisis” played an important role in the work of the Young Hegelians, who took their master’s thinking about the philosophy of history as a starting point for the critique of existing institutions. Given that a key term in Hegel’s system is Begriff (the Concept) and that one of the Young Hegelians was Karl Marx, who maintained that recurrent crisis was an inescapable part of the history of capitalism itself – well, given all that, it’s possible to see how the word Begriffsgeschichte might carry layers of implication soon lost in translation.

The argument of Anti-Crisis is nothing if not oblique, and self-reflexive to boot, and paraphrasing it seems a fool’s errand. It is a good idea to grapple with Koselleck’s essay on crisis before reading Roitman’s book (so I learned the hard way) and no hard feelings on my part if you did so before finishing this column.    

So now to run that errand. For Roitman, "crisis" is not simply a clichéd label for -- among other things -- recent economic developments, but a fraught and dubious concept. The word itself has roots in an ancient Greek medical term referring to the phase of an illness which will either kill the patient or end in recovery. It came into frequent use to describe social, political, and cultural phenomena beginning late in the 18th century -- one element in a very complex series of shifts of meaning between religious concepts of social and cosmic order and a (seemingly?) secular pattern of life.

The French Revolution, with the spectacle of comprehensive upheaval, doubtless made the word especially vivid. But Koselleck also cites Thomas Paine’s The Crisis, from 1776.  “To Paine, the War of Independence was no mere political or military event,” he writes; “rather it was the completion of a universal world historical process, the final Day of Judgment that would entail the end of all tyranny and the ultimate victory over hell... .”

In sum, then, “crisis” came to possess small range of theological, political, and other connotations. Calling something a crisis implies its urgency or consequentiality. But it also posits that elements of the crisis are intelligible. They are the effects of departures from a norm, or aspects in the unfolding of some grand narrative. The crisis has causes, which we can discover. It has effects, which we begin to interpret even while enduring them.

“Crisis is a blind spot that enables the production of knowledge,” writes Roitman. “… More precisely, it is a distinction that secures ‘a world’ for observation.” The process rests upon “a distinction that generates and refers to an ‘inviolate level’ of order (not crisis)” that “is seen to be contingent (historical crises) and yet is likewise posited as beyond the play of contingency, being a logical necessity that is affirmed in paradox (the formal possibility of crisis).”

Now, assuming I understand her argument correctly, Roitman regards calling the great vertigo of financial free-fall a few years ago as something we can label a crisis -- at the risk of assuming we understand what it was, how it happened, and why.

That, in turn, posits that our ideas and information are adequate to the tasks: that government regulation distorts the healthy functioning of the marketplace (if you’re a neoclassicist) or that insufficient government regulation tips the market advantage to the unscrupulous (if you’re Keynes-minded) or that crisis is built into capitalism because of the tendency of the rate of profit to fall (as Marx believed, or didn’t believe, depending on which Marxist you ask).

The problem in any case being that the causal explanations now available rest on understandings of the economy that don’t take into account how crises (or, rather, judgments about the risk of crisis) are not only a factor in how decisions are made in financial markets but operate in instruments involved in the functioning of those markets.

Derivatives and credit default swaps are the examples that everyone has now of, at least. More have been invented, and still more will be. Risk management is a thriving field. So can we judge something to be in a crisis when expectations of crisis (and of profit from crisis) are operational – and bound to become more so? That isn’t a rhetorical question. I have no idea one way or the other, and if Anti-Crisis answers it, I did not mark the page.

“We persevere,” the author says, “in the hope that we can perceive the moments when history is alienated in terms of its philosophy – that is, that we can perceive a dissonance between historical events and representations…. We are left in a chasm: perplexed and immobilized by the supposed radical dissonance between the value of houses and the value of derivatives of houses.”

Perplexed? Yes. Immobilized? Not necessarily. (Epistemologically induced paralysis is only one of the possible responses to a foreclosured mortgage.) I respect Anti-Crisis for making me think hard, even if it occasionally felt like thinking in circles. Meanwhile, it turns out that that Simon & Schuster will be publishing something now listed simply as Untitled Financial Crisis Book, appearing under the company’s Books for Young Readers imprint in early 2015. Whatever baggage its conceptual history has laden it with, the notion of crisis seems to be making itself very much at home.

Editorial Tags: 

New book on how to navigate an entire academic career

Smart Title: 

Author of new book offers tips for grad students and job-seekers.

University press uses social media to increase brand loyalty

Smart Title: 

University Press of Kentucky is giving free e-books to readers who purchase hard copies.

Pages

Subscribe to RSS - Books
Back to Top