books

Author discusses new book on history of area studies and Middle East studies

Smart Title: 

Author discusses his new book on the evolution of Middle East studies in specific and area studies more broadly.

Review of Michael Shermer, "Skeptic: Viewing the World With a Rational Eye"

The one really startling moment in Michael Shermer’s Skeptic: Viewing the World With a Rational Eye (Henry Holt and Company) comes in chapter 38, which begins with a description of his abduction by, presumably, space aliens. This was in 1983, “in the wee hours of the morning, “somewhere along a lonely rural highway approaching Haigler, Nebraska.”

That it took place near Haigler, Neb., is not the surprising part, of course: the extraterrestrials show a definite preference for such locales. Nobody ever gets abducted from the White House lawn. (They must have their reasons.)

But by chapter 38 -- each chapter being a stand-alone essay originally appearing in Shermer’s monthly column for Scientific American -- the author has debunked Atlantis, The Bible Code, Bigfoot and psychic powers, as well as several “alternative medicine” practices now prevalent enough, and in some cases dangerous enough, to make you wonder what the Food and Drug Administration people actually do all day. The polemics against pseudoscience appear alongside explanations and reflections on science, proper. They express the confidence of a rational mind in an intelligible universe -- and that context makes the author’s Nebraska interlude seem doubly odd.

“[A] large craft with bright lights overtook me and forced me to the side of the road. Alien beings exited the craft and abducted me for 90 minutes, after which I found myself back on the road with no memory of what transpired inside the ship. The experience was real and I can prove that it happened because I recounted it to a film crew shortly after, and I am still in contact with some of the aliens.”

Scientific American must have received some strange letters to the editor after that month’s column, I’d imagine, but Shermer’s close encounter can be explained in fairly mundane terms, which we’ll get to in a moment.

Shermer’s willingness to go on the record with the experience distinguishes his essays from the work of the late Martin Gardner, whose 1952 classic Fads and Fallacies in the Name of Science (Dover Publications) and other writings are the obvious point of reference for comparison. Their perspectives on the world have a large margin of overlap, and Gardner would find nothing to argue with when Shermer writes:

“The price to pay for liberty, in addition to eternal vigilance, is eternal patience with the vacuous blather occasionally expressed beneath the shield of free speech. … In a free society skeptics are the watchdogs of irrationalism -- the consumer advocates of bad ideas. Yet debunking is not simply the divestment of bunk; its utility is found in offering a viable alternative, along with a lesson on how thinking goes wrong.”

In another creedal statement, Shermer writes, “What does it mean to have an open mind? It is to find the essential balance between orthodoxy and heresy, between a total commitment to the status quo and the blind pursuit of new ideas, between being open-minded enough to accept radical new ideas and being so open-minded that your brains fall out. Skepticism is about finding that balance.”

For his part, Shermer acknowledges Gardner as a model. He also quotes with approval a point made by the science fiction writer and hard-science popularizer Isaac Asimov: “When people thought the earth was flat, they were wrong. When people thought the earth was spherical, they were wrong. But if you think that thinking the earth is spherical is just as wrong as thinking the earth is flat, then your view is wronger than both of them put together.” (Shermer calls this Asimov’s Axiom.) The irreplaceable and epoch-defining power of scientific research forms the bedrock of each of these figures’ worldviews -- to be defended against relapses into magical thinking as much as possible. Frustrating the effort, if not rendering it completely pointless, is the mass media’s general disinclination to report on real science in any depth while covering the ersatz varieties with all the skill and dedication it can muster.

The situation is not entirely the fault of the executives and producers behind such pseudoreality TV fare as the program about the “controversy” over whether the Apollo moon landings were faked. (You decide!) For part of the problem, as Shermer complains, is that the work of explaining the sciences to a nonspecialist public is typically neglected and even despised by scientists themselves. Carl Sagan and Stephen Jay Gould were notable exceptions, but only at the cost of damage to their professional reputations.

Sagan and Gould are also among the inspirations mentioned in Skeptic, and a number of the essays carry on their work of popularizing the nature and the results of scientific inquiry. But more than a third of the pieces are dedicated to exposing and combating various kinds of quackery, crackpottery, delusions and outright fraud -- a string of updates and supplements to Gardner’s Fads and Fallacies in the Name of Science, in effect. Shermer indicates that he expects to bring out more essay collections in the future, and I greatly hope that when he does, the largest portion of them will be devoted to explaining and interpreting real scientific developments.

For pseudoscience, by contrast, does not develop -- not really. Bad ideas ever appear or disappear, just changing in form. Discrediting claims about lost continents or the mysterious energy of the crystals, unknown to physics, may be a matter of public intellectual hygiene, but at best the reader will learn some basic principles of critical thought (such as “after all this time, it makes sense to talk about the ape man once there’s a corpse to examine; otherwise let’s not”) which are expressed just as well in skeptical writings from years past. Long stretches of Shermer’s book are given over to such exercises in debunkery, and reading them was like watching a man shoot fish in a barrel -- dead fish, at that.

Much more interesting are the essays in which he confronts the factors disposing us to believe weird things. What can make the writing of avowed skeptics frustrating is the tendency to establish a net dichotomy between the reasonable, rigorous, scientific, intelligent people (who don’t succumb to irrational beliefs) and the superstitious, slovenly, harebrained idiots (who do).

That gets boring to read after a while, and, more to the point, it is wrong. For, as Shermer aptly puts it, “smart people believe weird things because they are skilled at defending beliefs they arrived at for nonsmart reasons.” Indeed, cognitive skill and cognitive vulnerability may be linked in inconvenient ways.

“Humans evolved brains that are pattern-recognition machines,” writes Shermer, “designed to detect signals that enhance or threaten survival amid a very noisy world. Also known as association learning (associating A and B as causally connected), we are very good at it, or at least good enough to have survived and passed on the genes for the capacity of association learning. Unfortunately, the system has flaws. Superstitions are false associations -- A appears to be connected to B, but it is not (the baseball player who doesn’t shave and hits a home run). Las Vegas was built on false association learning.” Humans’ unique, language-fostered capacity to create and emit narrative adds another layer of complication and potential trouble: “Like all other animals, we evolved to connect the dots between events in nature to discern meaningful patterns for our survival. Like no other animals, we tell stories about the patterns we find.”

And it is a mark of our success as a species (so far) that we have not had to perfect the skills needed to recognize and break false associations: “The problem is that although true pattern recognition helps us survive, false pattern recognition does not necessarily get us killed, and so the overall phenomenon endured the winnowing process of natural selection.”

Although not explicitly framed as an example, the author’s abduction in Nebraska more than 30 years ago is relevant. Shermer mentions his love of long-distance biking in a number of essays, and in 1983 he was “in the opening days of the 3,100-mile nonstop transcontinental Race Across America,” accompanied by a support crew in a motor home.

After “83 straight hours and 1,259 miles” on his bike, Shermer says, the driver of the motor home flashed its headlights “and pulled alongside while my crew entreated me to take a sleep break.” In a waking dream state, his brain began calling up memories of a science-fiction program he’d watched in childhood: “Suddenly my support team was transmogrified into aliens,” he writes. After 90 minutes of sleep, Shermer found that they had returned to human form, and he then “recounted [the hallucination] to the ABC’s Wide World of Sports television crew” covering the race.

The circumstances were fortunate. He was able to recognize what had happened, to disentangle experience from reality, without trouble or distress. “But at the time the experience was real,” he writes, “and that’s the point. The human capacity for self-delusion is boundless, and the effects of belief are overpowering.” The lesson is better taught by someone who has experienced being overpowered than by someone who thinks of self-delusion as other people’s problem.

Author discusses his new book about why higher education matters

Smart Title: 

Author discusses his new book about why higher education still matters.

Review of Sara L. Crosby, "Poisonous Muse: The Female Poisoner and the Framing of Popular Authorship in Jacksonian America"

The mythological creature called the lamia is something like a hybrid of mermaid and vampire: a beautiful woman from the waist up, atop a serpent’s body, driven by an unwholesome appetite. The indispensable Brewer’s Dictionary of Phrase and Fable elaborates: “A female phantom, whose name was used by the Greeks and Romans as a bugbear to children. She was a Libyan queen beloved by Jupiter, but robbed of her offspring by the jealous Juno; and in consequence she vowed vengeance against all children, whom she delighted to entice and murder.”

Somewhere along the way, the Libyan queen’s proper name turned into the generic term for a whole subspecies of carnivorous nymph. Nor was the menu limited to children. In some tellings, the lamia could conceal her snaky side long enough to lure unwary human males to their deaths. (A femme fatale, if ever.) If the lamia outlived most of the other gods and monsters of antiquity in the Western cultural imagination, I suspect it is in part because of the coincidence that she embodies two aspects of Eden: the beguiling female and the deceiving reptile, merged, literally, into one.

That this overtly misogynistic image might ever have played a part in the political culture of the United States seems improbable -- a little less so in this election year, perhaps, though it remains difficult to picture. And it’s certainly true that the lamia underwent considerable mutation in crossing the Atlantic and finding a place in American literature and party politics. Sara L. Crosby’s Poisonous Muse: The Female Poisoner and the Framing of Popular Authorship in Jacksonian America (University of Iowa Press) follows the lamia’s transformation from the monster known to a classically educated elite to the sympathetic, vulnerable and all-too-human character accepted by the new mass public of early 19th-century America.

The author, an associate professor of English at Ohio State University at Marion, follows the lamia’s trail from antiquity (in Roman literature “she appeared as a dirty hermaphroditic witch who raped young men”) through the poetry of John Keats and on to such early American page-turners as The Female Land Pirate; or Awful, Mysterious, and Horrible Disclosures of Amanda Bannorris, Wife and Accomplice of Richard Bannorris, a Leader in That Terrible Band of Robbers and Murderers, Known Far and Wide as the Murrell Men. (Sample passage: “My whole nature was changed. All the dark passions of Hell seemed to have centered into one; that one into the core of my heart, and that one was revenge! REVENGE!! REVENGE!!!”) There are close readings of stories by Edgar Allan Poe and Nathaniel Hawthorne as well as of the case of Mrs. Hannah Kinney, alleged poisoner of husbands, acquitted after a trial that riveted the country’s newspaper readers.

From this array Crosby builds an argument in layers that may be synopsized roughly along these lines:

A standard version of the lamia story is presented by the Athenian author Philostratus in The Life of Apollonius of Tyana. A young philosopher named Menippus falls under the charms of “a foreign woman, who was good-looking and extremely dainty,” and to all appearances very wealthy as well. He prepares to marry her. Unfortunately, the older and wiser philosopher Apollonius intervenes in time to set the young man straight: “You are a fine youth and are hunted by fine women, but in this case you are cherishing a serpent, and a serpent cherishes you.” Menippus resists this advice, but Apollonius has a verbal showdown with the foreign lady and forces her to admit that she is a lamia, and places her under his control.

Menippus thus receives instruction on the difference between appearance and reality -- and in time to avoid being eaten. The situation can also be read as a kind of political fable: a wise authority figure intervenes to prevent a naïve young person from succumbing to the deceptive, seductive and destructive powers of a woman. For the figure of the lamia is congruent with a whole tradition of misogynistic attitudes, as expressed by the medieval theologian Albertus Magnus: “What [a woman] cannot get, she seeks to obtain through lying and diabolical deceptions. And so, to put it briefly, one must be on one’s guard with every woman, as if she were a poisonous snake and the horned devil.” (This is only one such passage Crosby cites by an authoritative figure maintaining that authority itself is endangered unless men with power practice epistemological as well as moral vigilance.)

But with his 1820 poem “Lamia,” John Keats offers a revisionist telling of the story. To wed her human lover, Lamia sacrifices both her venom and her immortality. In the Philostratusian telling, the confrontation with Apollonius makes her vanish, and presumably kills her, and her beloved immediately falls dead from grief. Having been savaged by reviewers and dismissed as a “Cockney poet” by the literary establishment, Keats recasts the story as a defense of beauty and a challenge to authority. The older man’s knowledge is faulty and obtuse, his power callous and deadly. Poe was an ardent admirer of Keats, and his critical writings are filled with expressions of contempt for the cultural gatekeepers of his day; Crosby interprets the title character of “Ligeia” (a very strange short story that Poe himself considered his best work) as “a revamped Romantic lamia” akin to Keats’s.

The continuity is much easier to see in the case of "Rappaccini's Daughter" by Nathaniel Hawthorne, in which Beatrice (the title figure) is a lamia-like menace to every living thing that crosses her path. This is through no fault of her own; suffice to say that a man with authority has turned her into a kind of walking biological weapon. Once again, the Philostratusian version of the story has been reconfigured. The misogynist vision of the lamia as a force for deception and destruction is abandoned. Her story becomes a fable of oppression, corruption, the illegitimacy of established authority.

These literary reimaginings took shape against a political backdrop that added another layer of significance to the transformation. In the first half century of the United States, citizens “practiced a ‘politics of assent,’ in which a relatively small population of mostly rural voters bowed to the leadership of local elites,” Crosby writes. Editorials and sermons issued Apollonius-like warnings about the need to subdue desire and cultivate virtue. One widely circulated and much-reprinted story told of a daughter who rapidly went from sassing her parents to poisoning them. Clearly the republic’s young females in particular were on the slipperiest of slopes.

“But by the time Andrew Jackson won the presidential election of 1828,” Crosby continues, “the nation was transitioning to a far more raucous and partisan ‘mass democracy,’ characterized by a ‘politics of affiliation’ in which larger populations of newly enfranchised white ‘common men’ identified with national political organizations.” Those organizations issued newspapers and magazines, to which publishers added an enormous flow of relatively cheap books, pamphlets and broadsides.

The old elites (largely associated with the Whig Party) dismissed most of this output as trash, and they may have had a point, if “revenge! REVENGE!! REVENGE!!!” is anything to go by. At the same time, Poe was arguing that, in Crosby’s words, “genius occurred in that space of free exchange between writer and reader” that could open up if Americans could shed their cultural subservience to the Old World. As for Hawthorne, he was a Democratic Party functionary who idolized Jackson, and "Rappaccini's Daughter" was first published in a Democratic Party magazine.

So the basic thematic implications of the “old” (Philostratusian) and “new” (Keatsian) lamia stories lined up fairly well with Whiggish and Jacksonian-Democratic cultural attitudes, respectively. For one side, the American people needed guidance from Apollonius the Whig to avoid the madness of excessive democracy (let the French revolution be a warning) and the lamia-like seductions of the new mass media. For the Democrat, danger came from corrupt authorities, out to manipulate the citizen into believing the worst about the innocent and moral female sex.

The political allegory took on flesh in the case of a number of women accused of murdering with poison -- an act of deception and homicide of decidedly lamia-like character. The Boston trial of Hannah Kinney -- whose third husband was found to have died from arsenic poisoning -- is both fascinating in itself and a striking corroboration of the author’s point about the lamia as a sort of template for public narrative. Early newspaper reports and gossip depicted her as a bewitching temptress of questionable morals and undoubted guilt. But as the trial continued, Democratic journalists described her as a pleasant, somewhat matronly woman whose late husband was mentally disturbed and who was trying to get over syphilis with the help of a shady “doctor.” (The arsenic in his system might well have gotten there through quackery or suicide.)

The jury acquitted her, which cannot have surprised the junior prosecuting attorney: “Recent experience has shown how difficult, if not impossible it has been to obtain a verdict of condemnation, in cases of alleged murders by secret poison, when females have been the parties accused, and men were the persons murdered.” By contrast, a number of British women accused of poisoning during the same period were dispatched to the gallows with haste. Factors besides "the lamia complex" may account for the difference, but the contrast is striking even so.

It’s unlikely that many Americans in the 1840s had read The Life of Apollonius of Tyana, or heard of Keats, for that matter. Cultural influence need not be that direct to be effective; it can be transmitted on the lower frequencies through repurposed imagery and fragments of narrative, through relays remixes. Perhaps that is what we’re seeing now -- with who knows what archetypes being mashed up on the political stage.

Editorial Tags: 
Image Source: 
University of Iowa Press

Researcher clashes with publisher over book chapter on open education resources

Smart Title: 

Psychology instructor withdraws book chapter after refusing to add language that he asserts a publisher demanded but he deemed too flattering to the textbook industry.

Essay on Barbara Ehrenreich's 'Living With a Wild God'

Examples of atheist spiritual autobiography are not plentiful, although the idea is not as self-contradictory as it perhaps sounds. A quest story that ends without the grail being located or the ring destroyed may not satisfy most audiences, but it's a quest story even so.

The one example that comes to mind is Twelve Years in a Monastery (1897) by Joseph McCabe, an English Franciscan who spent much of his clerical career grappling with doubts that eventually won out. Twelve Years is framed mainly as a critique and expose of clerical life, but its interest as a memoir comes in part from McCabe’s struggle to accept the moral and intellectual demands imposed by his growing skepticism. For all of its defects, monasticism offered a career in which McCabe’s talents were recognized and even, within ascetic limits, rewarded. Leaving it meant answering the call of a new vocation: He went on to write on an encyclopedic array of scientific, cultural and historical topics.

McCabe also became the translator and primary English spokesman for Ernst Haeckel, the German evolutionary theorist and advocate of pantheism, which seems to have squared easily enough with the lapsed monk’s atheism. (There may be more than a semantic difference between thinking of God and the universe as identical and believing there is no God, just universe. But if so, it is largely in the eye of the beholder.)

Barbara Ehrenreich’s background could not be more different from Joseph McCabe’s. In Living With a Wild God: A Nonbeliever’s Search for the Truth About Everything (Hachette/Twelve) she describes her working-class family as consisting of atheists, rationalists, and skeptics for at least a couple of generations back. “God is an extension of human personality,” she wrote in her journal as an adolescent, “brought into the world and enslaved as man’s glorifier.” McCabe would have had to do penance for such a thought; in Ehrenreich’s case, it was just dinner-table wisdom -- expressed with precocious verbal finesse, later honed into a sharp instrument of social criticism put to work in Nickle and Dimed, among other books.

Her memoir appeared two years ago, though I’ve only just read it, making this column more rumination than review. The usual blurb-phrases apply: it’s brilliantly written, thought-provoking, and often very funny, taking aphoristic turns that crystallize complex feelings into the fewest but most apt words. For example: “[I]f you’re not prepared to die when you’re almost 60, then I would say you’ve been falling down on your philosophical responsibilities as a grown-up human being.” Or: “As a child I had learned many things from my mother, like how to sew a buttonhole and scrub a grimy pot, but mostly I had learned that love and its expressions are entirely optional, even between a mother and child.”   

So, a recommended read. (More in the reviewerly vein is to be found here and here.) Additional plaudits for Living With a Wild God won’t count for much at this late date, while its literary ancestry might still be worth a thought. For it seems entirely possible, even likely, that Ehrenreich’s parents and grandparents in Butte, Montana would have read McCabe -- “the trained athlete of disbelief,” as H.G. Wells called him, in recognition of McCabe’s countless books and pamphlets on science, progress and the benefits of godlessness. Many of them appeared in newsprint editions circulating widely in the United States during the first half of the 20th century, with one publisher reckoning he’d sold over 2.3 million booklets by McCabe between the 1920s and the 1940s.

The inner life she led as a teenager that Ehrenreich depicts in her memoir certainly resembles the world of rationality and order that McCabe evokes, and that her family took as a given: “[E]very visible, palpable object, every rock or grain of sand, is a clue to the larger mystery of how the universe is organized and put together -- a mystery that it was our job, as thinking beings, to solve.” Ehrenreich took up the challenge as an adolescent with particular rigor. Faced with the hard questions that death raises about the value and point of life, she began taking notes in the expectation of working out the answer: “I think it is best to start out with as few as possible things which you hold to be unquestionably true and start from there.”

The problem, as Descartes discovered and Ehrenreich did soon, was that “the unquestionably true” is a vanishingly small thing to determine. You are left with “I exist” and no path to any more inclusive confidence than that. Descartes eventually posits the existence of God, but arguably bends the rules in doing so. Ehrenreich does not and lands in the quicksand of solipsism.

The mature Ehrenreich can see how her younger self’s philosophical conflicts took shape in the midst of more mundane family problems involving alcoholism, career frustration and each parent’s set of inescapable disappointments. (After all, solipsism means never having to say you’re sorry.) But she also recognizes that the quandaries of her teenage prototype weren’t just symptoms: “Somehow, despite all the peculiarities of my gender, age class, and family background, I had tapped into the centuries-old mainstream of Western philosophical inquiry, of old men asking over and over, one way or another, what’s really going on here?”

In pursuing answers that never quite hold together, she undergoes what sounds very much like the sort of crisis described by the saints and mystics of various traditions. First, there were repeated moments of being overwhelmed by the sheer strangeness and “there”-ness of the world itself. Then, in May 1959, a few months before leaving for college, she underwent a shattering and effectively inexpressible experience that left her earlier musings in ashes. No consciousness-altering chemicals were ingested beforehand; given the circumstances, it is easy to appreciate why the author spent the 1960s avoiding them:    

“[T]he world flamed into life,” she writes. “There were no visions, no prophetic voices or visits by totemic animals, just this blazing everywhere. Something poured into me and I poured out into it. This was not the passive beatific merger with ‘the All,’ as promised by the Eastern mystics. It was a furious encounter with a living substance that was coming at me through all things at once, and one reason for the terrible wordlessness of the experience is that you cannot observe fire really closely without becoming part of it.”

This kind of thing could not be discussed without the risk of involuntary commitment, and Ehrenreich herself refers to a medical hypothesis suggesting that ecstatic states may result when “large numbers of neurons start firing in synchrony, until key parts of the brain are swept up in a single pattern of activity, an unstoppable cascade of electrical events, beginning at the cellular level and growing to encompass the entire terrain that we experience as ‘consciousness.’”

Ehrenreich did not take McCabe’s course in reverse -- going from atheist into the waiting arms of an established faith. For that matter, she remains more or less an agnostic, at least willing to consider the possible merits of a polytheistic cosmos. "My adolescent solipsism is incidental compared to the collective solipsism our species has embraced for the last few centuries in the name of modernity and rationality," she writes, "a worldview in which there exists no consciousness or agency other than our own, where nonhuman animals are dumb mechanisms, driven by instinct, where all other deities or spirits have been eliminated in favor of the unapproachable God...." Whether a nonreligious mysticism can go beyond "modernity and rationality" without turning anti-modern and irrationalist is something we'll take up in the column on a future occasion. 

Editorial Tags: 
Image Source: 
Hachette Book Group

U of Cincinnati to Start University Press

The University of Cincinnati announced Thursday that it is starting a university press, which it said would focus on social justice and community engagement. The press plans to publish both print and e-books, and also to support creative works in digital media, web-based digital scholarship, multi-authored databases, library special collections and archives. The press will operate in and be supported by the university's library system.

 

 

Ad keywords: 

Shimer Will Become Part of North Central College

North Central College is poised to acquire Shimer College under an agreement between Illinois institutions that leaders hope will have Shimer, a small four-year Great Books college, becoming its own division in a larger institution.

Shimer, which has an enrollment of about 70, and North Central, an independent liberal arts and sciences college with a combined undergraduate and graduate enrollment of more than 2,900, announced on Thursday that they intend to do a deal. The two institutions will now move forward with negotiating a final agreement intended to close at the beginning of March 2017. If successful, the move would create a Shimer Great Books School reporting directly to North Central’s provost for the fall 2017 term, said Troy D. Hammond, North Central president.  Goals are to expand the amenities Shimer can offer to students and grow its size modestly, he said. But North Central wants to keep Shimer’s identity.

 “If we weren’t going to do that, we wouldn’t be having the conversation,” Hammond said. “We recognize the strength and value of what’s unique about Shimer.”

 The move toward an acquisition comes a decade after Shimer decided to relocate from Waukegan, Ill. to lease space at the Illinois Institute of Technology in Chicago. Shimer’s lease in Chicago was not expiring, said the college’s president, Susan E. Henking. But the college and its trustees wanted to find a strategy that would preserve it for the future, she said.

“There are some things you can do as a tenant and some things you can do as a kind of partner to another institution,” Henking said. “We have a mission that says we should be small, but that’s a challenge in today’s environment. If we want to keep with our mission of very small classes and the kind of core curriculum we do, we’ve got to find a different structure.”

Shimer faculty would become faculty at North Central, and its students would become North Central students. The colleges said that would give them access to activities, arts and athletics.

 

Ad keywords: 

University of Florida, Elsevier explore interoperability in the publishing space

Smart Title: 

U of Florida connects its institutional repository to Elsevier's ScienceDirect platform to try to increase the visibility of the university's intellectual work.

Review of Terry Eagleton's "Culture"

If ideas are tools -- “equipment for living,” so to speak -- we might well imagine the culture as a heavily patched-up conceptual backpack that has been around the world a few times. It has been roughly handled along the way.

The stitches strain from the sheer quantity and variety of stuff crammed into it over the years: global culture, national culture, high culture, popular culture, classical and print and digital cultures, sub- and countercultures, along with cultures of violence, of affluence, of entitlement, of critical discourse …. It’s all in there, and much else besides. How it all fits -- what the common denominator might be -- is anyone’s guess. We could always draw on the useful clarifying distinction between: (1) culture as a category of more or less aesthetic artifacts, perhaps especially those that end up in museums and libraries, and (2) culture as the shared elements of a way of life.

The difference is, in principle, one of kind, not of quality, although assumptions about value assert themselves anyway. The first variety is sometimes called “the Matthew Arnold idea of culture,” after that Victorian worthy’s reference, in his book Culture and Anarchy, to “the best which has been thought and said.” Presumably music and painting also count, but Arnold’s emphasis on verbal expression is no accident: culture in his use of the term implies literacy.

By contrast “culture in the anthropological sense” -- as the second category is often called -- subsumes a good deal that can be found in societies without writing: beliefs about the nature of the world, ways of dressing, gender roles, assumptions about what may be eaten and what must be avoided, how emotions are expressed (or not expressed) and so on. Culture understood as a way of life includes rules and ideas that are highly complex though not necessarily transmitted through formal education. You absorb culture by osmosis, often through being born into it, and much of it goes without saying. (This raises the question of whether animals such as primates or dolphins may be said to have cultures. If not, why not? But that means digging through a whole other backpack.)

The dichotomy isn’t airtight, by any means, but it has served in recent years as a convenient pedagogical starting point: a way to get students (among others) to think about the strange ubiquity and ambiguity of culture as a label we stick on almost everything from the Code of Hammurabi to PlayStation 4, while also using it to explain quite a bit. Two people with a common background will conclude a discussion of the puzzling beliefs or behavior of a third party by agreeing, “That’s just part of their culture.” This seems more of a shrug than an explanation, really, but it implies that there isn’t much more to say.

One way to think of Terry Eagleton’s new book, Culture (Yale University Press), is as a broad catalog of the stuff that comes out when you begin unpacking the concept in its title -- arranging the contents along a spectrum rather than sorting them into two piles. In doing so, Eagleton, a distinguished professor of English literature at the University of Lancaster, follows closely the line of thought opened by the novelist and critic Raymond Williams, who coined the expression “culture as a whole way of life.” Williams probably derived the concept in turn, not from the anthropologists, but from T. S. Eliot. In distinguishing “culture as ordinary” (another Williams phrase) from culture as the work that artists, writers, etc. produce, the entire point was to link them: to provoke interest in how life and art communicated, so to speak.

For Williams, the operative word in “culture as a whole way of life” was, arguably, “whole”: something integral, connected and coherent, but also something that could be shattered or violated. Here, too, Eagleton is unmistakably Williams’s student. His assessment of how ideas about culture have taken shape over the past 200 years finds in them a pattern of responses to both industrialism (along with its spoiled heir, consumerism) and the French revolution (the definitive instance of “a whole way of life” exploding, or imploding, under its own strains). “If it is the cement of the social formation,” Eagleton writes, culture “is also its potential point of fracture.”

It may be that I am overemphasizing how closely Eagleton follows Williams. If so, it is still a necessary corrective to the way Williams has slowly turned into just another name in the Cultural Studies Hall of Fame rather than a felt moral and intellectual influence. His emphasis on culture as “a whole way of life” -- expressed with unabashed love and grief for the solidarity and community he knew when growing up in a Welsh mining community -- would sound remarkably anachronistic (if not ideologically totalizing and nostalgically uncritical) to anyone whose cultural reference points are of today’s commodified, virtual and transnational varieties.

And to that extent, Eagleton’s general survey of ideas about culture comes to a sharp point -- aimed directly at how the concept functions now in a capitalist society that he says, “relegates whole swaths of its citizenry to the scrap heap, but is exquisitely sensitive about not offending their beliefs.”

He continues, in a vein that Williams would have appreciated: “Culturally speaking, we are all to be granted equal respect, while economically speaking the gap between the clients of food banks and the clients of merchant banks looms ever larger. The cult of inclusivity helps to mask these material differences. The right to dress, worship or make love as one wishes is revered, while the right to a decent wage is denied. Culture acknowledges no hierarchies, but the educational system is riddled with them.” This may explain why culture is looking so raggedy and overburdened as a term. Pulled too tight, stretched too thin, it covers too many things that it would be difficult to face straight on.

Editorial Tags: 
Image Source: 
Yale University Press

Pages

Subscribe to RSS - books
Back to Top