Review of Beth Shapiro, "How to Clone a Mammoth: The Science of De-Extinction"

So it turns out that -- title notwithstanding -- Beth Shapiro’s How to Clone a Mammoth: The Science of De-Extinction (Princeton University Press) is not a do-it-yourself manual. What’s more, cloned mammoths are, in the author’s considered opinion, impossible. Likewise, alas, with regard to the dodo.

But How Not to Clone a Dodo would never cut it in the marketplace. Besides, the de-extinction of either creature seems possible (and in case of the mammoth, reasonably probable) in the not-too-distant future. The process involved won’t be cloning, per se, but rather one of a variety of forms of bioengineering that Shapiro -- an associate professor of ecology and evolutionary biology at the University of California at Santa Cruz -- explains in moderate detail, and in an amiable manner.

Her approach is to present a step-by-step guide to how an extinct creature could be restored to life given the current state of scientific knowledge and the available (or plausibly foreseeable) advances in technology. There are obstacles. Removing some of them is, by Shapiro’s account, a matter of time and of funding. Whether or not the power to de-exterminate a species is worth pursuing is a question with many parts: ethical and economic, of course, but also ecological. And it grows a little less hypothetical all the time. De-extinction is on the way. (The author allows that the whole topic is hard on the English language, but “resurrection” would probably cause more trouble than it’s worth.)

The subject tickles the public’s curiosity and stirs up powerful emotions. Shapiro says she has received her share of fan and hate mail over the years, including someone’s expressed wish that she be devoured by a flesh-eating mammal of her own making. Perhaps the calmest way into the discussion is by considering why reviving the mammoth or the dodo is possible, but would not be the same thing as cloning one. (And dinosaur cloning is also right out, just to make that part clear without further delay.)

To clone something, in short, requires genetic material from a living cell with an intact genome. “No such cell has ever been recovered from remains of extinct species recovered from the frozen tundra,” writes Shapiro, whose research has involved the search for mammoth remains in Siberia. Flash freezing can preserve the gross anatomy of a mammoth for thousands of years, but nucleases -- the enzymes that fight off pathogens when a cell is alive -- begin breaking down DNA as soon as the cell dies.

What can be recovered, then, is paleogenetic material at some level of dismantling. The challenge is to reconstruct an approximation of the extinct creature’s original genome -- or rather, to integrate the fragments into larger fragments, since rebuilding the whole genetic structure through cut-and-paste efforts is too complex and uncertain a task. The reconstituted strings of genetic data can then be “inserted” at suitable places in the genome of a related creature from our own era. In the case of the woolly mammoth, that would mean genetic material from the Asian elephant; they parted ways on the evolutionary tree a mere 2.5 million years ago. In principle, at least, something similar could be done using DNA from the taxidermy-preserved dodo birds in various collections around the world, punched into the pigeon genome.

“Key to the success of genome editing,” writes Shapiro, “has been the discovery and development of different types of programmable molecular scissors. Programmability allows specificity, which means we can make the cuts we want to make where we want to make them, and we can avoid making cuts that kill the cell.”

Cells containing the retrofitted genome could then be used to spawn a “new” creature that reproduces aspects of the extinct one -- pending the solution of various technical obstacles. For that matter, scraping together enough raw material from millennia past presents its own problems: “In order to recover DNA from specimens that have very little preserved DNA in them, one needs a very sensitive and powerful method for recovering the DNA. But the more sensitive and powerful method is, the more likely it is to produce spurious results.”

Also a factor is the problem of contamination, whether found in the sample (DNA from long-dead mold and bacteria) or brought into the lab in spite of all precautions. Shapiro leaves the reader aware of both the huge barriers to be overcome before some species is brought back from extinction and the strides being made in that direction. She predicts the successful laboratory creation of mammoth cells, if not of viable embryos, within the next few years.

It will be hailed as the cloning of an extinct animal -- headlines that Shapiro (whose experiences with the media do not sound especially happy) regards as wrong but inevitable. The reader comes to suspect one motive for writing the book was to encourage reporters to ask her informed questions when that news breaks, as opposed to trying to get her to speculate about the dangers of Tyrannosaurus rex 2.0.

Besides its explanations of the genetics and technology involved, How to Clone a Mammoth insists on the need to think about what de-extinction would mean for the environment. Returning the closest bioengineerable approximation of a long-lost species to the landscape it once inhabited will not necessarily mean a happy reunion. The niche that animal occupied in the ecosystem might no longer exist. Indeed, the ecosystem could have developed in ways that doom the creature to re-extinction.

Shapiro is dismissive of the idea that being able to revive a species would make us careless about biodiversity (or more careless, perhaps), and she comes close to suggesting that de-extinction techniques will be necessary for preserving existing species. But those things are by no means incompatible. The author herself admits that some species are more charismatic than others: we're more likely to see the passenger pigeon revived than, say, desert rats, even though the latter play an ecological role. The argument may prove harder to take for the humbler species once members of Congress decide to freeze-dry them for eventual relaunching, should that prove necessary.

By now we should know better than to underestimate the human potential for creating a technology that goes from great promise to self-inflicted disaster in under one generation. My guess is that it will take about that long for the horrible consequences of the neo-dodo pet ownership craze of the late 2020s to makes themselves fully felt.

Editorial Tags: 

A new funding program at the NEH hopes to bring more humanities research to the general public

Smart Title: 

New grants from National Endowment for the Humanities aim to encourage books based on humanities research that are accessible to nonscholars.

Review of Naomi Zack, "White Privilege and Black Rights: The Injustice of U.S. Police Racial Profiling and Homicide"

You don’t hear much about the United States being a “postracial society” these days, except when someone is dismissing bygone illusions of the late ’00s, or just being sarcastic. With the Obama era beginning to wind down (as of this week, the president has just under 18 months left in office) American life is well into its post-post-racial phase.

Two thoughts: (1) Maybe we should retire the prefix. All it really conveys is that succession does not necessarily mean progress. (2) It is easy to confuse an attitude of cold sobriety about the pace and direction of change with cynicism, but they are different things. For one, cynicism is much easier to come by. (Often it’s just laziness pretending to be sophisticated.) Lucid assessment, on the other hand, is hard work and not for the faint of spirit.

Naomi Zack’s White Privilege and Black Rights: The Injustice of U.S. Police Racial Profiling and Homicide (Rowman & Littlefield) is a case in point. It consists of three essays plus a preface and conclusion. Remarks by the author indicate it was prepared in the final weeks of last year, with the events in Ferguson, Mo., fresh in mind. But don’t let the title or the book’s relative brevity fool you. The author is a professor of philosophy at the University of Oregon -- and when she takes up terms such as “white privilege” or “black rights,” it is to scrutinize the concepts rather than to use them in slogans.

Despite its topicality, Zack’s book is less a commentary on recent events than part of her continuing effort to think, as a philosopher, about questions of race and justice that are long-standing, but also prone to flashing up, on occasion, with great urgency -- demanding a response, whether or not philosophers (or anyone else) is prepared to answer them.

Zack distinguishes between two ways of philosophizing about justice. One treats justice as an ideal that can be defined and reasoned about, even if no real society in human history ever “fully instantiates or realizes an ideal of justice for all members of that society.” Efforts to develop a theory of justice span the history of Western philosophy.

The other approach begins with injustice and seeks to understand and correct it. Of course, that implies that the philosopher already has some conception of what justice is -- which would seem to beg the question. But Zack contends that theories of justice also necessarily start out from pre-existing beliefs about what it is, which are then strengthened or revised as arguments unfold.

“However it may be done and whatever its subject,” Zack writes, “beginning with concrete injustice and ending with proposals for its correction is a very open-ended and indeterminate task. But it might be the main subject of justice about which people who focus on real life and history genuinely care.”

The philosopher Zack describes may not start out with a theory of what justice is. But that’s OK -- she can recognize justice, paradoxically enough, when it's gone missing.

I wish the author had clarified the approach in the book’s opening pages, rather than two-thirds of the way through, because it proves fundamental to almost everything else she says. She points out how police killings of young, unarmed African-American males over the past couple of years are often explained with references to “white privilege” and “the white supremacist system” -- examples of a sort of ad hoc philosophizing about racial injustice in the United States, but inadequate ones in Zack’s analysis.

Take the ability to walk around talking on the phone carrying a box of Skittles. It is not a “privilege” that white people enjoy, as should be obvious from the sheer absurdity of putting it that way. It is one of countless activities that a white person can pursue without even having to think about it. “That is,” Zack writes, “a ‘privilege’ whites are said to have is sometimes a right belonging to both whites and nonwhites that is violated when nonwhites are the ones who [exercise] it.”

In the words of an online comment the author quotes, “Not fearing the police will kill your child for no reason isn’t a privilege, it’s a right.” The distinction is more than semantic. What Zack calls “the discourse of white privilege” not only describes reality badly but fosters a kind of moral masochism, inducing “self-paralysis in the face of its stated goals of equality.” (She implies that white academics are particularly susceptible to "hold[ing] … progressive belief structures in intellectual parts of their life that are insulated from how they act politically and privately …")

Likewise, “the white supremacist power structure” is a way of describing and explaining oppression that is ultimately incapacitating: “After the civil rights movement, overt and deliberate discrimination in education, housing and employment were made illegal and explicitly racially discriminatory laws were prohibited.” While “de facto racial discrimination is highly prevalent in desirable forms of education, housing and employment,” it does no one any good to assume that “an officially approved ideology of white supremacy” remains embodied in the existing legal order.

None of which should be taken to imply that Zack denies the existence of deep, persisting and tenacious racial inequality, expressed and reinforced through routine practices of violence and humiliation by police seldom held accountable for their actions. But, she says, "What many critics may correctly perceive as societywide and historically deep antiblack racism in the United States does not have to be thoroughly corrected before the immediate issue of police killings of unarmed young black men can be addressed."

She is not a political strategist; her analyses of the bogus logic by which racial profiling and police killings are rationalized are interesting but how to translate them into action is not exactly clear. But in the end, justice and injustice are not problems for philosophers alone.

Editorial Tags: 

Review of Stephen Siff, "Acid Hype: American News Media and the Psychedelic Experience"

If you can remember the 1960s, the old quip goes, you weren’t really part of them. By that standard, the most authentic participants ended up as what used to be called “acid casualties”: those who took spiritual guidance from Timothy Leary’s injunction to “turn on, tune in and drop out” and ended up stranded in some psychedelic heaven or hell. Not that they’ve forgotten everything, of course. But the memories aren’t linear, nor are they necessarily limited to the speaker’s current incarnation on this particular planet.

Fortunately Stephen Siff can draw on a more stable and reliable stratum of cultural memory in Acid Hype: American News Media and the Psychedelic Experience (University of Illinois Press). At the same time, communicating about the world as experienced through LSD or magic mushrooms was ultimately as difficult for a sober newspaper reporter, magazine editor or video documentarian as conversation tends to be for someone whose mind has been completely blown. The author, an assistant professor of journalism at Miami University in Ohio, is never less than shrewd and readable in his assessment of how various news media differed in method and attitude when covering the psychedelic beat. The slow and steady buildup of hype (a word Siff uses in a precise sense) precipitated an early phase of the culture wars -- sometimes in ways that partisans now might not expect.

Papers on experimentation with LSD were published in American medical journals as early as 1950, and reports on its effects from newspaper wire services began tickling the public interest by 1954. The following year, mass-circulation magazines were devoting articles to LSD research, followed in short order by a syndicated TV show’s broadcast of film footage showing someone under the influence. The program, Confidential File, sounds moderately sleazy (the episode in question was described as featuring “an insane man in a sensual trance”) but much of the early coverage was perfectly respectable, treating LSD as a potential source of insight into schizophrenia, or a potential expressway to the unconscious for psychoanalysts.

But the difference between rank sensationalism and science-boosting optimism may count for less, in Siff’s interpretation, than how sharply coverage of LSD broke with prevailing media trends that began coming into force in the 1920s.

After the First World War, with wounded soldiers coming back with a morphine habit, newspapers carried on panic-stricken anti-drug crusades (“The diligent dealer in deadly drugs is at your door!”) and any publication encouraging recreational drug use, or treating it as a fact of life, was sure to fall before J. Edgar Hoover’s watchful eye. Early movie audiences enjoyed the comic antics of Douglas Fairbanks Sr.’s detective character Coke Ennyday (always on the case, syringe at the ready), or in a more serious mood they could go to For His Son, D. W. Griffith’s touching story of a man’s addiction to Dopokoke, the cocaine-fueled soft drink that made his father rich. But by the time the talkies came around, the Motion Picture Production Code categorically prohibited any depiction of drug use or trafficking, even as a criminal enterprise. Siff notes that in the 20 years following the code’s establishment in 1930, “not a single major Hollywood film dealing with drug use was distributed to the public.”

Not that depictions of substance abuse were a forbidden fruit the public was craving, exactly. But the relative openness of the mid-1950s (emphasis on “relative”) allowed editors to risk publishing stories on what was, after all, serious research on a potential new wonder drug. Siff points out that general-assignment newspaper reporters attending a scientific or medical conference, unable to tell what sessions were worth covering, could feel reasonably confident that a title mentioning LSD would probably yield a story.

At the same time, writers for major newsmagazines and opinion journals were following the lead of Aldous Huxley, the novelist and late-life religious searcher, who wrote about mystical experiences he had while taking mescaline. In 1955, when the editors of Life magazine decided to commission a feature on hallucinogenic mushrooms, it turned to Wall Street banker and amateur mycologist R. Gordon Wasson. He traveled to Mexico and became, in his own words, one of “the first white men in recorded history to eat the divine mushroom” -- and if not, then surely the first to give an eyewitness report on “the archetypes, the Platonic ideals, that underlie the imperfect images of everyday life” in the pages of a major newsweekly.

Suffice it to say that by the time Timothy Leary and associates come on the scene (wandering around Harvard University in the early 1960s, with continuously dilated pupils and only the thinnest pretense of scientific research) it is rather late in Siff’s narrative. And Leary’s legendary status as psychedelic shaman/guru/huckster seems much diminished by contrast with the less exhibitionistic advocacy of LSD by Henry and Clare Boothe Luce. Beatniks and nonconformists of any type were mocked regularly in the pages of Time or Life, but the Luce publications were for many years very enthusiastic about the potential benefits of LSD. The power couple tripped frequently, and hard. (Some years ago, when I helped organize Mrs. Luce’s papers at the Library of Congress, the LSD notes were a confidence not to be breached, but now the experiments are a matter of public record.)

The hippies, in effect, seem like a late and entirely unintentional byproduct of industrial-strength hype. “During an episode of media hype,” Siff writes, “news coverage feeds on itself, as different news outlets follow and expand on one another’s stories, reacting among themselves and to real-world developments. Influence seems to flow from the larger news organizations to smaller ones, as editors at smaller or more marginal media operations look toward the decisions made by major outlets for ideas and confirmation of their own judgment.”

That is the process, broadly conceived. In Acid Hype, Siff charts the details -- especially how the feedback bounced around between news organizations, not just of different sizes, but with different journalistic cultures. Newspaper coverage initially stuck to the major talking points of LSD researchers; it tended to stress the potential wonder-drug angle, even when the evidence for it was weak. Major magazines wanted to cover the phenomenon in greater depth -- among other things, with firsthand reports on the psychedelic universe by people who’d gone there on assignment. Meanwhile, the art directors tried to figure out how to convey far-out experiences through imagery and layout -- as, in time, did TV producers. (Especially on Dragnet, if memory serves.)

Some magazine editors seem to have been put off by the religious undercurrents of psychedelic discourse. Siff exhibits a passage in a review that quotes Huxley’s The Doors of Perception but carefully removes any biblical or mystical references. But someone like Leary, who proselytized about psychedelic revolution, was eminently quotable -- plus he looked good on TV because (per the advice of Marshall McLuhan) he smiled constantly.

The same hype-induction processes that made hallucinogens seem like the next step toward improving the American way of life (or, conversely, the escape route for an alternative to it) also went into effect when the tide turned: just as dubious claims about LSD’s healing properties were reported without question (it’ll cure autism!), so did horror stories about side effects (it’ll make you stare at the sun until you go bling!).

The reaction seems to have been much faster and more intense than the gradual pro-psychedelic buildup. Siff ends his account of the period in 1969 -- oddly enough, without ever mentioning the figure who emerged into public view that year as the embodiment of LSD's presumed demons: Charles Manson. You didn't hear much about the drug's spiritual benefits after Charlie began explaining them. That was probably for the best.

Editorial Tags: 

In new book, faculty member urges universities to hold themselves to higher levels of accountability and inclusivity

Smart Title: 

Author discusses new book in which he argues that institutions can hold themselves to higher levels of accountability and inclusivity.

Juan Felipe Herrera Is Next U.S. Poet Laureate

The Library of Congress today named Juan Felipe Herrera, professor emeritus of creative writing at the University of California at Riverside, the next U.S. poet laureate. Herrera is the author of 28 books of poetry, novels for young adults and collections for children. In a statement, James H. Billington, the librarian of Congress, said, “I see in Herrera’s poems the work of an American original -- work that takes the sublimity and largesse of 'Leaves of Grass' and expands upon it. His poems engage in a serious sense of play -- in language and in image -- that I feel gives them enduring power. I see how they champion voices and traditions and histories, as well as a cultural perspective, which is a vital part of our larger American identity.”

Herrera will be the first Latino to hold the position of poet laureate.

His two most recent collections of poems, both published by the University of Arizona Press, are Senegal Taxi and Half of the World in Light.

The Library of Congress released this guide to his work. Herrera's CV may be found here.

Ad keywords: 

Review of Edna Greene Medford, "Lincoln and Emancipation"

Reading the Emancipation Proclamation for the first time is an unforgettable experience. Nothing prepares you for how dull it turns out to be. Ranking only behind the Declaration of Independence and the Constitution in its consequences for U.S. history, the document contains not one sentence that has passed into popular memory. It was the work, not of Lincoln the wordsmith and orator, but of Lincoln the attorney. In fact, it sounds like something drafted by a group of lawyers, with Lincoln himself just signing off on it.

Destroying an institution of systematic brutalization -- one in such contradiction to the republic’s professed founding principles that Jefferson’s phrase “all men are created equal” initially drew protests from slave owners -- would seem to require a word or two about justice. But the proclamation is strictly a procedural document. The main thrust comes from an executive order issued in late September 1862, “containing, among other things, the following, to wit: ‘That on the first day of January, in the year of our Lord one thousand eight hundred and sixty-three, all persons held as slaves within any State or designated part of a State, the people whereof shall then be in rebellion against the United States, shall be then, thenceforward, and forever free….’”

Then -- as if to contain the revolutionary implications of that last phrase -- the text doubles down on the lawyerese. The proclamation itself was issued on the aforesaid date, in accord with the stipulations of the party of the first part, including the provision recognizing “the fact that any State, or the people thereof, shall on that day be, in good faith, represented in the Congress of the United States by members chosen thereto at elections wherein a majority of the qualified voters of such State shall have participated, shall, in the absence of strong countervailing testimony, be deemed conclusive evidence that such State, and the people thereof, are not then in rebellion against the United States.”

In other words: “If you are a state, or part of a state, that recognizes the union enough to send representatives to Congress, don’t worry about your slaves being freed right away and without compensation. We’ll work something out.”

Richard Hofstadter got it exactly right in The American Political Tradition (1948) when he wrote that the Emancipation Proclamation had “all the moral grandeur of a bill of lading.” It is difficult to believe the same author could pen the great memorial speech delivered at Gettysburg a few months later -- much less the Second Inaugural Address.

But to revisit the proclamation after reading Edna Greene Medford’s Lincoln and Emancipation (Southern Illinois University Press) is also a remarkable experience -- a revelation of how deliberate, even strategic, its lawyerly ineloquence really was.

Medford, a professor of history at Howard University, was one of the contributors to The Emancipation Proclamation: Three Views (Louisiana State University Press, 2006). Her new book is part of SIUP’s Concise Lincoln Library, now up to 17 volumes. Medford’s subject overlaps with topics covered by earlier titles in the series (especially the ones on race, Reconstruction and the Eighteenth Amendment) as well as with works such as Eric Foner’s The Fiery Trial: Abraham Lincoln and American Slavery (Norton, 2010).

Even so, Medford establishes her own approach by focusing not only on Lincoln’s ambivalent and changing sense of what he could and ought to do about slavery (a complex enough topic in its own right) but also on the attitudes and activities of a heterogeneous and dispersed African-American public with its own priorities.

For Lincoln, abolishing the institutionalized evils of slavery was a worthy goal but not, as such, an urgent one. As of 1860, his primary concern was that it not spread to the new states. After 1861, it was to defeat the slaveholders’ secession -- but without making any claim to the power to end slavery itself. He did support efforts to phase it out by compensating slave owners for manumission. (Property rights must be respected, after all, went the thinking of the day.) His proposed long-term solution for racial conflict was to send the emancipated slaves to Haiti, Liberia, or someplace in Central America to be determined.

Thanks in part to newspapers such as The Weekly Anglo-African, we know how free black citizens in the North responded to Lincoln, and it is clear that some were less than impressed with his antislavery credentials. “We want Nat Turner -- not speeches,” wrote one editorialist; “Denmark Vesey -- not resolutions; John Brown -- not meetings.” Especially galling, it seems, were Lincoln’s plans to reimburse former slave owners for their trouble while uprooting ex-slaves from land they had worked for decades. African-American commentators argued that Lincoln was getting it backward. They suggested that the ex-slaves be compensated and their former masters shipped off instead.

To boil Medford’s succinct but rich narrative down into something much more schematic, I’ll just say that Lincoln’s cautious regard for the rights of property backfired. Frederick Douglass wrote that the slaves “[gave] Mr. Lincoln credit for having intentions towards them far more benevolent and just than any he is known to cherish…. His pledges to protect and uphold slavery in the States have not reached them, while certain dim, undefined, but large and exaggerated notions of his emancipating purpose have taken firm hold of them, and have grown larger and firmer with every look, nod, and undertone of their oppressors.” African-American Northerners and self-emancipating slaves alike joined the Union army, despite all the risks and the obstacles.

The advantage this gave the North, and the disruption it created in the South, changed abolition from a moral or political concern to a concrete factor in the balance of forces -- and the Emancipation Proclamation, for all its uninspired and uninspiring language, was Lincoln’s concession to that reality. He claimed the authority to free the slaves of the Confederacy “by virtue of the power in me vested as Commander-in-Chief, of the Army and Navy of the United States in time of actual armed rebellion against the authority and government of the United States, and as a fit and necessary war measure for suppressing said rebellion.”

Despite its fundamentally practical motivation and its avoidance of overt questions about justice, the proclamation was a challenge to the American social and political order that had come before. And it seems to have taken another two years before the president himself could spell out its implications in full, in his speech at the Second Inaugural. The depth of the challenge is reflected in the each week's headlines, though to understand it better you might want to read Medford's little dynamite stick of a book first.

Editorial Tags: 

Chegg to run Bowdoin College textbook center

Smart Title: 

Fewer and fewer students are buying their textbooks at the Bowdoin College bookstore, so the college is outsourcing its textbook center to Chegg.

Review of Stjepan Mestrovic, 'The Postemotional Bully'

You don’t often come across references to “the moral sciences” these days, unless you read a lot of biographies of well-educated Victorians, and maybe not even then. The term covers economics, psychology, anthropology and other fields in what are now usually called the social sciences. I’m not sure when the one gave way to the other. If the older expression sounds odd to the modern ear, that’s probably because anything called a science now implicitly rests on a fundamental distinction between fact and value.

A science sticks to “is” rather than “ought” -- or it ought to, anyway. (Whether or not the fact/value dichotomy is valid or coherent is a long discussion in itself.) Stjepan Mestrovic, a professor of sociology at Texas A&M University, does not overtly challenge the principle of value-free social science in The Postemotional Bully (SAGE). He uses concepts and distinctions from the canon of classical social theory to interpret contemporary cultural and behavioral trends.

But the ideas he draws on carry a certain amount of residue from the era of the moral sciences, while the phenomena he analyzes (the happy-face sadism at Abu Ghraib, for example) are too disturbing for studied neutrality to seem like anything but complicity.

George Orwell provides the book with its point of departure. “What,” Orwell asked in an article from 1946 that Mestrovic quotes, “is the special quality in modern life that makes a major human motive out of the impulse to bully others? If we could answer that question -- seldom asked, never followed up -- there might occasionally be a bit of good news on the front page of your morning paper.”

Sociology’s founding fathers never tackled the question, as such. But they did create a whole array of fundamental concepts about how the social system emerging over the past two hundred-odd years (often called modernity, a.k.a. capitalism, industrial society, mass society and related aliases) differed from the smaller-scale, slower-moving patterns of life that had gone before. And so Mestrovic can draw on Ferdinand Tonnies’s contrast between Gemeinschaft (community: organic, strong bonds, face-to-face relationships prevail) and Gesellschaft (society: change and dislocation common, regular contact with strangers, many interactions involve an exchange of money). Or on David Riesman’s interpretation of American society as moving from an inner-directed era (during which the sense of personal identity was shaped by values absorbed from parents and authorities) to one that is other-directed (the individual “is group oriented, conformist and changes values constantly to fit into norms and values that are in constant flux,” in Mestrovic’s words).

Other theorists and concepts also enter the discussion, but here it might be best to look in the general direction that the author steers them. An overarching schema of recent decades posits a sort of three-stage movement from a traditional order -- the Gemeinschaft, more or less -- to modern society, where the inner-directed people, at least, functioned with a sense of individual identity and established obligations, despite the alienation and other distractions of the Gesellschaft, including the antics of the other-directed. And beyond that? The postmodern condition, of course, which is endlessly defined and disputed without even reaching a plurality (much less a consensus) as to its meaning.

As an alternative, Mestrovic proposed his concept of “postemotional society” in the 1990s. An awkward expression, it has the sole virtue of allowing its creator to avoid sinking into the postmodernist quicksand without a trace. Perhaps the clearest way to explain postemotionality is to treat it as an extension and updating of Riesman’s characterization of other-directedness. The other-directed person relies on a peer group, rather than a deeply rooted and stringent superego, in determining what’s important and how to behave. Those accepted standards, in turn, often reflect current trends in film, advertising and mass media. By contrast Mestrovic’s postemotional type takes his or her cues from a culture more volatile and ephemeral than anything Riesman, writing in the early 1950s, could have imagined.

Postemotionality is -- for want of a more elegant way of putting it -- hyper-other-directed. It involves relationships that are “not intimate but also not alien.” The individual is “plugged and hooked into his or her electronic screen devices… pretend[ing] to be ‘in touch’ with others and the world” yet in reality “trapped in an electronic solitary confinement.”

But the condition is more than a symptom of the new digital order. In an insight combining Emil Durkheim’s thoughts on the division of labor with Donald Rumsfeld’s doctrine of manpower deployment (“People are fungible. You can have them here or there”), Mestrovic stresses that workplaces and institutions are increasingly prone to “the reduction of the human being as a fungible asset” that is “replaceable and interchangeable.” This aspect of the argument remains underdeveloped, but seems to echo recent discussions of precarity in employment. At the same time, society “aims much of its mechanical, intellectual, artificial and productive powers at the task of systematically faking community and its traits: the managed heart, fake sincerity, false kindness,” and so on, creating “new hybrid forms of emotional life that are neither entirely fake nor sincere.”

And beneath the forced smile -- that emblem of a “social life based upon dead emotions from the best” -- there lurks a considerable potential for cruelty. The author quotes Veblen’s remark that in modern society “simple aggression and unrestrained violence in great measure give place to shrewd practices and chicanery, as the best-approved method of accumulating wealth.”

But the process is not irreversible. The book takes up three cases of contemporary brutalization, postemotional style -- Abu Ghraib, the prolonged and ultimately fatal beating of an unresisting prisoner in an American jail, and a soldier driven to suicide by racial slurs and physical torture. I’ll forgo any discussion of them beyond noting that in each case, nobody within the chain of command was held accountable, the perpetrators’ rationalizations were more or less accepted by the court, and little or no punishment followed.

Add to that the prevailing tone of “screen culture” -- indignation, rage, contempt, malice and a certain cheerful callousness -- and the case can be made that Mestrovic has identified a real tendency, at least in American culture. I doubt the expression “postemotional” will ever catch on, though. It’s just the new normal.

Author discusses his new book on the rise of English as the dominant language of science

Smart Title: 

Author discusses his new book on science before and after the dominance of English as a global language.


Subscribe to RSS - books
Back to Top