books

Review of Jacques Le Goff, 'Must We Divide History Into Periods?'

George Orwell opened one of his broadcasts on the BBC in the early 1940s by recounting how he’d learned history in his school days. The past, as his teachers depicted it, was “a sort of long scroll with thick black lines ruled across it at intervals,” he said. “Each of these lines marked the end of what was called a ‘period,’ and you were given to understand that what came afterwards was completely different from what had gone before.”

The thick black lines were like borders between countries that didn’t know one another’s languages. “For instance,” he explained, “in 1499 you were still in the Middle Ages, with knights in plate armour riding at one another with long lances, and then suddenly the clock struck 1500, and you were in something called the Renaissance, and everyone wore ruffs and doublets and was busy robbing treasure ships on the Spanish Main. There was another very thick black line drawn at the year 1700. After that it was the Eighteenth Century, and people suddenly stopped being Cavaliers and Roundheads and became extra-ordinarily elegant gentlemen in knee breeches and three-cornered hats … The whole of history was like that in my mind -- a series of completely different periods changing abruptly at the end of a century, or at any rate at some sharply defined date.”

His complaint was that chopping up history and simplistically labeling the pieces was a terrible way to teach the subject. It is a sentiment one can share now only up to a point. Orwell had been an average student; today it would be the mark of a successful American school district if the average student knew that the Renaissance came after the Middle Ages, much less that it started around 1500. (A thick black line separates his day and ours, drawn at 1950, when television sales started to boom.)

Besides, the disadvantages of possessing a schematic or clichéd notion of history are small by contrast to the pleasure that may come later, from learning that the past was richer (and the borders between periods more porous) than the scroll made it appear.

Must We Divide History Into Periods? asked Jacques Le Goff in the title of his last book, published in France shortly before his death in April 2014 and translated by M. B. DeBevoise for the European Perspectives series from Columbia University Press. A director of studies at L'École des Hautes Études en Sciences Sociales in Paris, Le Goff was a prolific and influential historian with a particular interest in medieval European cities. He belonged to the Annales school of historians, which focused on social, economic and political developments over very long spans of time -- although his work also exhibits a close interest in medieval art, literature and philosophy (where changes were slow by modern standards, but faster than those in, say, agricultural technique).

Le Goff’s final book revisits ideas from his earlier work, but in a manner of relaxed erudition clearly meant to address people whose sense of the past is roughly that of young Orwell. And in fact it is that heavy mark on the scroll at the year 1500 -- the break between the Middle Ages and the Renaissance -- that Le Goff especially wants the student to reconsider. (I say “student” rather than “reader” because time with the book feels like sitting in a lecture hall with a memorably gifted teacher.)

He quotes one recent historian who draws the line a little earlier, with the voyage of Christopher Columbus in 1492: “The Middle Ages ended, the modern age dawned, and the world became suddenly larger.” Le Goff is not interested in the date but in the stark contrast that is always implied. Usually the Middle Ages are figured as “a period of obscurity whose outstanding characteristic was ignorance” -- happily dispelled by a new appreciation for ancient, secular literature and a sudden flourishing of artistic genius.

Calling something “medieval” is never a compliment; the image that comes to mind is probably that of a witch trial. By contrast, “Renaissance” would more typically evoke a page from Leonardo da Vinci’s notebooks. Such invidious comparison is not hard to challenge. Witch trials were rare in the Middle Ages, while the Malleus Maleficarum appeared in “the late fifteenth century,” Le Goff notes, “when the Renaissance was already well underway, according to its advocates.”

Given his expertise on the medieval city -- with its unique institutional innovation, the university -- Le Goff makes quick work of demolishing the notion of the Middle Ages having a perpetually bovine and stagnant cultural life. The status of the artist as someone “inspired by the desire to create something beautiful” who had “devoted his life to doing just this” in pursuit of “something more than a trade, nearer to a destiny,” is recognized by the 13th century. And a passage from John of Salisbury describes the upheaval underway in the 12th century, under the influence of Aristotle:

“Novelty was introduced everywhere, with innovations in grammar, changes in dialectic, rhetoric declared irrelevant and the rules of previous teachers expelled from the very sanctuary of philosophy to make way for the promulgation of new systems …”

I can’t say that the name meant anything to me before now, but the entry on John of Salisbury in the Stanford Encyclopedia of Philosophy makes it sound as if Metalogicon (the work just quoted) was the original higher ed polemic. It was “ostensibly written as a defense of the study of logic, grammar and rhetoric against the charges of the pseudonymous Cornificius and his followers. There was probably not a single person named Cornificius; more likely John was personifying common arguments against the value of a liberal education. The Cornificians believe that training is not relevant because rhetorical ability and intellectual acumen are natural gifts, and the study of language and logic does not help one to understand the world. These people want to seem rather than to be wise. Above all, they want to parlay the appearance of wisdom into lucrative careers. John puts up a vigorous defense of the need for a solid training in the liberal arts in order to actually become wise and succeed in the ‘real world.’”

That's something an Italian humanist might write four hundred years later to champion “the new learning” of the day. And that is no coincidence. Le Goff contends that “a number of renaissances, more or less extensive, more or less triumphant,” took place throughout the medieval era -- an elaboration of the argument by the American historian Charles H. Haskins in The Renaissance of the Twelfth Century (1927), a book that influenced scholars without, as Le Goff notes, having much effect on the larger public. The Renaissance, in short, was a renaissance -- one of many -- and in Le Goff’s judgment “the last subperiod of a long Middle Ages.”

So, no bold, clean stroke of the black Magic Marker; more of a watercolor smear, with more than one color in the mix. Le Goff treats the Middle Ages as having a degree of objective reality, insofar as certain social, economic, religious and political arrangements emerged and developed in Europe over roughly a thousand years.

At the same time, he reminds us that the practice of breaking up history into periods has its own history -- deriving, in its European varieties, from Judeo-Christian ideas, and laden with ideas of decline or regeneration. “Not only is the image of a historical period liable to vary over time,” he writes, “it always represents a judgment of value with regard to sequences of events that are grouped together in one way rather than another.”

I'm not entirely clear how, or if, he reconciled the claim to assess periodization on strictly social-scientific grounds with its status as a concept with roots in the religious imagination, but it's a good book that leaves the reader with interesting questions.

Editorial Tags: 

Essay calls for a new approach to college textbooks

The Book Industry Study Group just reported that 52 percent of college students surveyed agreed that “I would rather pay $100 for a learning solution that improves my result by one letter grade and reduces my study time by 25 percent than $50 for my current textbook.” As a professor, I am troubled by declines in the effort many in my classes are willing to put into doing the reading I assign. But as an administrator, I also recognize students’ concerns with scoring high grades, juggling internships and part-time jobs, and minimizing expenses.

Multiple factors are at play here: grade inflation, social pressures, student debt, the iffy job market. Further relevant is the time students report studying each week (now an average of 15 hours, down from about 24 in the 1960s). Yet one of the major culprits is the price tag on textbooks and other course materials, estimated at around $1,200 a year -- assuming you buy them.

Faculty members and students alike are in a quandary over how to handle textbook costs, especially for those hefty tomes often used in introductory courses. Increasingly, students are opting not to purchase these books -- not even rent them. Digital formats (and rentals of any kind) tend to be less expensive than buying print, though frequently the decision is not to acquire the materials at all. The U.S. Public Interest Research Group reports that two-thirds of students have refrained from purchasing at least one assigned textbook because of price.

Recently, American University ran focus groups with our undergraduates, looking to get a sense of how they make textbook decisions. For courses in their major, they are willing to lay out more money than for general education classes, which they perceive (often wrongly) not to require much work anyway. Over all, the common sentiment is that spending more than about $50 for a book is excessive. And of course there are plenty of college textbooks with prices that exceed $50.

This message was reinforced by an anecdote shared with me by Michael Rosenwald, a reporter for The Washington Post. While interviewing American University students for a story on college reading and book-purchasing habits, Rosenwald asked, “Who buys course materials from the campus store these days?” Their answer: “Freshmen,” revealing that once students settle into campus life, they discover less expensive ways to get their books -- or devise strategies on how much reading they'll actually do.

For faculty members, the challenge is to find a workable balance between the amount of reading we would like those in our classes to complete and realistic expectations for student follow-through. While some full-length books may remain on our required list, their numbers have shrunk over time. These days, assignments that used to call for complete books are being slimmed down to single chapters or articles. Our aspirations for our students to encounter and absorb substantial amounts of written material increasingly rub up against their notions of how much is worth reading.

The numbers tell the tale. That same Book Industry Study Group report noted that between 2010 and 2013, the percentage of students indicating that classes they were taking required “no formal course materials” rose from 4 percent to 11 percent.

Student complaints are equally revealing. When Robert Putnam’s Bowling Alone came out, I assigned the book to a group of honors undergraduates, eager for them to experience careful, hypothesis-driven, data-rich social science research. One member of the class balked. In fact, she publicly berated me, demanding to know why I hadn’t told the group about the “short version” of the book -- meaning an article Putnam has written years earlier, before his full study was completed. She went on to inform the class what she had learned from a teacher in high school: books aren’t worth reading, only articles. The rest of what’s in books is just padding.

The author and teacher in me cringed at how this young woman perceived the intellectual enterprise.

For students, besides the understandable limitations on time and finances, there is the question of value proposition. If the objective is learning that lasts, maybe buying the book (and reading it) is worth it. But if the goal is getting a better grade, maybe not. All too often today, it is the grade that triumphs.

One player that faculty members generally leave out of the equation is the publishing industry, including not just the companies whose names are on the spines but the people who print the books, supply the paper and ink, and operate the presses. Recently I spoke at the Book Manufacturers’ Institute Conference and was troubled by the disconnect I perceived between those who produce and distribute textbooks and those who consume them. As students buy fewer books, publishers do smaller print runs, resulting in higher prices, which in turn reinforces the spiral of lower sales.

A potential compensatory financial strategy for publishers is issuing revised editions, intended to render obsolete those already in circulation. In reality, students often take a pass on these new offerings, waiting until they appear on the used book market. Yes, sometimes there is fresh, timely material in the new versions, but how often do we really need to update textbooks on the structure of English grammar or the history of early America?

When speaking with participants in the book manufacturers’ conference, I became increasingly convinced that the current model of book creation, distribution and use is not sustainable. What to do?

There is a pressing need for meaningful collaboration between faculty members and the publishing industry to find ways of producing materials designed to foster learning that reaches beyond the test -- and that students can be reasonably expected to procure and use. I would like to hope that textbook publishers (who I know are financially suffering) are in conversation not just with authors seeking book contracts but with faculty members who can share their own assignment practices, along with personal experiences about how students are voting with their feet regarding purchasing and reading decisions.

To help foster such dialogue, here are some suggestions:

  • Gather data on shifts in the amount and nature of reading that faculty assign, say, over the past 10-20 years.
  • Reconsider publishing strategies regarding those handsome, expensive, color-picture-laden texts, whose purpose is apparently to entice students to read them. If students aren’t willing to shell out the money, the book likely isn’t being read. Focus instead on producing meaningful material written with clear, engaging prose.
  • Rethink when a new edition is really warranted and when not. In many instances, issuing a smaller update, to be used as a supplement to the existing text, is really all that’s needed. (Think of those encyclopedia annuals with which many of us are familiar.) Students -- and far more of them -- will be willing to pay $9.95 for an update to an older book than $109.95 for a new one. McDonald’s learned long ago that you can turn a handsome profit through high volume on low-cost items. The publishing industry needs to do the math.
  • Make faculty members aware of the realities of both textbook prices (some professors never look before placing book orders) and student reading patterns. I heartily recommend hanging out in the student union (or equivalent) and eavesdropping. You will be amazed at how cunning -- and how honest -- students are about their study practices.
  • Encourage professors to assign readings (especially ones students are asked to pay for) that maximize long-term educational value.
  • Educate students about the difference between gaming the assignment system (either for grades or cost savings) and learning.

The results can yield a win-win situation for both the publishing industry and higher education.

Naomi S. Baron is executive director of the Center for Teaching, Research, and Learning at American University and author of Words Onscreen: The Fate of Reading in a Digital World.

Editorial Tags: 

Authors discuss reasoning behind high levels of Asian American achievement

Smart Title: 

Authors discuss new book on high levels of Asian-American achievement in education in which they argue that it's not about culture.

University of Akron says it hasn't eliminated its university press, but has eliminated all press staff jobs

Smart Title: 

U of Akron denies killing off its press, even though the university eliminated the jobs of all employees. Many are dubious.

Akron Layoffs Include Everyone at University Press

The University of Akron announced a few weeks ago that it would eliminate more than 200 jobs to deal with a budget deficit. As employees losing their jobs were notified this week, it has become clear that the university is eliminating its university press. All employees of the press are among those having their jobs killed, Northeast Ohio Media Group reported. "We have essentially been shut down," said Thomas Bacher, director of the press. "Another blow against culture by a short-sighted administration. It's sad that the university values beans over brains." The press is a small one with a focus on Ohio-related topics, and it also publishes some poetry.

The university layoffs also include all employees of Akron's multicultural center, although the university says that other offices will support the programming offered by the center.

Faculty members have been complaining that the university is refusing to consider cuts to the football program, which loses money and attracts few fans to watch its games.

Review of Beth Shapiro, "How to Clone a Mammoth: The Science of De-Extinction"

So it turns out that -- title notwithstanding -- Beth Shapiro’s How to Clone a Mammoth: The Science of De-Extinction (Princeton University Press) is not a do-it-yourself manual. What’s more, cloned mammoths are, in the author’s considered opinion, impossible. Likewise, alas, with regard to the dodo.

But How Not to Clone a Dodo would never cut it in the marketplace. Besides, the de-extinction of either creature seems possible (and in case of the mammoth, reasonably probable) in the not-too-distant future. The process involved won’t be cloning, per se, but rather one of a variety of forms of bioengineering that Shapiro -- an associate professor of ecology and evolutionary biology at the University of California at Santa Cruz -- explains in moderate detail, and in an amiable manner.

Her approach is to present a step-by-step guide to how an extinct creature could be restored to life given the current state of scientific knowledge and the available (or plausibly foreseeable) advances in technology. There are obstacles. Removing some of them is, by Shapiro’s account, a matter of time and of funding. Whether or not the power to de-exterminate a species is worth pursuing is a question with many parts: ethical and economic, of course, but also ecological. And it grows a little less hypothetical all the time. De-extinction is on the way. (The author allows that the whole topic is hard on the English language, but “resurrection” would probably cause more trouble than it’s worth.)

The subject tickles the public’s curiosity and stirs up powerful emotions. Shapiro says she has received her share of fan and hate mail over the years, including someone’s expressed wish that she be devoured by a flesh-eating mammal of her own making. Perhaps the calmest way into the discussion is by considering why reviving the mammoth or the dodo is possible, but would not be the same thing as cloning one. (And dinosaur cloning is also right out, just to make that part clear without further delay.)

To clone something, in short, requires genetic material from a living cell with an intact genome. “No such cell has ever been recovered from remains of extinct species recovered from the frozen tundra,” writes Shapiro, whose research has involved the search for mammoth remains in Siberia. Flash freezing can preserve the gross anatomy of a mammoth for thousands of years, but nucleases -- the enzymes that fight off pathogens when a cell is alive -- begin breaking down DNA as soon as the cell dies.

What can be recovered, then, is paleogenetic material at some level of dismantling. The challenge is to reconstruct an approximation of the extinct creature’s original genome -- or rather, to integrate the fragments into larger fragments, since rebuilding the whole genetic structure through cut-and-paste efforts is too complex and uncertain a task. The reconstituted strings of genetic data can then be “inserted” at suitable places in the genome of a related creature from our own era. In the case of the woolly mammoth, that would mean genetic material from the Asian elephant; they parted ways on the evolutionary tree a mere 2.5 million years ago. In principle, at least, something similar could be done using DNA from the taxidermy-preserved dodo birds in various collections around the world, punched into the pigeon genome.

“Key to the success of genome editing,” writes Shapiro, “has been the discovery and development of different types of programmable molecular scissors. Programmability allows specificity, which means we can make the cuts we want to make where we want to make them, and we can avoid making cuts that kill the cell.”

Cells containing the retrofitted genome could then be used to spawn a “new” creature that reproduces aspects of the extinct one -- pending the solution of various technical obstacles. For that matter, scraping together enough raw material from millennia past presents its own problems: “In order to recover DNA from specimens that have very little preserved DNA in them, one needs a very sensitive and powerful method for recovering the DNA. But the more sensitive and powerful method is, the more likely it is to produce spurious results.”

Also a factor is the problem of contamination, whether found in the sample (DNA from long-dead mold and bacteria) or brought into the lab in spite of all precautions. Shapiro leaves the reader aware of both the huge barriers to be overcome before some species is brought back from extinction and the strides being made in that direction. She predicts the successful laboratory creation of mammoth cells, if not of viable embryos, within the next few years.

It will be hailed as the cloning of an extinct animal -- headlines that Shapiro (whose experiences with the media do not sound especially happy) regards as wrong but inevitable. The reader comes to suspect one motive for writing the book was to encourage reporters to ask her informed questions when that news breaks, as opposed to trying to get her to speculate about the dangers of Tyrannosaurus rex 2.0.

Besides its explanations of the genetics and technology involved, How to Clone a Mammoth insists on the need to think about what de-extinction would mean for the environment. Returning the closest bioengineerable approximation of a long-lost species to the landscape it once inhabited will not necessarily mean a happy reunion. The niche that animal occupied in the ecosystem might no longer exist. Indeed, the ecosystem could have developed in ways that doom the creature to re-extinction.

Shapiro is dismissive of the idea that being able to revive a species would make us careless about biodiversity (or more careless, perhaps), and she comes close to suggesting that de-extinction techniques will be necessary for preserving existing species. But those things are by no means incompatible. The author herself admits that some species are more charismatic than others: we're more likely to see the passenger pigeon revived than, say, desert rats, even though the latter play an ecological role. The argument may prove harder to take for the humbler species once members of Congress decide to freeze-dry them for eventual relaunching, should that prove necessary.

By now we should know better than to underestimate the human potential for creating a technology that goes from great promise to self-inflicted disaster in under one generation. My guess is that it will take about that long for the horrible consequences of the neo-dodo pet ownership craze of the late 2020s to makes themselves fully felt.

Editorial Tags: 

A new funding program at the NEH hopes to bring more humanities research to the general public

Smart Title: 

New grants from National Endowment for the Humanities aim to encourage books based on humanities research that are accessible to nonscholars.

Review of Naomi Zack, "White Privilege and Black Rights: The Injustice of U.S. Police Racial Profiling and Homicide"

You don’t hear much about the United States being a “postracial society” these days, except when someone is dismissing bygone illusions of the late ’00s, or just being sarcastic. With the Obama era beginning to wind down (as of this week, the president has just under 18 months left in office) American life is well into its post-post-racial phase.

Two thoughts: (1) Maybe we should retire the prefix. All it really conveys is that succession does not necessarily mean progress. (2) It is easy to confuse an attitude of cold sobriety about the pace and direction of change with cynicism, but they are different things. For one, cynicism is much easier to come by. (Often it’s just laziness pretending to be sophisticated.) Lucid assessment, on the other hand, is hard work and not for the faint of spirit.

Naomi Zack’s White Privilege and Black Rights: The Injustice of U.S. Police Racial Profiling and Homicide (Rowman & Littlefield) is a case in point. It consists of three essays plus a preface and conclusion. Remarks by the author indicate it was prepared in the final weeks of last year, with the events in Ferguson, Mo., fresh in mind. But don’t let the title or the book’s relative brevity fool you. The author is a professor of philosophy at the University of Oregon -- and when she takes up terms such as “white privilege” or “black rights,” it is to scrutinize the concepts rather than to use them in slogans.

Despite its topicality, Zack’s book is less a commentary on recent events than part of her continuing effort to think, as a philosopher, about questions of race and justice that are long-standing, but also prone to flashing up, on occasion, with great urgency -- demanding a response, whether or not philosophers (or anyone else) is prepared to answer them.

Zack distinguishes between two ways of philosophizing about justice. One treats justice as an ideal that can be defined and reasoned about, even if no real society in human history ever “fully instantiates or realizes an ideal of justice for all members of that society.” Efforts to develop a theory of justice span the history of Western philosophy.

The other approach begins with injustice and seeks to understand and correct it. Of course, that implies that the philosopher already has some conception of what justice is -- which would seem to beg the question. But Zack contends that theories of justice also necessarily start out from pre-existing beliefs about what it is, which are then strengthened or revised as arguments unfold.

“However it may be done and whatever its subject,” Zack writes, “beginning with concrete injustice and ending with proposals for its correction is a very open-ended and indeterminate task. But it might be the main subject of justice about which people who focus on real life and history genuinely care.”

The philosopher Zack describes may not start out with a theory of what justice is. But that’s OK -- she can recognize justice, paradoxically enough, when it's gone missing.

I wish the author had clarified the approach in the book’s opening pages, rather than two-thirds of the way through, because it proves fundamental to almost everything else she says. She points out how police killings of young, unarmed African-American males over the past couple of years are often explained with references to “white privilege” and “the white supremacist system” -- examples of a sort of ad hoc philosophizing about racial injustice in the United States, but inadequate ones in Zack’s analysis.

Take the ability to walk around talking on the phone carrying a box of Skittles. It is not a “privilege” that white people enjoy, as should be obvious from the sheer absurdity of putting it that way. It is one of countless activities that a white person can pursue without even having to think about it. “That is,” Zack writes, “a ‘privilege’ whites are said to have is sometimes a right belonging to both whites and nonwhites that is violated when nonwhites are the ones who [exercise] it.”

In the words of an online comment the author quotes, “Not fearing the police will kill your child for no reason isn’t a privilege, it’s a right.” The distinction is more than semantic. What Zack calls “the discourse of white privilege” not only describes reality badly but fosters a kind of moral masochism, inducing “self-paralysis in the face of its stated goals of equality.” (She implies that white academics are particularly susceptible to "hold[ing] … progressive belief structures in intellectual parts of their life that are insulated from how they act politically and privately …")

Likewise, “the white supremacist power structure” is a way of describing and explaining oppression that is ultimately incapacitating: “After the civil rights movement, overt and deliberate discrimination in education, housing and employment were made illegal and explicitly racially discriminatory laws were prohibited.” While “de facto racial discrimination is highly prevalent in desirable forms of education, housing and employment,” it does no one any good to assume that “an officially approved ideology of white supremacy” remains embodied in the existing legal order.

None of which should be taken to imply that Zack denies the existence of deep, persisting and tenacious racial inequality, expressed and reinforced through routine practices of violence and humiliation by police seldom held accountable for their actions. But, she says, "What many critics may correctly perceive as societywide and historically deep antiblack racism in the United States does not have to be thoroughly corrected before the immediate issue of police killings of unarmed young black men can be addressed."

She is not a political strategist; her analyses of the bogus logic by which racial profiling and police killings are rationalized are interesting but how to translate them into action is not exactly clear. But in the end, justice and injustice are not problems for philosophers alone.

Editorial Tags: 

Review of Stephen Siff, "Acid Hype: American News Media and the Psychedelic Experience"

If you can remember the 1960s, the old quip goes, you weren’t really part of them. By that standard, the most authentic participants ended up as what used to be called “acid casualties”: those who took spiritual guidance from Timothy Leary’s injunction to “turn on, tune in and drop out” and ended up stranded in some psychedelic heaven or hell. Not that they’ve forgotten everything, of course. But the memories aren’t linear, nor are they necessarily limited to the speaker’s current incarnation on this particular planet.

Fortunately Stephen Siff can draw on a more stable and reliable stratum of cultural memory in Acid Hype: American News Media and the Psychedelic Experience (University of Illinois Press). At the same time, communicating about the world as experienced through LSD or magic mushrooms was ultimately as difficult for a sober newspaper reporter, magazine editor or video documentarian as conversation tends to be for someone whose mind has been completely blown. The author, an assistant professor of journalism at Miami University in Ohio, is never less than shrewd and readable in his assessment of how various news media differed in method and attitude when covering the psychedelic beat. The slow and steady buildup of hype (a word Siff uses in a precise sense) precipitated an early phase of the culture wars -- sometimes in ways that partisans now might not expect.

Papers on experimentation with LSD were published in American medical journals as early as 1950, and reports on its effects from newspaper wire services began tickling the public interest by 1954. The following year, mass-circulation magazines were devoting articles to LSD research, followed in short order by a syndicated TV show’s broadcast of film footage showing someone under the influence. The program, Confidential File, sounds moderately sleazy (the episode in question was described as featuring “an insane man in a sensual trance”) but much of the early coverage was perfectly respectable, treating LSD as a potential source of insight into schizophrenia, or a potential expressway to the unconscious for psychoanalysts.

But the difference between rank sensationalism and science-boosting optimism may count for less, in Siff’s interpretation, than how sharply coverage of LSD broke with prevailing media trends that began coming into force in the 1920s.

After the First World War, with wounded soldiers coming back with a morphine habit, newspapers carried on panic-stricken anti-drug crusades (“The diligent dealer in deadly drugs is at your door!”) and any publication encouraging recreational drug use, or treating it as a fact of life, was sure to fall before J. Edgar Hoover’s watchful eye. Early movie audiences enjoyed the comic antics of Douglas Fairbanks Sr.’s detective character Coke Ennyday (always on the case, syringe at the ready), or in a more serious mood they could go to For His Son, D. W. Griffith’s touching story of a man’s addiction to Dopokoke, the cocaine-fueled soft drink that made his father rich. But by the time the talkies came around, the Motion Picture Production Code categorically prohibited any depiction of drug use or trafficking, even as a criminal enterprise. Siff notes that in the 20 years following the code’s establishment in 1930, “not a single major Hollywood film dealing with drug use was distributed to the public.”

Not that depictions of substance abuse were a forbidden fruit the public was craving, exactly. But the relative openness of the mid-1950s (emphasis on “relative”) allowed editors to risk publishing stories on what was, after all, serious research on a potential new wonder drug. Siff points out that general-assignment newspaper reporters attending a scientific or medical conference, unable to tell what sessions were worth covering, could feel reasonably confident that a title mentioning LSD would probably yield a story.

At the same time, writers for major newsmagazines and opinion journals were following the lead of Aldous Huxley, the novelist and late-life religious searcher, who wrote about mystical experiences he had while taking mescaline. In 1955, when the editors of Life magazine decided to commission a feature on hallucinogenic mushrooms, it turned to Wall Street banker and amateur mycologist R. Gordon Wasson. He traveled to Mexico and became, in his own words, one of “the first white men in recorded history to eat the divine mushroom” -- and if not, then surely the first to give an eyewitness report on “the archetypes, the Platonic ideals, that underlie the imperfect images of everyday life” in the pages of a major newsweekly.

Suffice it to say that by the time Timothy Leary and associates come on the scene (wandering around Harvard University in the early 1960s, with continuously dilated pupils and only the thinnest pretense of scientific research) it is rather late in Siff’s narrative. And Leary’s legendary status as psychedelic shaman/guru/huckster seems much diminished by contrast with the less exhibitionistic advocacy of LSD by Henry and Clare Boothe Luce. Beatniks and nonconformists of any type were mocked regularly in the pages of Time or Life, but the Luce publications were for many years very enthusiastic about the potential benefits of LSD. The power couple tripped frequently, and hard. (Some years ago, when I helped organize Mrs. Luce’s papers at the Library of Congress, the LSD notes were a confidence not to be breached, but now the experiments are a matter of public record.)

The hippies, in effect, seem like a late and entirely unintentional byproduct of industrial-strength hype. “During an episode of media hype,” Siff writes, “news coverage feeds on itself, as different news outlets follow and expand on one another’s stories, reacting among themselves and to real-world developments. Influence seems to flow from the larger news organizations to smaller ones, as editors at smaller or more marginal media operations look toward the decisions made by major outlets for ideas and confirmation of their own judgment.”

That is the process, broadly conceived. In Acid Hype, Siff charts the details -- especially how the feedback bounced around between news organizations, not just of different sizes, but with different journalistic cultures. Newspaper coverage initially stuck to the major talking points of LSD researchers; it tended to stress the potential wonder-drug angle, even when the evidence for it was weak. Major magazines wanted to cover the phenomenon in greater depth -- among other things, with firsthand reports on the psychedelic universe by people who’d gone there on assignment. Meanwhile, the art directors tried to figure out how to convey far-out experiences through imagery and layout -- as, in time, did TV producers. (Especially on Dragnet, if memory serves.)

Some magazine editors seem to have been put off by the religious undercurrents of psychedelic discourse. Siff exhibits a passage in a review that quotes Huxley’s The Doors of Perception but carefully removes any biblical or mystical references. But someone like Leary, who proselytized about psychedelic revolution, was eminently quotable -- plus he looked good on TV because (per the advice of Marshall McLuhan) he smiled constantly.

The same hype-induction processes that made hallucinogens seem like the next step toward improving the American way of life (or, conversely, the escape route for an alternative to it) also went into effect when the tide turned: just as dubious claims about LSD’s healing properties were reported without question (it’ll cure autism!), so did horror stories about side effects (it’ll make you stare at the sun until you go bling!).

The reaction seems to have been much faster and more intense than the gradual pro-psychedelic buildup. Siff ends his account of the period in 1969 -- oddly enough, without ever mentioning the figure who emerged into public view that year as the embodiment of LSD's presumed demons: Charles Manson. You didn't hear much about the drug's spiritual benefits after Charlie began explaining them. That was probably for the best.

Editorial Tags: 

In new book, faculty member urges universities to hold themselves to higher levels of accountability and inclusivity

Smart Title: 

Author discusses new book in which he argues that institutions can hold themselves to higher levels of accountability and inclusivity.

Pages

Subscribe to RSS - books
Back to Top