I am sick of reading about Malcolm Gladwell’s hair.
Sure, The New Yorker writer has funny hair. It has been big. Very big. It is audacious hair, hair that dares you not to notice it; hair that has been mentioned in far too many reviews. Malcolm Gladwell’s hair is its own thing.
Which is only appropriate, since in his writing, Gladwell has always gone his own way. But he’s been doing it long enough, and so well, and has made so much money, that some folks feel it’s time to trim him down to size. That hair is now seen as uppity.
Gladwell is a mere journalist. He’s not shy, and like many children of academics, he is not intimidated by eggheads. He does none of his own primary research, and instead scours academic journals to find interesting ideas -- he collects experiments and experimenters. He is a translator and a synthesizer, and comes up with catchy, sprightly titled theories to explain what he has seen. Some have called him a parasite. He has called himself a parasite.
It seems to me there’s always been a bit of snarkiness attached to discussions of Gladwell’s work. This is often the case for books that have become commercially successful, which is something that seems particularly to stick in the collective academic craw. There is a weird hostility in the reviews of Gladwell’s books that is directed not at the big-haired guy himself who, like a puppy, nips at the heels of academics and then relishes the opportunity to render their work into fluid, transparent prose, but toward those many people who have made Gladwell famous: his readers. No one matches the caustic condescension of Richard Posner, who said, in a review of Gladwell’s Blink, that “it’s a book for people who don’t read books.”
The reviews of Outliers, Gladwell’s latest book, show that even a New Yorker writer can go too far. People are now attacking Malcolm Gladwell as a kind of brand. The critiques boil down to a few things, one of which is that he doesn’t take into account evidence that refutes his theories. In other words, he’s not doing careful scholarship. But we all know that even careful scholarship is a game of picking and choosing -- it just includes more footnotes acknowledging this. And Gladwell never pretends to be doing scholarship.
Gladwell is also accused of being too entertaining. He takes creaky academic work and breathes Frankensteinian life into it. He weaves anecdotes together, creating a tapestry that builds to an argument that seems convincing. This, some reviewers have claimed, is like perpetuating fraud on the (non-academic) reading public: because Gladwell makes it so much fun to follow him on his intellectual journey, he’s going to convince people of things that aren’t provably, academically true. He will lull the hoi polloi into thinking they’re reading something serious.
Which is, of course, the most common complaint about Gladwell: He’s not serious enough. He’s having too much fun playing with his ideas. And, really, you can’t be Serious when you’re raking in so much coin. Anyone who gets paid four million bucks for a book that mines academic work -- and not necessarily the stuff that is agreed to be Important -- is going to become a target. His speaking fees are beyond the budgets of most colleges. In this way, his career is now similar to that of David Sedaris, who can command an impressive audience and still be dissed by the literary folks. Everyone who’s anyone knows that you can’t sell a lot of books and be a serious writer. Just ask Jonathan Franzen. Or Toni Morrison.
I don’t see Gladwell as a social scientist-manqué, or a philosopher wannabe. Instead, I read him more like an essayist. I think of his books as well-written, research-packed, extended essays. Let me show you the evils of imperialism by telling you a story about the time in Burma when I was forced to shoot an elephant. Let’s look at this (bad) academic prose and think about the relationship between politics and the English language. But instead of using his own experiences, he builds on work done by others. He uses a wry, quirky approach and blithely ignores the received wisdom and pieties of academe. He doesn’t seek out the researcher who’s highly regarded within her field; he looks for people who are doing things he finds interesting.
Gladwell reminds me of the kind of student I knew in college, the nerd who takes weird and arcane courses and then rushes from the lecture hall excited about some idea the professor has mentioned in passing and goes straight to the library to pursue it himself. He stays up all night talking about it, and convincing you that even though you were in the same class, and heard the same reference, you have somehow missed something. Maybe not something big, but at least something really, really cool.
Perhaps I have more trust in readers than to believe that they can be so easily bought off by a good story. And I wish that academics, instead of pillorying Gladwell for being good at translating complicated ideas, would study the way he does it and apply some portion of his method to their own work: He makes mini trade books of monographs. Surely this is a lesson worth learning. He uses the narrative art of the magazine writer to animate ideas. He profiles theories the way Gay Talese or Joan Didion did celebrities.
The audacity Gladwell shows in his writing, connecting seemingly disparate things and working hard, yet with apparent effortlessness, to make the ideas engaging, gives me hope for the future of books. It makes me feel better to see folks buying Gladwell rather than the swimmer Michael Phelps’s memoir or vampire novels -- not that there’s anything wrong with that. Yet this same audacity is what gets Gladwell into hot water with academics. He’s not supposed to do this.
Unless you are an aged physicist, you don’t really get to write books that “purport to explain the world.” You can, of course, try to explicate tiny portions of it. Science writers like James Gleick and Jonathan Weiner can go a lot further than most scientists in terms of making arcane principles understandable to the Joe the Plumbers of the reading world and no one gets bent of out shape. Perhaps it’s because of the assumption that scientists, with a few notable (often British) exceptions, are not supposed to be able to write books that normal people can read. Social scientists and historians are, however, expected to be able to know what is interesting and important about their work and present it to the public. Brand name thinkers like Susan Sontag and Martha Nussbaum can take on big ideas. But these people are experts; journalists shouldn’t try this at home.
What I love about Gladwell is that his writing is like his hair. You can see it as arrogant or scary (he writes about being stopped more frequently by cops when he had a big afro), or you can see it as playful and audacious. This is why, of course, so many reviews mention it; he has the right hair for his work.
One final, dour complaint about Gladwell has to do with his relentless cheeriness. He thinks that people are basically good, though he understands that sometimes circumstances aren’t. I can’t abide high-brow literary novelists who trash fiction that “cops out” with a happy ending. Maybe I’m hopelessly low-brow: I still love Jane Austen and Shakespeare’s comedies. The academic response to most things is generally: it’s more complicated than that. And sure, much of the time it is. But if something’s artfully crafted, I’m willing to cut the author some slack. I don’t ever expect to be thoroughly persuaded of anything; I’m characterologically skeptical and like to do the thinking on my own. Gladwell’s books invite me into a conversation. I think that’s part of the job of a good book.
For me, reading Malcolm Gladwell’s books is like watching Frank Capra movies. Just because they make you feel good and keep you entertained doesn’t mean that they’re not doing valuable work or tackling hard and real issues and ideas. Sure, someone else could have handled it differently. George Bailey might have finally committed suicide; the bank in Bedford Falls could have asked for a government bailout. But right now, maybe it’s not such a bad thing to read books that are a little more hopeful. And yes, audacious.
Rachel Toor teaches in the MFA program at Eastern Washington University. She writes a monthly column for The Chronicle of Higher Education, and her most recent book is Personal Record: A Love Affair With Running. Her Web site is www.racheltoor.com.
Last week Leon Kass, chairman of the Council of Bioethics under President Bush, took to the podium to deliver the Jefferson Lecture of the National Endowment for the Humanities -- an event I did not go to, though it was covered by one of IHE's intrepid reporters.
My reluctance to attend suggests that, without noticing it, I have come to accept Kass’s best-known idea, “the wisdom of repugnance.” There is, alas, all too little evidence I am getting any wiser with age -- but my visceral aversion to hearing a Bush appointee talk about human values is inarguable.
As you may recall, Kass wrote in the late 1990s that biotechnological developments such as cloning are “the emotional expression of deep wisdom, beyond reason’s power fully to articulate it.” In our rising gorge, he insisted, “we intuit and feel, immediately and without argument, the violation of things that we rightfully hold dear.... Shallow are the souls that have forgotten how to shudder.”
Judged simply as an argument, this is not, let’s say, apodictically persuasive. Anyone who as ever taken an introductory anthropology course, or read Herodotus -- or gone to a different part of town -- will have learned that different groups feel disgust at different things. The affect seems to be hard-wired into us, but the occasions provoking it are varied.
Kass invoked the "wisdom of repugnance" a few years before he joined an administration that treated the willingness to torture as a great moral virtue -- meanwhile coddling bigots for whom rage at gay marriage was an appropriate response to “the violation of things we hold rightfully dear.”
Now, as it happens, some of us do indeed feel disgust at one of these practices, and not at the other. We also suspect that Kass’s aphorism about the shallowness of souls that have forgotten how to shudder would make a splendid epigraph for the chapter in American history that has just closed.
In short, disgust is not quite so unambiguous and inarguable an expression of timeless values as its champion on the faculty of the University of Chicago has advertised. Given a choice between “deep wisdom” and “reason’s power fully to articulate,” we might do best to leave the ineffable to Oprah.
There is no serious alternative to remaining within the limits of reason. Which means argument, and indeed the valuing of argument -- however frustrating and inconclusive -- because even determining what the limits of reason themselves are tends to be very difficult.
Welcome to modernity. It’s like this pretty much all the time.
The account of Kass's speech in IHE -- and the text of it, also available online -- confirmed something that I would have been willing to wager my paycheck on, had there been a compulsive gambler around to take the bet. For I felt certain that Kass would claim, at some point, that the humanities are in bad shape because nobody reads the “great works” because everybody is too busy with the “deconstruction.”
It often seems like the culture wars are, in themselves, a particularly brainless form of mass culture. Some video game, perhaps, in which players keep shooting at the same zombies over and over, because they never change and just keep coming -- which is really good practice in case you ever have to shoot at zombies in real life, but otherwise is not particularly good exercise.
The reality is that you encounter actual deconstructionists nowadays only slightly more often than zombies. People who keep going on about them sound (to vary references a bit) like Grandpa Simpson ranting about the Beatles. Reading The New Criterion, you'd think that Derrida was still giving sold-out concerts at Che Stadium. Sadly, no.
But then it never makes any difference to point out that the center of gravity for argumentation has shifted quite a lot over the past 25 years. What matters is not actually knowing anything about the humanities in particular -- just that you dislike them in general.
The logic runs something like: “What I hate about the humanities is deconstructionism, because I have decided that everything I dislike should be called ‘deconstructionism.’ ” Q.E.D.!
Kass complained that people in the humanities fail to discuss the true, the good, and the beautiful; or the relationships between humanity, nature, and the divine; or the danger that comes from assuming that technical progress implies the growth of moral and civic virtue. Clearly this is a man who has not stopped at the new books shelf in a library since the elder George Bush was Vice President.
And so last week’s Jefferson lecture was, perhaps, an encouraging moment, in spite of everything. With it, Leon Kass was saying farewell to Washington for, with any luck, a good long while. Maybe now he can spend some time catching up with the range of work people in the humanities have actually been doing. At very least he could read some Martha Nussbaum.
Then he might even pause to reflect on his own role as hired philosopher for an administration that revived one of the interrogation techniques of the Khmer Rouge. The wisdom of repugnance begins at home.
The fall books have already started piling up. There are titles I’ve asked the publishers to send, and the ones volunteered by eager publicists, and the ceaseless influx of small books of poetry, which fill me with guilt for watching “Law & Order” reruns instead of reading them. (But verily, man cannot live by print alone.)
Before the new publishing season begins and they are lost in the flood, let me take a quick look here at a few recent titles – books I have found absorbing and rewarding, but not had a chance to discuss in this column. The list is miscellaneous, and the tip of an iceberg. I doubt they have much in common. But each title is a reminder of the fine and irreplaceable work that university presses do with no fanfare, and seldom much recognition. And let’s not even talk about profit.
The arrest of Henry Louis Gates Jr. (a.k.a. Gates-gate) has generated great heat but not much light. Various media loudmouths have been outdoing themselves in portraying the Harvard professor as some kind of wild-eyed radical. This is, of course, a matter of ignorance, robust and unashamed. Gates is in reality the most anodyne of centrists. But at least the furious fantasies he has provoked should put to rest for good any notion that the United States has lately turned into a “postracial” society.
It seems like a good moment to recommend Pulitzer Prize-winner Steven Hahn’s new book The Political Worlds of Slavery and Freedom, a compact but challenging volume published by Harvard University Press. The author is a professor of history at the University of Pennsylvania. His three chapters -- each a miniature monograph -- are based on a series of lectures at Harvard, given at Gates’s invitation.
Hahn looks at the complex way the African-American struggle for emancipation took shape both under slavery and in the wake of its abolition. This process involved the creation of institutions for self-governance, as seen in “the efforts of newly freed people to reconstitute their kinship groups, to form squads and other family-based work groups, to pool community resources, and, of course, to acquire land.”
These weren’t just social movements. They contained, argues Hahn, a political element. Hahn considers whether the activity of black Southerners during the Civil War amounted to a variety of slave revolt, and he sketches aspects of the political life of Marcus Garvey’s pan-Africanist group in the United States in the early part of the 20th century. Only the mo
st small-minded conception of American life would assume that these are matters of interest only to black readers. In a healthy culture, this little book would be a best-seller.
A few months ago, an editor asked me to review Adina Hoffman’s biography of Taha Muhammad Ali, My Happiness Bears No Relation to Happiness: A Poet’s Life in the Palestinian Century, published by Yale University Press. To tell the truth, my heart did not initially leap at the opportunity. For I had never read any of his poetry, and rather feared that it might be full of slogans -- that, indeed, the poet’s own life might be one long slogan.
This proves that I am an idiot. A couple of sessions with his selected works revealed Ali to be a wry, ambivalent, and often understated lyricist. (In translation, at least, he seems a little bit like Edgar Lee Masters.) The figure who emerges from Hoffman’s biography is that of a quiet shopkeeper in Nazareth who carefully studied Arabic literary tradition, and also absorbed the influence of the Palestinian nationalist “resistance literature” – then created his own distinctive style: one stripped-down and unrhetorical, but sensitive as a burn.
One of the remarkable things about this biography, as indicated in my review, is that it evokes not only the political and historical context of Ali’s work, but also how his poetry took shape. Its quietness and simplicity are hard-won.
At the other extreme from Ali, perhaps, is Walt Whitman, whose poetic voice is booming, and whose persona always seems a couple of sizes too large for the North American continent. A couple of years back, Duke University Press reprinted his one and only novel: a cautionary tale of the perils of strong drink called Franklin Evans, or The Inebriate. I have somehow never gotten around to reading it, and probably never will. But it is impressive to think that Whitman grew to his familiar cosmic dimensions while stone cold sober.
His poetry certainly intoxicated the readers portrayed in Michael Robertson’s Worshipping Walt: The Whitman Disciples, published by Princeton University Press. The noun in its subtitle is no exaggeration. The readers portrayed here found in Whitman’s work something akin to a new scripture -- nearly as much as followers of Joseph Smith or Mary Baker Eddy did the Book of Mormon or Science and Health.
You can still find R.M. Bucke’s Cosmic Consciousness (1901) -- where Whitman is identified as “the best, most perfect example the world has so far had of the Cosmic Sense" -- in New Age shops. Other disciples took his “chants democratic” as hymns for a worldwide socialist commonwealth. And his invocation of manly “adhesiveness” were understood by a few readers to be a call for what later generations would term gay liberation. Whitman insisted that his homophile readers had misunderstood him, and that when not writing poetry he had been busy fathering illegitimate children all over these United States. The biographers will continue to hash that one out -- though it’s clear that his literary persona, at least, is ready to couple with anything that moves, regardless of gender.
Whitman’s work gave some of his Victorian readers a vision of the world extending far beyond the horizon of the familiar and the acceptable. No surprise that they revered him as a prophet. Robertson, a professor of English at the College of New Jersey, tells the story of his steadily expanding circle of enthusiasts (which at one point aspired to become a global movement) with due appreciation for how profound the literary experience can be, when the right book falls into the right person’s hands.
Of course there are times when reading is a nothing but a guilty pleasure. So to go from the sublime to the sleazy, I have to recommend Jack Vitek’s The Godfather of Tabloid: Generoso Pope Jr. and the National Enquirer, published by the University Press of Kentucky late last year.
Not that the book itself is sleazy. The Enquirer may specialize in celebrity gossip, horrific crimes, UFO abductions, and Elvis Presley's posthumous itinerary. But that's not to say that the author -- an associate professor of English and journalism at Edgewood College, in Madison, Wisc. -- is anything but serious and measured in his approach. Vitek tackles his subject with all due awareness of its lingering cultural relevance. Pope modeled himself on newspaper tycoons such as William Randolph Hearst and Joseph Pulitzer.
The publisher also happens to have had family “connections” (as the preferred expression has it) with what its members do not call the Mafia. He also spent about a year working for the Central Intelligence Agency. This makes it especially interesting to consider the mission statement Pope released when he bought a local tabloid called the New York Enquirer in 1953. “In an age darkened by imperialist tyranny and war,” it said, “the New York Enquirer will fight for the rights of man, the rights of the individual, and will champion human decency and dignity, freedom and peace.”
Any biography moving between lofty rhetoric and very low company is bound to be pretty absorbing. The Enquirer, after it went national, reached a peak circulation of 6.7 million copies per issue in the late 1970s, with Pope playing an aggressive role in crafting its distinctive strain of populist sensationalism.
In a footnote, Vitek points out that Fredric Jameson’s analysis of postmodernism somehow overlooked Pope’s role as formative influence within what Jameson calls "the degraded landscape of schlock and kitsch." Quite right -- and it is good to have this oversight finally corrected.
OK, so, into a bar walk an Anglican priest, a Muslim imam, a Jewish rabbi and an atheist. Sounds like a ramp to punch line, right? No. That was my panel last month at the 20th anniversary of the Oxford Round Table, at the University of Oxford, England.
Apparently, a peek behind the veil of ORT is needed. Recent posts in the academic blogosphere about this invitation-only academic symposium feature adulation for the intelligencia it attracts and castigation of Oxford for trading on its name for summer business, like some sort of pedagogical Judas.
Fact is, they’re both right. Mind, matter and merger summarize why the event both enchanted and irritated me.
Mind Over Matter
Firstly, pundits need not dismiss its scholarly girth. Formidable participants do darken the doors. My symposium, “Religion and Science After Darwin -- Effects on Christians and Muslims” -- featured sessions with distinguished thinkers in physics, biology, religion and law from all the intellicrat schools you might imagine: Oxford, Harvard, Boston U., UNC-Chapel Hill, Rutgers, etc. It’s not every day you spend time with David Browning (icon for Christian-Islamic comity), Robert Neville (23 books and counting), Amedee Turner (European Parliament while the Euro was established), or the ardent atheist Richard Dawkins (The God Delusion).
Further stamps of legitimacy on the program include ORT Trustee Charles Mould, former secretary of Oxford’s 400-year-old, 11 million volume Bodleian Library, and a 16-member advisory committee of university presidents and rectors from eight countries. Also, the manuscripts in its blind reviewed journal, Forum on Public Policy, bear characteristics of quality.
However, since the program began in 1989 with ministers of education from 20 countries, an internalized invitation system eroded to include mid-level researchers or engaged academics like me from teaching institutions -- from ministers of education to an educator with ministerial credentials (and a few relevant publications). Try to tame the jokes for Darwinian devolution.
The intellectual temperature was warm, not hot. This is where I’m supposed to say, “but all were meaningful contributors.” Truth is, some members of our panel were alien to the work, sending more than one head scratching. The good news is that neither title, institution type, or academic discipline were the indicators. Candid confrontation carried the day, based on the quality of ideas. I’m the better for hearing it all. (I’m supposed to say that, too).
As for how aliens gather, one candid comment by an event organizer confessed that the University of Oxford bills the ORT organization heavily for use, and like most universities in modern economy Oxford depends on summer conference “hotel” business to get by.
The ORT itself is, of course, a business (albeit nonprofit), which explains why they folded two smaller symposia into one fumbling theme. That irked me. It was like bringing a fruitcake to a wine and cheese party. I was dressed for interfaith democracy since 9/11. Others came with erudite philosophies of science.
Most organizations can’t get away with last minute theme mergers, but the collective transfixion over a week at the world’s first English speaking university seems to place otherwise central concerns, like the event purpose (!), out of mind for most participants.
Matter Over Mind: Pub and Pulpit
Oh, but the place is intoxicating, and place matters. If space inspires thought or ambition, the ORT venue should produce the most luminous luminaries on the planet. I’ll spare you predictable fawning over this medieval city, where every castle and cathedral issues such artisan care the place is fabled “the city of dreaming spires.” The point: ORT wouldn’t work in Albuquerque.
It’s not intention that the American Southwest lacks, but history, deep academic history, and the continuity one feels holding forth at an ancient lectern presided over by 800 years of political, scientific and religious savants.
Both pubs and pulpits nurtured greatness here for centuries. Their understated, six-inch plaques tagged across the city commemorate landmarks in a prevalence of meaning only Oxford could afford.
To the pub: on one side of town is a tiny booth in The Eagle and Child tavern where C.S. Lewis and J.R.R. Tolkien met every Tuesday for 25 years -- “the conversations that have taken place here," its plaque reads, "have profoundly influenced the development of 20th century literature,” from The Chronicles of Narnia to The Lord of the Rings.
And to the pulpit: across town is an unassuming though well-crafted podium in a Gothic cathedral from which John Wesley preached his conversion story and launched the Methodist Movement that, in part, propelled my own institution into being. There brother Charles penned hymns now sung in every Christian church on the planet. In a word, cool.
The significance of location fits ORT, as described by the French philosopher Gaston Bachelard in The Poetics of Space: Certain places reduce us to silence. They contain more than their objectivity. Sometimes you feel “inside an essential impression seeking expression.” And recall of such spaces, as I perform for you now, become not history -- “I was there” -- but a kind of poetry that memorializes moments. Bachelard says, “The great function of poetry is to shelter dreams.”
For too many academics, the dreams of significance are extinguished in a chemical bath of routine responsibilities (e.g. recommendation letters, grading, meetings). But such dreams require opportunities to perform. The University of Oxford’s space holds sufficient cachet to revive academic dreams, requiting love for elevated and sublime learning.
Mind and Matter Merger: Leaving in Tension
Alas, learning without tension is entertainment. Mind and matter merged for me during one session in the centuries-old Victorian Oxford Union Debate Chamber -- affectionately called “the last bastion of free speech in the world.” Recently, the Holocaust-denier David Irving and “sex-positive” community builder Joani Blank spun yarns. The likes of Yasser Arafat, Desmond Tutu, and a Kennedy or two are tossed in here and there. In that space all the tensions of the Oxford Round Table, real and symbolic, came together for me.
Standing at the podium was Dawkins. I’ve never been insulted with such kindness. He artfully delivered wink-and-smile sarcasm against bald jabs of theist stupidity, and appeared to relish the provocation. Had I not read some of his work, I would’ve thought it mere gamesmanship, superficial wordplay for positions not fully held.
Yet there’s a likability in him somehow, a most unexpected thing for me to feel as an evangelical Christian. I wished I had more time with him, but not in the way that morphed middle-aged scientists into giddy children after the Q & A, lining up hurriedly with the front flaps of their Dawkins books in one hand and autograph pen in another. Here was an orgy of secularism, loud and proud, baby.
Seated next to him in poetic paradox was the head-in-hand, the veteran Vicar Brian Mountford of millennia-aged University Church of St. Mary’s, original site for Oxford coursework, and the physical and spiritual hub of a city and campus with 40 chaplains. Twice per term, in fact, the “university sermon” is delivered here, dignitaries in tow.
Not only does this priest share the platform with Dawkins, shepherding souls in a landscape of logical positivism, but imagine this: He’s also Dawkins’s neighbor. What a delicious irony! That’s better than McDonalds and Burger King on the same corner.
Mountford reconciles this tension, in part, through self-described liberal theology. Our talk, his Spring sermons, and his book, Perfect Freedom: Why Liberal Christianity Might Be The Faith You’re Looking For, express: a “low view of the church” (it institutionalizes discipleship, stripping salvation of its freedoms); an “embrace of the secular” (the Church should not assume society is ethically less sophisticated than itself); soft judgment (“God would not condemn his creatures to eternal torment”); and the “championing of doubting Thomases on the fringe.” He sees this as being “more evangelical than the evangelicals” -- courting scoffers almost Socratically while provoking believers (“sermons send us to sleep because they are totally uncontroversial”).
But for me, a theological conservative, here strikes another strand of tension, beyond the ridiculing atheist “neighbor” we’re charged to love. Here is faith diverging between two likable people -- a theological gap Mountford once described as “chalk and cheese,” things that just don’t go well together.
Such was ORT for me: enchantment and irritation, the merger of chalk and cheese.
En route to the airport were two books under wing, Dawkins’s The God Delusion and Anthony Flew’s There is a God: How the World’s Most Notorious Atheist Changed His Mind.
Agitations get me thinking. I’m the better for it, remember?
Gregg Chenoweth is vice president for academic affairs at Olivet Nazarene University, and practicing journalist for a variety of magazines and newspapers.
Submitted by David Vine on September 21, 2009 - 3:00am
Over the past two years, there has been considerable controversy over attempts by the Pentagon to recruit anthropologists and other social scientists to assist in counterinsurgency operations in Iraq, Afghanistan, and elsewhere in the “global war on terror.” Like the American Psychiatric Association and American Medical Association, which banned members’ participation in torture and interrogation, anthropologists have widely criticized the use of anthropology in counterinsurgency as unethical.
Of particular concern has been the U.S. Army’s “Human Terrain Team” program under which (sometimes armed) social scientists are embedded in brigades deployed in Iraq and Afghanistan to provide cultural knowledge that assists with combat operations. Many anthropologists agree that the Human Terrain program and other counterinsurgency activities violate the American Anthropological Association’s code of ethics, which commits members to do no harm to the people with whom they work, prohibits covert research, and requires researchers to obtain informed consent and to avoid doing things that could endanger the work of future anthropologists. Many have likewise criticized the recruitment of anthropologists as an effort to forestall bringing troops home from Iraq and Afghanistan, continuing the policies that have left the United States mired in deadly, unpopular wars.
Spurred by such concerns, in October 2007, the executive board of the American Anthropological Association (AAA) called the Human Terrain program “an unacceptable application of anthropological expertise.” Between 2007 and 2008, more than 1,000 anthropologists agreed to boycott the program, signing a pledge of non-participation in counterinsurgency as part of a campaign organized by the Network of Concerned Anthropologists (I am a member of the steering committee).
Supporters of the Human Terrain program have often claimed that those opposed to working in the wars are advocating total academic disengagement from the military and a retreat to the ivory tower. This could not be further from the truth. Most opponents of the Human Terrain program, myself included, are not categorically opposed to work and engagement with the military. To the contrary, many believe that anthropologists can ethically teach soldiers in classrooms, train peacekeepers, or consult with military and other government officials about cultural, social, historical, and political-economic issues.
Indeed, the campaign against anthropological collaboration in counterinsurgency has coincided with and helped fuel a recent efflorescence of research and work on an expanding array of issues related to the military and foreign policy. Far from calling for a retreat to the ivory tower, a growing number of anthropologists are actively involved in research both with and about the U.S. and other militaries, foreign policymaking and policymakers, war, conflict, and militarization.
Inspired by anthropologists like Laura Nader, Kathleen Gough, Mina Davis Caulfield, Marshall Sahlins and Eric Wolf, anthropologists have studied topics as diverse as nuclear weapons policy, the training of foreign military personnel at the School of the Americas, the shadowy world of the global arms trade, and the harmful effects of military bases. My own research has investigated the creation of the secretive U.S. base on Britain’s Indian Ocean island Diego Garcia, the expulsion of the island’s indigenous people during development of the base, and the significance of the base for U.S. foreign policy.
As a result of this work, I recently attended a two-day meeting of anthropologists, historians, sociologists, and political scientists organized by the newly founded Eisenhower Research Project for the Critical Study of Armed Forces and Militarization. Hosted by co-directors Catherine Lutz and Aaron Belkin and project manager Christina Rowley at Brown University’s Watson Institute for International Studies, participants discussed subjects as diverse as U.S. military spending (which now equals or exceeds that of all the other nations of the world combined), military checkpoints in Iraq, the increasing use of remote-controlled robots and other advanced technologies in war, the military’s role in the war on drugs, the militarization of the U.S. border, the armed services’ dependence on so-called military wives and military families, and the role of Hollywood and popular culture in glorifying war.
Most importantly, the interdisciplinary group of scholars dedicated itself not just to conducting research on military issues, but also to attempting to influence national conversations and public opinion about military and foreign policy. For too long in the past anthropologists and other social scientists have indeed isolated themselves in the ivory tower, ceding policy debates to international relations and security scholars, to think tanks generally invested (intellectually or literally) in war, to arms manufacturers’ lobbyists, to pundits, politicians, and the Pentagon.
Our nation is at a critical moment in determining the role the military is going to play in the world and the shape of our relations with other nations. President Obama has indicated his desire to chart a different course in the nation’s foreign policy from that of President Bush, to make diplomacy, cooperation, and engagement the hallmarks of U.S. international relations.
And yet, while slowly trying to extricate the nation from a deadly, illegal war in Iraq, we appear ready to repeat the same mistakes of that war, and Vietnam before it, in pursuing an increasingly violent war of occupation in Afghanistan — a nation where the British and Soviet empires failed before us in their attempts to impose foreign rule. Rather than learning from these past mistakes, from the lesson that there can be no military solution to the challenge posed by the Taliban and others resisting occupation, the escalation of U.S. troops and bombing in both Afghanistan and Pakistan is an increasingly bloody diversion from the political, economic, and diplomatic initiatives that must be at the heart of any solution to violent conflict.
Given the growing crisis in Afghanistan, which threatens to derail Obama’s agenda abroad and at home, the skills and original perspectives of anthropologists and other social scientists are desperately needed to build a new direction for U.S. military and foreign policy. This will mean conducting research of direct relevance to the U.S. military, to the State Department, and to the dynamics of U.S. global relations. This will mean shedding anthropologists’ traditional hesitancy about proposing proscriptive solutions to identified problems (the bread and butter of many international relations scholars). This will mean writing not primarily for academic audiences but instead for policymakers, politicians, and the wider public.
It will mean doing so not in the pages of (generally obscure) academic journals, but in the op-ed pages of newspapers, for blogs and major web outlets, and for the likes of Foreign Affairs and Foreign Policy. And it will mean building on efforts like the Eisenhower Research Project to create a new breed of policy think tanks — think tanks staffed by a diverse group of social scientists, driven by empirical research, and frequently working in collaboration with military leaders and others in the national security bureaucracy to create new policy approaches.
The Pentagon’s efforts to recruit anthropologists for the wars in Iraq and Afghanistan represent the failure of U.S. foreign policy rather than innovation. They are a return to the sad beginnings of anthropology — the “handmaiden of empires” — when the discipline was born as a tool to assist in the rule and control of colonized peoples in Africa, Asia, and North America. The recruitment of anthropologists represents the misguided belief that victory in Iraq and Afghanistan can be achieved through better tactics — if only we could fight smarter, know more about their cultures, and embed anthropologists with the troops, then we would “win”! — rather than realizing that the real lesson these wars is that wars of invasion and occupation should not be waged at all.
The nation must use this moment to embrace a permanent and fundamental change in our military and foreign policy. We must finally reject a foreign policy of invasion and occupation and embrace a new kind of foreign policy based around non-aggression, diplomacy, international cooperation, and the protection of human needs and human lives as the best way to ensure the security of the country and the world. With members of the military and an engaged citizenry as our partners and allies, anthropologists and other social scientists have a critical role to play in this process.
Human beings are the product of a few million years of evolution. Awareness of this is part of what it means to be modern. But most of the time this recognition remains general and vague. We get along just fine without thinking about the scale of the processes involved. We act as if a thousand years is a long time; it can be a strain to imagine the world of a few decades ago. It is hard to reckon just how thin a slice of human time is there in recorded history. That we ever developed the capacity to record things at all is strange and improbable. As recently as ten or twelve thousand years ago, our ancestors devoted most of their waking hours to finding enough calories to stay alive.
“To date,” writes Michael Tomasello, co-director of the Max Planck Institute for Evolutionary Anthropology in Leipzig, “no animal species other than humans has been observed to have cultural behaviors that accumulate modifications and so ratchet up in complexity over time.” At some point that complexity begins to spike -- an exponential surge that comes to seem almost normal. But how is it possible?
In Why We Cooperate, just published by Boston Review Books, Tomasello gives a succinct account of his work with a research team conducting comparative studies of the behavior of human infants and our closest primate relations, especially chimpanzees.
Their findings suggest that we are distinguished, as a species, by capacities for empathy, generosity, cooperation, and a sense of fair play. Some of these tendencies are found among the great apes, but not to anything like the degree to which they manifest themselves in children from very early in their development. These distinctive capacities form the bedrock of our capacity to accumulate, over time, not just wealth but complex behavior.
The new book -- based on the Tanner Lectures delivered by Tomasello at Stanford University in early 2008 -- is a lay reader's introduction to work described in The Cultural Origins of Human Cognition (Harvard University Press, 1999) and Origins of Human Communication (MIT Press, 2008). Peers who comment on his work in the “forum” section of Why We Cooperate sometimes question the degree to which these abilities are hard-wired into us -- rather than being acquired through, or at least stimulated by, nurture and communication. But they concur that Tomasello and his team have opened up fruitful lines of inquiry into the source and nature of human development.
There is evidence, Tomasello writes, “that from around their first birthdays -- when they first begin to walk and talk and become truly cultural beings -- human children are already cooperative and helpful in many, though obviously not all, situations.” Faced with an adult they have never met before, for example, an infant between the ages of 14 and 18 months will help with “everything from fetching out-of-reach objects to opening cabinet drawers when the adult’s hands are full.”
This behavior is not the work of little rational-choice theorists in diapers. Children who were consistently rewarded for their assistance were found to be less helpful in subsequent experiments. Chimpanzees, too, were found to possess some inclination toward altruism. The major distinction on this point is that small children are both better able and more willing to share information in order to be helpful -- for example, by pointing out the location of a stapler whose whereabouts in the laboratory the child knows. While apes are capable of some very limited exchanges with humans, their messages tend to be self-interested (helping convey where a tool is that will be useful in getting them food) and they do not teach each other to communicate.
As they get older, writes Tomasello, the “relatively indiscriminate cooperativeness” of human children “becomes mediated by such influences as their judgments of likely reciprocity and their concern for how others in the group judge them....” This is not a matter of self-interest alone -- of doing unto others just as generously as they do unto you. The knack for moral bookkeeping does develop, of course. But first we acquire a sense that there are general rules for how things ought to be done, and that everyone ought to abide by them.
Three year-old children were shown a game that could be played by one person. “When a puppet later entered and announced that it, too, would play the game, but then did so in a different way,” reports Tomasello, “most of the children objected, sometimes vociferously. The children’s language when they objected demonstrated clearly that they were not just expressing their personal displeasure at a deviation. They made generic, normative declarations like, ‘It doesn’t work like that,’ ‘One can’t do that,’ and so forth.”
This is intriguing because the children’s perspective is disinterested. “It is one thing to follow a norm ... to avoid the negative consequences of not following it,” says Tomasello, “and it is quite another to legislate the norm when not involved oneself.”
In the case of the puppet experiment, I suppose “enforce” is a more appropriate word than “legislate.” But either way, it suggests the early development of a capacity to grasp the principle of a common and impersonal norm.
This both reflects and reinforces our capacity for cooperative action -- which may be the very thing that distinguished our hominid ancestors from other primates. Groups of small children “engage in all kinds of verbal and nonverbal communication for forming joint goals and attention and for coordinating their various roles in the activity,” says Tomasello, while his colleagues find nothing comparable to this range and complexity of cooperation among the great apes.
"Indeed,” he writes, “I believe that the ecological context within which these skills and motivations developed was a sort of cooperative foraging. Humans were put under some kind of selective pressure to collaborate in their gathering of food -- they became obligatory collaborators -- in a way that their closest primate relatives were not.... We could also speculate that since hunter-gatherer societies tend to be egalitarian, with bullies often ostracized or killed, humans underwent a kind of self-domestication process in which very aggressive and acquisitive individuals were weeded out by the group.”
Thanks to millennia of progress, we have reached a plateau of development where it is commonly accepted that existence is a war of all against all, and Donald Trump is taken to embody the traits that drove human evolution itself. (Otherwise he might look like the missing link with a hairpiece.)
None of this makes for optimism about our next ten thousand years or so -- or decade, for that matter. Hard as it is to wrap one’s mind around the depths of time and transformation involved in reaching this stage of civilization, it can be still more difficult to imagine how we can continue. The scale of possible aggression now -- let alone the unintended consequences of raw acquisitiveness -- go beyond anything our primate brains are quite ready to picture.
But for all that, the research by Tomasello and his associates is at least somewhat encouraging. It suggests that collaboration, sharing, and even generosity are not late developments in human existence -- merely secondary or superfluous capacities. They are essential. They came first. And they could yet assert themselves as a basis for reorganizing life itself.
Then, perhaps, our prehistory would come to an end -- and something like a civilization worthy of human beings would begin.
In his mock documentary Take the Money and Run (1969), Woody Allen plays the ambitious but remarkably unlucky bank robber Virgil Starkwell. He never makes the FBI’s Ten Most Wanted because, after all, it all depends on who you know. But he does manage to shave some time off one of his prison sentences by volunteering for medical research. He survives the experiment. There is one side effect, however, as the narrator explains in a solemn voiceover: He is temporarily transformed into a rabbi.
This sequence came to mind while reading The Professional Guinea Pig, by Roberto Abadie, just published by Duke University Press. “An estimated 90 percent of drugs licensed before the 1970s were first tested on prisoners,” writes Abadie. “Prisoners were in many ways a perfect population for a controlled experiment. Because they had similar living conditions they provided good control groups for clinical trials, while the financial and material benefits ensured a large supply of willing and compliant volunteers.”
Only in 1980 did the Food and Drug Administration ban the use of prisoners for medical research. Their circumstances made a mockery of informed consent. (Especially in Virgil’s case. “Prisoners received one hot meal per day,” the narrator explains: “a bowl of steam.”) But the demand for experimental subjects for biomedical research had to be met somehow. And so there has emerged the new regime of power and knowledge analyzed by Abadie, a visiting scholar with the health sciences doctoral program at the City University of New York Graduate Center.
His book is an ethnographic account of the subculture of “paid volunteers” recruited to serve as subjects for pharmaceutical testing -- with a particular focus on what he calls the “professionalized” guinea pigs who derive most (or all) of their income from this work. Volunteers receive “from $1200 for three or four days in less intensive trials,” according to Abadie, “to $5000 for three or four weeks in more extensive ones.”
Actually the term “work” is somewhat problematic here. The labor is almost entirely passive. Half of it, as Woody Allen once said about life itself, is just showing up. You are weighed and your blood taken, and there might be a few other tests, along with quite a lot of boredom. (One of Abadie’s informants describes it as participation in “the mild torture economy.”) Some of the guinea pigs fall back on it as a supplement to “low-paying jobs as cooks, construction workers, house painters, or bike messengers.” For others, it is their sole source of income. They enlist for up to eight rounds of testing per year, earning “a total estimated income of $15,000 to $20,000 in exceptionally good years.”
Higher rates of pay are available to those willing to endure unpleasant procedures. Likewise, there is a premium for testing psychiatric drugs -- though the considered opinion of old-time guinea pigs is that you just don’t earn enough to make it worth letting someone mess with your brain chemistry.
Abadie’s description of the guinea-pig milieu -- based largely on interviews with a number of them living in a bohemian neighborhood in Philadelphia -- focuses on how they understand the risks involved in making a living this way, including their preferred means of recovering between rounds of exposure to “phase I” testing. (That is the term for clinical trials in which pharmaceuticals shown to have low toxicity when given to animals are tried on human subjects.) Various dietary regimens are thought to have a purifying effect. An informal network keeps participants updated on new opportunities in the human-subject market, and there used to be a zine called Guinea Pig Zero that still has a web presence.
Most of Abadie’s informants are also members of an anarchist counterculture that prides itself on remaining outside corporate capitalism. And making your living as a guinea pig is certainly different from joining the rat race. But the “mild torture economy” is well integrated into the larger and more literal economy. Testing is a necessary stage of pharmaceutical development, with some 80,000 phase I trials -- each involving 30 to 100 human subjects -- being run each year. The development of a pool of reliable but poorly paid “volunteers” (consisting mostly of young men who, as Abadie puts it, “use their bodies as ATMs to fund their lifestyles”) is one sign of the effect of deindustrialization on the labor market.
And the effect of becoming dependent on guinea-piggery as a source of income is that it creates an incentive to ignore the question of how exposure to experimental pharmaceuticals might affect you over the long run. “Beginners are more worried about risks than professionals,” notes Abadie. “Maybe this reflects the general population’s anxieties about biomedical research and its well-publicized abuses. Volunteers’ initial uneasiness focuses on the unknown effects of the drugs, but it also reflects a discomfort with a procedure they do not yet fully understand…. Some volunteers mentioned that they were somewhat concerned about developing cancer in the future.”
Not so, evidently, with those who had been through the process a few times: “Dependency on trial income, trial experiences that have not exposed them to side effects, and interactions with more experienced volunteers convinces newcomers that risks are not to be feared.” Just drink a couple of gallons of unsweetened cranberry juice and it’ll just wash the corporate technoscience right out of your system….
Meanwhile, the FDA “inspects less than 1 percent of all clinical trials in this country,” writes Abadie, and paid volunteers lack the resources to challenge any abuses they may suffer.
Trials in phases II and III -- when a drug is tested on patients suffering from the condition it may help treat -- draw on a different pool of human subjects, with motivations beyond that of payment. But when the subjects are economically vulnerable, as with some of the poor AIDS patients discussed in later chapters of Abadie's study, it compounds the ethical problems facing an institutional review board trying to assess whether the research has scientific merit or is driven instead by business interests.
The IRB in this case oversees the work of a small, community-based organization, not a university (where many clinical trials are conducted), but Abadie suggests that its ambivalence is commonplace. Its members "recognize the benefits that can derive from a relationship with the industry, but at the same time they fear that prospective financial gains can influence the research. These anxieties are reflected particularly in their views of the informed-consent process ... in which volunteers are supposed to be able to evaluate risks and benefits independently of other considerations."
The major weakness of this otherwise intriguing and worrying book is that it provides no clear sense of how typical the “professionalized” guinea pigs in Philadelphia may be -- and how central such repeat-performing volunteers are to the industry employing them.
Abadie maintains that a cohort of full-time human subjects emerged after the pool of prisoners dried up 30 years ago. The needs of the pharmaceutical industry led to the formation of “a group of reliable, knowledgeable, and willing subjects who depend on participation in trials for income to support themselves.” Okay, but just how dependent is the industry on them? What portion of the population of human research subjects for pharmaceutical research consists of such full-timers?
Invocations of “the new subjectivity required by neoliberal governmentality” may have their place in defining the situation. But hard numbers would be good, too. The fact that we don’t have them is part of the problem. But then there aren’t too many dimensions of the health care industry that don’t look like problems, right now.
Every so often a thinker will earn a place in history through the force of a single really bad idea. Cesare Lombroso (1832-1909) was such a figure. Examining the physiognomy of known felons, living and dead, the pioneering Italian criminologist concluded that some people were organically predisposed to breaking the law. It was just in their nature. They were degenerates, in the strictest sense: biological throwbacks from civilized humanity to something lower on the evolutionary scale.
Various physical traits signaled the regression. This was the bright side of Lombroso’s theory, since it told you what to watch out for. Rapists tended to have abnormally round heads. Women with masculine faces and excessive body hair were a menace to society; a lack of maternal instinct made them capable of acts more vicious and depraved than male offenders. Left-handed men were closer to the state of "women and savage races," thus more prone to crime or lunacy than we law-abiding right-handers.
All of this proves less amusing given how influential Lombroso’s books remained into the early 20th century. Somebody probably went to jail for having a sloping forehead and asymmetrical ears.
A few years back, Duke University Press brought out translations of a couple of Lombroso’s works, which apart from their historical significance, are fascinating for the images the esteemed researcher used to demonstrate his argument. They are haunting, especially the photographs. The faces wear various expressions: hardened, hungry, bitter, confused, terrified. Each evokes a long story of bad choices or bad luck, or both. I’m not sentimental enough to believe that all of them, or even most, were innocent. There are some tough customers who look ready to stick to their story, no matter what. (“That guy was already dead when I got there.”) But the crimes are long forgotten. What remains now is the trace of misery, caught in the gaze of a criminologist who has reduced them to specimens.
On page 78 of William Garriott’s Policing Methamphetamine: Narcopolitics in Rural America, published by New York University Press, there is the reproduction of a poster called “A Body on Drugs.” The author, who is an assistant professor of justice studies at James Madison University, found it taped to the wall of a sheriff’s office in “Baker County” -- the name he has given to an area in West Virginia where he did ethnographic fieldwork in the mid-2000s.
Garriott calls the poster “reminiscent of the catalogs of criminals from which the 19th-century criminologist Cesare Lombroso sought to discern the distinctive features of congenital criminality.” I will return to this idea later, but first should describe the poster itself. Because it has been reduced to the dimensions of a single page in a book, the text is almost impossible to read, but you can still make out the photographs, which show the long-term effects of methamphetamine use on the body through a combination of mug shots and close-ups, plus brain scans.
All of it is ghastly. “The arms and legs had open sores,” recalls Garriott, “the hands were scabbed and bandaged, the mouth was missing teeth, the brains showed signs of malfunction, and the faces were prematurely aged.” If anything, the images may understate the impact of meth. The festering sores result from an accumulation of toxins in the addict’s body. They can also induce psychosis. “Cooking” meth in improvised labs, besides running the risk of explosion, generates extremely dangerous contaminants.
The social profile of crack cocaine, 20 years or so back, was black and urban, while meth’s “brand identity” tends to be white and (especially in recent years) rural. Garriott initially went to West Virginia as a cultural anthropologist to study “the treatment experiences of addicts working to overcome their addiction to meth,” he writes, “what I thought of as the ‘therapeutic trajectory of their recovery process.' ” The focus of the project shifted as Garriott noticed how often “drug problems generally, and the methamphetamine problem specifically, were framed locally as matters for the criminal justice system,” rather than as a medical issue.
To describe the relationship between addict and community, then, was impossible without assessing the role of the police. This is hardly surprising, and perhaps least of all in rural areas, where state and civil society tend to meet at the same diner and church socials. But Garriott’s analysis leaps from the ethnographic particulars to broad claims about what he calls the “narcopolitics” of meth. The term is modeled on Michel Foucault’s concept of biopolitics, which covers a host of ways the modern state seeks to monitor, classify, regulate, and control the population of human organisms within its territory. (However befuddled Lombroso’s dubious Darwinism, for example, his work is the perfect instance of a biopolitical strategy: identifying a defective and dangerous human subspecies enhances the power of the authorities over the social order. That was the plan, anyway.)
Once, the narcopolitical imperative was summed up in the slogan “War on Drugs,” which you don’t hear invoked much anymore. (To quote Detective Ellis from "The Wire": “You can’t even call this shit a war… Wars end.”) Yet the constant mobilization against illegal drugs not only continues but blurs the line between narcopolitics and the “normal” functioning of the state – including, in Garriott’s catalog, “the election of officials, the administration of justice, the practice of law enforcement and the formation of public policy (both foreign and domestic), the allocation of social services, the use of military force, the interpretation of law, and the behavior of the judiciary.”
And because the prosecution and incarceration of drug offenders is one of the few areas of governmental action with broad public support, narcopolitics serves to legitimate the state itself. Policing the availability of illegal drugs and the behavior of their uses becomes a means through which the authorities establish and maintain public order -- or can at least be seen trying.
These tendencies become self-reinforcing. Drug abuse ceases to be a social problem. Rather, social problems, including violence and poverty, look like effects of criminal drug enterprises – which means resources should be channeled towards interdiction and incarceration.
With this notion of narcopolitical power -- as with just about any schema derived from Foucault’s work -- you soon get the sense of a juggernaut rolling over the landscape, flattening everything in its path, with nobody resisting because nobody can, and you’d pity the fool who tried.
In an epilogue, Garriott takes up the question of what reforms of the system suggested by his analysis -- then admits that none really follow. I respect his candor. If you can’t change the world, might as well interpret it, not that doing so makes much difference. But there is at times a strange disconnection between his analytic framework and his descriptions of life in Baker County.
The narcopolitical imagination, by Garriott’s account, “maps” social space according to its own imperative to track and control illegal substances. The community learns to define itself in opposition to the menace of drug dealers and addicts. Social anxieties become focused around them. The preferred response is punitive. Therapeutic treatment for meth abuse is something prescribed by the legal system; it is part of a continuum, with prison at the other end. And all of this functions in a closed loop -- with the problem always finally defined as a matter of criminality, thereby reinforcing narcopolitical power.
But Garriott’s fieldwork shows a community with every reason to regard meth as a real menace – not because it is a convenient explanation for social disorder, but every phase of its existence creates actual dangers. The author does not mention this. Cooking one pound of meth creates six pounds of toxic byproduct. Recovery from addiction is difficult and rarely lasts for very long. Nor does accumulation of narcopolitical power by the state generate confidence in its authority. Garriott notes rumors that local officials are failing to deal with the meth problem because they are somehow involved in trafficking. And while Foucault's thinking about biopower treated certain new disciplines (criminology for instance) as modes of domination over the social field, the knowledge gained by the police and citizens clearly has the very opposite effect. Garriott quotes one officer saying, "Sometimes I wish I was more naive." The awareness that a trash bag on the side of the road might be filled with deadly chemicals from a meth lab is itself a kind of "poisonous knowledge," as the author puts it.
"A Body on Drugs," the poster mentioned earlier, is a concentrated bit of such poisonous knowledge. Garriott borrowed its title for the dissertation later revised as this book. His commentary treats the images as a contemporary narcopolitical variant of Lombroso's work, "drawing attention to a generic type of criminal and the signs by which they could be identified." Recognizing the open wounds, rotting teeth, and emaciation "made possible ... understanding both their physical appearance and their criminality as symptoms of their addiction." The poster did not say they were "born criminals," as Lombroso might. But the narcopolitical gaze was linking their biology and their criminality just as closely.
Having finished reading Policing Methamphetamine, I used a magnifying glass to examine the poster closely. You could see, for example, the little jar that one woman used to collect the imaginary bugs she felt crawling under her skin and removed with a knife. So I learned from the captions. There were some mug shots taken of people who had been arrested before becoming addicted to meth and then afterward. Garriott calls them "a concrete means of imagining the temporality of the relationship between drugs, addiction, and criminality," which is certainly one way of putting it. But in spite of prolonged squinting, I never saw any mention of criminality on the poster. That was not its point. It was about suffering.