Wikipedia came into the world 15 years ago today -- and, man, what an ugly baby. The first snapshot of it in the Internet Archive is from late March of 2001, when Wikipedia was already 10 weeks old. At that point, it claimed to have more than 3,000 pages, with an expressed hope of reaching 10,000 by the end of summer and 100,000 at some point in the not unimaginably distant future. The first entries were aspirational, at best. The one about Plato, for example, reads in its entirety: “Famous ancient Greek philosopher. Wrote that thing about the cave.”
By November -- with Wikipedia at 10 months old -- the entry on Plato was longer, if not more enlightening: you would have learned more from a good children’s encyclopedia. Over the next several months, the entry grew to a length of about 1,000 words, sometimes in classically padded freshman prose. (“Today, Plato's reputation is as easily on a par with Aristotle's. Many college students have read Plato but not Aristotle, in large part because the former's greater accessibility.”) But encouraging signs soon began to appear. A link directed the reader to supplementary pages on Platonic realism, for example. As of early 2006, when Wikipedia turned five years old, the main entry on Plato had doubled in length, with links to online editions of his writings. In addition, separate pages existed for each of the works -- often consisting of just a few sentences, but sometimes with a rough outline of the topics to be covered in a more ambitious entry somewhere down the line.
The aspirations started to look more serious. There were still times when Wikipedia seemed designed to give a copy editor nightmares -- as in 2003, when someone annotated the list of dialogues to indicate: “(1) if scholars don't generally agree Plato is the author, and (2) if scholars don't generally disagree that Plato is not the author of the work.”
Yet it is also indicative of where the site was heading that before long some volunteer stepped in to unclog that passage's syntactical plumbing. The site had plenty of room for improvement -- no denying it. On the other hand, improvements were actually happening, however unsystematically.
The site hit its initial target of 100,000 pages in early 2003 -- at which point it began to blow up like a financial bubble. There were not quite one million pages by the fifth anniversary of its founding and 3.5 million by the tenth. Growth has slowed of late, with an average of about 300,000 pages being added annually over the past five years.
I draw these figures from Dariusz Jemielniak’s Common Knowledge? An Ethnography of Wikipedia (Stanford University Press, 2014), which also points out how rapidly the pace of editorial changes to articles began to spike. Ten million edits were made during Wikipedia’s first four years. The next 10 million took four months. From 2007 on, the frequency of edits stabilized at a rate of 10 million edits per seven or eight weeks.
We could continue in this quantifying vein for a while. As with the Plato entry finding its center of gravity after a long period of wobbly steps, the metrics for Wikipedia tell a story of growth and progress. So does the format’s worldwide viability: Wikipedia is now active in 280 languages, of which 69 have at least 100,000 entries. It all still seems improbable and inexplicable to someone who recalls how little credibility the very concept once had. (“You can edit this page right now! … Write a little (or a lot) about what you know!”) If someone told you in 2002 that, in 10 years, the Encyclopædia Britannica would suspend publication of its print edition -- while one of the world’s oldest university presses would be publishing material plagiarized from Wikipedia, rather than by it -- the claim would have sounded like boosterism gone mad.
That, or the end of civilization. (Possibly both.) What’s in fact happened -- celebrate or mourn it as you will -- has been a steady normalization of Wikipedia as it has metamorphosed from gangly cultural interloper into the de facto reference work of first resort.
In large measure, the transformation came about as part what Siva Vaidhyanathan has dubbed “the Googlization of everything.” Wikipedia entries normally appear at or near the top of the first page of the search engine’s results. After a while, the priority that the Google algorithm gives to Wikipedia has come to seem natural and practically irresistible. At this point, having a look at Wikipedia usually quicker and easier than deciding not to (as someone once said about reading the comic strip “Nancy”).
Another sign of normalization has been the development of bibliographical norms for citing Wikipedia in scholarship. It signals that the online reference work has become a factor in knowledge production -- not necessarily as a warehouse of authoritative information but as a primary source, as raw material, subject to whatever questions and methods a discipline may bring to bear on it.
In the case of that Plato entry, the archive of changes over time would probably be of minimal interest as anything but a record of the efforts of successively better informed and more careful people. But Wikipedia’s role as a transmitter of information and an arena for contesting truth claims make its records a valuable source for people studying more recent matters. Someone researching the impact of the Sandy Hook Elementary School shootings, for example, would find in the Wikipedia archive a condensed documentation of how information and arguments about the event appeared in real time, both in its immediate aftermath and for years afterward.
I've been reading and writing about Wikipedia for this column for most of its lifespan, and it won't be five years before there's occasion to do so again. There's plenty more to say. But for now, it seems like Professor Wikipedia should get the last word.
Until fairly recently, I had in my files a copy of the supplement to the Sept. 19, 1995, issues of The New York Times and The Washington Post containing “Industrial Society and Its Future,” better known as the Unabomber manifesto. “Unabomber” was the moniker the Federal Bureau of Investigation gave the person or persons responsible for a string of mail bombs primarily targeting people at universities and airlines, which killed three people and maimed 23 more between 1978 and 1995.
The carnage was vicious, but it also appeared pointless, at least until the manifesto became available. It is usually characterized as neo-Luddite -- a call to halt and reverse the self-perpetuating course of technology-dominated human history, which both fosters and feeds on profound alienation. (My copy went into the recycle bin once I downloaded the text to my ereader. So it goes.) Blowing up college professors and airline executives seemingly at random was hardly the most logical or effective way of transforming civilization. But by the mid-1990s American culture had spawned a number of strange and disturbing combinations of means and ends. “Industrial Society and Its Future” was the work of someone more intelligent than Timothy McVeigh and less manifestly delusional than the Branch Davidians or the Heaven’s Gate people. It read like a master’s thesis in anthropological theory that had, at some point, gone terribly, terribly wrong.
Publication of his treatise in a major national publication had been the Unabomber’s condition for ending the terror campaign. The FBI figured that conceding would at very least save lives and buy investigators time; it might also increase the chances of someone reading the text and having a hunch as to its authorship.
And that is just what happened -- although things very well might not have worked out that way. David Kaczynski’s reflective and resolutely unsensational memoir Every Last Tie: The Story of the Unabomber and His Family (Duke University Press) reveals how difficult it was to accept even the possibility that his older brother, Theodore, might be a terrorist. The author is the executive director of a Tibetan Buddhist monastery in Woodstock, N.Y., and (as testified in the afterword by James L. Knoll IV, a forensic psychiatrist who teaches at State University of New York Upstate Medical University in Syracuse, N.Y.) a tireless speaker and activist in the movement against the death penalty. It took weeks of agonizing discussion with his wife for David Kaczynski to reach the reluctant conclusion that the Unabomber might be his troubled older sibling -- identified in captions to family photos as “Teddy,” which somehow feels a little disconcerting.
By 1995 they had been out of touch for several years. The estrangement was not quite as bitter as the one between Ted and his parents, but the recriminations were one-sided in either case. The older sibling’s back-to-the-land yearning for self-sufficiency had metastasized over the years. Introversion and reclusiveness gave way to a seething hatred of everyone -- even of those closest to him, those able to give him unconditional love.
Especially those people, in fact. The burden of the memoirist here is not just to recount his own past but to make sense of it in the context of an act of violence otherwise utterly disconnected from his own memories. Seven years younger than his brother, the author recalls growing up happily in his shadow; they remained on affectionate terms long after Ted went off to Harvard University at age 16. “Growing up,” David writes, “I never doubted my brother’s fundamental loyalty and love or felt the slightest insecurity in his presence.” He characterizes their father as “a blue-collar intellectual” -- one who “didn’t have much formal education [but] was widely read and believed progress was possible through mankind’s rational pursuit of the greater good” -- and the description sounds like it would apply to their mother as well. “Discipline in our family was based on reason and dialogue,” the author says, “not authority and fear.”
There were worse ways to spend the 1950s, but an unfortunate turn as a Harvard undergraduate may have sent Ted Kaczynski’s retiring and cerebral personality off in a dangerously pathological direction. For three years, he served as a human guinea pig for a study on the effects of aggression and humiliation on bright college students -- one of various projects intended to add psychological weaponry to the Cold War arsenal. He survived and went on to do award-winning doctoral work in mathematics at the University of Michigan, Ann Arbor, followed by an assistantprofessorship at the University of California at Berkeley. But by 1971, Kaczynski, still in his twenties, left academe to cultivate a small plot of land in Montana, where he could be alone with his thoughts.
What happened over the following 25 years is presumably documented among the papers removed from Ted Kaczynski’s cabin. Every Last Tie does not refer to this material, and it’s hard to blame the author if he did not burrow through it in search of an explanation. (He’s been through enough.) What he did see was the letters full of accusation that Ted sent to their parents as his mind wandered deeper into paranoia and rage.
Eventually, when David’s long-unrequited love for a girl he’d grown up with ended in marriage, it was the end of that family connection as well: Ted disowned his brother. The memoir is made up of essays focusing in turn on each member of David Kaczynski’s nuclear family and finally on his spouse, Linda, who was the first person to suspect that the brother-in-law she’d never met might be the Unabomber.
One effect of narrating the past this way is that the book as a whole is not linear. Events are recounted as moments in the author’s relationship with each individual. The effect is to underscore precisely the thing that Ted Kaczynski could not experience, or at least not endure: the intimacy of shared lives. And although they remain estranged, David does not disown his brother for the simple reason that he cannot:
“Ted’s cruelty stigmatizes my good name; but my reputation for goodness comes at his expense. Like all contrived opposites, we reinforce one another. The worst thing that he can do to me is to deny an opportunity for reconciliation. Hope of reconciliation is something I am bound to maintain, but it costs me little -- only the sneaking sense that some part of me is missing.”
As our schedules and the weather permit, my wife and I walk dogs rescued from high-kill and overcrowded shelters and held in a foster-care facility. The walks are, in part, a way to find the dogs new homes: they wear vests or bandannas inviting people to inquire about adoption. (Anyone inclined to make an end-of-the-year donation to this worthy cause should inquire here.)
For the dogs, of course, a walk is an end unto itself. Most are raring to go, though we’ve occasionally had new arrivals from the countryside who feel overwhelmed by the sights and sounds of an urban downtown. One got about half a block out the door before he’d had enough and, refusing to budge, sat down and cowered in place. But that was a rare and extreme case. Normally it takes just a few minutes for a dog to adjust to the environment and feel drawn into it, pulling us along into the excitement of open space.
Before long we start crossing paths with attentive people who stop to admire the dog, and sometimes Rita interests them in taking information on how to adopt. The foster facility seems to have pretty high turnover, so mission accomplished, presumably. But after the first or second expedition, that part of the walk became much less interesting to me than the moments of heightened awareness that sometimes occurred between contacts with other humans. It was an almost meditative absorption in our surroundings -- an effort to imagine the world as experienced from the other end of the leash.
In The Marriage of Heaven and Hell, William Blake asks, “How do you know but ev'ry Bird that cuts the airy way, / Is an immense world of delight, clos'd by your senses five?” A dog on the ground raises that question in an earthier way than a bird in the air, for there’s a constant reminder (at least two or three times per block) that the dog’s landscape consists of a fine-grained texture of smells that is almost entirely lost on humans. (Likewise with sounds beyond our ken.) The bird’s ecstasy was a matter of conjecture for Blake. But that an “immense world of delight” opens itself to a dog’s senses seems self-evident, even though the human imagination is closed to most of it.
My musings on dog sensibility have been a lot like the walks themselves: occasional and fairly restricted, exploring no more than could be covered by a circular route in about an hour. Colin Dayan’s With Dogs at the Edge of Life (Columbia University Press) is a much more comprehensive exploration -- the work of a mind that slips the leash of genre or narrow specialization at every opportunity
The author, a professor of humanities and of law at Vanderbilt University, makes sharp turns and intuitive leaps that are, at times, unexpected and disconcerting. She published parts of the book in The Boston Review and other journals as essays of diverse kinds (memoir, reportage, criticism, political commentary, etc.) but with continuities and themes developing across the differences in framework and voice. Generalization seems hazardous with such a hybrid text, but here goes anyway.
Dayan refers to “those of us who believe that the distinction between human and nonhuman animals is unsustainable.” She takes the experience of toggling between human and canine awareness -- as with trying to imagine walking the dog from the four-legged perspective -- as a given. It is basic to the relationship between the two species that has developed from tens of thousands of years of cohabitation.
Humans and dogs read each other’s minds, in effect, or at least we try -- and anyone who lives with or around dogs for very long knows that a real zone of intersubjectivity emerges from the effort. A degree of anthropomorphism is probably always involved, but we get around it a little in moments of recognizing, and respecting, the dog’s own capacities.
“Dogs live on the track between the mental and the physical,” Dayan writes, “and sometimes seem to tease out a near-mystical disintegration of the bonds between them. What would it mean to become more like a dog? How might we come up against life as a sensory but not sensible experience? We all experience our dogs’ unprecedented and peculiar attentiveness. It comes across as an exuberance of a full heart. Perhaps this is what the Puritan divine Jonathan Edwards meant when he emphasized a physical rather than a moral conversion. He knew that the crux of divinity in earthbound entities lay in the heart’s ‘affections.’”
The movement within that paragraph -- between metaphysical categories and the ordinary dog owner’s intuitions, with the dismantling of dichotomies raising moral implications which then, even more sharply, plunge into the sphere of theology -- presents in miniature what the book does on a much larger scale. At the same time, Dayan’s thinking is grounded in concrete particulars, including issues around a particular variety of dog, the pit bull terrier, which appears to have become the contemporary, secular embodiment of diabolical menace. In some places they are very nearly the target of a campaign of extermination.
Not so coincidentally, perhaps, it is African-American residents of housing projects and poor white Southerners whose pit bulls are most likely to be confiscated and destroyed. Video of the police killing poor or homeless people’s dogs, whatever the breed, seems to be its own genre on YouTube. (I am willing to take her word for it.) At the same time, an association between impoverished or collapsing cities and feral dog packs has become a commonplace in journalism, while a number of directors have used the roaming dog as a character or scenic element in recent films.
It’s tempting to say that Dayan does for dogs what Melville did for whales: tracking the social roles and symbolic frameworks built up around them and depicting them at the intersection between cosmic order and human frailty, while also giving them (dogs and whales alike) due recognition as animals with worlds of their own, which we humans impinge upon. That description may intrigue some people while doubtless putting off at least as many. So be it, but I’ll say that With Dogs on the Edge of Life was one of the most memorable books I’ve had the chance to read this year.
Mad Men Unzipped: Fans on Sex, Love, and the Sixties on TV, from the University of Iowa Press, is not the first academic book devoted to the AMC series about hard-drinking, chain-smoking, decidedly nonmonogamous advertising executives in Manhattan in the 1960s. Not by a large margin: of the 14 titles on the program listed in the Library of Congress Catalog, 10 are from scholarly presses or otherwise manifestly professorial.
Unzipped is the ninth such title. Its senior author, Karen E. Dill-Shackleford, is a professor of psychology at Fielding Graduate University -- an accredited distance-learning program described on its website as offering graduate degrees in “the fields of clinical and media psychology, educational leadership, human development, and organizational development” -- and the other three authors also have some connection to Fielding. (For particulars, see the book's Facebook page.) Identifying themselves as “a team of media psychologists” who are also “members of the Mad Men audience,” they have “followed the show and the fans’ reactions to better understand both fandom generally and the Mad Men fan phenomenon particularly.”
Previous monographs treated Mad Men in its political, historical and philosophical dimensions, and there is already at least one effort to psychoanalyze the characters. With Unzipped, Dill-Shackleford et al focus on, in their own words, “the way people make sense of fictional stories and use what they learn to think about life” and “how the interactive world of social media allows us to contribute to the conversation.”
The authors announce their work as “cutting-edge psychological research on how fans make meaning from fictional drama.” The claim is too hyperbolic for its own good, considering that the study of fandom largely got underway with Henry Jenkins’s Textual Poachers: Television Fans and Participatory Culture (1992) and now has its own publication of record, the Journal of Fandom Studies, launched in 2013. On the first page of Mad Men Unzipped, the authors stress that they reject “the misguided stereotype of the geeky fan who has had a mental break with reality.” Fair enough, but that simply repeats the inaugural gesture of fandom research, which involved responding to William Shatner’s satirical dig at Trekkers with, “That’s not funny!” (to paraphrase very loosely).
Then again, distancing their attitude from “the misguided stereotype of the geeky fan” makes sense if we assume that the book is meant for an audience of psychology undergrads and Mad Men aficionados, rather than of initiates in fandom-studies research. In that respect, Unzipped is a good conversation starter about the relatively unproblematic condition of “being a fan” in the everyday, typical sense: someone who enjoys watching, thinking and talking about a program, whether or not he or she goes on to attend or host a theme party, write fiction based on the show’s characters, or the like.
Granted, the more ardent expressions of devotion do sometimes lead to strange and interesting subcultures. But it’s casual fans who are more common and, perhaps, more teachable -- that is, able to benefit from turning their enthusiasm for a particular show into an occasion to reflect on how and why it means something to them.
“In our digital era,” the authors of Unzipped write, “stories live in what are known as ‘transmedia spaces.’ Transmedia means that the story crosses from one medium to another (TV, blog, fan video, theater, app), playing itself out in different spaces.” That certainly has implications for media-psychology research itself -- creating “a new era of social science in action” now that “dragging college sophomores into a lab and forcing answers out of them” is no longer necessary. Fandom, even casual fandom, documents itself. The authors can survey the range of reactions to Mad Men’s characters (Pete Campbell: Man or boy?) or depictions of changing gender roles (Joan Holloway: Second-wave feminist avant la lettre?) with an abundance of blog posts, tweets and other digital records, often put out into public space before an episode ended.
The responses themselves are seldom very surprising, at least to anyone who has had a chance to discuss with another viewer the pleasures, frustrations and ambivalences of following the show’s arcs of character development and depictions of social change (not to mention their likely post-1970 fallout). There are occasional exceptions, such as the authors’ observation that “the fans had precious little to say about alcohol addiction that went beyond ‘that’s how it was in those days,’” although they did want to talk about sex addiction. Another quoted commenter pointed out, “While the writers show great complexity in their development of working women at a turning point, they do not seem to know what to do about motherhood.”
And interviews with viewers working in the advertising industry at various points over the past 50 years tended to evaluate Mad Men as an extremely accurate depiction of life in a major agency -- except for those who dismissed it as unrecognizable and soap opera-like. As with judgments of Don Draper’s character or Bert Cooper’s sanity, questions of historical realism here are in the eye of the viewer. The very nature of the evidence, and of the jury, is that no binding judgment can be made.
Media psychologists can show us that audiences bring diverse and complex emotions and presuppositions with them that imaginary characters and dramatic situations can then evoke. My belief that Sally Draper went on to join the Symbionese Liberation Army tells you something about her or about me -- possibly both. Our meaning-making capacities can and do respond to works of fictional narrative in ways that media psychologists can show and analyze.
The more interesting thing is that some narratives invite or even demand such an engagement from the public and get it. Others don’t; some don’t even try. What sets them apart from one another is a question with historical and aesthetic aspects, but it also has a component that it seems as if psychologists would want to take up.
And as a spin-off study, someone ought to do research into another matter. There are Mad Men Barbie dolls and tarot cards and many other such items -- including the Unofficial Mad Men Cookbook: Inside the Kitchens, Bars, and Restaurants of Mad Men. Why, for every such fan-oriented title, are there two aimed at an academic audience? With more to come, no doubt about it.
Five years ago this month, a consortium of major telecommunications carriers (AT&T, T-Mobile and Verizon) announced that it was developing a new application that would enable customers to pay for goods and services using their smartphones. This “mobile wallet,” as such technology is commonly called now, would make credit card and debit account information available to merchants by wireless.
Other enterprises, including banks and American Express, soon joined the partnership. The application seemed well positioned to enter the market for hyperconvenient consumerism, even to dominate it. But things did not work out that way. A demo during the plenary session of a major conference on new payment technologies in 2013 went badly. Consumers complained that the app’s “setup and payment processes were cumbersome and frustrating,” in the words of Chris Welch in The Verge. But those were minor scratches compared to the self-inflicted fatality of the app’s name: Isis. It gets worse. A gift card with the words “serve ISIS” was circulating even after the product’s name was changed to Softcard in 2014.
“Probably few consumers even knew of its existence until the media bump it received from its rebranding,” Bill Maurer notes in How Would You Like to Pay? How Technology Is Changing the Future of Money (Duke University Press). That bump was clearly not enough: Softcard shut down early this spring. At the same time, the range of mobile wallets on sale has only been increasing. The information technology research firm Gartner estimated the value of mobile payments for 2012 was $163 billion worldwide and anticipates it will reach $720 billion by 2017.
“This is definitely an ecosystem in flux,” one business and technology columnist wrote last month, “partly because there are so many players offering so many different solutions -- and so many questions about compatibility and security.”
“Ecosystem” seems an interesting choice of words, in this context. Maurer, an anthropologist who is also dean of the school of social sciences at the University of California at Irvine, also uses it -- but in a much thicker sense than as a synonym, more or less, for “market.” The smartphone wallet represents only one means of mobile payment, limited mainly to the world’s more prosperous sectors. It’s in the poorer countries of the global South that mobile payment (using phones with text-messaging capabilities and maybe a little built-in flashlight) looms as a much larger part of everyday life: an economic and social link between urban and rural areas.
In Kenya, the M-Pesa mobile payment service launched in 2007, and within three years, more people were using it than had bank accounts. Over half of the country’s households had adopted it by 2011, and Maurer writes that M-Pesa “processed in that year more transactions within Kenya than Western Union had done globally.”
The contrasting fortunes of Isis/Softcard and M-Pesa (where M stands for “mobile” and “pesa” is the Swahili word for money) are striking; how well each met the demands of the people using them obviously differed significantly. But Maurer’s interest runs deeper than the great disparities between the respective societies.
We’re prone to think of money as a medium of exchange originally created to get around the vagaries of barter (e.g., it’s hard to make change for a goat) and also as a tool notoriously indifferent to how it’s used. With $10,000, you can furnish your apartment or hire a contract killer. Money itself, so understood, is both fungible and morally inert. And from that perspective of money, the recent technological innovations in how it can be transferred from one person or place to another are significant chiefly for whatever changes are made in speed, ease or degree of anonymity of the exchange.
Maurer’s subtitle seems to promise speculation on how money will change, but his stress on the idea of payment (or, better, payment systems) has a decidedly retrospective component. In the abstract, the value of $10,000 in cash is the same as that of $10,000 in diamonds, bitcoins or traveler's checks. Each can be used as a form of money, for exchange.
But in practice, different kinds of social infrastructure are involved in making the transaction feasible -- or even possible -- with considerable implications about the relationships among the people involved. I have not made the experiment, but I doubt you can buy furniture using diamonds, and paying a hit man with a money order seems like a bad idea.
At some point, bitcoins might have the nearly universal acceptance that cash now does; both are fungible and, in principle, anonymous. But those qualities do not inhere in the paper or digital currency themselves: each is part of a payment system, without which it would be worthless. The same is true of credit cards, of course, or smartphones-turned-wallets.
Money of whatever sort is an “index,” the author says, of “relationships of obligation, rank, clientage, social belonging or state sanction.” Furthermore, old payment systems don’t necessarily die off; more than one can be operating in a given society at the same time. Maurer describes the interesting and intricate ways long-distance charge cards have become integrated into African economies where cash and barter also have a place. Aware of the fantastically destructive effects of the last financial crisis, he is clearly concerned that the advantages of being integrated into the global economy could be wiped out in the long term, through no fault of the continent's mobile users themselves.
In the end, How Would You Like to Pay? is of interest less for what it says about the future (the author makes no predictions -- which, given the Isis debacle, seems prudent) than for how it encourages the reader to pay attention to nuances of the present. It’s a primer of the anthropological imagination -- and a reminder that money is too important a matter to leave to the economists.
Between the Boston Marathon bombing and too many spree killings by heavily armed men with grievances to keep count, we’ve all had plenty of recent experience with 24-hour coverage of horrific events such as the Paris massacre last Friday. There seem to be two major ways to manage attention. They don’t exhaust the possibilities but definitely mark the limits of a not especially broad spectrum.
One option is to keep track of breaking developments more or less in real time -- in extreme cases, checking for updates every few minutes. You expect the worst but try to get a jump on it, somehow, by absorbing each new crumb of pertinent information as it becomes available. The opposite extreme is to make like an ostrich and find some sand. Or at least to wait for fact, rumor and guesswork be sorted out. Only at that point does catching up make sense; until then, there’s more noise than signal.
By temperament I lean toward the first pattern: obsessive scanning. But not after Friday. Maybe the thought of the worst being yet to come was too much to handle. In any event, I opted for burying my head in a couple of recent books, starting with Michael Griffin’s Islamic State: Rewriting History, published this month by Pluto Press and distributed by the University of Chicago Press. The other, The Rise of Islamic State: Isis and the New Sunni Revolution by Patrick Cockburn, was released by Verso earlier this year. Neither can be recommended to anyone whose nerves are easily jarred. But they give a much thicker account of the group that inspired the attacks than nonspecialists can piece together from news reports over the past couple of years.
The authors seem to have turned in the manuscripts to their respective publishers around this time in 2014. At that point, the potential for the new “caliphate” to inspire terrorism beyond the Middle East was a less pressing issue than its unprecedented arrival as a force in the region.
Cockburn’s book, which incorporates his reporting on the Islamic State in Iraq and Syria for the London Review of Books, vividly conveys the speed and range of the group’s expansion and consolidation: ISIS, “as though intoxicated by its own triumphs,” proved capable of “defeating its enemies with ease even when they were more numerous and better equipped.” From one of a number of “nonstate actors” in the region, ISIS transformed into something that, beyond simply proclaiming itself as the Islamic State, effectively dissolved the border between Iraq and Syria and imposed its own religious and military authority over “an area larger than Great Britain and inhabited by some six million people -- a population larger than that of Denmark, Finland or Ireland.”
At the same time, ISIS remains, if not invulnerable to air strikes, then certainly prepared for them. It “operates as a guerrilla army,” Cockburn says, “without easily visible movements of personnel or equipment that can be targeted. Its leadership is well practiced at keeping out of sight.”
But it’s Griffin’s book that actually tells in detail the story of where ISIS came from and how it transformed over time. The author is a political analyst and commentator for BBC World and Al Jazeera. He draws almost entirely on English-language publications, in contrast to Cockburn, who quotes an array of friends, interview subjects and bits of popular culture from around the Middle East. But Griffin integrates his sources to good effect. He traces the growth of ISIS out of what had been Abu Musab al-Zarqawi’s organization Al Qaeda in Iraq -- a group that had managed to alienate both Osama bin Laden and Iraqi insurgents fighting the U.S. occupation. The death of Zarqawi in 2006 seems to have created less of a power vacuum than an opening for more capable strategists to assert themselves.
They learned to adapt to and exploit specific local and tribal concerns while building up both an effective economic infrastructure and formidable propaganda skills, taking advantage of the new-media skills of European-born jihadis who joined them. The ISIS cadre were also exceptionally lucky -- astonishingly, uncannily so -- about getting hold of new weaponry and tools. The psychological impact of the fall of Mosul in June 2014 was magnified by the sight of ISIS fighters “speeding towards outflanked enemies in hundreds of looted Humvees, bristling with assault rifles and rocket launchers.”
Plus the jihadis had “a fleet of white Toyota Tacoma pickups, double cabbed with mounted machine guns.” The vehicles, custom-made for U.S. Special Forces, were only available from a Toyota assembly plant in Texas. “How they managed to reach the frontiers of the caliphate,” Griffin says, “is anyone’s guess.”
Reading these books quickly was difficult, and the marginal notes and highlights I made along the way are evidence of how much more time it would take to grapple with them -- especially with regard to the authors’ differing understandings of the Arab Spring, and of the Syrian uprising in particular.
What they concur on, and no surprise, is the emerging status quo, with the Islamic State obliging the United States and Iran to act as allies for the foreseeable future, despite the saber-rattling impulses toward one another. The situation was paradoxical and hard to imagine as stable even before the terrorist attacks of last week. Is there even a word for how things stand now?