Books

Review of "Hell Is a Very Small Place: Voices From Solitary Confinement"

No Exit by Jean-Paul Sartre involves three characters who are condemned to spend their afterlives together in a shabby but not especially uncomfortable room -- “condemned” because it’s the Devil himself who brings them together. Evidently some kind of infernal algorithm has been used to select the group, designed to create optimal misery. Sartre’s dialogue is quite efficient: we soon learn the broad outlines of their time on Earth and how it was shaped by wishful thinking and self-deception.

We also see how quick they are to recognize one another’s vulnerabilities. Any given pair of characters could find a mutually satisfying way to exploit each other’s neuroses. But there’s always that third party to disrupt things, rubbing salt into old wounds while inflicting fresh ones.

In a moment of clarity, one of them finally recognizes that they are damned and utters Sartre’s best-known line: “Hell is other people.” Quoting this is easy, and misunderstanding it even easier. It sounds like a classical expression of self-aggrandizing misanthropy. But the figures on stage did not wander over from the pages of an Ayn Rand novel. They are not sovereign egos, imposed upon by the demands of lesser beings whose failings they repudiate with lofty scorn.

On the contrary, Sartre’s characters are driven by a desperate and insurmountable need to connect with other people. They crave intimacy, acceptance, reciprocity. They also seek to dominate, manipulate, yield to or seduce one another, which would be difficult enough if they weren’t trying to do more than one at the same time. The efforts fail, and the failures pile up. Things grow messy and frustrating for all parties involved. Hell is other people, but the torment is fueled by one’s own self.

That insight rings even more true in the wake of Hell Is a Very Small Place: Voices From Solitary Confinement (The New Press), an anthology edited by Jean Casella, James Ridgeway and Sarah Shourd. I doubt anyone meant the title as an allusion to No Exit. The activists, scholars and prisoners contributing to the book document a place much darker and more brutal than Sartre imagined -- but akin to it, in that the damned are condemned, not to lakes of fire, but to psychic torture so continuous that it seems eternal. (Casella and Ridgeway are co-founders of Solitary Watch, while Shourd is a journalist who spent 410 days in solitary confinement while imprisoned in Iran.)

A few basic points: solitary confinement initially had humane intentions. Quaker reformers in the early American republic were convinced that prisoners might benefit from a period of reckoning with their own souls, which would come readily in isolation from the evil influence of low company. If so, they would reform and return to society as productive members. More secular versions of this line of thought also caught on. Unfortunately it did not work in practice, since prisoners tended to emerge no better for the experience, when not driven insane. By the turn of the 20th century, the practice was being phased out, if not eliminated, as ineffective and dangerous.

Now, my sense from reading around in JSTOR is that, from about 1820 on, whenever the issue of imprisonment came up, the eyes of the world turned to the United States. Other countries had a similar rise and fall of confidence in solitary confinement over the years. But the practice took on a new life in America starting in the 1980s. The aim of reforming prisoners was no longer a factor. Solitary confinement -- warehousing prisoners alone in a cell for 23 to 24 hours a day, minimizing contact with one another and with the outside world -- permitted mass incarceration at reduced risk to prison guards.

In 2011, Juan E. Méndez, the United Nations Human Rights Council’s Special Rapporteur on Torture and Cruel, Inhuman and Degrading Treatment or Punishment, issued a report on the use of prolonged solitary imprisonment around the world, with “prolonged” meaning more than 15 days. Administrators and government officials rejected his request to inspect isolation units in American prisons. In his contribution to Hell Is a Very Small Place, Méndez writes that the best estimate for the population of those in solitary confinement in the United States at any given time is 80,000 people, “but no one knows that for sure.” The personal accounts by prisoners in the book show that confinement American-style is more than “prolonged.” It can go on for years, and in some cases, for decades.

An isolation cell is sometimes called “the Box,” and the experience of living in one for months and years on end makes it sound like being buried alive. In the chapter “How to Create Madness in Prison,” Terry Kupers describes the symptoms that appear during a long stretch. The prisoner in isolation “may feel overwhelmed by a strange sense of anxiety. The walls may seem to be moving in on him (it is stunning how many prisoners in isolated confinement independently report this experience) …. The prisoner may find himself disobeying an order or inexplicably screaming at an officer, when really all he wants is for the officer to stop and interact with him a little longer than it takes for a food tray to be slid through the slot in his cell door. Many prisoners in isolated confinement report it is extremely difficult for them to contain their mounting rage ….”

But that is far from the extreme end of the spectrum, which involves psychotic breaks, self-mutilation and suicide. In isolation, time no longer passes through the usual cycle of hours, weekdays, months. The damaged mind is left to pick at its own scabs for what might as well be an eternity.

Hell Is a Very Small Place proves fairly repetitious, though it could hardly be otherwise. Reading the book leaves one with the horrible feeling of being overpowered by routines and forces that will just keep running from the sheer force of momentum. Last year, President Obama called for an extensive review and reform of prison conditions, and last month, he issued a ban on the solitary confinement of juveniles in the federal prison system. So that’s the good news, for however long it may last. But consider the enormous obstacle to change represented by the sunk cost of millions or billions of dollars spent to erect Supermax prisons -- let alone the businesses (and lobbyists) who depend on more of them being built.

Anyone housed in solitary for a while would have to envy the characters in No Exit. They have more room (a “box” is typically somewhere between 6 by 9 feet to a luxurious 8 by 10) and they have each other, like it or not. Sartre’s hell is imaginary; it exists only to reveal something about the audience. The idea of burying people alive in concrete tombs degrades the society that has turned it into reality. The phrase “solitary confinement of juveniles in the federal prison system” alone is the sign of something utterly unforgivable.

Editorial Tags: 

Review of Edward H. Miller, "Nut Country: Right-Wing Dallas and the Birth of the Southern Strategy"

Trying to explain recent developments in the American presidential primaries to an international audience, a feature in this week’s issue of The Economist underscores aspects of the political landscape common to both the United States and Europe. “Median wages have stagnated even as incomes at the top have soared,” the magazine reminds its readers (as if they didn’t know and had nothing to do with it). “Cultural fears compound economic ones” under the combined demographic pressures of immigration and an aging citizenry.

And then there’s the loss of global supremacy. After decades of American ascent, “Europe has grown used to relative decline,” says The Economist. But the experience is unexpected and unwelcome to those Americans who assumed that the early post-Cold War system (with their country as the final, effectively irresistible superpower) represented the world’s new normal, if not the End of History. The challenges coming from Putin, ISIS and Chinese-style authoritarian state capitalism suggest otherwise.

Those tensions have come to a head in the primaries with the campaigns of Donald Trump and Bernie Sanders: newcomers to their respective parties whose rise seemed far-fetched not so long ago. To The Economist’s eyes, their traction is, if not predictable, then at least explicable: “Populist insurgencies are written into the source code of a polity that began as a revolt against a distant, high-handed elite.”

True enough, as far as it goes. The analysis overlooks an important factor in how “anti-elite” sentiment has been channeled over the past quarter century: through “anti-elitist” tycoons. Trump is only the latest instance. Celebrity, bully-boy charisma and deep pockets have established him as a force in politics, despite an almost policy-free message that seems to take belligerence as an ideological stance. Before that, there was the more discrete largess of the Koch brothers (among others) in funding the Tea Party, and earlier still, Ross Perot’s 1992 presidential campaign, with its folksy infomercials and simple pie charts, which drew almost a fifth of the popular vote. In short, “revolt against a distant, high-handed elite” may be written into the source code of American politics; the billionaires have the incentives and the means to keep trying to hack it.

If anything, even Perot was a latecomer. In the opening pages of Nut Country: Right-Wing Dallas and the Birth of the Southern Strategy (University of Chicago Press), Edward H. Miller takes note of a name that’s largely faded from the public memory: H. L. Hunt, the Texas oilman. Hunt was probably the single richest individual in the world when he died in 1974. He published a mountain of what he deemed “patriotic” literature and also funded a widely syndicated radio program called Life Line. All of it furthered Hunt’s tireless crusade against liberalism, humanism, racial integration, socialized medicine, hippies, the New Deal, the United Nations and sundry other aspects of the International Communist Conspiracy, broadly defined. ("Nut country" is how John F. Kennedy described Dallas to the first lady a few hours before he was killed.)

Hunt’s output was still in circulation when I grew up in Texas a few years after his death, and reading it has meant no end of déjà vu in the meantime: the terrible ideas of today are usually just the terrible ideas of yesterday, polished with a few updated topical references. Miller, an assistant teaching professor of history at Northeastern University Global, reconstructs the context and the mood that made Dallas a hub of far-right political activism between the decline of Joseph McCarthy and the rise of Barry Goldwater -- a city with 700 members of the John Birch Society. A major newspaper, The Dallas Morning News, helped spur the sales of a book called God, the Original Segregationist by running excerpts. Cold War anti-Communism mutated into a belief that the United States and the Soviet Union were in the process of being merged under the direction of the United Nations, in the course of which all reference to God would be outlawed. John F. Kennedy was riding roughshod over American liberties, bypassing Congress and establishing a totalitarian dictatorship in which, as H. L. Hunt warned, there would be “no West Point, no Annapolis, no private firearms -- no defense!”

An almost Obama-like dictatorship, then. Needless to say, these beliefs and attitudes are still with us, even if many of the people who espoused them are not.

Miller identifies two tendencies or camps within right-wing political circles in Dallas during the late 1950s and early ’60s. “Moderate conservatism” was closer to established Republican Party principles of free enterprise, unrelenting anti-Communism and the continuing need to halt and begin rolling back the changes brought by the New Deal. Meanwhile, “ultraconservatism” combined a sense of apocalyptic urgency with fear of all-pervasive subversion and conspiracy. A reader familiar with recent laments about the state of the Republican Party -- that it was once a much broader tent, with room for even the occasional liberal -- might well assume that Miller’s moderate conservatives consisted of people who liked Ike, hated Castro and otherwise leaned to a bit to the right wing of moderation, as opposed to ultraconservative extremism.

That assumption is understandable but largely wrong: Miller’s moderates were much closer to his ultras than either was to, say, the Eisenhower who sent federal troops to Little Rock, Ark. (Or as someone the author quotes puts it, the range of conservative opinion in Dallas was divided between those who wanted to impeach Supreme Court Justice Earl Warren and those who wanted to hang him.)

Where the difference between the moderates and the ultras ultimately combined to create something more durable and powerful than either of them could be separately was in opposition to the Civil Rights movement and their realignment of the segregationist wing of the Democratic Party. I’ll come back to Miller’s argument on this in a later column, once the primary season has progressed a bit. Suffice it to say that questions of realignment are looking a little antiquarian all the time.

Editorial Tags: 

Essay on David Bowie

There’s a special rung of hell where the serious and the damned writhe in agony, gnashing their teeth and cursing their fate, as they watch an endless marathon of historical documentaries from basic cable networks. Their lidless eyes behold Ancient Aliens, now in its tenth season, and High Hitler, which reveals that the Führer was a dope fiend. The lineup includes at least one program about the career of each and every single condemned soul in the serial-killer department, which is a few rungs down.

In the part of the inferno where I presumably have reservations, a lot of the programming concerns the history of rock music. With each cliché, a devil pokes you, just to rub it in. The monotonous predictability of each band’s narrative arc (struggle, stardom, rehab, Hall of Fame) is just part of it, since there are also the talking-head commentaries, interspersed every few minutes, by people unable to assess any aspect of the music except through hyperbole. Each singer was the voice of the era. Every notable guitarist revolutionized the way the instrument was played -- forever. No stylistic innovation failed to change the way we think about music, influencing all that followed.

Even the devils must weary of it, after a while. It probably just makes them meaner.

Here on earth, of course, such programming can be avoided. Choose to watch Nazi UFOs -- an actual program my TiVo box insists on recording every time it runs -- and you have really no one to blame but yourself.

But David Bowie’s death earlier this month left me vulnerable to the recent rerun of a program covering most of his life and work. Viewing it felt almost obligatory: I quit keeping track of Bowie’s work in the early 1980s (a pretty common tendency among early devotees, the near-consensus opinion being that he entered a long downturn in creativity around that time) so that catching up on Bowie’s last three decades, however sketchily, seemed like a matter of paying respects. It sounds like his last few albums would be worth a chance, so no regrets for watching.

Beyond that, however, the program offered only the usual insight-free superlatives -- echoes of the hype that Bowie spent much of his career both inciting and dismantling. Bowie had a precocious and intensely self-conscious relationship to mass media and spectacle. He was, in a way, Andy Warhol’s most attentive student. That could easily have led Bowie down a dead end of cynicism and stranded him there, but instead it fed a body of creative activity -- in film and theater as well as music -- that fleshed out any number of Oscar Wilde’s more challenging paradoxes. (A few that especially apply to Bowie’s career: “To be premature is to be perfect.” “One should either be a work of art or wear a work of art.” “Man is least himself when he talks in his own person. Give him a mask, and he will tell you the truth.”) There must be a whole cohort of people who lived through early theoretical discussions of postmodernism and performativity while biting our tongues, thinking that an awful lot of it was just David Bowie, minus the genius.

“Genius” can be a hype word, of course, but the biggest problem with superlatives in Bowie’s case isn’t that they are clichéd but that they’re too blunt. Claim that Bowie invented rock stardom, as somebody on TV did, for example, and the statement is historically obtuse while also somehow underestimating just how catalytic an impact he had.

As noted here in a column some months ago, Bowie is not among the artists David Shumway wrote about in Rock Star: The Making of Musical Icons from Elvis to Springsteen (Johns Hopkins University Press, 2014). And yet one aspect of Bowie’s career often taken as quintessential, his tendency to change appearances and styles, actually proves to be one of the basic characteristics of the rock star’s cultural role, well before his Thin White Duke persona rose from the ashes of Ziggy Stardust. Context undercuts the hype.

Elsewhere, in an essay for the edited collection Goth: Undead Subculture (Duke, 2007), Shumway acknowledges that Bowie did practice a kind of theatricalization that created a distinctive relationship between star and fan: the “explicit use of character, costume and makeup … moved the center of gravity from the person to the performance” in a way that seemed to abandon the rock mystique of authenticity and self-expression in favor of “disguising the self” while also reimagining it.

“His performances taught us about the constructedness of the rock star and the crafting of the rock performance,” Shumway writes. “His use of the mask revealed what Dylan’s insistence on his own authenticity and Elvis’s swagger hid.”

At the same time, Bowie’s decentered/concealed self became something audiences could and did take as a model. But rather than this being some radical innovation that transformed the way we think about rock forever (etc.), Shumway suggests that Bowie and his audience were revisiting one of the primary scenes of instruction for 20th-century culture as a whole: the cinema.

Bowie “did not appear to claim authenticity for his characters,” Shumway writes. “But screen actors do not claim authenticity for the fictional roles they play either. Because he inhabits characters, Bowie is more like a movie star than are most popular music celebrities. In both cases the issue of the star’s authenticity is not erased by the role playing, but made more complex and perhaps more intense.”

That aptly describes Bowie’s effect. He made life “more complex and perhaps more intense” -- with the sound of shattering clichés mixed into the audio track at unexpected moments. And a personal note of thanks to Shumway for breaking some, too.

Editorial Tags: 
Image Source: 
Getty Images
Image Caption: 
Memorial to David Bowie

Book review of Hugh Pennington's "Have Bacteria Won?" (essay)

Last month came the unwelcome if not downright chilling news that the antibiotic of last resort -- the most powerful infection fighter in the medical arsenal -- is now ineffective against some new bacterial strains. If, like me, you heard that much and decided your nerves were not up to learning a lot more, then this might be a good time to click over to see what else looks interesting in the Views section menu. There’s something to be said for deliberate obliviousness on matters that you can’t control anyway.

Hugh Pennington’s Have Bacteria Won? (Polity) is aimed straight at the heart of a public anxiety that has grown over the past couple of decades. The author, an emeritus professor of bacteriology at the University of Aberdeen, is clearly a busy public figure in the United Kingdom, where he writes and comments frequently on medical news for the media. A number of recent articles in British newspapers call him a “leading food-poisoning expert,” but that is just one of Pennington’s areas of expertise. Besides contributing to the professional literature, he has served on commissions investigating disease outbreaks and writes “medico-legal expert witness reports” (he says in the new book) on a regular basis.

The fear resonating in Pennington’s title dates back to the mid-1990s. Coverage of the Ebola outbreak in Zaire in 1995 seemed to compete for attention with reports of necrotizing fasciitis (better known as “that flesh-eating disease”), which inspired such thought-provoking headlines as “Killer Bug Ate My Face.”

Pennington refers to earlier cases of food contamination that generated much press coverage -- and fair enough. But it was the ghastly pair of hypervirulent infections in the news 20 years ago that really raised the stakes of something else that medical researchers were warning us about: the widespread overuse of antibiotics. It was killing off all but the most resilient disease germs. An inadvertent process of man-made natural selection was underway, and the long-term consequences were potentially catastrophic.

But now for the good news, or the nonapocalyptic news, anyway: Pennington makes a calm assessment of the balance of forces between humanity and bacteria and, without being too Pollyannaish about it, suggests that unpanicked sobriety would be a good attitude for the public to cultivate, as well.

The history of medical advances in the industrialized world has, he argues, had unexpected and easily overlooked side effects. Now we live, on average, longer than our ancestors. But we also die for different reasons, with mortality from infection no longer being high on the list. The website of the Centers for Disease Control and Prevention makes the point sharply with a couple of charts: apart from a spike during the influenza pandemic following the First World War, death from infectious disease fell in the United States throughout most of the 20th century. Pennington’s point is that we find this trend throughout the modernized world, wherever life expectancy increased. Medical advances, including the development of antibiotics, played a role, but not in isolation. Improved sanitation and increased agricultural output were also part of it.

“There is a pattern common to rich countries,” Pennington notes. “The clinical effects of an infection become much less severe long before specific control measures or successful treatments become available. Their introduction then speeds up the decline, but from a low base. An adequate diet brings this about.”

So death from infectious disease went from being a terrible fate to something practically anomalous within two or three generations. (To repeat, we’re talking about the developed world here: both prosperity and progress impose blinders.) And when serious infectious disease become rare, it also becomes news. “From time to time,” Pennington says, “the media behave like a chief refracting telescope, focusing on an object of interest but magnifying it with a good deal of aberration and fuzziness at the edges because of the poor quality of their lenses.”

Lest anyone think that the competitive shamelessness of the British tabloid press has excessively distorted Pennington’s outlook, keep in mind that CNN once had a banner headline reading, “Ebola: ‘The ISIS of Biological Agents?’” Nor does he demonize the mass media, as such. “Sometimes the journalistic telescope finds hidden things that should be public,” he writes -- giving as an example how a local newspaper identified and publicized an outbreak of infectious colitis at an understaffed and poorly run hospital in Scotland.

Have Bacteria Won? is packed with case histories of outbreaks from the past 60 or 70 years. Each is awful enough in its own right to keep the reader from feeling much comfort at their relative infrequency, and Pennington’s message certainly isn’t that disease can be eradicated. Powerful and usually quite effective techniques exist to prevent or minimize bacterial contamination of food and water, and we now have systematic ways to recognize and treat a wider range of infections than would have been imaginable not that long ago. But systems fail (he mentions several cases of defective pasteurization equipment causing large-scale outbreaks) and bacteria mutate without warning. “Each microbe has its own rules,” Pennington writes. “Evolution has seen to that.”

We enjoy some advantage, given our great big brains, especially now that we have the tools of DNA sequencing and ever-increasing computational power. "This means," Pennington writes, "that tracking microbes, understanding their evolution and finding their weaknesses gets easier, faster and cheaper every day." Given reports that the MCR-1 gene found in antibiotic-impervious bacteria can move easily between micro-organisms, any encouraging word is welcome right about now.

But Pennington's analysis also implies that the world's incredible and even obscene disparities in wealth are another vulnerability. "An adequate diet" for those who don't have it seems like something all that computational power might also be directed toward. Consider it a form of preventative medicine.

Editorial Tags: 

Scholarly database JSTOR sees growth in ebooks program

Smart Title: 

The scholarly database JSTOR, recognizing its role as a starting point for research, sees major growth in its ebook program.

Essay on Wikipedia's fifteenth anniversary

Wikipedia came into the world 15 years ago today -- and, man, what an ugly baby. The first snapshot of it in the Internet Archive is from late March of 2001, when Wikipedia was already 10 weeks old. At that point, it claimed to have more than 3,000 pages, with an expressed hope of reaching 10,000 by the end of summer and 100,000 at some point in the not unimaginably distant future. The first entries were aspirational, at best. The one about Plato, for example, reads in its entirety: “Famous ancient Greek philosopher. Wrote that thing about the cave.”

By November -- with Wikipedia at 10 months old -- the entry on Plato was longer, if not more enlightening: you would have learned more from a good children’s encyclopedia. Over the next several months, the entry grew to a length of about 1,000 words, sometimes in classically padded freshman prose. (“Today, Plato's reputation is as easily on a par with Aristotle's. Many college students have read Plato but not Aristotle, in large part because the former's greater accessibility.”) But encouraging signs soon began to appear. A link directed the reader to supplementary pages on Platonic realism, for example. As of early 2006, when Wikipedia turned five years old, the main entry on Plato had doubled in length, with links to online editions of his writings. In addition, separate pages existed for each of the works -- often consisting of just a few sentences, but sometimes with a rough outline of the topics to be covered in a more ambitious entry somewhere down the line.

The aspirations started to look more serious. There were still times when Wikipedia seemed designed to give a copy editor nightmares -- as in 2003, when someone annotated the list of dialogues to indicate: “(1) if scholars don't generally agree Plato is the author, and (2) if scholars don't generally disagree that Plato is not the author of the work.”

Yet it is also indicative of where the site was heading that before long some volunteer stepped in to unclog that passage's syntactical plumbing. The site had plenty of room for improvement -- no denying it. On the other hand, improvements were actually happening, however unsystematically.

The site hit its initial target of 100,000 pages in early 2003 -- at which point it began to blow up like a financial bubble. There were not quite one million pages by the fifth anniversary of its founding and 3.5 million by the tenth. Growth has slowed of late, with an average of about 300,000 pages being added annually over the past five years.

I draw these figures from Dariusz Jemielniak’s Common Knowledge? An Ethnography of Wikipedia (Stanford University Press, 2014), which also points out how rapidly the pace of editorial changes to articles began to spike. Ten million edits were made during Wikipedia’s first four years. The next 10 million took four months. From 2007 on, the frequency of edits stabilized at a rate of 10 million edits per seven or eight weeks.

We could continue in this quantifying vein for a while. As with the Plato entry finding its center of gravity after a long period of wobbly steps, the metrics for Wikipedia tell a story of growth and progress. So does the format’s worldwide viability: Wikipedia is now active in 280 languages, of which 69 have at least 100,000 entries. It all still seems improbable and inexplicable to someone who recalls how little credibility the very concept once had. (“You can edit this page right now! … Write a little (or a lot) about what you know!”) If someone told you in 2002 that, in 10 years, the Encyclopædia Britannica would suspend publication of its print edition -- while one of the world’s oldest university presses would be publishing material plagiarized from Wikipedia, rather than by it -- the claim would have sounded like boosterism gone mad.

That, or the end of civilization. (Possibly both.) What’s in fact happened -- celebrate or mourn it as you will -- has been a steady normalization of Wikipedia as it has metamorphosed from gangly cultural interloper into the de facto reference work of first resort.

In large measure, the transformation came about as part what Siva Vaidhyanathan has dubbed “the Googlization of everything.” Wikipedia entries normally appear at or near the top of the first page of the search engine’s results. After a while, the priority that the Google algorithm gives to Wikipedia has come to seem natural and practically irresistible. At this point, having a look at Wikipedia usually quicker and easier than deciding not to (as someone once said about reading the comic strip “Nancy”).

Another sign of normalization has been the development of bibliographical norms for citing Wikipedia in scholarship. It signals that the online reference work has become a factor in knowledge production -- not necessarily as a warehouse of authoritative information but as a primary source, as raw material, subject to whatever questions and methods a discipline may bring to bear on it.

In the case of that Plato entry, the archive of changes over time would probably be of minimal interest as anything but a record of the efforts of successively better informed and more careful people. But Wikipedia’s role as a transmitter of information and an arena for contesting truth claims make its records a valuable source for people studying more recent matters. Someone researching the impact of the Sandy Hook Elementary School shootings, for example, would find in the Wikipedia archive a condensed documentation of how information and arguments about the event appeared in real time, both in its immediate aftermath and for years afterward.

I've been reading and writing about Wikipedia for this column for most of its lifespan, and it won't be five years before there's occasion to do so again. There's plenty more to say. But for now, it seems like Professor Wikipedia should get the last word.

Editorial Tags: 

Is this the best acknowledgment section of a scholarly book?

Smart Title: 

Blog post about a scholar's anti-thank-you has lots of people talking.

Review of 'Every Last Tie: The Story of the Unabomber and His Family'

Until fairly recently, I had in my files a copy of the supplement to the Sept. 19, 1995, issues of The New York Times and The Washington Post containing “Industrial Society and Its Future,” better known as the Unabomber manifesto. “Unabomber” was the moniker the Federal Bureau of Investigation gave the person or persons responsible for a string of mail bombs primarily targeting people at universities and airlines, which killed three people and maimed 23 more between 1978 and 1995.

The carnage was vicious, but it also appeared pointless, at least until the manifesto became available. It is usually characterized as neo-Luddite -- a call to halt and reverse the self-perpetuating course of technology-dominated human history, which both fosters and feeds on profound alienation. (My copy went into the recycle bin once I downloaded the text to my ereader. So it goes.) Blowing up college professors and airline executives seemingly at random was hardly the most logical or effective way of transforming civilization. But by the mid-1990s American culture had spawned a number of strange and disturbing combinations of means and ends. “Industrial Society and Its Future” was the work of someone more intelligent than Timothy McVeigh and less manifestly delusional than the Branch Davidians or the Heaven’s Gate people. It read like a master’s thesis in anthropological theory that had, at some point, gone terribly, terribly wrong.

Publication of his treatise in a major national publication had been the Unabomber’s condition for ending the terror campaign. The FBI figured that conceding would at very least save lives and buy investigators time; it might also increase the chances of someone reading the text and having a hunch as to its authorship.

And that is just what happened -- although things very well might not have worked out that way. David Kaczynski’s reflective and resolutely unsensational memoir Every Last Tie: The Story of the Unabomber and His Family (Duke University Press) reveals how difficult it was to accept even the possibility that his older brother, Theodore, might be a terrorist. The author is the executive director of a Tibetan Buddhist monastery in Woodstock, N.Y., and (as testified in the afterword by James L. Knoll IV, a forensic psychiatrist who teaches at State University of New York Upstate Medical University in Syracuse, N.Y.) a tireless speaker and activist in the movement against the death penalty. It took weeks of agonizing discussion with his wife for David Kaczynski to reach the reluctant conclusion that the Unabomber might be his troubled older sibling -- identified in captions to family photos as “Teddy,” which somehow feels a little disconcerting.

By 1995 they had been out of touch for several years. The estrangement was not quite as bitter as the one between Ted and his parents, but the recriminations were one-sided in either case. The older sibling’s back-to-the-land yearning for self-sufficiency had metastasized over the years. Introversion and reclusiveness gave way to a seething hatred of everyone -- even of those closest to him, those able to give him unconditional love.

Especially those people, in fact. The burden of the memoirist here is not just to recount his own past but to make sense of it in the context of an act of violence otherwise utterly disconnected from his own memories. Seven years younger than his brother, the author recalls growing up happily in his shadow; they remained on affectionate terms long after Ted went off to Harvard University at age 16. “Growing up,” David writes, “I never doubted my brother’s fundamental loyalty and love or felt the slightest insecurity in his presence.” He characterizes their father as “a blue-collar intellectual” -- one who “didn’t have much formal education [but] was widely read and believed progress was possible through mankind’s rational pursuit of the greater good” -- and the description sounds like it would apply to their mother as well. “Discipline in our family was based on reason and dialogue,” the author says, “not authority and fear.”

There were worse ways to spend the 1950s, but an unfortunate turn as a Harvard undergraduate may have sent Ted Kaczynski’s retiring and cerebral personality off in a dangerously pathological direction. For three years, he served as a human guinea pig for a study on the effects of aggression and humiliation on bright college students -- one of various projects intended to add psychological weaponry to the Cold War arsenal. He survived and went on to do award-winning doctoral work in mathematics at the University of Michigan, Ann Arbor, followed by an assistant professorship at the University of California at Berkeley. But by 1971, Kaczynski, still in his twenties, left academe to cultivate a small plot of land in Montana, where he could be alone with his thoughts.

What happened over the following 25 years is presumably documented among the papers removed from Ted Kaczynski’s cabin. Every Last Tie does not refer to this material, and it’s hard to blame the author if he did not burrow through it in search of an explanation. (He’s been through enough.) What he did see was the letters full of accusation that Ted sent to their parents as his mind wandered deeper into paranoia and rage.

Eventually, when David’s long-unrequited love for a girl he’d grown up with ended in marriage, it was the end of that family connection as well: Ted disowned his brother. The memoir is made up of essays focusing in turn on each member of David Kaczynski’s nuclear family and finally on his spouse, Linda, who was the first person to suspect that the brother-in-law she’d never met might be the Unabomber.

One effect of narrating the past this way is that the book as a whole is not linear. Events are recounted as moments in the author’s relationship with each individual. The effect is to underscore precisely the thing that Ted Kaczynski could not experience, or at least not endure: the intimacy of shared lives. And although they remain estranged, David does not disown his brother for the simple reason that he cannot:

“Ted’s cruelty stigmatizes my good name; but my reputation for goodness comes at his expense. Like all contrived opposites, we reinforce one another. The worst thing that he can do to me is to deny an opportunity for reconciliation. Hope of reconciliation is something I am bound to maintain, but it costs me little -- only the sneaking sense that some part of me is missing.”

Editorial Tags: 

New book of essays sheds light on what it's like to be a professor and a mom

Smart Title: 

New book of essays sheds light on what it's like to be a professor and a mom.

Review of Colin Dayan, 'With Dogs on the Edge of Life'

As our schedules and the weather permit, my wife and I walk dogs rescued from high-kill and overcrowded shelters and held in a foster-care facility. The walks are, in part, a way to find the dogs new homes: they wear vests or bandannas inviting people to inquire about adoption. (Anyone inclined to make an end-of-the-year donation to this worthy cause should inquire here.)

For the dogs, of course, a walk is an end unto itself. Most are raring to go, though we’ve occasionally had new arrivals from the countryside who feel overwhelmed by the sights and sounds of an urban downtown. One got about half a block out the door before he’d had enough and, refusing to budge, sat down and cowered in place. But that was a rare and extreme case. Normally it takes just a few minutes for a dog to adjust to the environment and feel drawn into it, pulling us along into the excitement of open space.

Before long we start crossing paths with attentive people who stop to admire the dog, and sometimes Rita interests them in taking information on how to adopt. The foster facility seems to have pretty high turnover, so mission accomplished, presumably. But after the first or second expedition, that part of the walk became much less interesting to me than the moments of heightened awareness that sometimes occurred between contacts with other humans. It was an almost meditative absorption in our surroundings -- an effort to imagine the world as experienced from the other end of the leash.

In The Marriage of Heaven and Hell, William Blake asks, “How do you know but ev'ry Bird that cuts the airy way, / Is an immense world of delight, clos'd by your senses five?” A dog on the ground raises that question in an earthier way than a bird in the air, for there’s a constant reminder (at least two or three times per block) that the dog’s landscape consists of a fine-grained texture of smells that is almost entirely lost on humans. (Likewise with sounds beyond our ken.) The bird’s ecstasy was a matter of conjecture for Blake. But that an “immense world of delight” opens itself to a dog’s senses seems self-evident, even though the human imagination is closed to most of it.

My musings on dog sensibility have been a lot like the walks themselves: occasional and fairly restricted, exploring no more than could be covered by a circular route in about an hour. Colin Dayan’s With Dogs at the Edge of Life (Columbia University Press) is a much more comprehensive exploration -- the work of a mind that slips the leash of genre or narrow specialization at every opportunity

The author, a professor of humanities and of law at Vanderbilt University, makes sharp turns and intuitive leaps that are, at times, unexpected and disconcerting. She published parts of the book in The Boston Review and other journals as essays of diverse kinds (memoir, reportage, criticism, political commentary, etc.) but with continuities and themes developing across the differences in framework and voice. Generalization seems hazardous with such a hybrid text, but here goes anyway.

Dayan refers to “those of us who believe that the distinction between human and nonhuman animals is unsustainable.” She takes the experience of toggling between human and canine awareness -- as with trying to imagine walking the dog from the four-legged perspective -- as a given. It is basic to the relationship between the two species that has developed from tens of thousands of years of cohabitation.

Humans and dogs read each other’s minds, in effect, or at least we try -- and anyone who lives with or around dogs for very long knows that a real zone of intersubjectivity emerges from the effort. A degree of anthropomorphism is probably always involved, but we get around it a little in moments of recognizing, and respecting, the dog’s own capacities.

“Dogs live on the track between the mental and the physical,” Dayan writes, “and sometimes seem to tease out a near-mystical disintegration of the bonds between them. What would it mean to become more like a dog? How might we come up against life as a sensory but not sensible experience? We all experience our dogs’ unprecedented and peculiar attentiveness. It comes across as an exuberance of a full heart. Perhaps this is what the Puritan divine Jonathan Edwards meant when he emphasized a physical rather than a moral conversion. He knew that the crux of divinity in earthbound entities lay in the heart’s ‘affections.’”

The movement within that paragraph -- between metaphysical categories and the ordinary dog owner’s intuitions, with the dismantling of dichotomies raising moral implications which then, even more sharply, plunge into the sphere of theology -- presents in miniature what the book does on a much larger scale. At the same time, Dayan’s thinking is grounded in concrete particulars, including issues around a particular variety of dog, the pit bull terrier, which appears to have become the contemporary, secular embodiment of diabolical menace. In some places they are very nearly the target of a campaign of extermination.

Not so coincidentally, perhaps, it is African-American residents of housing projects and poor white Southerners whose pit bulls are most likely to be confiscated and destroyed. Video of the police killing poor or homeless people’s dogs, whatever the breed, seems to be its own genre on YouTube. (I am willing to take her word for it.) At the same time, an association between impoverished or collapsing cities and feral dog packs has become a commonplace in journalism, while a number of directors have used the roaming dog as a character or scenic element in recent films.

It’s tempting to say that Dayan does for dogs what Melville did for whales: tracking the social roles and symbolic frameworks built up around them and depicting them at the intersection between cosmic order and human frailty, while also giving them (dogs and whales alike) due recognition as animals with worlds of their own, which we humans impinge upon. That description may intrigue some people while doubtless putting off at least as many. So be it, but I’ll say that With Dogs on the Edge of Life was one of the most memorable books I’ve had the chance to read this year.

Editorial Tags: 

Pages

Subscribe to RSS - Books
Back to Top