Books

Essay on Barbara Ehrenreich's 'Living With a Wild God'

Examples of atheist spiritual autobiography are not plentiful, although the idea is not as self-contradictory as it perhaps sounds. A quest story that ends without the grail being located or the ring destroyed may not satisfy most audiences, but it's a quest story even so.

The one example that comes to mind is Twelve Years in a Monastery (1897) by Joseph McCabe, an English Franciscan who spent much of his clerical career grappling with doubts that eventually won out. Twelve Years is framed mainly as a critique and expose of clerical life, but its interest as a memoir comes in part from McCabe’s struggle to accept the moral and intellectual demands imposed by his growing skepticism. For all of its defects, monasticism offered a career in which McCabe’s talents were recognized and even, within ascetic limits, rewarded. Leaving it meant answering the call of a new vocation: He went on to write on an encyclopedic array of scientific, cultural and historical topics.

McCabe also became the translator and primary English spokesman for Ernst Haeckel, the German evolutionary theorist and advocate of pantheism, which seems to have squared easily enough with the lapsed monk’s atheism. (There may be more than a semantic difference between thinking of God and the universe as identical and believing there is no God, just universe. But if so, it is largely in the eye of the beholder.)

Barbara Ehrenreich’s background could not be more different from Joseph McCabe’s. In Living With a Wild God: A Nonbeliever’s Search for the Truth About Everything (Hachette/Twelve) she describes her working-class family as consisting of atheists, rationalists, and skeptics for at least a couple of generations back. “God is an extension of human personality,” she wrote in her journal as an adolescent, “brought into the world and enslaved as man’s glorifier.” McCabe would have had to do penance for such a thought; in Ehrenreich’s case, it was just dinner-table wisdom -- expressed with precocious verbal finesse, later honed into a sharp instrument of social criticism put to work in Nickle and Dimed, among other books.

Her memoir appeared two years ago, though I’ve only just read it, making this column more rumination than review. The usual blurb-phrases apply: it’s brilliantly written, thought-provoking, and often very funny, taking aphoristic turns that crystallize complex feelings into the fewest but most apt words. For example: “[I]f you’re not prepared to die when you’re almost 60, then I would say you’ve been falling down on your philosophical responsibilities as a grown-up human being.” Or: “As a child I had learned many things from my mother, like how to sew a buttonhole and scrub a grimy pot, but mostly I had learned that love and its expressions are entirely optional, even between a mother and child.”   

So, a recommended read. (More in the reviewerly vein is to be found here and here.) Additional plaudits for Living With a Wild God won’t count for much at this late date, while its literary ancestry might still be worth a thought. For it seems entirely possible, even likely, that Ehrenreich’s parents and grandparents in Butte, Montana would have read McCabe -- “the trained athlete of disbelief,” as H.G. Wells called him, in recognition of McCabe’s countless books and pamphlets on science, progress and the benefits of godlessness. Many of them appeared in newsprint editions circulating widely in the United States during the first half of the 20th century, with one publisher reckoning he’d sold over 2.3 million booklets by McCabe between the 1920s and the 1940s.

The inner life she led as a teenager that Ehrenreich depicts in her memoir certainly resembles the world of rationality and order that McCabe evokes, and that her family took as a given: “[E]very visible, palpable object, every rock or grain of sand, is a clue to the larger mystery of how the universe is organized and put together -- a mystery that it was our job, as thinking beings, to solve.” Ehrenreich took up the challenge as an adolescent with particular rigor. Faced with the hard questions that death raises about the value and point of life, she began taking notes in the expectation of working out the answer: “I think it is best to start out with as few as possible things which you hold to be unquestionably true and start from there.”

The problem, as Descartes discovered and Ehrenreich did soon, was that “the unquestionably true” is a vanishingly small thing to determine. You are left with “I exist” and no path to any more inclusive confidence than that. Descartes eventually posits the existence of God, but arguably bends the rules in doing so. Ehrenreich does not and lands in the quicksand of solipsism.

The mature Ehrenreich can see how her younger self’s philosophical conflicts took shape in the midst of more mundane family problems involving alcoholism, career frustration and each parent’s set of inescapable disappointments. (After all, solipsism means never having to say you’re sorry.) But she also recognizes that the quandaries of her teenage prototype weren’t just symptoms: “Somehow, despite all the peculiarities of my gender, age class, and family background, I had tapped into the centuries-old mainstream of Western philosophical inquiry, of old men asking over and over, one way or another, what’s really going on here?”

In pursuing answers that never quite hold together, she undergoes what sounds very much like the sort of crisis described by the saints and mystics of various traditions. First, there were repeated moments of being overwhelmed by the sheer strangeness and “there”-ness of the world itself. Then, in May 1959, a few months before leaving for college, she underwent a shattering and effectively inexpressible experience that left her earlier musings in ashes. No consciousness-altering chemicals were ingested beforehand; given the circumstances, it is easy to appreciate why the author spent the 1960s avoiding them:    

“[T]he world flamed into life,” she writes. “There were no visions, no prophetic voices or visits by totemic animals, just this blazing everywhere. Something poured into me and I poured out into it. This was not the passive beatific merger with ‘the All,’ as promised by the Eastern mystics. It was a furious encounter with a living substance that was coming at me through all things at once, and one reason for the terrible wordlessness of the experience is that you cannot observe fire really closely without becoming part of it.”

This kind of thing could not be discussed without the risk of involuntary commitment, and Ehrenreich herself refers to a medical hypothesis suggesting that ecstatic states may result when “large numbers of neurons start firing in synchrony, until key parts of the brain are swept up in a single pattern of activity, an unstoppable cascade of electrical events, beginning at the cellular level and growing to encompass the entire terrain that we experience as ‘consciousness.’”

Ehrenreich did not take McCabe’s course in reverse -- going from atheist into the waiting arms of an established faith. For that matter, she remains more or less an agnostic, at least willing to consider the possible merits of a polytheistic cosmos. "My adolescent solipsism is incidental compared to the collective solipsism our species has embraced for the last few centuries in the name of modernity and rationality," she writes, "a worldview in which there exists no consciousness or agency other than our own, where nonhuman animals are dumb mechanisms, driven by instinct, where all other deities or spirits have been eliminated in favor of the unapproachable God...." Whether a nonreligious mysticism can go beyond "modernity and rationality" without turning anti-modern and irrationalist is something we'll take up in the column on a future occasion. 

Editorial Tags: 
Image Source: 
Hachette Book Group

University of Florida, Elsevier explore interoperability in the publishing space

Smart Title: 

U of Florida connects its institutional repository to Elsevier's ScienceDirect platform to try to increase the visibility of the university's intellectual work.

Review of Terry Eagleton's "Culture"

If ideas are tools -- “equipment for living,” so to speak -- we might well imagine the culture as a heavily patched-up conceptual backpack that has been around the world a few times. It has been roughly handled along the way.

The stitches strain from the sheer quantity and variety of stuff crammed into it over the years: global culture, national culture, high culture, popular culture, classical and print and digital cultures, sub- and countercultures, along with cultures of violence, of affluence, of entitlement, of critical discourse …. It’s all in there, and much else besides. How it all fits -- what the common denominator might be -- is anyone’s guess. We could always draw on the useful clarifying distinction between: (1) culture as a category of more or less aesthetic artifacts, perhaps especially those that end up in museums and libraries, and (2) culture as the shared elements of a way of life.

The difference is, in principle, one of kind, not of quality, although assumptions about value assert themselves anyway. The first variety is sometimes called “the Matthew Arnold idea of culture,” after that Victorian worthy’s reference, in his book Culture and Anarchy, to “the best which has been thought and said.” Presumably music and painting also count, but Arnold’s emphasis on verbal expression is no accident: culture in his use of the term implies literacy.

By contrast “culture in the anthropological sense” -- as the second category is often called -- subsumes a good deal that can be found in societies without writing: beliefs about the nature of the world, ways of dressing, gender roles, assumptions about what may be eaten and what must be avoided, how emotions are expressed (or not expressed) and so on. Culture understood as a way of life includes rules and ideas that are highly complex though not necessarily transmitted through formal education. You absorb culture by osmosis, often through being born into it, and much of it goes without saying. (This raises the question of whether animals such as primates or dolphins may be said to have cultures. If not, why not? But that means digging through a whole other backpack.)

The dichotomy isn’t airtight, by any means, but it has served in recent years as a convenient pedagogical starting point: a way to get students (among others) to think about the strange ubiquity and ambiguity of culture as a label we stick on almost everything from the Code of Hammurabi to PlayStation 4, while also using it to explain quite a bit. Two people with a common background will conclude a discussion of the puzzling beliefs or behavior of a third party by agreeing, “That’s just part of their culture.” This seems more of a shrug than an explanation, really, but it implies that there isn’t much more to say.

One way to think of Terry Eagleton’s new book, Culture (Yale University Press), is as a broad catalog of the stuff that comes out when you begin unpacking the concept in its title -- arranging the contents along a spectrum rather than sorting them into two piles. In doing so, Eagleton, a distinguished professor of English literature at the University of Lancaster, follows closely the line of thought opened by the novelist and critic Raymond Williams, who coined the expression “culture as a whole way of life.” Williams probably derived the concept in turn, not from the anthropologists, but from T. S. Eliot. In distinguishing “culture as ordinary” (another Williams phrase) from culture as the work that artists, writers, etc. produce, the entire point was to link them: to provoke interest in how life and art communicated, so to speak.

For Williams, the operative word in “culture as a whole way of life” was, arguably, “whole”: something integral, connected and coherent, but also something that could be shattered or violated. Here, too, Eagleton is unmistakably Williams’s student. His assessment of how ideas about culture have taken shape over the past 200 years finds in them a pattern of responses to both industrialism (along with its spoiled heir, consumerism) and the French revolution (the definitive instance of “a whole way of life” exploding, or imploding, under its own strains). “If it is the cement of the social formation,” Eagleton writes, culture “is also its potential point of fracture.”

It may be that I am overemphasizing how closely Eagleton follows Williams. If so, it is still a necessary corrective to the way Williams has slowly turned into just another name in the Cultural Studies Hall of Fame rather than a felt moral and intellectual influence. His emphasis on culture as “a whole way of life” -- expressed with unabashed love and grief for the solidarity and community he knew when growing up in a Welsh mining community -- would sound remarkably anachronistic (if not ideologically totalizing and nostalgically uncritical) to anyone whose cultural reference points are of today’s commodified, virtual and transnational varieties.

And to that extent, Eagleton’s general survey of ideas about culture comes to a sharp point -- aimed directly at how the concept functions now in a capitalist society that he says, “relegates whole swaths of its citizenry to the scrap heap, but is exquisitely sensitive about not offending their beliefs.”

He continues, in a vein that Williams would have appreciated: “Culturally speaking, we are all to be granted equal respect, while economically speaking the gap between the clients of food banks and the clients of merchant banks looms ever larger. The cult of inclusivity helps to mask these material differences. The right to dress, worship or make love as one wishes is revered, while the right to a decent wage is denied. Culture acknowledges no hierarchies, but the educational system is riddled with them.” This may explain why culture is looking so raggedy and overburdened as a term. Pulled too tight, stretched too thin, it covers too many things that it would be difficult to face straight on.

Editorial Tags: 
Image Source: 
Yale University Press

Review of Christina Crosby, "A Body, Undone: Living on After Great Pain (A Memoir)"

Somewhere along the way, Nietzsche’s apothegm “That which does not destroy me makes me stronger” lost all the irony and ambiguity it had in context and turned into an edifying sentiment -- a motivational catchphrase, even on the order of that poster saying, “Hang in there, baby!” with the cat clinging to a tree branch.

“Destroy” is often rendered “kill,” giving it a noirish and edgy sound. Either way, the phrase is uplifting if and only if understood figuratively, as a statement about mental resilience. For when taken literally, it is barely even half true, as a moment’s reflection reveals. A life-threatening virus can make us stronger -- i.e., immune to it in the future -- but a bullet to the brain never will. That truth would not have been lost on Nietzsche, who understood philosophy as a mode of psychology and both as rooted in physiology.

He expected the reader not just to absorb a thought but to test it, to fill in its outlines and pursue its implications -- including, I think, a contradictory thought: Whatever does not kill me might very well leave me wishing it had.

While riding her newly repaired bicycle early in the fall semester of 2003 -- pumping the pedals hard, with the strong legs of someone just entering her 50s and determined not to feel it -- Christina Crosby, a professor of English and feminist, gender and sexuality studies at Wesleyan University, got a branch caught in the front wheel. She went flying from her seat, landing on the pavement chin first, fracturing two vertebrae in her neck. The broken bone scraped her spinal cord. One indication of how fast it all happened is that reflexes to break a fall never kicked in. Her hands were not damaged at all.

“Serious neurological damage started instantly,” Crosby writes in A Body, Undone: Living On After Great Pain (NYU Press); “blood engorged the affected site, and the tissue around the lesion began to swell, causing more and more damage as the cord pressed against the broken vertebrae. I also smashed my chin into tiny pieces, tore open my lips, slashed open my nose, breaking the cartilage, and multiply fractured the orbital bone underneath my right eye.” She had been wearing wire-frame glasses and the force of the impact drove the metal into the bridge of her nose.

Crosby spent three weeks in the hospital, unconscious in intensive care for most of it, and only found out later, from her lover, Janet, that the neurosurgeons and plastic surgeons “debated who should go first.” The plastic surgeons won. It sounds as if they had proportionately the more hopeful and effective job to do -- piecing together her chin from fragments, reconstructing her mouth, removing the eyeglass frames from her flesh and leaving only a half-moon scar.

The neurological damage was much more extensive and included both paralysis and a loss of sensation from the neck down. In time, Crosby regained limited use of her hands and arms and could begin to overcome the extreme (and dangerous) atrophy that set in following the accident. She was able to return to teaching part time at Wesleyan in 2005.

The author refers to herself dictating the memoir, but it feels very much as a piece of writing -- that is, as something composed in large part through revision, through grappling with the enormous problem of communicating sequences of experience and thought that few readers will have shared. The accident occurred relatively late in her life and without warning; the contrast between her existence before and after the catastrophic event is made even starker by the fact that she cannot remember it happening. “My sense of a coherent self,” she writes, “has been so deeply affronted” that the book in large measure serves as a way to try to put the fragments back together again without minimizing the depth of the chasm she has crossed.

“You become who you are,” Crosby writes, “over the course of a life that unfolds as an ongoing interaction with objects and others, from the infant you once were, whose bodily cartography slowly emerged as you were handled by caregivers whose speech washed over you, to the grown-up you are today, drawn beyond reason to one person rather than another.”

On that last point she has been extraordinarily fortunate in whom she found herself drawn to: the bond she shares with Janet seems like a rope across the abyss, or more like a steel cable, perhaps. (I call her by her first name simply because the author does. The view from Janet R. Jakobsen’s side of things may be read in a thoughtful essay from 2007.) At the same time, A Body, Undone is anything but sentimental about the possibilities of growth and healing. As doctors lowered the dosage of Crosby’s painkillers, new forces imposed themselves on her daily life:

“I feel an unassuageable loneliness, because I will never be able to adequately describe the pain I suffer, nor can anyone accompany me into the realm of pain …. Pain is so singular that it evades direct description, so isolating because in your body alone. Crying, and screaming, and raging against pain are the signs of language undone. … I have no exact account of how pain changes my interaction with my students and my colleagues, but I know there are times when I don’t feel fully present. It’s not that the pain is so bad that it demands all my attention, but rather that it’s so chronic as to act like a kind of screen.”

No pseudo-Nietzschean bromides to be had here. There is also the difficult new relationship with one’s bowels when they cease to be under any control by the brain -- the discovery of a whole terra incognita beyond ordinary feelings of awkwardness and embarrassment. Crosby discusses her reality with a candor that must be experienced to be believed. And the reader is left to face the truth that one’s embodiment (and the world that goes with it) can change utterly and forever, in a heartbeat.

Editorial Tags: 
Image Source: 
NYU Press

Review of Benedict Anderson, "Life Beyond Boundaries: A Memoir"

The folklore of Indonesia and Thailand tells of a frog who is born under half of a coconut-shell bowl and lives out his life there. In time, he draws the only sensible conclusion: the inside of the shell is the whole universe.

“The moral judgment in the image,” writes Benedict Anderson in Life Beyond Boundaries: A Memoir (Verso), “is that the frog is narrow-minded, provincial, stay-at-home, and self-satisfied for no good reason. For my part, I stayed nowhere long enough to settle down in one place, unlike the proverbial frog.”

Anderson, a professor emeritus of international studies, government and Asia studies at Cornell University, wrote major studies of the history and culture of Southeast Asia. A certain degree of cosmopolitanism went with the fieldwork. But the boundaries within a society can be patrolled just as insistently as its geographical borders -- and in the case of academic specialties, the guards inspecting passports tend to be quite unapologetically suspicious.

In that regard, Anderson was an even more remarkable citizen of the world, for his death late last year has been felt as a loss in several areas of the humanities as well as at least a couple of the social sciences. Nearly all of this reflects what someone writing in a scholarly journal once dubbed “Benedict Anderson’s pregnant phrase” -- i.e., the main title of his 1983 work Imagined Communities: Reflections on the Origin and Spread of Nationalism, which treated the mass production of books and periodicals in vernacular languages (what he called “print capitalism”) as a catalytic factor in creating a shared sense of identity and, with it, the desire for national sovereignty.

By the 1990s, people were pursuing tangents from Anderson’s argument with ever more tenuous connection to nationalism -- and still less to the specific emphasis on print capitalism. Any group formed and energized by some form of mass communication might be treated as an imaginary community. Here one might do a search for “Benedict Anderson” and ”World of Warcraft” to see why the author came to think of his best-known title as “a pair of words from which the vampires of banality have by now sucked almost all the blood.” Even so, Imagined Communities has shown remarkable longevity, and its landmark status is clearly international: it had been translated into more than 30 languages as of 2009, when it appeared in a Thai edition.

The reader of Life Beyond Boundaries soon understands why Anderson eventually developed mixed feelings about his “pregnant phrase” and its spawn. His sense of scholarship, and of life itself, was that it ought to be a mode of open-ended exploration, of using what you’ve learned to figure out what you could learn. Establishing a widely known line of thought must have become frustrating once it’s assumed to represent the only direction in which you can move. Professional interest is not the only kind of interest; what it recognizes as knowledge is no measure of the world outside the shell.

Anderson wrote the memoir by request: a Japanese colleague asked for it as a resource to show students something of the conduct of scholarship abroad and to challenge the “needlessly timid” ethos fostered by Japanese professors’ “patriarchal attitude.” Long retired -- and evidently reassured by the thought that few of his American colleagues would ever see the book -- Anderson was wry and spot-on in recounting the unfamiliar and not always agreeable experience of American academic life as he found it after emigrating to the United States from England as a graduate student in the late 1950s. For one thing, his professors looked askance at his papers, where he might indulge in a sardonic remark if so inspired, or pursue a digressive point in his footnotes.

“In a friendly way,” he writes, “my teachers warned me to stop writing like this …. It was really hard for me to accept this advice, as in previous schools I had always been told that, in writing, ‘dullness’ was the thing to be avoided at all cost.” He also underscores the paradox that the pragmatic American disinterest in “grand theory” coexisted with an academic hunger for it, renewed on a seasonal basis:

“‘Theory,’ mirroring the style of late capitalism, has obsolescence built into it, in the manner of high-end commodities. In year X students had to read and more or less revere Theory Y while sharpening their teeth on passé Theory W. Not too many years later, they were told to sharpen their teeth on passé Theory Y, admire Theory Z, and forget about Theory W.”

Lest anyone assume this refers to the situation in the humanities, it’s worth clarifying that one example he gives is the “modernization theory” that once ruled the social sciences roost. And similar ridings of the trend wave also prevail in the choice of areas for research. The antidote, he found, came from leaving the academic coconut bowl to explore Indonesia, the Philippines and Thailand:

“I began to realize something fundamental about fieldwork: that it is useless to concentrate exclusively on one’s ‘research project.’ One has to be endlessly curious about everything, sharpen one’s eyes and ears, and take notes about anything …. The experience of strangeness makes all your senses much more sensitive than normal, and your attachment to comparison grows deeper. This is also why fieldwork is so useful when you return home. You will have developed habits of observation and comparison that encourage or force you to start noticing that your own culture is just as strange ….”

Unfortunately the author does not say how his intended Japanese public responded to Life Beyond Boundaries. A lot probably depends on how well the moments of humor and reverie translated. But in English they read wonderfully, and the book is a gem.

Editorial Tags: 
Image Source: 
Verso

New book for graduate students decodes key academic life terms from A-Z

Smart Title: 

New book decodes key terms for academic life, from A to Z.

Two landmark developments in the humanities this week

Prestige has its privileges. When a well-established award is announced -- as the 100th set of Pulitzer Prize winners was on Tuesday -- it tends to consume the available limelight. Anything less monumental tends to disappear into its shadow.

But a couple of developments in the humanities this week strike me as being as newsworthy as the Pulitzers. If anything, they are possibly more consequential in the long run.

For one, we have the Whiting Foundation’s new Public Engagement Fellowship, which named its first recipients on Tuesday. The fellowship ought not to be confused with the Whiting Award, which since 1985 has been given annually to 10 authors “based on early accomplishment and the promise of great work to come.” The winners receive $50,000 each, along with, presumably, the professed esteem and subdued malice of their peers.

By contrast, the Public Engagement Fellowships go to professors who have shown “a demonstrated commitment to using their scholarly expertise to reach wider audiences,” in order to fund ambitious projects designed to have direct and significant impact on a specific public outside the academy.” There are eight scholars in the fellowship’s inaugural cohort, including, for instance, Zoë Kontes, an associate professor of classics at Kenyon College, who will spend a semester creating a podcast to explore the black market in looted artifacts.

As with the literary prize, the fellowship comes with $50,000, with $10,000 earmarked for the project’s expenses and the rest covering the recipient’s stipend. Neither the number of fellows nor the apportionment of finances is set in stone, as I learned from Daniel Reid, the foundation’s executive director, when we met last week.

He explained that after more than 40 years of funding dissertations in the humanities at elite universities, the Whiting Foundation had decided it was time to direct its attention to a relatively underserved aspect of humanities scholarship: the cultivation of new ways of making connections with the world beyond the campus. Last year, the foundation contacted administrators at 40 universities, encouraging them to nominate faculty with projects that might be appropriate for funding.

“This has been a learning process on both sides,” Reid said, “for [the foundation] in running things and for the institutions in getting a sense of what we’re looking for.” He explained that the proposals were then evaluated by a group of seven people who had considerable experience with the communication of specialized knowledge to a wide public. The names are not public, though Reid indicates that a number of them are prominent figures in scholarship, publishing and museum or gallery curation. (The need for secrecy is understandable: publicizing the names would leave the Whiting judges as vulnerable as delegates to this summer’s political conventions are starting to feel.)

For the second group of Public Engagement Fellows, the Whiting Foundation will double the number of colleges and universities it contacts in search of nominations, with the long-term goal of making the process open to all higher education institutions. In the future, the number of recipients may range from six to 10. I gave the example of Kontes’s podcast on the looting of antiquities as an example (not quite at random: consider me on the waiting list to subscribe) but hope the other projects stimulate interest, discussion and perhaps some healthy competition.

The other development from earlier in the week is Duke University Press’s announcement that it will be publishing an edition of the works of Stuart Hall, who can -- without exaggeration, if not without argument -- be called the founding father of cultural studies as an academic discipline, at least in Great Britain. The Wikipedia entry for Hall is surprisingly thorough, so anyone for whom the name does not signify might want get up to speed there.

Hall is the case of a figure in the humanities whose impact is both widely recognized yet difficult to assess for an American -- for the simple reason that, even at the peak of his influence, his work was remarkably difficult to find. A number of his major writings seem to have been published as mimeographed papers. He published books, but not that many found their way into American bookstores. So the prospect of having his scattered and fugitive writings in an edition from a major university press is appealing.

I heard that Ken Wissoker, the press's editorial director, might have some background information on why we are getting Hall’s work in this form only now, two years after his death. He confirmed my impression in an email note and gave a little background that seems worth putting into the record: “David Morley had edited two or three volumes of Stuart’s essays for Macmillan U.K. back in the late ’80s, but my understanding is that Stuart decided against having them come out (or delayed it into not happening). The original cultural studies essays were in a lot of different places …. Xeroxes and then PDFs circulated, but it would have been very difficult to track down all the originals …. Stuart saw the work as conjunctural and didn’t want it becoming scripture. Ironically, this was only a problem in English. There are translations to Mandarin and German (and I believe Spanish and/or Portuguese).”

The first of the two titles in the Duke edition will be out this fall, and the second will be published next spring. One is a set of lectures on the intellectual foundations of cultural studies, the other the first volume of Hall’s autobiography. “The memoir will have a second volume,” Wissoker says, “that will be more of an intellectual and political summation ‘what I think now’ book.” Farther down the line there will be a volume of selected essays, and Laura Sell, Duke's publicity and advertising manager, says that a number of thematically organized collections on “politics, race, photography, black aesthetics, Marxism and post-Marxism, [and] the Caribbean” will come in due course.

Editorial Tags: 

Review (part 2) of Meg Leta Jones, "Ctrl+Z: The Right to Be Forgotten"

When Winston Smith discovers the blind spot in his apartment -- the niche just out of range of the telescreen, Big Brother’s combination video feed and surveillance system -- it is, George Orwell tells us, “partly the unusual geography of the room” that allows him to take the risk of writing in a diary.

Later Smith finds another room with no telescreen at all, where he and Julia create another zone of privacy: the shared kind, intimacy. It can’t last, of course, and it doesn’t, with brutal consequences for both of them. (Thoughtcrime does not pay.)

The dystopia of Orwell’s 1984 is very much the product of its era, which spanned roughly the period between Hitler’s ascension to power in 1933 and Stalin’s death 20 years later. And while the novel’s depiction of a world without privacy can still raise a reader’s hackles, its technology now looks both retrofuturist and surprisingly inefficient. The telescreens are menacing, but there’s always a chance that Big Brother’s watchers will overlook something. And look at the tools that Winston uses to carve out his own domain of personal memory and antitotalitarian sentiment: a pen and paper. The authorities manage to read his thoughts eventually, but it takes most of the novel to get to that point. Today, Winston would be destined to Room 101 before he powered down his notebook.

Last week, I noted that Meg Leta Jones’s book Ctrl+Z: The Right to Be Forgotten (NYU Press) arrives at a time when ever fewer activities or communicative exchanges occur without the accompaniment of some form of information technology intervening. Digital traces generated along the way are gathered, analyzed, sold. And the right to privacy becomes a little more purely notional each time one’s eyes slide down the text of a user agreement on the way to clicking “accept.”

A kind of fatalism is involved, one resting on the tacit but powerful tendency to assume that technology itself defines what information will gathered, and how, and the use to be made of it. Implied is a trade-off between privacy and various benefits -- with both the cost and the reward determined by what our devices do and require. Privacy is, in this view, a function of engineering necessities, not of political or moral decisions.

The initial, blunt challenge to technological determinism comes in Ctrl+Z’s opening chapters, where Jones, an assistant professor of communications, culture and technology at Georgetown University, contrasts how the European Union and the United States frame their policies concerning the availability of personal information online. Here personal information would include employment history, financial data and arrest records, as well as, say, material communicated via social media.

In the United States, she writes, the default attitude “permits the collection and transfer of personal information and prevents abuse through self-regulation and market forces,” while E.U. states “operate under comprehensive regimes that protect information across both the public and private sectors and are enforced by specialized data-protection agencies.”

The contrast becomes striking when “data protection” might be better described as protecting the reputation or well-being of the individual to which the data pertains. Take the case of someone who, as a young adult, is arrested for vandalism and destruction of property and serves a jail sentence, all of which was written up in a newspaper in 1990 as well as being documented in official records. Once released, he swears off his old ways and spends the next 25 years in steady employment and overall irreproachable conduct. He awakes to find that the newspaper has digitized its archives and made them searchable via Google.

If our reformed graffiti artist lives in America, he can do little if anything about it, apart from asking the paper to take down its accurate but deeply embarrassing article. There is also a chance his conviction will be publicized on any of various websites dedicated to posting mug shots.

In a number of E.U. countries, by contrast, he could appeal to laws that forbid public reference to someone’s criminal record if it is no longer news or if the ex-con has undergone significant rehabilitation. He might also file a request with Google to remove links to sites mentioning the old transgression. In 2014, the Court of Justice of the European Union ruled that the search engine had to establish a take-down system for people who wanted personal information removed from its search results.

There are variations from country to country, but Jones finds that the E.U. “data subject” (in effect, the citizen’s digital doppelgänger) can claim a “general right to personality” -- a certain degree of dignified immunity from unwelcome attention. The American data subject, by contrast, is presumed to take the Wild West ethos of the Internet pretty much as a given, with any effort to delete information or limit its circulation being labeled, almost inevitably, as Orwellian. (Even so, a number of piecemeal efforts have been made in the United States to protect children and victims of harassment and bullying, including laws against revenge porn.)

But as Jones goes on to show, any preference for one of these frameworks over the other will soon enough be faced with the much harder matter of dealing with new and unanticipated shades of gray left out of the public/private distinction. And the other dichotomy -- between having every bit of personal data (flattering, humiliating or neither) either preserved forever in a digital archive or destined for the memory hole -- is also looking out of date. Jones’s book doesn’t predict what comes next, but it’s a great stimulant for anyone bracing themselves to think about it.

Editorial Tags: 

New book critiques campus censorship movement and pushes for marketplace of ideas

Smart Title: 

New book argues that students involved in campus protests over controversial speakers or ideas should instead support a marketplace of ideas in which all notions are heard and the best rise to the top.

Review of Stefan Zweig, 'Messages From a Lost World: Europe on the Brink'

Fame can be fickle and destiny, perverse -- but what are we to call how posterity has treated Stefan Zweig? In the period between the world wars, he was among the best-known authors in the world and, by some reckonings, the most translated. Jewish by birth and Austrian by nationality, Zweig was perhaps most of all Viennese by sensibility. His fiction expressed the refined cynicism about sexual mores that readers associated with Sigmund Freud and Arthur Schnitzler, and he played the role of Vienna coffeehouse man of letters to perfection.

Zweig’s cosmopolitanism was of the essence: his biographical and critical studies of European literary and historical figures spanned the Renaissance through the early 20th century and even roamed far enough abroad to include Mary Baker Eddy, the very American if otherwise sui generis founder of Christian Science. (The portrait of her in his book Mental Healers is etched in acid, but it occupies a surprisingly appropriate spot: between the accounts of Franz Mesmer and Freud.)

His books fueled the Nazi bonfires. Even apart from their racial obsessions, Zweig was precisely the sort of author to drive Hitler and Goebbels into paroxysms, and after years of exile he committed suicide in Brazil in 1942. His reputation did not survive the war, either, at least not among English-language readers. After four decades or so of relatively far-flung reading, I think of him as one of those authors who seem only ever to show up in parentheses and footnotes, or sometimes pointed out as a biographer too prone to psychologizing or melodrama. Being barely remembered trumps being totally forgotten (it’s more than most of us will be, anyway), but Zweig hardly seemed like a figure poised for rediscovery when, not too long ago, the comeback began.

The essays and speeches collected in Messages From a Lost World: Europe on the Brink (Pushkin Press) form a supplement to the volume that launched the revival -- Zweig’s memoir, The World of Yesterday, which the University of Nebraska Press published in the new translation that Pushkin Press issued in 2009. (Finished just before the author’s suicide, the book first appeared in English in 1943. That earlier translation can also be had, in ebook format.) The recent NYRB Classics editions of his fiction have had a lot to do with it being more than a one-book revival, but e-publishing and print-on-demand operations account for nearly everything by Zweig in English now being available. A mixed blessing, given that some such “publishers” do little but sell you copies of material available for free from the Internet Archive or Project Gutenberg.

The World of Yesterday is less autobiography than a self-portrait of the European literati during the final years of the belle epoque -- the four decades of relative peace and prosperity on the continent that ended with World War I. The new communication technologies and modes of high-speed transport were shrinking the world, while the spread of education, scientific progress and humane cultural values would presumably continue. The earliest pieces in Messages From a Lost World contain Zweig’s musings on the spiritual impact of the war, written while it was still in progress and with no end in sight. They are the thoughts of a man trying to find his way out of what must have seemed a completely reasonable state of despair:

“Never since it came into being has the whole world been so communally seized by nervous energy. Until now a war was only an isolated flare-up in the immense organism that is humanity, a suppurating limb which could be cauterized and thus healed, whilst all the remaining limbs were free to perform their normal functions without the least hindrance …. But due to its steady conquest of the globe, humanity forged ever-closer links, so today a fever quivers within its whole organism; horrors easily traverse the entire cosmos. There is not a workshop, not an isolated farm, not a hamlet deep in a forest from which they have not torn a man so that he might launch himself into the fray, and each of these beings is intimately connected to others by myriad threads of feeling; even the most insignificant among them has breathed so much of the feverish heat, his sudden disappearance makes those that remain that much colder, more alone and empty.”

In pieces from the 1920s and early ’30s, Zweig takes it as a moral imperative to champion the cause of peace by reminding his readers and listeners that humanity could no longer afford the sort of belligerent nationalism that had led them into the Great War. Respect for the possibilities of human development should replace claims to military greatness:

“If men lived [in earlier eras] as if in the folds of a mountain, their sight limited by the peaks on either side, we today see as if from a summit all worldly happenings in the same moment, in their exact dimensions and proportions. And because we have this commanding view across the surface of the earth, we must now usher in new standards. It’s no longer a case of which country must be placed ahead of another at their expense, but how to accomplish universal movement, progress, civilization. The history of tomorrow must be a history of all humanity and the conflicts between individual countries must be seen as redundant alongside the common good of the community.”

If the world could be changed by elegantly expressed humanist sentiments, this passage, from a speech delivered in 1932, might have altered the course of history. But the way it tempted fate now looks even more bitterly ironic than it did after Hitler took office a few months later. For in spite of his lofty vantage point (“we today see as if from a summit all worldly happenings in the same moment, in their exact dimensions and proportions”) and a depth of historical vision giving him insight into Magellan, Casanova, Napoleon, Goethe, Nietzsche and Mary Antoinette (among others), Zweig was struck dumb by the course of events after 1933.

Not literally, of course; on the contrary, later selections in Messages From a Lost World show that he could turn on the spigots of eloquence all too easily. (The tendency of Zweig's prose to turn purple and kitschy has been mocked at length by the translator Michael Hoffmann.) But he remained notoriously -- and to a great many people, unforgivably -- averse to speaking out against what was happening in Germany. He did say that, considering the company it put him in, having his book torched by the Nazis was an honor. Yet as a statement of solidarity against a regime that would, in time, burn people, that seems decidedly wanting.

Uncharitable people have accused Zweig of cowardice, while Hannah Arendt’s essay on him, found in Reflections on Literature and Culture (Stanford University Press, 2007) treats Zweig as an example of the cultural mandarin so determined to avoid the grubby realities of politics that he disgraced his own idealism. Whether or not that implies a lack of courage is a difficult question.

But surely it isn’t being excessively generous to wonder if Zweig’s failure was one of imagination, rather than of nerve: the inability, first of all, to grasp that Hitler was more than just another nationalist demagogue, followed by a paralysis at seeing the mad dream of the Thousand-Year Reich tearing into reality across most of Europe, with no plan to stop there. Against the horrible future, all he had left was nostalgia -- with a memory of what security and boundless optimism had been like, once.

Editorial Tags: 

Pages

Subscribe to RSS - Books
Back to Top