If ideas are tools -- “equipment for living,” so to speak -- we might well imagine the culture as a heavily patched-up conceptual backpack that has been around the world a few times. It has been roughly handled along the way.
The stitches strain from the sheer quantity and variety of stuff crammed into it over the years: global culture, national culture, high culture, popular culture, classical and print and digital cultures, sub- and countercultures, along with cultures of violence, of affluence, of entitlement, of critical discourse …. It’s all in there, and much else besides. How it all fits -- what the common denominator might be -- is anyone’s guess. We could always draw on the useful clarifying distinction between: (1) culture as a category of more or less aesthetic artifacts, perhaps especially those that end up in museums and libraries, and (2) culture as the shared elements of a way of life.
The difference is, in principle, one of kind, not of quality, although assumptions about value assert themselves anyway. The first variety is sometimes called “the Matthew Arnold idea of culture,” after that Victorian worthy’s reference, in his book Culture and Anarchy, to “the best which has been thought and said.” Presumably music and painting also count, but Arnold’s emphasis on verbal expression is no accident: culture in his use of the term implies literacy.
By contrast “culture in the anthropological sense” -- as the second category is often called -- subsumes a good deal that can be found in societies without writing: beliefs about the nature of the world, ways of dressing, gender roles, assumptions about what may be eaten and what must be avoided, how emotions are expressed (or not expressed) and so on. Culture understood as a way of life includes rules and ideas that are highly complex though not necessarily transmitted through formal education. You absorb culture by osmosis, often through being born into it, and much of it goes without saying. (This raises the question of whether animals such as primates or dolphins may be said to have cultures. If not, why not? But that means digging through a whole other backpack.)
The dichotomy isn’t airtight, by any means, but it has served in recent years as a convenient pedagogical starting point: a way to get students (among others) to think about the strange ubiquity and ambiguity of culture as a label we stick on almost everything from the Code of Hammurabi to PlayStation 4, while also using it to explain quite a bit. Two people with a common background will conclude a discussion of the puzzling beliefs or behavior of a third party by agreeing, “That’s just part of their culture.” This seems more of a shrug than an explanation, really, but it implies that there isn’t much more to say.
One way to think of Terry Eagleton’s new book, Culture (Yale University Press), is as a broad catalog of the stuff that comes out when you begin unpacking the concept in its title -- arranging the contents along a spectrum rather than sorting them into two piles. In doing so, Eagleton, a distinguished professor of English literature at the University of Lancaster, follows closely the line of thought opened by the novelist and critic Raymond Williams, who coined the expression “culture as a whole way of life.” Williams probably derived the concept in turn, not from the anthropologists, but from T. S. Eliot. In distinguishing “culture as ordinary” (another Williams phrase) from culture as the work that artists, writers, etc. produce, the entire point was to link them: to provoke interest in how life and art communicated, so to speak.
For Williams, the operative word in “culture as a whole way of life” was, arguably, “whole”: something integral, connected and coherent, but also something that could be shattered or violated. Here, too, Eagleton is unmistakably Williams’s student. His assessment of how ideas about culture have taken shape over the past 200 years finds in them a pattern of responses to both industrialism (along with its spoiled heir, consumerism) and the French revolution (the definitive instance of “a whole way of life” exploding, or imploding, under its own strains). “If it is the cement of the social formation,” Eagleton writes, culture “is also its potential point of fracture.”
It may be that I am overemphasizing how closely Eagleton follows Williams. If so, it is still a necessary corrective to the way Williams has slowly turned into just another name in the Cultural Studies Hall of Fame rather than a felt moral and intellectual influence. His emphasis on culture as “a whole way of life” -- expressed with unabashed love and grief for the solidarity and community he knew when growing up in a Welsh mining community -- would sound remarkably anachronistic (if not ideologically totalizing and nostalgically uncritical) to anyone whose cultural reference points are of today’s commodified, virtual and transnational varieties.
And to that extent, Eagleton’s general survey of ideas about culture comes to a sharp point -- aimed directly at how the concept functions now in a capitalist society that he says, “relegates whole swaths of its citizenry to the scrap heap, but is exquisitely sensitive about not offending their beliefs.”
He continues, in a vein that Williams would have appreciated: “Culturally speaking, we are all to be granted equal respect, while economically speaking the gap between the clients of food banks and the clients of merchant banks looms ever larger. The cult of inclusivity helps to mask these material differences. The right to dress, worship or make love as one wishes is revered, while the right to a decent wage is denied. Culture acknowledges no hierarchies, but the educational system is riddled with them.” This may explain why culture is looking so raggedy and overburdened as a term. Pulled too tight, stretched too thin, it covers too many things that it would be difficult to face straight on.
Somewhere along the way, Nietzsche’s apothegm “That which does not destroy me makes me stronger” lost all the irony and ambiguity it had in context and turned into an edifying sentiment -- a motivational catchphrase, even on the order of that poster saying, “Hang in there, baby!” with the cat clinging to a tree branch.
“Destroy” is often rendered “kill,” giving it a noirish and edgy sound. Either way, the phrase is uplifting if and only if understood figuratively, as a statement about mental resilience. For when taken literally, it is barely even half true, as a moment’s reflection reveals. A life-threatening virus can make us stronger -- i.e., immune to it in the future -- but a bullet to the brain never will. That truth would not have been lost on Nietzsche, who understood philosophy as a mode of psychology and both as rooted in physiology.
He expected the reader not just to absorb a thought but to test it, to fill in its outlines and pursue its implications -- including, I think, a contradictory thought: Whatever does not kill me might very well leave me wishing it had.
While riding her newly repaired bicycle early in the fall semester of 2003 -- pumping the pedals hard, with the strong legs of someone just entering her 50s and determined not to feel it -- Christina Crosby, a professor of English and feminist, gender and sexuality studies at Wesleyan University, got a branch caught in the front wheel. She went flying from her seat, landing on the pavement chin first, fracturing two vertebrae in her neck. The broken bone scraped her spinal cord. One indication of how fast it all happened is that reflexes to break a fall never kicked in. Her hands were not damaged at all.
“Serious neurological damage started instantly,” Crosby writes in A Body, Undone: Living On After Great Pain (NYU Press); “blood engorged the affected site, and the tissue around the lesion began to swell, causing more and more damage as the cord pressed against the broken vertebrae. I also smashed my chin into tiny pieces, tore open my lips, slashed open my nose, breaking the cartilage, and multiply fractured the orbital bone underneath my right eye.” She had been wearing wire-frame glasses and the force of the impact drove the metal into the bridge of her nose.
Crosby spent three weeks in the hospital, unconscious in intensive care for most of it, and only found out later, from her lover, Janet, that the neurosurgeons and plastic surgeons “debated who should go first.” The plastic surgeons won. It sounds as if they had proportionately the more hopeful and effective job to do -- piecing together her chin from fragments, reconstructing her mouth, removing the eyeglass frames from her flesh and leaving only a half-moon scar.
The neurological damage was much more extensive and included both paralysis and a loss of sensation from the neck down. In time, Crosby regained limited use of her hands and arms and could begin to overcome the extreme (and dangerous) atrophy that set in following the accident. She was able to return to teaching part time at Wesleyan in 2005.
The author refers to herself dictating the memoir, but it feels very much as a piece of writing -- that is, as something composed in large part through revision, through grappling with the enormous problem of communicating sequences of experience and thought that few readers will have shared. The accident occurred relatively late in her life and without warning; the contrast between her existence before and after the catastrophic event is made even starker by the fact that she cannot remember it happening. “My sense of a coherent self,” she writes, “has been so deeply affronted” that the book in large measure serves as a way to try to put the fragments back together again without minimizing the depth of the chasm she has crossed.
“You become who you are,” Crosby writes, “over the course of a life that unfolds as an ongoing interaction with objects and others, from the infant you once were, whose bodily cartography slowly emerged as you were handled by caregivers whose speech washed over you, to the grown-up you are today, drawn beyond reason to one person rather than another.”
On that last point she has been extraordinarily fortunate in whom she found herself drawn to: the bond she shares with Janet seems like a rope across the abyss, or more like a steel cable, perhaps. (I call her by her first name simply because the author does. The view from Janet R. Jakobsen’s side of things may be read in a thoughtful essay from 2007.) At the same time, A Body, Undone is anything but sentimental about the possibilities of growth and healing. As doctors lowered the dosage of Crosby’s painkillers, new forces imposed themselves on her daily life:
“I feel an unassuageable loneliness, because I will never be able to adequately describe the pain I suffer, nor can anyone accompany me into the realm of pain …. Pain is so singular that it evades direct description, so isolating because in your body alone. Crying, and screaming, and raging against pain are the signs of language undone. … I have no exact account of how pain changes my interaction with my students and my colleagues, but I know there are times when I don’t feel fully present. It’s not that the pain is so bad that it demands all my attention, but rather that it’s so chronic as to act like a kind of screen.”
No pseudo-Nietzschean bromides to be had here. There is also the difficult new relationship with one’s bowels when they cease to be under any control by the brain -- the discovery of a whole terra incognita beyond ordinary feelings of awkwardness and embarrassment. Crosby discusses her reality with a candor that must be experienced to be believed. And the reader is left to face the truth that one’s embodiment (and the world that goes with it) can change utterly and forever, in a heartbeat.
The folklore of Indonesia and Thailand tells of a frog who is born under half of a coconut-shell bowl and lives out his life there. In time, he draws the only sensible conclusion: the inside of the shell is the whole universe.
“The moral judgment in the image,” writes Benedict Anderson in Life Beyond Boundaries: A Memoir (Verso), “is that the frog is narrow-minded, provincial, stay-at-home, and self-satisfied for no good reason. For my part, I stayed nowhere long enough to settle down in one place, unlike the proverbial frog.”
Anderson, a professor emeritus of international studies, government and Asia studies at Cornell University, wrote major studies of the history and culture of Southeast Asia. A certain degree of cosmopolitanism went with the fieldwork. But the boundaries within a society can be patrolled just as insistently as its geographical borders -- and in the case of academic specialties, the guards inspecting passports tend to be quite unapologetically suspicious.
In that regard, Anderson was an even more remarkable citizen of the world, for his death late last year has been felt as a loss in several areas of the humanities as well as at least a couple of the social sciences. Nearly all of this reflects what someone writing in a scholarly journal once dubbed “Benedict Anderson’s pregnant phrase” -- i.e., the main title of his 1983 work Imagined Communities: Reflections on the Origin and Spread of Nationalism, which treated the mass production of books and periodicals in vernacular languages (what he called “print capitalism”) as a catalytic factor in creating a shared sense of identity and, with it, the desire for national sovereignty.
By the 1990s, people were pursuing tangents from Anderson’s argument with ever more tenuous connection to nationalism -- and still less to the specific emphasis on print capitalism. Any group formed and energized by some form of mass communication might be treated as an imaginary community. Here one might do a search for “Benedict Anderson” and ”World of Warcraft” to see why the author came to think of his best-known title as “a pair of words from which the vampires of banality have by now sucked almost all the blood.” Even so, Imagined Communities has shown remarkable longevity, and its landmark status is clearly international: it had been translated into more than 30 languages as of 2009, when it appeared in a Thai edition.
The reader of Life Beyond Boundaries soon understands why Anderson eventually developed mixed feelings about his “pregnant phrase” and its spawn. His sense of scholarship, and of life itself, was that it ought to be a mode of open-ended exploration, of using what you’ve learned to figure out what you could learn. Establishing a widely known line of thought must have become frustrating once it’s assumed to represent the only direction in which you can move. Professional interest is not the only kind of interest; what it recognizes as knowledge is no measure of the world outside the shell.
Anderson wrote the memoir by request: a Japanese colleague asked for it as a resource to show students something of the conduct of scholarship abroad and to challenge the “needlessly timid” ethos fostered by Japanese professors’ “patriarchal attitude.” Long retired -- and evidently reassured by the thought that few of his American colleagues would ever see the book -- Anderson was wry and spot-on in recounting the unfamiliar and not always agreeable experience of American academic life as he found it after emigrating to the United States from England as a graduate student in the late 1950s. For one thing, his professors looked askance at his papers, where he might indulge in a sardonic remark if so inspired, or pursue a digressive point in his footnotes.
“In a friendly way,” he writes, “my teachers warned me to stop writing like this …. It was really hard for me to accept this advice, as in previous schools I had always been told that, in writing, ‘dullness’ was the thing to be avoided at all cost.” He also underscores the paradox that the pragmatic American disinterest in “grand theory” coexisted with an academic hunger for it, renewed on a seasonal basis:
“‘Theory,’ mirroring the style of late capitalism, has obsolescence built into it, in the manner of high-end commodities. In year X students had to read and more or less revere Theory Y while sharpening their teeth on passé Theory W. Not too many years later, they were told to sharpen their teeth on passé Theory Y, admire Theory Z, and forget about Theory W.”
Lest anyone assume this refers to the situation in the humanities, it’s worth clarifying that one example he gives is the “modernization theory” that once ruled the social sciences roost. And similar ridings of the trend wave also prevail in the choice of areas for research. The antidote, he found, came from leaving the academic coconut bowl to explore Indonesia, the Philippines and Thailand:
“I began to realize something fundamental about fieldwork: that it is useless to concentrate exclusively on one’s ‘research project.’ One has to be endlessly curious about everything, sharpen one’s eyes and ears, and take notes about anything …. The experience of strangeness makes all your senses much more sensitive than normal, and your attachment to comparison grows deeper. This is also why fieldwork is so useful when you return home. You will have developed habits of observation and comparison that encourage or force you to start noticing that your own culture is just as strange ….”
Unfortunately the author does not say how his intended Japanese public responded to Life Beyond Boundaries. A lot probably depends on how well the moments of humor and reverie translated. But in English they read wonderfully, and the book is a gem.
Prestige has its privileges. When a well-established award is announced -- as the 100th set of Pulitzer Prize winners was on Tuesday -- it tends to consume the available limelight. Anything less monumental tends to disappear into its shadow.
But a couple of developments in the humanities this week strike me as being as newsworthy as the Pulitzers. If anything, they are possibly more consequential in the long run.
For one, we have the Whiting Foundation’s new Public Engagement Fellowship, which named its first recipients on Tuesday. The fellowship ought not to be confused with the Whiting Award, which since 1985 has been given annually to 10 authors “based on early accomplishment and the promise of great work to come.” The winners receive $50,000 each, along with, presumably, the professed esteem and subdued malice of their peers.
By contrast, the Public Engagement Fellowships go to professors who have shown “a demonstrated commitment to using their scholarly expertise to reach wider audiences,” in order to fund ambitious projects designed to have direct and significant impact on a specific public outside the academy.” There are eight scholars in the fellowship’s inaugural cohort, including, for instance, Zoë Kontes, an associate professor of classics at Kenyon College, who will spend a semester creating a podcast to explore the black market in looted artifacts.
As with the literary prize, the fellowship comes with $50,000, with $10,000 earmarked for the project’s expenses and the rest covering the recipient’s stipend. Neither the number of fellows nor the apportionment of finances is set in stone, as I learned from Daniel Reid, the foundation’s executive director, when we met last week.
He explained that after more than 40 years of funding dissertations in the humanities at elite universities, the Whiting Foundation had decided it was time to direct its attention to a relatively underserved aspect of humanities scholarship: the cultivation of new ways of making connections with the world beyond the campus. Last year, the foundation contacted administrators at 40 universities, encouraging them to nominate faculty with projects that might be appropriate for funding.
“This has been a learning process on both sides,” Reid said, “for [the foundation] in running things and for the institutions in getting a sense of what we’re looking for.” He explained that the proposals were then evaluated by a group of seven people who had considerable experience with the communication of specialized knowledge to a wide public. The names are not public, though Reid indicates that a number of them are prominent figures in scholarship, publishing and museum or gallery curation. (The need for secrecy is understandable: publicizing the names would leave the Whiting judges as vulnerable as delegates to this summer’s political conventions are starting to feel.)
For the second group of Public Engagement Fellows, the Whiting Foundation will double the number of colleges and universities it contacts in search of nominations, with the long-term goal of making the process open to all higher education institutions. In the future, the number of recipients may range from six to 10. I gave the example of Kontes’s podcast on the looting of antiquities as an example (not quite at random: consider me on the waiting list to subscribe) but hope the other projects stimulate interest, discussion and perhaps some healthy competition.
The other development from earlier in the week is Duke University Press’s announcement that it will be publishing an edition of the works of Stuart Hall, who can -- without exaggeration, if not without argument -- be called the founding father of cultural studies as an academic discipline, at least in Great Britain. The Wikipedia entry for Hall is surprisingly thorough, so anyone for whom the name does not signify might want get up to speed there.
Hall is the case of a figure in the humanities whose impact is both widely recognized yet difficult to assess for an American -- for the simple reason that, even at the peak of his influence, his work was remarkably difficult to find. A number of his major writings seem to have been published as mimeographed papers. He published books, but not that many found their way into American bookstores. So the prospect of having his scattered and fugitive writings in an edition from a major university press is appealing.
I heard that Ken Wissoker, the press's editorial director, might have some background information on why we are getting Hall’s work in this form only now, two years after his death. He confirmed my impression in an email note and gave a little background that seems worth putting into the record: “David Morley had edited two or three volumes of Stuart’s essays for Macmillan U.K. back in the late ’80s, but my understanding is that Stuart decided against having them come out (or delayed it into not happening). The original cultural studies essays were in a lot of different places …. Xeroxes and then PDFs circulated, but it would have been very difficult to track down all the originals …. Stuart saw the work as conjunctural and didn’t want it becoming scripture. Ironically, this was only a problem in English. There are translations to Mandarin and German (and I believe Spanish and/or Portuguese).”
The first of the two titles in the Duke edition will be out this fall, and the second will be published next spring. One is a set of lectures on the intellectual foundations of cultural studies, the other the first volume of Hall’s autobiography. “The memoir will have a second volume,” Wissoker says, “that will be more of an intellectual and political summation ‘what I think now’ book.” Farther down the line there will be a volume of selected essays, and Laura Sell, Duke's publicity and advertising manager, says that a number of thematically organized collections on “politics, race, photography, black aesthetics, Marxism and post-Marxism, [and] the Caribbean” will come in due course.
When Winston Smith discovers the blind spot in his apartment -- the niche just out of range of the telescreen, Big Brother’s combination video feed and surveillance system -- it is, George Orwell tells us, “partly the unusual geography of the room” that allows him to take the risk of writing in a diary.
Later Smith finds another room with no telescreen at all, where he and Julia create another zone of privacy: the shared kind, intimacy. It can’t last, of course, and it doesn’t, with brutal consequences for both of them. (Thoughtcrime does not pay.)
The dystopia of Orwell’s 1984 is very much the product of its era, which spanned roughly the period between Hitler’s ascension to power in 1933 and Stalin’s death 20 years later. And while the novel’s depiction of a world without privacy can still raise a reader’s hackles, its technology now looks both retrofuturist and surprisingly inefficient. The telescreens are menacing, but there’s always a chance that Big Brother’s watchers will overlook something. And look at the tools that Winston uses to carve out his own domain of personal memory and antitotalitarian sentiment: a pen and paper. The authorities manage to read his thoughts eventually, but it takes most of the novel to get to that point. Today, Winston would be destined to Room 101 before he powered down his notebook.
Last week, Inoted that Meg Leta Jones’s book Ctrl+Z: The Right to Be Forgotten (NYU Press) arrives at a time when ever fewer activities or communicative exchanges occur without the accompaniment of some form of information technology intervening. Digital traces generated along the way are gathered, analyzed, sold. And the right to privacy becomes a little more purely notional each time one’s eyes slide down the text of a user agreement on the way to clicking “accept.”
A kind of fatalism is involved, one resting on the tacit but powerful tendency to assume that technology itself defines what information will gathered, and how, and the use to be made of it. Implied is a trade-off between privacy and various benefits -- with both the cost and the reward determined by what our devices do and require. Privacy is, in this view, a function of engineering necessities, not of political or moral decisions.
The initial, blunt challenge to technological determinism comes in Ctrl+Z’s opening chapters, where Jones, an assistant professor of communications, culture and technology at Georgetown University, contrasts how the European Union and the United States frame their policies concerning the availability of personal information online. Here personal information would include employment history, financial data and arrest records, as well as, say, material communicated via social media.
In the United States, she writes, the default attitude “permits the collection and transfer of personal information and prevents abuse through self-regulation and market forces,” while E.U. states “operate under comprehensive regimes that protect information across both the public and private sectors and are enforced by specialized data-protection agencies.”
The contrast becomes striking when “data protection” might be better described as protecting the reputation or well-being of the individual to which the data pertains. Take the case of someone who, as a young adult, is arrested for vandalism and destruction of property and serves a jail sentence, all of which was written up in a newspaper in 1990 as well as being documented in official records. Once released, he swears off his old ways and spends the next 25 years in steady employment and overall irreproachable conduct. He awakes to find that the newspaper has digitized its archives and made them searchable via Google.
If our reformed graffiti artist lives in America, he can do little if anything about it, apart from asking the paper to take down its accurate but deeply embarrassing article. There is also a chance his conviction will be publicized on any of various websites dedicated to posting mug shots.
In a number of E.U. countries, by contrast, he could appeal to laws that forbid public reference to someone’s criminal record if it is no longer news or if the ex-con has undergone significant rehabilitation. He might also file a request with Google to remove links to sites mentioning the old transgression. In 2014, the Court of Justice of the European Union ruled that the search engine had to establish a take-down system for people who wanted personal information removed from its search results.
There are variations from country to country, but Jones finds that the E.U. “data subject” (in effect, the citizen’s digital doppelgänger) can claim a “general right to personality” -- a certain degree of dignified immunity from unwelcome attention. The American data subject, by contrast, is presumed to take the Wild West ethos of the Internet pretty much as a given, with any effort to delete information or limit its circulation being labeled, almost inevitably, as Orwellian. (Even so, a number of piecemeal efforts have been made in the United States to protect children and victims of harassment and bullying, including laws against revenge porn.)
But as Jones goes on to show, any preference for one of these frameworks over the other will soon enough be faced with the much harder matter of dealing with new and unanticipated shades of gray left out of the public/private distinction. And the other dichotomy -- between having every bit of personal data (flattering, humiliating or neither) either preserved forever in a digital archive or destined for the memory hole -- is also looking out of date. Jones’s book doesn’t predict what comes next, but it’s a great stimulant for anyone bracing themselves to think about it.
New book argues that students involved in campus protests over controversial speakers or ideas should instead support a marketplace of ideas in which all notions are heard and the best rise to the top.