On the Friday following Thanksgiving in 2009, Tiger Woods had an automobile accident. For someone who does not follow golf, the headlines that ran that weekend provided exactly as much information as it seemed necessary to have. Over the following week, I noticed a few more headlines, but they made no impression. Some part of the brain is charged with the task of filtering the torrent of signals that bombard it from the media every day. And it did its job with reasonable efficiency, at least for a while.
Some sort of frenzy was underway. It became impossible to tune this out entirely. I began to ignore it in a more deliberate way. (All due respect to the man for his talent and accomplishments, but the doings of Tiger Woods were exactly as interesting to me as mine would be to him.) There should be a word for the effort to avoid giving any attention to some kerfuffle underway in the media environment. “Fortified indifference,” perhaps. It’s like gritting your teeth, except with neurons.
But the important thing about my struggle in 2009 is that it failed. Within six weeks of the accident, I had a rough sense of the whole drama in spite of having never read a single article on the scandal, nor watched nor listened to any news broadcasts about it. The jokes, allusions, and analogies spinning off from the event made certain details inescapable. A kind of cultural saturation had occurred. Resistance was futile. The whole experience was irritating, even a little depressing, for it revealed the limits of personal autonomy in the face of an unrelenting media system, capable of imposing utterly meaningless crap on everybody’s attention, one way or another.
But perhaps that’s looking at things the wrong way. Consider the perspective offered by Orin Starn in The Passion of Tiger Woods: An Anthropologist Reports on Golf, Race, and Celebrity Scandal (Duke University Press). Starn, the chair of cultural anthropology at Duke, maintains that the events of two years back were not meaningless at all. If anything, they were supercharged with cultural significance.
The book's title alludes to the theatrical reenactments of Christ’s suffering performed at Easter during the middle ages, or at least to Mel Gibson’s big-screen rendition thereof. Starn interprets “Tigergate” as an early 21st-century version of the scapegoating rituals analyzed by René Girard. From what I recall of Girardian theory, the reconsolidation of social order involves the scapegoat being slaughtered, rather than paying alimony, though in some cases that may be too fine a distinction.
The scandal was certainly louder and more frenetic than the game that Woods seems have been destined to master. The first image of him in the book shows him at the age of two, appearing on "The Mike Douglas Show" with his father. He is dressed in pint-sized golfing garb, with a little bag of clubs over his shoulder. As with a very young Michael Jackson, the performance of cuteness now reads as a bit creepy. Starn does not make the comparison, but it’s implicit, given the outcome. “This toddler was not to be one of those child prodigies who flames out under unbearable expectations,” Starn writes. “By his early thirties, he was a one-man multinational company…. Forbes magazine heralded Woods as the first athlete to earn $1 billion.”
Starn, who mentions that he is a golfer, is also a scholar of the game, which he says “has always traced the fault lines of conflict, hierarchy, and tension in America, among them the archetypal divides of race and class.” To judge by my friend Dave Zirin’s book A People’s History of Sports in the United States (The New Press) that’s true of almost any athletic pursuit, even bowling. But the salient point about Woods is that most of his career has been conducted as if no such fault lines existed. Starn presents some interesting and little-known information on how golf was integrated. But apart from his genius on the green, Woods’s “brand” has been defined by its promise of harmony: “He and his blonde-haired, blue-eyed wife, Elin Nordegren, seemed the poster couple for a shiny new postracial America with their two young children, two dogs, and the fabulous riches of Tiger’s golfing empire.”
Each of his parents had a multiracial background -- black, white, and Native American on his father’s side; Chinese, Thai, and Dutch on his mother’s. “Cablinasian,” the label Woods made up to name his blended identity, is tongue-in-cheek, but it also represents a very American tendency to mess with the established categories of racial identity by creating an ironic mask. (Ralph Ellison wrote about in his essay “Change the Joke and Slip the Yoke.”)
But that mask flew off, so to speak, when his car hit the fire hydrant in late 2009. Starn fills out his chronicle of the scandal that followed with an examination of the conversation and vituperation that took place online, often in the comments sections of news articles -- with numerous representative samples, in all their epithet-spewing, semiliterate glory. The one-drop rule remains in full effect, it seems, even for Cablinasians.
“For all the ostensible variety of opinion,” Stern writes about the cyberchatter, “there was something limited and predictable about the complaints, stereotypes, and arguments and counterarguments, as if we were watching a movie we’d already seen many times before. Whether [coming from] the black woman aggrieved with Tiger about being with white women or the white man bitter about supposed black privilege, we already knew the lines, or at least most of them.… We are all players, like it or not, in a modern American kabuki theater of race, where our masks too often seem to be frozen into a limited set of expressions.”
Same as it ever was, then. But this is where the comparison to a scapegoating ritual falls apart. (Not that it’s developed very much in any case.) At least in Girard’s analysis, the ritual is an effort to channel and purge the conflicts within a society – reducing its tensions, restoring its sense of cohesion and unity, displacing the potential for violence by administering a homeopathic dose of it. Nothing like that can be said to have happened with Tigergate. It involved no catharsis. For that matter, it ran -- by Starn’s own account -- in exactly the opposite direction: the golfer himself symbolized harmony and success and the vision of historical violence transcended with all the sublime perfection of a hole-in-one. The furor of late 2009 negated all of that. The roar was so load that it couldn’t be ignored, even if you plugged your ears and looked away.
The latest headlines indicate that Tiger Woods is going to play the Pebble Beach Pro-Am tournament next month, for the first time in a decade. Meanwhile, his ex-wife has purchased a mansion for $12 million and is going to tear it down. She is doing so because of termites, or so go the reports. Hard to tell what symbolic significance that may have. But under the circumstances, wiping out termites might not be her primary motivation for destroying something incredibly expensive.
In 1939, the French anthropologist Michel Leiris published a memoir called Manhood in which he undertook an inventory of his own failures, incapacities, physical defects, bad habits, and psychosexual quirks. It is a triumph of abject self-consciousness. And the subtitle, “A Journey from Childhood into the Fierce Order of Virility,” seems to heighten the cruelty of the author’s self-mockery. Leiris portrays himself as a wretched specimen: machismo’s negation.
But in fact the title was not ironic, or at least not merely ironic. It was a claim to victory. “Whoever despises himself, still respects himself as one who despises,” as Nietzsche put it. In an essay Leiris wrote when the book was reissued after World War II, he described it as an effort to turn writing into a sort of bullfight: “To expose certain obsessions of an emotional or sexual nature, to admit publicly to certain deficiencies or dismays was, for the author, the means – crude, no doubt, but which he entrusts to others, hoping to see it improved – of introducing even the shadow of the bull’s horn into a literary work.”
By that standard, Leiris made the most broodingly taciturn character in Hemingway look like a total wuss.
The comment about passing along a technique to others -- “hoping to see it improved” -- now seems cringe-making in its own way. Leiris was addressing a small audience consisting mainly of other writers. The prospect of reality TV, online confessionals, or the industrialized production of memoirs would never have crossed his mind. He hoped his literary method -- a kind of systematic violation of the author's own privacy -- would develop as others experimented with it. Instead, the delivery systems have improved. They form part of the landscape Wayne Koestenbaum surveys in Humiliation, the latest volume in Picador’s Big Ideas/Small Books series.
Koestenbaum, a poet and essayist, is a professor of English at the City University of New York Graduate Center and a visiting professor in the painting department of the Yale School of Art. The book is an assemblage of aphoristic fragments, notes on American popular culture and its cult of celebrity, and reflections on the psychological and social dynamics of humiliation – with a few glances at how writing, or even language itself, can expose the self to disgrace. It’s unsystematic, but in a good way. Just because the author never quotes Erving Goffman or William Ian Miller is no reason to think they aren’t on his mind. “I’m writing this book,” he says early on, “in order to figure out – for my own life’s sake – why humiliation is, for me, an engine, a catalyst, a cautionary tale, a numinous scene, producing sparks and showers…. Any topic, however distressing, can become an intellectual romance. Gradually approach it. Back away. Tentatively return.”
The experience of humiliation is inevitable, short of a life spent in solitary confinement, and I suppose everyone ends up dishing it out as well as taking it, sooner or later. But that does not make the topic universally interesting. The idea of reading (let alone writing) almost two hundred pages on the subject will strike many people as strange or revolting. William James distinguished between “healthy mindedness” (the temperament inclined to “settl[ing] scores with the more evil aspects of the universe by systematically declining to lay them to heart or make much of them…. or even, on occasion, by denying outright that they exist”) and “sick souls” (which “cannot so swiftly throw off the burden of the consciousness of evil, but are congenitally fated to suffer from its presence”). Koestenbaum’s readers are going to come from just one side of that divide.
But then, one of the James’s points is that the sick soul tends to see things more clearly than the robust cluelessness of the healthy-minded ever permits. As a gay writer -- and one who, moreover, was taken to be a girl when he was young, and told that he looked like Woody Allen as an adult -- Koestenbaum has a kind of sonar for detecting plumes of humiliation beneath the surface of ordinary life.
He coins an expression to name “the somberness, or deadness, that appears on the human face when it has ceased to entertain the possibility that another person exists.” He calls it the Jim Crow gaze – the look in the eyes of a lynching party in group photos from the early 20th century, for example. But racial hatred is secondary to “the willingness to desubjectify the other person” – or, as Koestenbaum puts it more sharply, “to treat someone else as garbage.” What makes this gaze especially horrific is that the person wearing it can also be smiling. (The soldier giving her thumbs-up gesture while standing next to naked, hooded prisoners at Abu Ghraib.) The smile “attests to deadness ... you are humiliated by the refusal, evident in the aggressor’s eyes, to see you as sympathetic, to see you as a worthy, equal subject.”
Deliberate and violent degradation is the extreme case. But the dead-eyed look, the smirk of contempt, are common enough to make humiliation a kind of background radiation of everyday social existence, and intensified through digital communication “by virtue of its impersonality…its stealth attack.” An embarrassing moment in private becomes a humiliating experience forever if it goes viral on YouTube.
“The Internet is the highway of humiliation,” Koestenbaum writes. “Its purpose is to humiliate time, to turn information (and the pursuit of information) into humiliation.” This seems overstated, but true. The thought of Google owning everyone’s search histories is deeply unsettling. The sense of privacy may die off completely one day, but for now the mass media, and reality TV most of all, work to document its final twitches of agony. “Many forms of entertainment harbor this ungenerous wish: to humiliate the audience and to humiliate the performer, all of us lowered into the same (supposedly pleasurable) mosh pit.”
A study of humiliation containing no element of confession would be a nerveless book indeed. Koestenbaum is, like Leiris, a brave writer. The autobiographical portions of the book are unflinching, though flinch-inducing. There are certain pages here that, once read, cannot be unread, including one that involves amputee porn. No disrespect to amputees intended, and the human capacity to eroticize is probably boundless; but Koesternbaum's describes a practice it never would have occurred to me as being possible. With hindsight, I was completely O.K. with that, but it's too late to get the image out of my head now.
Humiliation counts on “shame’s power to undo boundaries between individuals,” which is also something creativity does. That phrase comes from Koestenbaum tribute to the late Eve Kosofsky Sedgwick towards the end of the book. He evokes the memory of her friendship at least as much as the importance of her foundational work in queer theory – though on reflection, I’m not so sure it makes sense to counterpose them. Kosofsky’s ideas permeate the book; she was, like Koestenbaum, also a poet; and Humiliation may owe something to A Dialogue on Love, the most intimate of her writings.
But it’s more reckless and disturbing, because the author plays off of his audience's own recollections of humiliation, and even with the reader's capacity for disgust. There’s a kind of crazy grace to Koestenbaum’s writing. He moves like a matador working the bull into ever greater rage -- then stepping out of the path of danger in the shortest possible distance, at the last possible moment, with a flourish.
From time to time, an academic organization will invite me to sit on a panel at one of its gatherings, where my role is to serve as a native informant from the tribe of the journalists – one charged with the task of explaining our bizarre customs, and of demonstrating the primitive means by which we approximate abstract thought. (Sometimes they then give me food.) It is a curious experience, full of potential for misstatement and hasty generalizations. For one thing, the tribe is quite heterogeneous. "Media" is a plural, or it should be anyway. And within any given medium, the "journalistic field" (as Pierre Bourdieu called it) is itself fissured and stratified. It is a point I try to communicate to the professors through a combination of grunts and hand gestures -- an awkward exercise, all around.
But one question-and-answer session sticks in mind as particularly embarrassing. The audience consisted mainly of English professors. Some of them practiced media criticism and analysis of various forms. What role (the question went) did such work play in how those of us in the media understood our work?
For once, I felt no hesitation about generalizing. The short answer was simply that academic media analysis plays no part at all, at least in its theoretically articulated variants. This should not be surprising. As a guest, though, you don’t want to be rude, so I padded the answer out with polite indications that this was a matter of some regret -- and that, to be fair, some kinds of media criticism do provoke a lot of attention in journalistic circles.
But that last part was a bit of a feint. The kinds of analyses produced by people in that audience have no traction. The most subtle and cogent analysis by a rhetorician of how The Times or CNN frames its stories has all the pertinence to a reporter or editor that a spectrographic analysis of jalapeno powder would to someone cooking chili.
This is not a function of journalistic anti-intellectualism, though there’s certainly enough of that to go around. No, it comes down to a knowledge gap –- one in which academic media critics are often at a serious disadvantage. I mean tacit knowledge. There are, for example, things one learns from the experience of interviewing people who are clearly lying to you (or otherwise trying to make you a pawn in whatever game they are playing) that cannot be reduced to either formal propositions or methodological rules.
It is not necessary to read the collected fulminations of Noam Chomsky to learn that editors have ideological blinkers and blindspots. You tend to figure that one out pretty quickly, and may grow impatient with its protracted demonstration. What you want, rather, is some good old-fashioned phronesis -- that is, the cultivated practical wisdom required to know how to handle a situation.
One Web site quotes a scholar’s description of phronesis as "a sound practical instinct for the course of events, an almost indefinable hunch that anticipates the future by remembering the past and thus judges the present correctly." Start showing us how to get some of that, and I guarantee that folks will stand around the newsroom, debating your endnotes.
All of this is a roundabout way of framing the virtues of Danny Schechter’s The Death of Media, as well as its limitations. It is a new title in the Melville Manifestoes series published by Melville House, an independent press mentioned here on Tuesday. Schechter, one of the first producers for CNN and a winner of two Emmys for his work on the ABC program "20/20," has been a Neiman Fellow in Journalism at Harvard University. He is also the author of a book called The More You Watch, the Less You Know (1999), which I haven’t read -- though reportedly it did upset Bill O’Reilly, which seems like recommendation enough.
Schechter, then, is someone who brings tacit knowledge aplenty to the work of commenting on the state of the media. Last year, in his documentary WMD: Weapons of Mass Deception, he did more than reconstruct how the print and electronic media alike fell into line with the administration’s justifications for war. In that, he drew in part on a piece of scholarly research that certainly does deserve the closest and most shame-faced attention by the entire journalistic profession, the study Media Coverage of Weapons of Mass Destruction, by Susan D. Moeller, an associate professor of journalism at the University of Maryland at College Park. (The full text is available here.)
But Schechter went a step further -- zeroing in on moments when reporters and editors worried aloud that changes in the mass media were eroding the difference between practicing journalism and providing coverage. That distinction is not a very subtle one, but it’s largely missing from the conceptual universe of, say, cultural studies.
"Providing coverage" is rather like what Woody Allen said about life: Most of it is just showing up. The cameras record what is happening, or the reporter takes down what was said -- and presto, an event is "covered." The quantity of tacit knowledge so mobilized is not large.
By contrast, any effort to "practice journalism" involves (among other things) asking questions, following hunches, noticing the anomalous, and persisting until someone accidentally says something meaningful. There is more to it than providing stenography to power. It involves certain cognitive skills -- plus a sense of professional responsibility.
In his manifesto, Schechter runs through the familiar and depressing statistics showing a decline of public confidence in the mainstream media, increasing percentages of "infotainment" to hard news, and steady downsizing of reporting staff at news organizations.
One public-opinion poll conducted for the Pew Center found that "as 70 percent of the people asked expressed dissatisfaction with the news media." And the same figure emerged from a survey of people working in the news media: about 70 percent, as Schechter puts it, "feel the same way as their customers." He quotes Hunter S. Thompson’s evocative characterization of the television industry as "a cruel and shallow money trench, a long plastic hallway where thieves and pimps run free, and good men die like dogs. There’s also a negative side."
To all of this, Schechter offers the alternative of ... uh, Wikipedia?
Well, "citizen journalism" anyway -- through which "the ideas, observations, and energy of ordinary people" will serve as "not only a way of democratizing the media but also enlivening it." He points to "the meteoric growth of the blogosphere and the emergence of thousands of video activists," plus the contribution of scholars to "first rate publishing projects," including "a new online, non-commercial encyclopedia that taps the expertise of researchers and writers worldwide."
Well, it’s probably not fair to judge the possibilities for citizen journalism by the actual state of public-access cable TV -- or any given Wikipedia entry written by a follower of Lyndon LaRouche. (Besides, are either all that much worse than MSNBC?) But something is missing from Schechter’s optimistic scenario, in any case.
It is now much easier to publish and broadcast than ever before. In other words, the power to cover and event or a topic has increased. But the skills necessary to foster meaningful discussion are not programmed into the software. They have to be cultivated.
That's where people from academe come in. The most substantial interventions in shaping mass media probably won't come from conference papers and journal articles, but in the classroom -- by giving the future citizen journalist access, not just to technology, but to cognitive tools.
The curtain rises on a domestic scene –- though not, the audience soon learns, a tranquil one. It is the apartment of the philosopher Louis Althusser and his wife Hélène Rytman, on an evening in November, a quarter century ago. The play in question, which opened last month in Paris, is called The Caïman. That’s an old bit of university slang referring to Althusser's job as the “director of studies” -- an instructor who helps students prepare for the final exam at the École Normale Supérieure, part of what might be called the French Ivy League.
The caïman whose apartment the audience has entered was, in his prime, one of the “master thinkers” of the day. In the mid-1960s, Althusser conducted an incredibly influential seminar that unleashed structuralist Marxism on the world. He played a somewhat pestiferous role within the French Communist Party, where he was spokesman for Mao-minded student radicals. And he served as tutor and advisor for generations of philosophers-in-training.
At Althusser’s funeral in 1990, Jacques Derrida recalled how, “beginning in 1952 ... the caïman received in his office the young student I then was.” One of the biographers of Michel Foucault (another of his pupils) describes Althusser as an aloof and mysterious figure, but also one known for his gentleness and tact. When a student turned in an essay, Althusser wrote his comments on a separate sheet of paper -- feeling that there would be something humiliating about defacing the original with his criticisms.
But everyone in the audience knows how Althusser’s evening at home with his wife in November 1980 will end. How could they not? And even if you know the story, it is still horrifying to read Althusser’s own account of it. In a memoir that appeared posthumously, he recalls coming out of a groggy state the next morning, and finding himself massaging Hélène’s neck, just as he had countless times in the course of their long marriage.
“Suddenly, I was terror-struck,” he wrote. “Her eyes stared interminably, and I noticed the tip of her tongue was showing between her teeth and lips, strange and still.” He ran to the École, screaming, “I’ve strangled Hélène!”
He was whisked away for psychiatric evaluation, which can’t have taken long: Althusser’s entire career had been conducted between spells of hospitalization for manic-depression. In one autobiographical fragment from the late 1970s –- presumably written while on a manic high –- he brags about sneaking aboard a nuclear submarine and taking it for a joy-ride when no one was looking. If ever there were reason to question legal guilt on grounds of insanity, the murder of Hélène Rytman would seem to qualify.
He underwent a long spell of psychiatric incarceration -- a plunge, as he later wrote, back into the darkness from which he had awakened that morning. In the late 1980s, after he was released, the philosopher could be seen wandering in the streets, announcing “I am the great Althusser!” to startled pedestrians.
It became the stuff of legend. In the early 1980s, as a student at the University of Texas at Austin, I heard what turns out to have been an apocryphal account of that morning. A small crowd of Althusser’s students, it was said, routinely gathered outside his apartment to greet him each day. When he emerged, disheveled and shrieking that he was a murderer, everyone laughed and clapped their hands. They thought (so the story went) that Althusser was clowning around.
That rumor probably says more about American attitudes towards French thinkers than it does about Althusser himself, of course. The murder has become a standard reference in some of the lesser skirmishes of the culture wars – with Hélène Rytman’s fate a sort of morbid punch-line.
Althusser’s philosophical work took as its starting point the need to question, and ultimately to dissolve, any notion that social structures and historical changes are the result of some basic human essence. Somewhat like Foucault, at least in this regard, he regards the idea of “man” as a kind of myth. Instead, Althusser conceived of history as a “a process without a subject” – something operating in ways not quite available to consciousness. Various economic and linguistic structures interacted to “articulate” the various levels of life and experience.
Althusser called this perspective “theoretical anti-humanism.” And for anyone who loathes such thinking, the standard quip is that he practiced his anti-humanism at home.
That strikes me as being neither funny nor fair. At the risk of sounding like a pretty old-fashioned bourgeois humanist, I think you have to treat his ideas as ... well, ideas. Not necessarily as good ones, of course. (In his seminar, Althusser and his students undertook a laborious and ultimately preposterous effort to figure out when and how Marx became a Marxist, only to conclude that only a few of his works really qualified.) But however you judge his writings, they make sense as part of a conversation that started long before Althusser entered the room -- one that will continue long after we are all dead.
One way to see his “theoretical anti-humanism,” for example, is as a retort to Jean-Paul Sartre’s “Existentialism is a Humanism” –- the lecture that drew standing-room only crowds in 1945, at just about the time Althusser was resuming an academic career interrupted by the war. (The Germans held him as a POW for most of it.) It was the breeziest of Sartre’s introductions to his basic themes: We are free – deep down, and for good. That freedom may be unbearable at times. But it never goes away. No matter what, each individual is always radically responsible for whatever action and meaning is possible in a given circumstance.
“Man,” Sartre told his listeners, “is nothing else but what he makes of himself.” But that “nothing” is, after all, everything. “There is no universe other than a human universe, a universe of human subjectivity.”
For Althusser, this is all completely off track. It rests on the idea that individuals are atoms who create their own meaning – and that somehow then link up to form a society. A very different conception is evident in “Ideology and Ideological State Apparatuses,” a paper from 1970 that is about as close to a smash-hit, era-defining performance as Althusser ever got. Which is to say, not that close at all. But extracts are available in The Norton Anthology of Theory and Criticism, and passages have turned up in countless thousands of course packets in lit-crit and cultural studies, over the years.
For Althusser, it’s exactly backwards to start from the individual as a basic unit capable, through its own imagination and endeavor, to create a world of meaning. On the contrary, there are societies that seek to reproduce themselves over time, not just by producing material goods (that too) but through imposing and enforcing order.
The police, military, and penal systems have an obvious role. Althusser calls them the Repressive State Apparatuses. But he’s much more interested in what he calls the Ideological State Apparatuses – the complex array of religious institutions, legal processes, communication systems, schools, etc. that surround us. And, in effect, create us. They give us the tools to make sense of the world. Most of all, the ISAs convey what the social order demands of us. And for anyone who doesn’t go along....Well, that’s when the Repressive State Apparatuses might just step in to put you in line.
Why has this idea been so appealing to so many academics –- and for such a long time? Well, at the time, it tended to confirm the sense that you could effect radical social change via “the long march through the institutions.” By challenging how the Ideological State Apparatuses operated, it might be possible to shift the whole culture’s center of gravity. And Althusser placed special emphasis on educational institutions as among the most important ISA's in capitalist society.
Such was the theory. In practice, of course, the social order tends to push back –- and not necessarily through repression. A handful of non-academic activists became interested in Althusser for a while; perhaps some still are. But for the most part, his work ended up as a fairly nonthreatening commodity within the grand supermarket of American academic life.
The brand is so well-established, in fact, that the thinker’s later misfortunes are often dismissed with a quick change of subject. The effect is sometimes bizarre.
In 1996, Columbia University Press issued a volume by Althusser called Writings on Psychoanalysis: Freud and Lacan. Surely an appropriate occasion for some thoughtful essays on how the theorist’s own experience of mental illness might have come into play in his work, right? Evidently not: The book contains only a few very perfunctory references to “temporary insanity” and psychiatric care. Presumably Althusser’s editors will be forthcoming next summer, with the publication by Verso of Philosophy of the Encounter: Later Writings, 1978-1987. The catalog text for the book refers to it as “his most prolific period.” But it was also one when much of his writing was done while hospitalized.
Is it possible to say anything about his work and his illness that doesn’t amount to a roundabout denunciation of Althusser? I think perhaps there is.
On one level, his theory about the Ideological State Apparatuses looks....maybe not optimistic, exactly, but like a guide to transforming things. From this point of view, each individual is a point of convergence among several ISAs. In other words, each of us has assimilated various codes and rules about how things are supposed to be. And if there are movements underway challenging how the different ISAs operate, that might have a cumulative effect. If, say, feminists and gay rights activists are transforming the rules about how gender is constructed, that creates new ways of life. (Though not necessarily a social revolution, as Althusser wanted. Capitalism is plenty flexible if there’s a buck to be extracted.)
But that notion of the individual as the intersection of rules and messages also has a melancholy side. It somewhat resembles the experience of depression. If a person suffering from depression is aware of anything, it is this: The self is a product of established patterns....fixed structures.... forces in the outside world that are definitive, and sometimes crushing.
Any Sartrean talk of “radical freedom” makes no sense whatever to anyone in that condition – which is, rather, a state of radical loss. And as the German poet Hans Magnus Enzensberger puts it in a recent essay, the most extreme “radical loser” may find the only transcendence in an act of violence.
“He can explode at any moment,” writes Enzensberger. “This is the only solution to his problem that he can imagine: a worsening of the evil conditions under which he suffers.... At last, he is master over life and death.”
Is that what happened in Althusser’s apartment, 25 years ago? That, or something like it.
It’s been a hard season, marked by a preponderance of headlines announcing the end of a great many things. One of the most instructive entries appeared in the January 4 edition of The New York Times: an op-ed titled “The End of the Financial World As We Know It,” by Michael Lewis and David Einhorn. There’s a longstanding tradition in the humanities of such pronouncements. In November, we heard that irony is dead. Just one week after the Times reported on Joan Didion’s announcement in a talk at the New York Public Library, however, the Sunday “Arts & Leisure” section of the paper ran a story on page 1 announcing that Liza (Minnelli) is back (again), so we can only conclude that, while Didion is inarguably one of the leading expert lights of irony, the reports of its demise were widely exaggerated.
The novel has been pronounced dead so many times (in more than 50 percent of the cases, the actual wording was “Le roman est mort”) that the phrase “the death of the novel” has its own entry on Wikipedia.
Now while I would be the first to agree that irony is, if not dead, certainly sleeping in my literature classes, the novel is in fact undead: 11 out of 12 students in Honors Creative Writing confessed, when questioned closely, that they had in fact read Twilight. Eleven out of 11 blamed this reading choice on their roommates, the student excuse being the one genre that no one has ever pronounced even remotely near death.
It was only a matter of time before the topic of literary studies itself became caught up in the contagion of pronouncements of the demise of one thing or another, and thus The Chronicle of Higher Education of December 19 featured not one but three essays under the general heading of “What Ails Literary Studies.” At least they’re not dead, although ails is somewhat disturbing, with its connotations of some obscure 19th century illness involving headaches and quarantine.
Stanley Fish offered a much deadlier view in "The Last Professor," an entry on Frank Donoghue’s new book, TheLast Professors: The Corporate University and the Fate of the Humanities. According to Donoghue -- a former student of Fish -- we are beyond even “crisis” mode, for any “vision of restored stability is a delusion.” This has, in fact, been evident to many English faculty for quite a while; we are, after all, quite good at analysis. It turns out that Fish is the last Humanities professor, which is a bit of a disappointment, since I just received a promotion at my college, a situation that has moved me to reflect long and hard on the theories of the Marx Brothers (Groucho and Karl).
And so back to the business of business. Alongside another article on Wall Street by Michael Lewis, this one with the Arthur-Miller-like title of “After the Fall,” in the December 2008/January 2009 issue of Conde Nast Portfolio: Investing Survival Guide 2009, I found“The End of Hubris.” And it doesn’t matter which definition the author, Leslie Bennetts, was thinking of — the commonly assumed prideful “attitude” or Aristotle’s act of violence — this is one declaration that we can all raise our glasses in a toast to. And it’s surely heartening that while the success of The Twilight Saga has suggested the death of literary style, current reflections on business assure us that, at least, metaphor and allusion — like irony and, alas, hubris -- are alive and nowhere near ending.
Of course, while reading all of these prognostications, it’s impossible not to think of another ending: that of the Cheney/Bush regime. The end of the damages that pair inflicted will be much longer in coming, but it’s a start. And then who knows: perhaps, in our lifetime, we’ll even see the end of the vampire novel. In the meantime, let’s declare moratoriums on jargon-laden college mission statements and the instant-comment feature of online news sites; let’s keep alive summer reading programs for high school and college students — and the meditative model of the life of the mind. If change is coming (and it must), there is still the important — essential — work to be done in Humanities service courses. Without the reminder of the life of the mind, we are truly dead.
Carolyn F. Segal
Carolyn F. Segal is professor of English at Cedar Crest College.