From time to time, an academic organization will invite me to sit on a panel at one of its gatherings, where my role is to serve as a native informant from the tribe of the journalists – one charged with the task of explaining our bizarre customs, and of demonstrating the primitive means by which we approximate abstract thought. (Sometimes they then give me food.) It is a curious experience, full of potential for misstatement and hasty generalizations. For one thing, the tribe is quite heterogeneous. "Media" is a plural, or it should be anyway. And within any given medium, the "journalistic field" (as Pierre Bourdieu called it) is itself fissured and stratified. It is a point I try to communicate to the professors through a combination of grunts and hand gestures -- an awkward exercise, all around.
But one question-and-answer session sticks in mind as particularly embarrassing. The audience consisted mainly of English professors. Some of them practiced media criticism and analysis of various forms. What role (the question went) did such work play in how those of us in the media understood our work?
For once, I felt no hesitation about generalizing. The short answer was simply that academic media analysis plays no part at all, at least in its theoretically articulated variants. This should not be surprising. As a guest, though, you don’t want to be rude, so I padded the answer out with polite indications that this was a matter of some regret -- and that, to be fair, some kinds of media criticism do provoke a lot of attention in journalistic circles.
But that last part was a bit of a feint. The kinds of analyses produced by people in that audience have no traction. The most subtle and cogent analysis by a rhetorician of how The Times or CNN frames its stories has all the pertinence to a reporter or editor that a spectrographic analysis of jalapeno powder would to someone cooking chili.
This is not a function of journalistic anti-intellectualism, though there’s certainly enough of that to go around. No, it comes down to a knowledge gap –- one in which academic media critics are often at a serious disadvantage. I mean tacit knowledge. There are, for example, things one learns from the experience of interviewing people who are clearly lying to you (or otherwise trying to make you a pawn in whatever game they are playing) that cannot be reduced to either formal propositions or methodological rules.
It is not necessary to read the collected fulminations of Noam Chomsky to learn that editors have ideological blinkers and blindspots. You tend to figure that one out pretty quickly, and may grow impatient with its protracted demonstration. What you want, rather, is some good old-fashioned phronesis -- that is, the cultivated practical wisdom required to know how to handle a situation.
One Web site quotes a scholar’s description of phronesis as "a sound practical instinct for the course of events, an almost indefinable hunch that anticipates the future by remembering the past and thus judges the present correctly." Start showing us how to get some of that, and I guarantee that folks will stand around the newsroom, debating your endnotes.
All of this is a roundabout way of framing the virtues of Danny Schechter’s The Death of Media, as well as its limitations. It is a new title in the Melville Manifestoes series published by Melville House, an independent press mentioned here on Tuesday. Schechter, one of the first producers for CNN and a winner of two Emmys for his work on the ABC program "20/20," has been a Neiman Fellow in Journalism at Harvard University. He is also the author of a book called The More You Watch, the Less You Know (1999), which I haven’t read -- though reportedly it did upset Bill O’Reilly, which seems like recommendation enough.
Schechter, then, is someone who brings tacit knowledge aplenty to the work of commenting on the state of the media. Last year, in his documentary WMD: Weapons of Mass Deception, he did more than reconstruct how the print and electronic media alike fell into line with the administration’s justifications for war. In that, he drew in part on a piece of scholarly research that certainly does deserve the closest and most shame-faced attention by the entire journalistic profession, the study Media Coverage of Weapons of Mass Destruction, by Susan D. Moeller, an associate professor of journalism at the University of Maryland at College Park. (The full text is available here.)
But Schechter went a step further -- zeroing in on moments when reporters and editors worried aloud that changes in the mass media were eroding the difference between practicing journalism and providing coverage. That distinction is not a very subtle one, but it’s largely missing from the conceptual universe of, say, cultural studies.
"Providing coverage" is rather like what Woody Allen said about life: Most of it is just showing up. The cameras record what is happening, or the reporter takes down what was said -- and presto, an event is "covered." The quantity of tacit knowledge so mobilized is not large.
By contrast, any effort to "practice journalism" involves (among other things) asking questions, following hunches, noticing the anomalous, and persisting until someone accidentally says something meaningful. There is more to it than providing stenography to power. It involves certain cognitive skills -- plus a sense of professional responsibility.
In his manifesto, Schechter runs through the familiar and depressing statistics showing a decline of public confidence in the mainstream media, increasing percentages of "infotainment" to hard news, and steady downsizing of reporting staff at news organizations.
One public-opinion poll conducted for the Pew Center found that "as 70 percent of the people asked expressed dissatisfaction with the news media." And the same figure emerged from a survey of people working in the news media: about 70 percent, as Schechter puts it, "feel the same way as their customers." He quotes Hunter S. Thompson’s evocative characterization of the television industry as "a cruel and shallow money trench, a long plastic hallway where thieves and pimps run free, and good men die like dogs. There’s also a negative side."
To all of this, Schechter offers the alternative of ... uh, Wikipedia?
Well, "citizen journalism" anyway -- through which "the ideas, observations, and energy of ordinary people" will serve as "not only a way of democratizing the media but also enlivening it." He points to "the meteoric growth of the blogosphere and the emergence of thousands of video activists," plus the contribution of scholars to "first rate publishing projects," including "a new online, non-commercial encyclopedia that taps the expertise of researchers and writers worldwide."
Well, it’s probably not fair to judge the possibilities for citizen journalism by the actual state of public-access cable TV -- or any given Wikipedia entry written by a follower of Lyndon LaRouche. (Besides, are either all that much worse than MSNBC?) But something is missing from Schechter’s optimistic scenario, in any case.
It is now much easier to publish and broadcast than ever before. In other words, the power to cover and event or a topic has increased. But the skills necessary to foster meaningful discussion are not programmed into the software. They have to be cultivated.
That's where people from academe come in. The most substantial interventions in shaping mass media probably won't come from conference papers and journal articles, but in the classroom -- by giving the future citizen journalist access, not just to technology, but to cognitive tools.
The curtain rises on a domestic scene –- though not, the audience soon learns, a tranquil one. It is the apartment of the philosopher Louis Althusser and his wife Hélène Rytman, on an evening in November, a quarter century ago. The play in question, which opened last month in Paris, is called The Caïman. That’s an old bit of university slang referring to Althusser's job as the “director of studies” -- an instructor who helps students prepare for the final exam at the École Normale Supérieure, part of what might be called the French Ivy League.
The caïman whose apartment the audience has entered was, in his prime, one of the “master thinkers” of the day. In the mid-1960s, Althusser conducted an incredibly influential seminar that unleashed structuralist Marxism on the world. He played a somewhat pestiferous role within the French Communist Party, where he was spokesman for Mao-minded student radicals. And he served as tutor and advisor for generations of philosophers-in-training.
At Althusser’s funeral in 1990, Jacques Derrida recalled how, “beginning in 1952 ... the caïman received in his office the young student I then was.” One of the biographers of Michel Foucault (another of his pupils) describes Althusser as an aloof and mysterious figure, but also one known for his gentleness and tact. When a student turned in an essay, Althusser wrote his comments on a separate sheet of paper -- feeling that there would be something humiliating about defacing the original with his criticisms.
But everyone in the audience knows how Althusser’s evening at home with his wife in November 1980 will end. How could they not? And even if you know the story, it is still horrifying to read Althusser’s own account of it. In a memoir that appeared posthumously, he recalls coming out of a groggy state the next morning, and finding himself massaging Hélène’s neck, just as he had countless times in the course of their long marriage.
“Suddenly, I was terror-struck,” he wrote. “Her eyes stared interminably, and I noticed the tip of her tongue was showing between her teeth and lips, strange and still.” He ran to the École, screaming, “I’ve strangled Hélène!”
He was whisked away for psychiatric evaluation, which can’t have taken long: Althusser’s entire career had been conducted between spells of hospitalization for manic-depression. In one autobiographical fragment from the late 1970s –- presumably written while on a manic high –- he brags about sneaking aboard a nuclear submarine and taking it for a joy-ride when no one was looking. If ever there were reason to question legal guilt on grounds of insanity, the murder of Hélène Rytman would seem to qualify.
He underwent a long spell of psychiatric incarceration -- a plunge, as he later wrote, back into the darkness from which he had awakened that morning. In the late 1980s, after he was released, the philosopher could be seen wandering in the streets, announcing “I am the great Althusser!” to startled pedestrians.
It became the stuff of legend. In the early 1980s, as a student at the University of Texas at Austin, I heard what turns out to have been an apocryphal account of that morning. A small crowd of Althusser’s students, it was said, routinely gathered outside his apartment to greet him each day. When he emerged, disheveled and shrieking that he was a murderer, everyone laughed and clapped their hands. They thought (so the story went) that Althusser was clowning around.
That rumor probably says more about American attitudes towards French thinkers than it does about Althusser himself, of course. The murder has become a standard reference in some of the lesser skirmishes of the culture wars – with Hélène Rytman’s fate a sort of morbid punch-line.
Althusser’s philosophical work took as its starting point the need to question, and ultimately to dissolve, any notion that social structures and historical changes are the result of some basic human essence. Somewhat like Foucault, at least in this regard, he regards the idea of “man” as a kind of myth. Instead, Althusser conceived of history as a “a process without a subject” – something operating in ways not quite available to consciousness. Various economic and linguistic structures interacted to “articulate” the various levels of life and experience.
Althusser called this perspective “theoretical anti-humanism.” And for anyone who loathes such thinking, the standard quip is that he practiced his anti-humanism at home.
That strikes me as being neither funny nor fair. At the risk of sounding like a pretty old-fashioned bourgeois humanist, I think you have to treat his ideas as ... well, ideas. Not necessarily as good ones, of course. (In his seminar, Althusser and his students undertook a laborious and ultimately preposterous effort to figure out when and how Marx became a Marxist, only to conclude that only a few of his works really qualified.) But however you judge his writings, they make sense as part of a conversation that started long before Althusser entered the room -- one that will continue long after we are all dead.
One way to see his “theoretical anti-humanism,” for example, is as a retort to Jean-Paul Sartre’s “Existentialism is a Humanism” –- the lecture that drew standing-room only crowds in 1945, at just about the time Althusser was resuming an academic career interrupted by the war. (The Germans held him as a POW for most of it.) It was the breeziest of Sartre’s introductions to his basic themes: We are free – deep down, and for good. That freedom may be unbearable at times. But it never goes away. No matter what, each individual is always radically responsible for whatever action and meaning is possible in a given circumstance.
“Man,” Sartre told his listeners, “is nothing else but what he makes of himself.” But that “nothing” is, after all, everything. “There is no universe other than a human universe, a universe of human subjectivity.”
For Althusser, this is all completely off track. It rests on the idea that individuals are atoms who create their own meaning – and that somehow then link up to form a society. A very different conception is evident in “Ideology and Ideological State Apparatuses,” a paper from 1970 that is about as close to a smash-hit, era-defining performance as Althusser ever got. Which is to say, not that close at all. But extracts are available in The Norton Anthology of Theory and Criticism, and passages have turned up in countless thousands of course packets in lit-crit and cultural studies, over the years.
For Althusser, it’s exactly backwards to start from the individual as a basic unit capable, through its own imagination and endeavor, to create a world of meaning. On the contrary, there are societies that seek to reproduce themselves over time, not just by producing material goods (that too) but through imposing and enforcing order.
The police, military, and penal systems have an obvious role. Althusser calls them the Repressive State Apparatuses. But he’s much more interested in what he calls the Ideological State Apparatuses – the complex array of religious institutions, legal processes, communication systems, schools, etc. that surround us. And, in effect, create us. They give us the tools to make sense of the world. Most of all, the ISAs convey what the social order demands of us. And for anyone who doesn’t go along....Well, that’s when the Repressive State Apparatuses might just step in to put you in line.
Why has this idea been so appealing to so many academics –- and for such a long time? Well, at the time, it tended to confirm the sense that you could effect radical social change via “the long march through the institutions.” By challenging how the Ideological State Apparatuses operated, it might be possible to shift the whole culture’s center of gravity. And Althusser placed special emphasis on educational institutions as among the most important ISA's in capitalist society.
Such was the theory. In practice, of course, the social order tends to push back –- and not necessarily through repression. A handful of non-academic activists became interested in Althusser for a while; perhaps some still are. But for the most part, his work ended up as a fairly nonthreatening commodity within the grand supermarket of American academic life.
The brand is so well-established, in fact, that the thinker’s later misfortunes are often dismissed with a quick change of subject. The effect is sometimes bizarre.
In 1996, Columbia University Press issued a volume by Althusser called Writings on Psychoanalysis: Freud and Lacan. Surely an appropriate occasion for some thoughtful essays on how the theorist’s own experience of mental illness might have come into play in his work, right? Evidently not: The book contains only a few very perfunctory references to “temporary insanity” and psychiatric care. Presumably Althusser’s editors will be forthcoming next summer, with the publication by Verso of Philosophy of the Encounter: Later Writings, 1978-1987. The catalog text for the book refers to it as “his most prolific period.” But it was also one when much of his writing was done while hospitalized.
Is it possible to say anything about his work and his illness that doesn’t amount to a roundabout denunciation of Althusser? I think perhaps there is.
On one level, his theory about the Ideological State Apparatuses looks....maybe not optimistic, exactly, but like a guide to transforming things. From this point of view, each individual is a point of convergence among several ISAs. In other words, each of us has assimilated various codes and rules about how things are supposed to be. And if there are movements underway challenging how the different ISAs operate, that might have a cumulative effect. If, say, feminists and gay rights activists are transforming the rules about how gender is constructed, that creates new ways of life. (Though not necessarily a social revolution, as Althusser wanted. Capitalism is plenty flexible if there’s a buck to be extracted.)
But that notion of the individual as the intersection of rules and messages also has a melancholy side. It somewhat resembles the experience of depression. If a person suffering from depression is aware of anything, it is this: The self is a product of established patterns....fixed structures.... forces in the outside world that are definitive, and sometimes crushing.
Any Sartrean talk of “radical freedom” makes no sense whatever to anyone in that condition – which is, rather, a state of radical loss. And as the German poet Hans Magnus Enzensberger puts it in a recent essay, the most extreme “radical loser” may find the only transcendence in an act of violence.
“He can explode at any moment,” writes Enzensberger. “This is the only solution to his problem that he can imagine: a worsening of the evil conditions under which he suffers.... At last, he is master over life and death.”
Is that what happened in Althusser’s apartment, 25 years ago? That, or something like it.
It’s been a hard season, marked by a preponderance of headlines announcing the end of a great many things. One of the most instructive entries appeared in the January 4 edition of The New York Times: an op-ed titled “The End of the Financial World As We Know It,” by Michael Lewis and David Einhorn. There’s a longstanding tradition in the humanities of such pronouncements. In November, we heard that irony is dead. Just one week after the Times reported on Joan Didion’s announcement in a talk at the New York Public Library, however, the Sunday “Arts & Leisure” section of the paper ran a story on page 1 announcing that Liza (Minnelli) is back (again), so we can only conclude that, while Didion is inarguably one of the leading expert lights of irony, the reports of its demise were widely exaggerated.
The novel has been pronounced dead so many times (in more than 50 percent of the cases, the actual wording was “Le roman est mort”) that the phrase “the death of the novel” has its own entry on Wikipedia.
Now while I would be the first to agree that irony is, if not dead, certainly sleeping in my literature classes, the novel is in fact undead: 11 out of 12 students in Honors Creative Writing confessed, when questioned closely, that they had in fact read Twilight. Eleven out of 11 blamed this reading choice on their roommates, the student excuse being the one genre that no one has ever pronounced even remotely near death.
It was only a matter of time before the topic of literary studies itself became caught up in the contagion of pronouncements of the demise of one thing or another, and thus The Chronicle of Higher Education of December 19 featured not one but three essays under the general heading of “What Ails Literary Studies.” At least they’re not dead, although ails is somewhat disturbing, with its connotations of some obscure 19th century illness involving headaches and quarantine.
Stanley Fish offered a much deadlier view in "The Last Professor," an entry on Frank Donoghue’s new book, TheLast Professors: The Corporate University and the Fate of the Humanities. According to Donoghue -- a former student of Fish -- we are beyond even “crisis” mode, for any “vision of restored stability is a delusion.” This has, in fact, been evident to many English faculty for quite a while; we are, after all, quite good at analysis. It turns out that Fish is the last Humanities professor, which is a bit of a disappointment, since I just received a promotion at my college, a situation that has moved me to reflect long and hard on the theories of the Marx Brothers (Groucho and Karl).
And so back to the business of business. Alongside another article on Wall Street by Michael Lewis, this one with the Arthur-Miller-like title of “After the Fall,” in the December 2008/January 2009 issue of Conde Nast Portfolio: Investing Survival Guide 2009, I found“The End of Hubris.” And it doesn’t matter which definition the author, Leslie Bennetts, was thinking of — the commonly assumed prideful “attitude” or Aristotle’s act of violence — this is one declaration that we can all raise our glasses in a toast to. And it’s surely heartening that while the success of The Twilight Saga has suggested the death of literary style, current reflections on business assure us that, at least, metaphor and allusion — like irony and, alas, hubris -- are alive and nowhere near ending.
Of course, while reading all of these prognostications, it’s impossible not to think of another ending: that of the Cheney/Bush regime. The end of the damages that pair inflicted will be much longer in coming, but it’s a start. And then who knows: perhaps, in our lifetime, we’ll even see the end of the vampire novel. In the meantime, let’s declare moratoriums on jargon-laden college mission statements and the instant-comment feature of online news sites; let’s keep alive summer reading programs for high school and college students — and the meditative model of the life of the mind. If change is coming (and it must), there is still the important — essential — work to be done in Humanities service courses. Without the reminder of the life of the mind, we are truly dead.
Carolyn F. Segal
Carolyn F. Segal is professor of English at Cedar Crest College.