As a graduate student, I devoured the Slovenian philosopher and cultural critic Slavoj Žižek’s essays, articles, and, books. I did so in part because I found his analyses of the ideological subjective mechanisms underpinning the functioning of contemporary capitalism generally compelling, and more specifically relevant to my own work. I admired his theoretical acumen, but also what seemed like the sheer breadth of his knowledge.
Žižek, in other words, seemed to me at the time to know almost everything, and he was able to use that knowledge for theoretical gains. Although I, too, wanted to do what Žižek did, I was also painfully aware of my shortcomings. I would never know as much as he seemed to know, meaning that my contribution to “scholarship” would likely remain forever slight. Žižek was a superhuman genius, one of those rare individuals who could do it all; me, I was -- and remain -- a mere mortal.
Last week Žižek was accused of plagiarism for an article he originally published in 2006 in the journal Critical Inquiry. The article, “A Plea for a Return to Différance (with a Minor Pro Domo Sua),” discusses Kevin MacDonald’s The Culture of Critique: An Evolutionary Analysis of Jewish Involvement in Twentieth-Century Intellectual and Political Movements.
As numerous blogs and media outlets, including Newsweek, have discussed, the conservative blogger Steve Sailer first called attention to the article on July 9, noting that “it’s striking how much more opaque Žižek’s prose suddenly becomes when he switches to elucidating what are, presumably, his own ideas, such as they are." Later that day, a blogger gave more teeth to Sailer’s claim, posting a side-by-side comparison of Žižek’s article with Stanley Hornbeck’s review of MacDonald’s book, which had appeared in 1999 in The American Renaissance, a far-right publication known for featuring and advocating for overtly racist views.
As someone who has followed and admired Žižek’s work I was initially disappointed, to say the least. Finding out that one of your favorite authors has plagiarized is the intellectual equivalent of learning of the infidelity of one’s partner. I also, however, wasn’t surprised. Four years out of my graduate program and now in a full-time faculty position, my views of scholarship and its production are less naïve than they once were. To be blunt: it’s simply impossible for someone who keeps Žižek’s schedule, which includes various appointments and a rigorous, international lecturing schedule, to singlehandedly read and research broadly and publish as much as he does. Whether in the form of research assistants or, as appears to be the case in this instance, plagiarism, the actual production of scholarship often depends on others, whose work often remains largely unacknowledged.
Žižek himself seems to indicate as much in his response to the allegations. In response to a request from the website Critical Theory (not really known for its love of Žižek, it is worth pointing out) for comment on the allegation, Žižek expressed “regret” over the incident, but explained it as follows: “When I was writing the text on Derrida which contains the problematic passages, a friend told me about Kevin MacDonald’s theories, and I asked him to send me a brief resume. The friend send [sic] it to me, assuring me that I can use it freely since it merely resumes another’s line of thought. Consequently, I did just that – and I sincerely apologize for not knowing that my friend’s resume was largely borrowed from Stanley Hornbeck’s review of MacDonald’s book.”
Cite too much, and your work is derivative; cite too little, and you get accused of not knowing the literature, sloppiness, or in some cases plagiarism.
If my Facebook feed is any indication, many have not found Žižek’s response satisfying -- and I’ve seen more than a few make the comparison to what students say when they’re accused of plagiarism. It’s unsatisfying because it strikes us as dishonest and unconvincing, something that someone says after being caught because there is nothing better to say.
My question is why we find that response unsatisfying. Putting aside Žižek’s intentions and “what really happened” (which we can’t, of course, know), I would suggest that underneath a lot of the dissatisfaction among fans and critics are unrealistic expectations of what it means to be a scholar and to produce scholarly work. What is unsatisfying in Žižek’s response, in other words, is not the lame passing of the buck to his “friend” but the fact that Zizek relied rather heavily in this instance -- and likely in numerous others -- on the work of someone else.
“That’s what citations are for,” someone will say -- “that’s the whole problem with what happened here.” True, Žižek should have cited his source here, and given his reliance, probably should have just put quotation marks around the whole section in question. But it’s worth pointing out that even if Žižek had done so, he would have been criticized for relying too heavily on “secondary sources” for his argument. In other words, “real” scholarship places a value on uniqueness and novelty, which requires a careful balance when it comes to citation practices. Cite too much, and your work is derivative; cite too little, and you get accused of not knowing the literature, sloppiness, or in some cases plagiarism.
Yet this balance often conceals the fact that we rely on “secondary sources” all the time without necessarily citing them (who hasn’t looked up something on Wikipedia -- without citing it -- to get some bearings?). Conversely, we often pad our arguments with citations to things we haven’t read well or at all. Not only because that’s what's expected but also because doing so allows us to cover our arguments with a supposed mastery of a literature that is virtually impossible for any one person to master. Whoever says that he or she hasn’t done as much either is lying or hasn’t published.
Despite all the talk in the humanities over the past few decades about the death of the author, the inexistence of the subject, the collective production of knowledge, intertextuality, networks of information, and so on, our publication practices and expectations haven’t caught up.
In practice, our notions of scholarship continue to assume an autonomous, substantial ego who is the author of his or her works; when that ego does acknowledge its debts to others, it does so only by citing other autonomous, substantial egos. “Theft” is a good critical concept that helps to destabilize power structures and explain the production of subjectivity -- until, that is, someone steals ideas from someone else.
All of this is not really to defend Žižek, nor is it to suggest replacing current scholarly conventions with an “anything goes” approach. In raising questions surrounding the accusations against Žižek and his response, I’m not necessarily advocating for plagiarism.
I am, rather, saying that the whole affair raises issues in how we understand the production of scholarship. We’re all mere mortals, so perhaps it would be best to lower our expectations with regard to what we do, really acknowledge our debt to others, and allow practice to catch up with theory.
That applies especially, I think, to the “star academics” who shape current discussions and fields, like Žižek. Despite the charges of plagiarism, I still admire and find value in his work, so I’ll continue to read what he has to say. That’s not to say that I’m not disappointed -- I, like other admirers, am, so I might take what he has to say with a few more grains of salt. But such disappointment is a good reminder that we’re all mere mortals, Žižek included.
Hollis Phelps is an assistant professor of religion at the University of Mount Olive. He is the author of Alain Badiou: Between Theology and Anti-theology (Acumen 2013).
The case of a Rutgers U. philosophy professor accused of sexually assaulting a man with cerebral palsy raises questions about a controversial communication method much debated by disability studies scholars.
In December, the journal Brain Connectivity published a paper called "Short- and Long-Term Effects of a Novel on Connectivity in the Brain," based on a study conducted at Emory University. The researchers did MRI scans of the brains of 21 undergraduate students over a period of days before, during, and after they read a best-selling historical page-turner called Pompeii over the course of nine evenings. A significant increase of activity in "the left angular supramarginal gyri and right posterior temporal gyri" occurred during the novel-reading phase of the experiment, which fell off rapidly once they finished the book -- the gyri being, the report explained, regions of the brain "associated with perspective taking and story comprehension."
Not a big surprise; you'd figure as much. But the researchers also found that an elevated level of activity continued in the bilateral somatosensory cortex for some time after the subjects were done reading. In the novel, a young Roman aqueduct engineer visiting the ancient vacation resort of Pompeii "soon discovers that there are powerful forces at work -- both natural and man-made -- threatening to destroy him." Presumably the readers had identified with the protagonist, and parts of their brains were still running away from the volcano for up to five days after they finished the book.
So one might construe the findings, anyway. The authors are more cautious. But they raise the question of whether the experience of reading novels "is sufficiently powerful to cause a detectable reorganization of cortical networks" -- what they call a "hybrid mentalizing-narrative network configuration." Or to put it another way, a long-term rearrangement of the mind's furniture.
It isn't a work of fiction, and I am but a solitary reader without so much as access to an electroencephalograph, but A Philosophy of Walking by Frédéric Gros, a French best-seller from 2011 just published in English by Verso, seems to have been setting up its own "hybrid mentalizing-narrative network configuration" within my head over the past few days. Maybe it's the weather. After so many months of cold weather and leaden skies, Gros's evocation of the pleasures of being outside, moving freely, in no particular hurry, stirs something deep within.
The author, a professor of philosophy at the University of Paris, has, among other things, edited volumes in the posthumous edition of Michel Foucault's lectures at the College de France. But the authority Gros brings to his reflections on walking comes only in part from knowing the lives and writings of ambulatory thinkers across the millennia, beginning in ancient Greece. He is a scholar but also a connoisseur -- someone who has hiked and wandered enough in his time, over a sufficient variety of terrains, to know at first hand the range of moods (ecstasy, monotony, exhaustion) that go with long walks.
It is a work of advocacy, and of propaganda against sedentary thinking. The first of Gros's biographical essays is on Nietzsche, who took up walking in the open air while suffering from migraine headaches, eyestrain, and late-night vomiting spasms. It did not cure him, but it did transform him. He might be the one spending time at health resorts, but it was contemporary intellectual life that manifested invalidism.
"We do not belong to those who have ideas only among books, when stimulated by books," Nietzsche wrote. "It is our habit to think outdoors -- walking, leaping, climbing, dancing, preferably on lonely mountains or near the sea where even the trails become thoughtful. Our first questions about the value of a book, of a human being, or of a musical composition, are: Can they walk? Even more, can they dance?" Long, solitary hikes such as those taken by Nietzsche -- and also by Rousseau, the subject of another essay -- are only one mode of philosophical pedestrianism. The precisely timed daily constitutional that Kant took each day, so regular that his neighbors could set their watches by it, has gone down in history as an example of his extreme rigor (one easily recognized even by the layman who can't tell his an a posteriori from his elbow). Gros adds a telling detail to this otherwise commonplace biographical fact: Kant took care to walk at a measured, even pace, since he was profoundly averse to sweating.
At the other extreme was the ancient philosophical school known as the Cynics, with its cultivation of an almost violent indifference to comfort and propriety. The Cynics were homeless vagrants on principle. They denied themselves, as much as possible, every luxury, or even convenience, taken for granted by their fellow Greeks.
That included footwear: "They had done so much walking," Gros says, "that they hardly needed shoes or even sandals, the soles of their feet being much like leather." When the Cynics showed up in a town square, their constant exposure to nature's elements gave a jagged edge to the harangues in which they attacked commonplace ideas and values. Gros sees walking, then, as the foundation of the Cynics' philosophical method:
"Philosophers of the type one might call sedentary enjoy contrasting the appearance with the essence of things. Behind the curtain of tangible sights, behind the veil of visibilities, they try to identify what is pure and essential, hoping perhaps to display, above the colors of the world, the glittering, timeless transparency of their own thought…. The Cynic cut through that classic opposition. He was not out to seek or reconstruct some truth behind appearances. He would flush it out from the radical nature of immanence: just below the world's images, he was searching for what supported them. The elemental: nothing true but sun, wind, earth and sky; their truth residing in their unsurpassable vigor."
Walking is not a sport, Gros takes care to emphasize. You don't need any equipment (not even shoes, for an old-school Cynic) nor is any instruction required. The skill set is extremely limited and mastered by most people in infancy. Its practice is noncompetitive.
But in a paradox that gives the book much of its force, we don't all do it equally well. It's not just that some of us are clumsy or susceptible to blisters. Gros contrasts the experience of a group of people talking to one another while marching their way through a walking tour (an example of goal-driven and efficiency-minded behavior) and the unhurried pace of someone for whom the walk has become an end in itself, a point of access to the sublimely ordinary. And so he has been able to give the matter a lot of thought:
"Basically, walking is always the same, putting one foot in front of the other. But the secret of that monotony is that it constitutes a remedy for boredom. Boredom is immobility of the body confronted with emptiness of mind. The repetitiveness of walking eliminates boredom, for, with the body active, the mind is no longer affected by its lassitude, no longer drawn from its inertia the vague vertigo in an endless spiral.… The body's monotonous duty liberates thought. While walking, one is not obliged to think, to think this or that. During that continuous but automatic effort of the body, the mind is placed at one's disposal. It is then that thoughts can arise, surface or take shape."
As for the clumsiness and blisters, I hope they will disappear soon. It's the practice of walking, not reading about it, that makes all the difference. But no book has rewired my bilateral somatosensory cortex so thoroughly in a long while.
In some fields, the ground tone of reviews is normally subtle and soothing: the sound of logs being gently rolled. Poets are especially prone to giving one another gold stars for effort. Journals devoted to contemporary art have cultivated a dialect in which even the syntax is oblique. The reviewers’ judgments (if that is what they are) resist paraphrase.
Philosophy is another matter. Extremely critical and even brutal book reviews, while hardly the norm, are at least an occupational hazard. The drubbing may be delivered in measured terms and an even tone. But the really memorable assessments nip near the jugular vein, when not ripping it wide open. Consider, for instance, this classic, from a notice appearing in the venerable British journal Mind, in 1921: “[The author’s] method of exegesis consists, in fact, of a combination of the suppressio veri with the suggestio falsi, both, of course, practised in the absolute good faith which comes from propagandist enthusiasm unchecked by any infusion of historical sense.... It is because Mr. Urwick's book is one long dogmatising without knowledge that I feel bound to put it on record that of all bad books on Plato his is the very worst.”
In a subsequent issue, Mr. Urwick called it “a delightfully abusive review,” which in the science of fistics is called “taking it on the chin.”
An enterprising publisher could put together an anthology of the such colorful passages of philosophers shellacking one another’s work. The anthologist would need to limit the scope to reviews from professional journals; invective from blogs or general-interest publications render the volume unwieldy. Another challenge would be limiting how much space it devotes to reviews by Colin McGinn are included.
As aggressive in the stately pages of The Philosophical Review as when writing for The New York Review of Books, McGinn set a new benchmark for philosophical savaging in 2007 with his comments on Ted Honderich's On Consciousness. But his published opinion of the book (“the full gamut from the mediocre to the ludicrous to the merely bad.... painful to read, poorly thought out, and uninformed”) is by no means as rough as it could have been. “The review that appears here is not as I originally wrote it,” reads McGinn’s note accompanying the piece in The Philosophical Review. “The editors asked me to 'soften the tone' of the original; I have done so, though against my better judgment.”
The men were once colleagues at University College London, as it is somehow proves unsurprising to learn, and they have a long history of personal and intellectual hostilities, during which Honderich gave as good as he got. (Honderich is now professor emeritus of philosophy of mind and logic. McGinn was professor of philosophy at the University of Miami until his recent resignation.)
The review, and its prehistory and aftermath, inspired an interesting and unusual paper in the Journal of Consciousness Studies. It elucidates the technical questions at issue involved while also bringing the distance between gossip and intellectual history to an all-time low. McGinn is prolific, and I have not kept up with work in the two years since writing about The Meaning of Disgust in this column. But in the meantime, certain of his writings have generated even more attention and commentary than his phillipic against Honderich did.
The prose in question took the form of email and text messages to a research assistant in which he allegedly told her he “had a hand job imagining you giving me a hand job.” (Here is one account of the matter, behind a paywall, though you can also find a PDF of the article for free at this site; and here is another.) I attach the word “allegedly” per protocol, but McGinn has defended his use of the expression “hand job” as part of the philosophical banter around his work on a theory “that ostension and prehension are connected and that the mind is a ‘grasping organ,’ ” as the abstract for a lecture he gave last year puts it. Hence, most forms of human activity are -- ultimately, in a certain sense – “hand jobs.” He has issued a manifesto.
The new issue of Harper’s magazine reprints, under the title “Out on a Limb,” a blog post by McGinn from June 2013 in which he explains: “I have in fact written a whole book about the hand, Prehension, in which its ubiquity is noted and celebrated… I have given a semester-long seminar discussing the hand and locutions related to it. I now tend to use ‘hand job’ in the capacious sense just outlined, sometimes with humorous intent…. Academics like riddles and word games.”
Some more than others, clearly. McGinn then considers the complexity of the speech-act of one professional glassblower asking another, “Will you do a blow job for me while I eat my sandwich?” The argument here is that nothing he did should be regarded as sexual harassment of a graduate student, and the real victim here is McGinn himself: “One has a duty to take all aspects of the speech situation into account and not indulge in rash paraphrases. And one should also not underestimate the sophistication of the speaker.”
Nor overestimate the usefulness of sophistication as a shovel, once one has dug oneself into a hole and needs to get back out. McGinn subsequently thought the better of this little essay and deleted it from his blog, but the Harper’s “Readings” section preserves it for posterity. Life would be much simpler if good judgment weren’t so tardy at times.
The whole matter might reveal its true philosophical depths once Prehension is available. But Amazon doesn’t list it as forthcoming, and lately McGinn’s name seems to come up most often in discussions of sexual harassment, or of the tendency of philosophy as a discipline to resemble the Little Rascals’ treehouse.
But a recent review in Mind ( the journal that gave Urwick such delight in 1921) might shift attention away from McGinn’s alleged peccadillos and the hazards of paraphrase. Arguably it raises the bar higher than even his critique of Honderich did. It starts out with relatively understated and rather donnish clucking over the author’s transgression of specialist boundaries. By the end the end, the gloves are off:
“As was said of the Sokal hoax, there is simply no way to do justice to the cringe-inducing nature of this text without quoting it in its entirety. But, in a nutshell, Basic Structures of Reality is an impressively inept contribution to philosophy of physics, and one exemplifying everything that can possibly go wrong with metaphysics: it is mind-numbingly repetitive, toe-curlingly pretentious, and amateurish in the extreme regarding the incorporation of physical fact. With work this grim, the only interesting questions one can raise concern not the content directly but the conditions that made it possible; and in this connection, one might be tempted to present the book as further evidence of the lack of engagement of metaphysicians with real science — something that has lately been subject to lively discussion (and I myself have slung some of the mud). But I would insist that to use this work to make a general point about the discipline would in fact be entirely unfair...
“For all the epistemic faux-modesty that this book purports to defend, the image that persists while grinding through its pages is of an individual ludicrously fancying themselves as uniquely positioned to solve the big questions for us, from scratch and unassisted, as if none of the rest of us working in the field have had anything worth a damn to contribute. It will however be clear by now that I take the reality to be substantially different. For me, then, the one pertinent question this work raises is why all of this went unrecognized: this book, after all, issues not from one of the many spurious publishing houses currently trolling graduate students, but Oxford University Press — a press whose stated aim is to ‘publish works that further Oxford University’s objective of excellence in research, scholarship, and education’. So why did they publish this?”
The reviewer ventures an explanation: The author of the offending volume “is a ‘big name’; and if that is sufficient for getting work this farcical in print with [Oxford University Press], then shame on our field as a whole.” The book could well provoke a worthwhile discussion, though sadly one focused on concerns rather different from those he himself had in mind.”
I came across this takedown within about an hour of reading the blog post reprinted by Harper’s. The author is Colin McGinn – the author of the book in question, that is, Basic Structures of Reality: Essays in Meta-Physics (Oxford, 2011). The reviewer is Kerry McKenzie, a postdoctoral fellow in philosophy at the University of Western Ontario. The piece in Mind is only her second review-essay, but I’d say it’s one for the anthology. (Note: This essay has been updated from an earlier version to correct Kerry McKenzie's current institution.)
Philosophers have lives; saints have legends. No miracles are associated with Simone Weil (1909-43), and a glance at reference books on philosophy finds her listed alongside Jean-Paul Sartre and Simone de Beauvoir, her classmates at the École normale supérieure. But she looks odd in their company -- in any company, really. Frail but indomitable, she was in the world but not quite of it.
French intellectuals of her generation wrote essays on Marxism and the Spanish Civil War, while she worked in a factory and went to the front as part of an antifascist militia. She had an extraordinarily acute (some would say morbid) awareness of the depths and the extent of human suffering; it made comfort seem like complicity. Combine that sensitivity with her conviction that our world is the diminished or faulty image of a realm in which truth and justice are real and absolute – a Platonic notion, flecked perhaps with Gnostic elements -- and you have someone with a vocation, rather than a career.
Weil died at the age of 34, under what seem to have been suspiciously beatific circumstances: she succumbed to tuberculosis after months of refusing to eat more than the rations available to the French compatriots under German occupation. Her collected writings, which run to several stout volumes, range from pacifist essays and interventions in trade-union debates to reflections on atheism and mysticism (not entirely antithetical terms, in her experience) and studies of classical Greek literature and philosophy. Most of this work remained unpublished during her lifetime, apart from scattered essays in journals of no wide circulation.
At a couple of points in Julia Haslett’s film “An Encounter with Simone Weil,” the camera focuses on a few lines of a manuscript, the words in French and Greek. Her handwriting appears small, precise, and highly concentrated – like Weil herself, by all accounts. “She was apparently unacquainted with doubt,” wrote Raymond Aron, a philosopher and political journalist who knew her in the 1920s, “and, although her opinions might change, they were always thoroughly categorical.” His choice of words may allude to one of Weil’s nicknames from their student days: “The Categorical Imperative in Skirts.”
“An Encounter with Simone Weil,” billed on its Facebook promotional page as “a documentary by Julia Haslett,” made the rounds of film festivals in 2011, and in the meantime the director (a visiting associate professor of cinema and comparative literature at the University of Iowa) has screened it at two dozen colleges and universities throughout the United States. It will be available on DVD and various digital platforms sometime in the next few months. Upon request, the director sent me a screener, indicating that she had just finished work on the French language version. The film is already available in Italian, with the German and Japanese translations due out this fall, and a Korean version in the works.
“Encounter” is at least as much a personal essay as a biographical portrait. Haslett’s fascination began when she came across a quotation from one of Weil’s letters: “Attention is the rarest and purest form of generosity.” It was an ideal point of entry, given how often Weil’s aphorisms sound like one of Pascal’s Pensées, and also how much moral and intellectual significance the word “attention” turns out to have in her work.
But it also carried a strong personal connotation. In the voiceover Haslett tells us of her father’s suicide when she was young, and in a video clip her brother discusses his struggle with anxiety and depression. “My father's death taught me that if I don't pay attention, someone might die," she says -- an enormous burden to have to carry.
Although Weil cannot be said to lighten the load, she at least understood the stakes. “The capacity to give one's attention to a sufferer is a very rare and difficult thing,” she says in another passage that Haslett quotes. “It is almost a miracle; it is a miracle.”
In short order she read Francine Du Plessix Gray’s biography of Weil (published by Penguin in 2001) in a single sitting. "Here was this brilliant, deeply ethical young woman speaking truth to power, putting her body on the line for her convictions, and providing such an incisive critique of political and economic power,” she told me an in e-mail exchange. “And yet I'd never read her. That despite studying philosophy, religion, and history at Swarthmore College -- an institution that embodies the Weil-like values of rigorous intellectual inquiry and a deeply held commitment to social justice.”
In recounting Weil’s period as a left-wing militant in the early years of the Great Depression, “Encounter” shows Haslett going over film footage of mass demonstrations in the archives of the French Communist Party – searching, almost desperately, to catch a glimpse of Weil in the crowd, but with no luck. She visits the apartment building in New York where Weil lived for a few months in 1942, and interviews the (very) few remaining people who knew Weil, as well as one of the editors preparing her collected works.
The effort to establish a connection with the philosopher even extends to having Soraya Broukhim (an actress with some resemblance to Weil) read enough of Weil's work to improvise responses in a mock interview. The sequence is odd. By the end, the situation has become unmistakably awkward for both parties. When Broukhim complains that Haslett seems to want answers she can’t give, the effect is strangely revealing. For a moment, it’s not quite clear whether she’s doing so in character, as Weil, or in real life.
To call Haslett’s quest a kind of pilgrimage would be tempting, if not for the most striking thing about the film: its emphasis on her as a secular figure and an activist. Most people who become interested in Weil do so through the theological side of her work. Even her appeal for nonbelievers, such as Albert Camus, comes in large measure from an awareness of her "tortured prowling outside the doors of the Catholic Church, like a starving wild animal,” to borrow the poet Kenneth Rexroth’s apt characterization.
Weil’s spiritual writings “are certainly the reason she gets studied in this country,” Haslett acknowledged by e-mail. “For example, many of the annual meetings of the American Weil Society are held at theological seminaries and most participants are religious scholars or at least people of faith.” But for the director, “it was the way she combined such an incisive critique of power and her willingness to sacrifice everything to tell the truth as she knew it (by directly experiencing that about which she wrote) that drew me in so completely.… She became a guide for me through the very dark first decade of this century, when our politicians were dispensing with the truth and our media wasn't holding them accountable.”
The film gives due attention to Weil's religious passion, but suggests that her mystical turn came after deep disillusionment with radical politics. Haslett seems largely uninterested in the very difficult matter of Weil’s relationship to Judaism. (Her parents were so completely assimilated into French secular culture that they neglected to mention this element of her identity until she was about 10 years old). And the director treads lightly around the topic of Weil’s mental health, although she does briefly consider the possibility that the final period of self-denial may have been a kind of suicide by self-starvation.
Haslett makes her resistance to certain aspects of her subject’s life and thought explicit, saying in one voiceover that she felt “betrayed by Weil’s turn toward God.” This does not detract from the value of the film in the least.
On the contrary, ambivalence and discomfort are essential to any meaningful encounter with Weil. To borrow a remark by T.S. Eliot, whose grounds for admiration were as different from Haslett’s as they could be: “I cannot conceive of anybody agreeing with all of her views, or of not disagreeing violently with some of them. But agreement and rejection are secondary: what matters is to make contact with a great soul.”
In a memorable scene from the first season of "Breaking Bad" (AMC), the protagonist sits down to do some moral bookkeeping of a fairly literal variety. He is a 50-year-old high-school chemistry teacher named Walter White. A recent trip to the doctor to check on a nagging cough has left with a diagnosis of advanced lung cancer, giving him, at most, a couple of years to live. If you’ve seen the show (and maybe even if you haven’t, since it has received extremely good press and won more awards than I feel like counting) you know that Walter has decided on a hazardous way to provide for his family after his death. He applies his lab skills to the production of crystal methamphetamine.
The stuff he “cooks” (as the term of art goes) is exceptionally pure and powerful. The connoisseurs love it. If he can turn a profit of $737,000 in the time he has left, Walt will leave a nest egg for his wife and children and die in peace. As a middle-class family man, Walt lacks any direct knowledge of the marketing side of the meth business, and would prefer to keep it that way. His connection to the underworld is a former student named Jesse Pinkman, memorable chiefly for his bad grades. But Jesse is a gangsta wannabe, as well as a meth head, and nowhere near as street-savvy as he thinks or the job requires.
And so it comes to pass that Walter find himself facing an unforeseen problem involving a well-connected figure from the meth supply chain – a fellow who goes by the street name of Krazy-8. It's a long story how he got there, but Krazy-8 ends up shackled by the neck to a pole in Jesse’s basement, and he is understandably, even homicidally, unhappy. Walt must now decide between two options: let Krazy-8 live or kill him.
Being the rational sort, Walt tabulates the arguments on each side.The column headed “Let him live” fills up quickly, if redundantly: “It’s the moral thing to do. Judeo-Christian principles. You are not a murderer. He may listen to reason. Post-traumatic stress. Won’t be able to live with yourself. Murder is wrong!”
Under “Kill him,” the camera reveals just one entry: “He’ll kill your entire family if you let him go.” So much for weighing the alternatives.
In his method -- and ultimately in his actions -- Walt proves to be a consequentialist, as J.C. Donhauser points out in “If Walt’s Breaking Bad, Maybe We Are Too,” one of the essays in Breaking Bad and Philosophy: Badder Living Through Chemistry (Open Court). Most viewers will have surmised as much, even if they don’t have a name for it. But there is more than one metric for judging costs and benefits, and so more than one species of consequentialist. Donhauser -- an assistant instructor of philosophy at the State University of New York at Buffalo and a lecturer at Buffalo State University – uses examples from other episodes to consider the options. There’s act consequentialism, for one (the realized effect of an act determine whether it is good or bad, even if the consequences are unintended or unforeseeable), which is distinct from rule consequentialism (“actions are better or worse, not in relation to their actual consequences, but in proportion to how far afield they fall from a rule that would be best for most people if everyone followed it”).
As for Walt, he belongs in the ranks of the agent-centered consequentialists, who “judge actions based on their consequences” but “also argue that the most important consequences are for the person carrying out the actions that produce those consequences.”
Each stance has its limitation – quite as much as deontology does. Deontology insists that consequences are irrelevant, since an act can be judged moral if and only if it could be universalized. Murder is immoral, then, because “if everyone did it, there’d be no one around for you to murder then! The same goes for stealing, as there’d be nothing left to steal.” So Jeffrey E. Stephenson put it, with tongue in cheek, in “Walter White’s American Vice.” Ditto for lying, since a society in which everyone lied constantly would be even more irrational than the one we live in.
Walt's list of argument for letting Krazy-8 live is not deontological by any means -- although “He may listen to reason” rests on a similar conviction that clarity and rationality are not just worthy aspirations but realizable possibilities as well. Despite his nickname and his criminal vocation, Krazy-8 is a well-spoken and seemingly pragmatic individual, with strong family ties of a sort that Walt can respect. And Walt very nearly reaches a decision on that basis.
On the other hand, not every consequence can be put in brackets while you seek the universally right thing to do. And “He’ll kill your entire family if you let him go” is a pretty good example of that. Under the circumstances, even a deontologist would probably find a way to think of murder as obligatory.
By now, it seems as if every genre, blockbuster, videogame, superhero, hit program, or teen trend has been covered by at least one book in this niche, or will be in the foreseeable future. I picture them being produced in something akin to Walt’s methamphetamine superlab – with the important exception that Walt’s product is of famously consistent in quality. The popcult philosophy collections that I’ve sampled over the years tend to be pretty uneven, even within the same volume. The one constant is that most of the essays are clearly didactic. The implied reader for these books almost always seems to be an undergraduate, with popular culture as the candy coating on the philosophical vitamins otherwise missing from the educational diet. There is jocularity aplenty. In this volume, for example, a comparison of Breaking Bad and Augustine’s Confessions includes the information that the saint-to-be “had a rep for hooking up with the MILFs of Carthage” -- not unlike Peter Abelard, “a famous playa before his lover’s father and brother… cut off his junk and sent him packin.’”
Well, you do what you must to keep the students' attention. With any luck, these books will be the philosophical equivalent of a gateway drug, leading some readers to try the harder stuff.
But there must be more ways to go about it than by reducing every pop-culture phenomenon to a pretext for introducing well-established topics and thinkers. Another constituency for these books is the fan base for whatever cultural commodity gets yoked to philosophy in their titles. It was as a devotee of the show (one who has seen every episode of the first four seasons at least twice) that I bought Breaking Bad and Philosophy in the first place. And the striking thing about the program is that it's all about how decisions, consequences, and responsibility (or the lack of it) get mixed up in ways that no schema can account for very well. That is undoubtedly part of its appeal.
I’ll end by recommending one essay from the book that will reward the attention of anyone who follows the show closely. Titled “Macbeth on Ice,” it is by Ray Bossert, a visiting assistant professor of English at Franklin and Marshall College. He compares "Breaking Bad" and the Scottish play by reference to Aristotle's Poetics, to surprisingly appropriate effect.
In Aristotle’s analysis, the hero in classical tragedy is responsible for his actions and ultimately their victim. His character is admirable and doomed because of some flaw -- excessive pride, for example. That's the one Macbeth and Walter White share. The hero's motives and decisions are transformed as this flaw grows more prominent. It leads him to "incidents arousing pity and fear" in the audience, says Aristotle. Such incidents have the very greatest effect on the mind when they occur unexpectedly and at the same time in consequence of one another; they arouse more awe than if they happened accidentally and by chance."
In Walt’s case, as his involvement in the meth business deepens, we see that his insistence that everything he does is out of love for his family is a kind of self-deception. More and more evidence of his rage and resentment accumulates. He feels trapped by his family, and his pride has been wounded too many times in his 50 years. As events unfold, Walt feels increasingly confident and powerful, and his running cost-benefit analysis leaves ever more collateral damage.
We believe in the character, writes Bossert, “because, in our own thoughts, we, too, resent being limited to a single role on life’s stage. We pity Walter White, and fear that we might make similar mistakes because we’re like him.” This seems exactly right. Bossert makes no predictions about how Breaking Bad will end (it is now counting down its last 16 episodes, 8 this summer and 8 in 2013) nor will I. But Walt has enormous potential in the pity and fear department, and the stage is sure to be covered with bodies before the curtain falls – even more than it already is.
Call it philosophical synesthesia: the work of certain thinkers comes with a soundtrack. With Leibniz, it’s something baroque played on a harpsichord -- the monads somehow both crisply distinct and perfectly harmonizing. Despite Nietzsche’s tortured personal relationship with Wagner, the mood music for his work is actually by Richard Strauss. In the case of Jean-Paul Sartre’s writings, or at least some of them, it’s jazz: bebop in particular, and usually Charlie Parker, although it was Dizzie Gillespie who wore what became known as “existentialist” eyeglasses. And medieval scholastic philosophy resonates with Gregorian chant. Having never managed to read Thomas Aquinas without getting a headache, I find that it’s the Monty Python version:
Such linkages are, of course, all in my head -- the product of historical context and chains of association, to say nothing of personal eccentricity. But sometimes the connection between philosophy and music is much closer than that. It exists not just in the mind’s ear but in the thinker’s fingers as well, in ways that François Noudelmann explores with great finesse in The Philosopher’s Touch: Sartre, Nietzsche, and Barthes at the Piano (Columbia University Press).
The disciplinary guard dogs may snarl at Noudelmann for listing Barthes, a literary critic and semiologist, as a philosopher. The Philosopher’s Touch also ignores the principle best summed up by Martin Heidegger (“Horst Vessel Lied”): “Regarding the personality of a philosopher, our only interest is that he was born at a certain time, that he worked, and that he died." Biography, by this reasoning, is a distraction from serious thought, or, worse, a contaminant.
But then Noudelmann (a professor of philosophy at l’Université Paris VIII who has also taught at Johns Hopkins and New York Universities) has published a number of studies of Sartre, who violated the distinction between philosophy and biography constantly. Following Sartre’s example on that score is a dicey enterprise -- always in danger of reducing ideas to historical circumstances, or of overinterpreting personal trivia.
The Philosopher’s Touch runs that risk three times, taking as its starting point the one habit its protagonists had in common: Each played the piano almost every day of his adult life. Sartre gave it up only as a septuagenarian, when his health and eyesight failed. But even Nietzsche’s descent into madness couldn’t stop him from playing (and, it seems, playing well).
All of them wrote about music, and each published at least one book that was explicitly autobiographical. But they seldom mentioned their own musicianship in public and never made it the focus of a book or an essay. Barthes happily accepted the offer to appear on a radio program where the guest host got to spin his favorite recordings. But the tapes he made at home of his own performances were never for public consumption. He was an unabashed amateur, and recording himself was just a way to get better.
Early on, a conductor rejected one of Nietzsche’s compositions in brutally humiliating terms, asking if he meant it as a joke. But he went on playing and composing anyway, leaving behind about 70 works, including, strange to say, a mass.
As for Sartre, he admitted to daydreams of becoming a jazz pianist. “We might be even more surprised by this secret ambition,” Noudelmann says, “when we realize that Sartre did not play jazz! Perhaps this was due to a certain difficulty of rhythm encountered in jazz, which is so difficult for classical players to grasp. Sight-reading a score does not suffice.” It don’t mean a thing if it ain’t got that swing.
These seemingly minor or incidental details about the thinkers’ private devotion to the keyboard give Noudelmann an entrée to a set of otherwise readily overlooked set of problems concerning both art -- particularly the high-modernist sort -- and time.
In their critical writings, Sartre and Barthes always seemed especially interested in the more challenging sorts of experimentation (Beckett, serialism, Calder, the nouveau roman, etc.) while Nietzsche was, at first anyway, the philosophical herald of Wagner’s genius as the future of art. But seated at their own keyboards, they made choices seemingly at odds with the sensibility to be found in their published work. Sartre played Chopin. A lot. So did Nietzsche. (Surprising, because Chopin puts into sound what unrequited love feels like, while it seems like Nietzsche and Sartre are made of sterner stuff. Nietzsche also loved Bizet’s Carmen. His copy of the score “is covered with annotations, testifying to his intense appropriation of the opera to the piano.” Barthes liked Chopin but found him too hard to play, and shifted his loyalties to Schumann – becoming the sort of devotee who feels he has a uniquely intense connection with an artist. “Although he claims that Schumann’s music is, through some intrinsic quality, made for being played rather than listened to,” writes Noudelmann, “his arguments can be reduced to saying that this music involves the body that plays it.”
Such ardor is at the other extreme from the modernist perspective for which music is the ideal model of “pure art, removed from meaning and feeling,” creating, Noudelmann writes, “a perfect form and a perfect time, which follow only their own laws.... Such supposed purity requires an exclusive relation between the music and a listener who is removed from the conditions of the music’s performance.”
But Barthes’s passion for Schumann (or Sartre’s for Chopin, or Nietzsche’s for Bizet) involves more than relief at escaping severe music for something more Romantic and melodious. The familiarity of certain compositions; the fact that they fall within the limits of the player’s ability, or give it enough of a challenge to be stimulating; the way a passage inspires particular moods or echoes them -- all of this is part of the reality that playing music “is entirely different from listening to it or commenting on it.” That sounds obvious but it is something even a bad performer sometimes understands better than a good critic.
“Leaving behind the discourse of knowledge and mastery,” Noudelmann writes, “they maintained, without relent and throughout the whole of their existence, a tacit relation to music. Their playing was full of habits they had cultivated since childhood and discoveries they had made in the evolution of their tastes and passions.” More is involved than sound.
The skills required to play music are stored, quite literally, in the body. It’s appropriate that Nietzsche, Sartre, and Barthes all wrote, at some length, about both the body and memory. Noudelmann could have belabored that point at terrific length and high volume, like a LaMonte Young performance in which musicians play two or three notes continuously for several days. Instead, he improvises with skill in essays that pique the reader's interest, rather than bludgeoning it. And on that note, I must now go do terrible things to a Gibson electric guitar.