“If you spend much time in libraries,” the late Northrop Frye wrote at the start of an essay from 1959, “you will probably have seen long rows of dark green books with gold lettering, published by Macmillan and bearing the name of Frazer.” These were the collected works of the Victorian classicist and anthropologist Sir James Frazer, author of The Golden Bough (15 volumes) and a great deal else besides.
Frye’s remarks -- originally delivered as a talk on the Canadian Broadcasting Corporation’s radio network -- were aimed for a much broader public than would have read his then-recent book Anatomy of Criticism, which made its author the most-cited name in Anglophone literary studies until at least the early 1980s. (Frye was professor emeritus of English at Victoria College, University of Toronto, when he died in 1991.) He told listeners that it would require “a great many months of hard work, without distractions, to read completely through Frazer.”
And the dedicated person making the effort probably wouldn’t be an anthropologist. The discipline’s textbooks “were respectful enough about him as a pioneer,” Frye wrote, “but it would have taken a Geiger counter to find much influence of The Golden Bough in them.”
And yet Frazer’s ideas about myth and ritual and his comparative approach to the analysis of symbolism exercised an abiding fascination for other readers -- in part through the echoes of them audible in T. S. Eliot’s “The Waste Land,” but also thanks to Frazer’s good sense in preparing an abridged edition of The Golden Bough in one stout volume that it was entirely possible to finish reading in no more than a year.
If you spend much time in libraries these days -- wandering the stacks, that is, rather than sitting at a terminal -- you might have seen other long rows of dark green books with gold lettering, published by the University of Toronto Press and bearing the name of Frye.
The resemblance between The Collected Works of Northrop Frye (in 30 volumes) and the Frazerian monolith is almost certainly intentional, though not the questions such a parallel implies: What do we do with a pioneer whose role is acknowledged and honored, but whose work may be several degrees of separation away from where much of the contemporary intellectual action is? Who visits the monument now? And in search of what?
Part of the answer may be found in Essays on Northrop Frye: Word and Spirit, a new collection of studies by Robert D. Denham, professor emeritus of English at Roanoke College. The publisher named on the title page is Iron Mountain Press of Emory, Va., which appears not to have a website; the listing for the book on Amazon indicates that it is available through CreateSpace Independent Publishing Platform, a print-on-demand service.
Denham has written or edited more than 30 books by or about Frye, including several volumes of notebooks, diaries, letters and works of fiction in the Collected Works, for which he also prepared the definitive edition of Anatomy of Criticism. The second of the three sections in Word and Spirit (as I prefer to call the new book) consists of essays on the Anatomy, examining Frye’s ideas about rhetoric and the imagination and brandishing them in the face of dismissive remarks by Frederick Crews and Tzvetan Todorov.
Frye’s relative decline as a force to be reckoned with in literary theory was already evident toward the end of his life; at this point the defense of Frygian doctrine may seem like a hopelessly arrière-garde action. (“Frygian” is the preferred term, by the way, at least among the Frygians themselves.) But the waning of his influence at the research-university seminar level is only part of the story, and by no means the most interesting part. The continuing pedagogical value of the Anatomy is suggested by how many of Frye’s ideas and taxonomies have made their way into Advanced Placement training materials. Anyone trying to find a way around in William Blake’s poetic universe can still do no better than to start with Frye’s first book, Fearful Symmetry (1947). Before going to see Shakespeare on stage, I’ve found it worthwhile to see what Frye had to say about the play. Bloggers periodically report reading the Anatomy, or Frye’s two books about the Bible and literature, and having their minds blown.
Northrop Frye is the rare case of a literary theorist whose critical prose continues to be read with interest and profit by people who are not engaged in producing more of the stuff. In the talk on Frazer, he noted that The Golden Bough appealed to artists, poets and “students of certain aspects of religion” -- which seems, on the whole, like a fair guess at the makeup of Frye’s own posthumous constituency.
What’s been lacking is the single-volume, one-stop survey of the Frygian landscape. The Collected Works have complicated things -- not just by being vast and intimidating (and too expensive for most of individuals to afford) but by adding thousands of pages of unpublished material to the already imposing mass of Frye’s work.
Denham is as responsible for adding new turns to the labyrinth as anyone. He is the scholar dedicated enough to have solved the riddle of the great man’s handwriting. Most of the lectures and papers in Essays on Northrop Frye: Word and Spirit draw on the private papers, which are of considerably more than biographical interest. Frye used his notebooks to think out loud and to explain himself to himself, working out the links among the work he’d published and things he wanted to write.
They reveal elements of his inner life that remained unstated, or at most implicit, in Frye’s public writings -- for example, his studies in Buddhist and Hindu thought. He also explored the whole gamut of esoteric and mystical writings from the Corpus Hermeticum and Nicolas of Cusa (respectable) to Madame Blavatsky and Aleister Crowley (shady but undeniably fascinating) to titles such as The Aquarian Conspiracy and Cosmic Trigger: The Final Secret of the Illuminati (“kook books,” as Frye called them). Connections existed between this material and his scholarship (you can’t study Blake or Yeats for long without picking up some Gnosticism and theosophy) but Frye also needed to understand his own religious beliefs and occasional experiences of the ineffable. He was interested in the cosmological side of the literary imagination, but also compelled to figure out his own place in the cosmos.
The drives were mutually reinforcing. But references to these interests in his published work were few and far between, and often enough too oblique to notice. With Denham’s close knowledge of Frye’s writings (scholarly and subterranean alike) Word and Spirit seems like the book that’s been necessary for some while -- the thread that can take readers into the depths of the Frygian labyrinth. So on those grounds, I can recommend it -- without guaranteeing you’ll find the way back out again.
You can’t judge a book by its neologisms, but the coinages appearing in the first chapter or two of Carl Cederström and André Spicer’s The Wellness Syndrome (Polity) serve as pretty reliable landmarks for the ground its argument covers. We might start with “orthorexia,” which spell-check regards with suspicion, unlike “anorexia,” its older and better-established cousin.
Where the anorexic person avoids food as much as possible, the orthorexic is fixated on eating correctly -- that is, in accord with a strict and punitive understanding of what’s healthy to eat, and in what quantities, as well as what must be avoided as the culinary equivalent of a toxic landfill. It is a sensible attitude turned pathological by anxiety. And in the authors’ interpretation, that anxiety is socially driven: the product of “biomorality,” meaning “the moral demand to be happy and healthy,” as expressed in countless ways in a culture that makes chefs celebrities while stigmatizing the poor for eating junk food.
But diet is only one bailiwick for “wantologists,” somewhat better known as “life coaches,” whose mission it is to “help you figure out what you really want” in life. Cederström is an assistant professor of organizational theory at Stockholm University, while Spicer is a professor of organizational behavior at City University, London. I take it from their account that the wantological professions (there are certification programs) extend beyond one-on-one consulting to include the market in self-improvement and motivational goods and services such as books, workshops and so on. The goal in each case is the combination of physical fitness and positive mental attitude that amounts to an “ideal performance state” for the contemporary employee.
“A recent survey by RAND,” we learn, “found that just over half of U.S. employers with more than 50 staff offer some kind of workplace wellness program,” while 70 percent of companies in the Fortune 200 do so. “In total, U.S. employers spend about $6 billion a year on such programs,” which “are often tied up with employees’ health insurance.”
“Know Yourself, Control Yourself, Improve Yourself” reads one of the chapter subheads, as if to list the slogans from some Orwellian Ministry of Wellness. But where Big Brother ruled through the repression of desire and personal identity, the cultural regime defined by what the authors call “the wellness command” makes every possible concession to individuality and contentment. Indeed, it demands them. Every aspect of life becomes “an opportunity to optimize pleasure and become more productive,” and the experts warn that faking it won’t help: the satisfaction and self-realization must be authentic. We are all the captains of our fates and masters of our souls. Failure to stay healthy and happy -- and flexible enough to adapt to whatever circumstances the labor market may throw at you -- is ultimately a personal and moral failure. So you’d better get some life coaching if you know what’s good for you, and maybe especially if you don’t.
“What is crucial is not what you have achieved,” write Cederström and Spicer, “but what you can become. What counts is your potential self, not your actual self.” The titular syndrome refers to the cumulative strain of trying to respond to all the wellness commands, which are numerous, conflicting and changeable -- a perfect recipe for chronic anxiety, of which an obsession with eating correctly seems like an exemplary symptom. On first reading, I took “orthorexia” to be the authors’ own addition to the language (like “the insourcing of responsibility” and “authenticrat,” per the tendencies described a moment ago) but in fact it turns out to be an unofficial diagnosis in the running for future lists of psychiatric disorders.
The Wellness Syndrome offers, by turns, both a recognizable survey of recent cultural trends and a collage of insights drawn from more original works of social analysis and theory. Much of it will seem more than a little familiar to readers already acquainted with Christopher Lasch’s The Culture of Narcissism, Eve Chiapello and Luc Boltanski’s The New Spirit of Capitalism, Slavoj Zizek’s sundry discussions of the contemporary superego, or any given book by Zygmunt Bauman or Barbara Ehrenreich published in the past twenty years. These works are duly cited but the ideas not pushed in any new direction. The common principle subtending them all is that cynicism about institutions or the possibility of large-scale social change creates a privatized, moralistic ideology that traps people into punitive introspection or the fine-tuning of lifestyles. Unfortunately much of The Wellness Syndrome reads as if such trends began under the administrations of Bill Clinton and Tony Blair.
Alas, no. They were already visible 40 years ago as baby boomers began signing up for weekend explorations in self-discovery with unlicensed therapists who yelled insults at them and wouldn’t let them use the bathroom. Nothing in the new book points to any means or agency capable of changing things in any fundamental way, or even of imagining such a change. Social scientists aren't obliged to be prophets and, of course, they seldom do a very good job when they try; at best they describe and analyze change once it's discernable, not before. But after seven or eight years of shocks and aftershocks from a global financial crisis, it's time for books that do more than put new labels on decades-old problems.
In coining the word utopia, Thomas More was making a pun. The villain of Wolf Hall was, in real life, a learned man who wrote for people who could recognize a joke in Greek when he made one. The island republic of social perfection depicted in his most famous book was a good place (eu-topia), obviously. But it existed only in the imagination: it was also, literally, no place (ou-topia).
Alternating currents of optimism and skepticism crackle in the space between syllables. The ambivalence vanishes with “dystopia,” which, like dysentery (“bad bowels”), has nothing to recommend it. But there is more to dystopia than has been encoded in its etymology. The word usually implies utopia’s evil twin: a social order of perfect oppression, designed to bring the greatest misery to the greatest number.
The places Kate Brown writes about in Dispatches From Dystopia: Histories of Places Not Yet Forgotten (University of Chicago Press) are not all examples of hell on earth, by any means, but each bears the scars of some catastrophe that the visitor is bound to know about before arriving: the ghost town of Chernobyl, for example, or the basement of a hotel in Seattle full of the belongings of Japanese-American residents relocated to internment camps during World War II. The author introduces herself as “a professional disaster tourist,” though her day job is as a professor of history at the University of Maryland, Baltimore County. Her two previous books grew out of research on Russia and Ukraine during the Soviet era. Dispatches From Dystopia pursues many of the same interests while also working reflexively to consider the genres available for writing about place and memory: professional historiography, of course, but also personal narrative and travel writing.
“Many writers presume that the site of action is a given,” she notes, “as if places were neutral containers of human interaction rather than dynamic places in their own right.” At the same time, scholarly prose is often written from the vantage point of the proverbial “man from nowhere.” Make that “person from nowhere,” rather -- anyway, a voice that, while not omniscient, remains as rigorous and impersonal as possible.
“In their quest to explore the human condition,” Brown writes, “historians can hide behind their subjects, using them as a scrim on which to project their own sentiments and feelings. Let me put that another way: in my quest to explore the human condition, I have hidden behind my subjects, using them as a scrim on which to project my own sentiments and feelings. The third-person voice is a very comfortable one in which to reside. Permanently. The intimacy of the first person takes down borders between the author and the subject, borders that are considered by many to be healthy in a profession that is situated between the social sciences and the humanities.”
Such intimacy brings the potential for extreme embarrassment. Brown prefaces the lines just quoted by saying that her hands are sweating as writes them. Her early ventures into first-person scholarship met with resistance, expressed in well-meant warnings such as, “You won't get a job with that dissertation” and “Other scholars will assign you, but not cite you.” Which is understandable, because other risks besides personal and professional awkwardness can follow from experimentation of the kind Brown undertakes. The existence of “borders between the author and the subject” at least reduce the dangers of twee memoir -- and also of prolonged metaepistemic inquiry (how can the knower know the knower, much less the known?) that scorches the earth with tedium.
So for the first several pages of Dispatches From Dystopia I braced myself, only to find that Brown is the rare case of someone who can incorporate a number of registers of narrative and reflection within the same piece of writing, shifting among them with grace and quiet confidence. Her essays might be called position papers: topographical surveys of historical sites, with the mapmaker’s own itinerary sketched in.
The trips to erstwhile Soviet republics are not, she makes clear, a search for roots. A product of “the industrial heartland of the United States at a time when it was the world’s most prosperous and powerful country,” she is unaware of any German, Jewish or Slavic branches to her family tree: “I could hardly have been born farther from rural, famished, collectivized, heavily politicized, bombed and terrorized Right Bank Ukraine” -- the subject of her first book -- “a place that stands in my mind as the epicenter of 20th-century misery.”
But another essay suggests the advantages of this presumed naïveté. People she met granted the author a place in post-Soviet society “as an honorary child…. If I accepted this role passively, relinquishing my status as an autonomous adult and the critical rationality of a researcher, they often let me in, if fleetingly, for a closer look. By becoming childlike -- susceptible, disabled and dependent -- I became a temporary member of their community, which in the Soviet Union was defined by an understanding of biological vulnerability, mutual interdependence and obligation.”
Other expeditions require different personae. Her trip to what’s left of the city of Chernobyl elicits another kind of identification with people who have been there. Expecting a scene from opening days of the Gorbachev era -- irradiated but frozen in time -- she finds that everything that can be sold has been hauled off to market: “Even the knobs on the kitchen cabinets were gone. Even the time capsule schoolchildren buried in the 1970s had been looted. (I know because I was hoping to dig it up and loot it myself.)”
Brown’s first-person reflections are embedded in narratives and place descriptions that are more intricate and varied than a reviewer can even begin to suggest, and certain issues and motifs link the essays in ways that would probably reward a second reading. Each piece, like the volume as a whole, is an example of nonfiction that uses the first person, rather than just indulges it. The learned essay and the personal essay are different creatures and attempts to create a hybrid are often problematic at best. But Dispatches From Dystopia proves it can be done.
My ears have been burning: Michael Eric Dyson’s philippic directed at Cornel West, published a few days ago at the website of The New Republic, echoes much of my grumbling and gnashing of teeth in this column back in late 2009, following the publication of Brother West, an “as told to” autobiography. Dyson now calls that volume “an embarrassing farrago of scholarly aspiration and breathless self-congratulation” -- quite an astute characterization, if I say so myself.
The New Republic article is the most public and substantial (or at least sustained) phase of a conflict that began late in President Obama’s first term. Until then, the West-Dyson relationship was close -- practically symbiotic. A professor of philosophy at Union Theological Seminary, West is also an emeritus professor at Princeton University, where in the early 1990s he served on the dissertation committee for Dyson, who is a professor of sociology at Georgetown University. In 1995 -- when a string of articles appearing in The Atlantic, The New Yorker and other high-profile venues identified them as members of a new cohort of black public intellectuals -- West and Dyson still had what was clearly a mentor-protégé relationship, and their dialogue tended to be, as Adolph Reed Jr. put it in a blistering essay at the time, “a publicist’s delight, a hyperbolically log-rolling love fest.”
The mutual-admiration arrangement lasted until sometime near the end of the first Obama administration, when West turned up the heat on his criticisms of the president as (among other things) a “black mascot of Wall Street oligarchs” and “the head of the American killing machine.” A number of black liberals took issue with West’s hard left turn. But it was Dyson’s defenses of the president that seemed especially to rankle West. In August 2013, West singled out Dyson by name as one of the people “who’ve really prostituted themselves intellectually in a very ugly and vicious way.”
Similar pleasantries followed. Dyson’s response was muted until earlier this month, when he made some not very subtle allusions to West at a meeting of the National Action Network, the civil rights organization founded by Al Sharpton. “Be honest and humble in genuine terms,” Dyson said, “not the public performance of humility masquerading a huge ego. No amount of hair can cover that.” His more expansive remarks in print run to more than 9,000 words, accompanied by a drawing in which West appears to have a very bad case of dandruff.
One assessment now making the rounds is that it’s a lamentable case of the white establishment turning two formidable African-American minds against one another when otherwise they might be uniting against all that merits ruthless critique. I doubt a more inane judgment is possible. A pretty thoroughgoing ignorance of African-American intellectual history would be required to assume that black thinkers can’t or won’t do battle without there being some Caucasian fight promoter involved. Richard Wright never entirely recovered from James Baldwin’s essay “Everybody’s Protest Novel.” The great but long-neglected black sociologist Oliver C. Cox was scathing about the work of his colleague E. Franklin Frazier.
Such conflicts can be psychobabbled into meaninglessness, of course. Cox’s remarks were attributed to jealousy (Frazier became the first African-American president of the American Sociological Association in 1948, the same year Cox published his overlooked masterpiece Class, Caste, and Race) while Baldwin’s critique of Wright seems like a perfect example of the Oedipal conflict between authors that Harold Bloom calls “the anxiety of influence.” And yes, the ego will take its revenge, given a chance. But real differences in understanding of American society or the role of the artist were involved in those disputes. Those who profess to favor a vigorous intellectual life, and yet deprecate polemic, want crops without plowing up the ground.
But in moving from Baldwin/Wright and Cox/Frazier to Dyson/West, we descend a hundred miles in conceptual altitude. The earlier debates are still interesting to revisit, while the sooner we forget this one, the better. For at issue here are not ideas or principles but questions of demeanor and attributions of motive. It is the way celebrities feud.
My complaint of a few years ago was that Brother West treated intellect as little more than grounds to earn a backstage pass to meet famous people. It was frustrating and dismaying, and the passing of time has not made anything better: I find myself in the awkward and disagreeable position of agreeing with West’s opinions about Obama (and so concurring with Dave Zirin’s criticism of the New Republic article) while growing even more disappointed with West’s sense of priorities.
He hasn’t returned to philosophy or social analysis. He appears content with what I’ve come to think of as “that speech Cornel West always gives.” It is a set list of standard references, sparkling and variously arranged, like the bits of colored glass in a kaleidoscope:
“Coltrane and Chekhov, Foucault and Funkadelic. Du Bois wore a three-piece suit like this one. Structural inequality; the Panthers sold their paper near Yale when I was a student there; applause-winning mention of Larry Summers and/or Spike Lee. Quotation(s) from my dear brother _______ [famous philosopher or performer]. Nihilism is bad, bluesman of the mind; keep hope alive.”
It is never the same speech, yet it is always the same speech. In it are occasional riffs from West’s early writings, but they go undeveloped. Circa 1990, the prospect of seeing him work out the deep links between Chekhov and Coltrane was intriguing. Now it’s just a shiny piece of glass, pretty enough but not going anywhere.
Dyson’s essay is for the most part a chronicle of a friendship betrayed, but it does make a telling point. The issue is West’s constant references to the Judeo-Christian idea of prophecy, understood not as prognostication but as advocacy for justice and righteousness. The word “prophetic” appears in a number of West’s titles, in ways that suggest it applies to the author himself, or at least the book. But he has never offered “detailed comparative analyses of prophets in Judaism, Christianity, Islam or Zoroastrianism,” Dyson says. “…He hasn’t explored the differences between social and political prophecy, examined the fruitful connections between the biblical gift of prophecy and its cultural determinants, or linked his understanding of prophecy to secular expressions of the prophetic urge found in New Left radicalism, for example….”
Dyson considers the vagueness all too convenient. It leaves West free to put on the prophetic mantle when and how he sees fit -- to issue warnings and denunciations while never clarifying the grounds for his claim to assume that role. In challenging this blind spot, Dyson also challenges the authority upon which West’s discourse rests.
As rhetorical strategy goes, it’s a shrewd move. Dyson targets something more fundamental than West’s political stance, and something harder to hit than a side-of-the-barn-sized ego. It will be interesting to see if West takes up the challenge. His students at Union Theological Seminary ought to press him on it.
At the same time, grounding their disagreement within the terms of their shared religious faith leaves open the possibility of reconciliation. Dyson’s other point about prophecy is that the prophet’s inspiration coexists with human fallibility. All of the most pointed jabs at West -- his vanity, appetite for media attention and intellectually lightweight work -- are also reflexive. “West’s off-the-cuff riffs and rants,” Dyson says, “spoken into a microphone and later transcribed to page, lack the discipline of the written word.” Coming from the man who published Debating Race With Michael Eric Dyson, a collection of transcripts from his television appearances, let’s hope this was meant as self-critical.
From mutual admiration to mutual recrimination -- to mutual forgiveness? Who knows? The next move is West’s. Five years ago, I hoped, against all odds, that Brother West might count as hitting rock bottom.
Alas, no. West’s activities since then have included a cameo appearance on a situation comedy. He also offered himself as the bait to lure thousands of fans into attending his “dialogue” with a Maoist cult leader whose grandiosity and verbosity did not lend themselves well to conversation, as such.
So, to repeat: I agree with a very large portion of what West says, but only his worst enemy could feel much enthusiasm for the use he makes of his time.
Probably the best-known fact about The Higher Learning in America by Thorstein Veblen (1857-1929) is that the author’s original subtitle for it was “A Study in Total Depravity.” By the time the book finally appeared in print in 1918, the wording had been changed to “A Memorandum on the Conduct of Universities by Business Men,” which gives the reader a clearer sense of the contents, albeit at a considerable loss in piquancy.
The “memorandum” nonetheless displayed Veblen’s knack for turning a phrase that twisted the knife. He attacked the “bootless meddling” of governing boards and the “skilled malpractice and malversion” of the presidents they appointed. These “captains of erudition” (a play on the then-recent expression “captains of industry”) understood the value of a dollar and of publicity, but not much else. To their way of thinking, good public relations meant “tawdry, spectacular pageantry and a straining after showy magnitude.” And worse, they molded higher education in their own likeness.
“The school becomes primarily a bureaucratic organization,” writes Veblen, “and the first and unremitting duties of the staff are those of official management and accountancy. The further qualifications requisite to the members of the academic staff will be such as make for vendibility, volubility, tactical effrontery [and] conspicuous conformity to the popular taste in all matters of opinion, usage and conventions.” The cumulative, long-term effect on the life of the mind? “A substitution of salesmanlike proficiency -- a balancing of bargains in staple credits -- in the place of scientific capacity and addiction to study.”
Veblen was more than a satirist and scold, brimming over with vitriol and bile. That final expression, for example -- “addiction to study” -- could only have been coined by someone who had experienced what it names, and his critique of the university includes a serious effort to understand its nature and history. But Veblen’s problems with the field of higher education in his day were both substantial and dogged, and even the most analytical portions seem driven by sublimated anger.
In the 1890s and early 1900s, he was a professor at the University of Chicago and then at Stanford University, but in each case he left under a cloud of scandal. Besides his religious disbelief and his acerbic (if not misanthropic) disposition, there were the rumors about his animal magnetism -- which, it was said, irresistibly pulled colleagues’ wives into bed with him.
In fact, the rumors were spread by his first wife, who brought them to the notice of the presidents at Chicago and Stanford. They confronted Veblen, who declined to respond -- something his first and most influential biographer, Joseph Dorfman, took as an admission of guilt. As far as I can tell, contemporary Veblen scholarship rejects that judgment entirely, treating the charges of Don Juan-ism as fallout from the dissolution of marriage that (in a telling detail about the level of estrangement here) seems never to have been consummated.
Be that as it may, the image that the literary and intellectual historian Daniel Aaron depicted in an essay from 1947 has continued to color how Veblen is read. “Irascible, dour and sardonic,” Aaron wrote, “living precariously along the fringes of the American university world he anatomized so mercilessly, Veblen remained during his lifetime a kind of academic rogue, admired by an increasing number of discriminating disciples but never winning the kudos handed out to his less able but more circumspect colleagues.”
Nearly all of whom were soon utterly forgotten, of course, but not Veblen, whose grievances -- whether about “conspicuous consumption” in society at large or “nugatory intrigue and vacant pedantry” within the groves of academe -- retain a certain vigor and bite. The opening pages of the new edition of The Higher Learning in America from Johns Hopkins University Press call it “an appropriate way to mark the centennial of Veblen’s great book,” and most of the back cover is taken up with comments by historians and critics of higher education, noting how disconcertingly timely it still seems.
The editor, Richard F. Teichgraeber III (a professor of history at Tulane University), has prepared what’s bound to remain the standard edition of the text for a long time to come. His extensive yet unobtrusive notes “identify -- when identification proved possible -- events, institutions, persons and publications alluded to or mentioned,” and he glosses the literary quotations and biblical references embedded in Veblen’s wild and sometimes woolly prose. The timeline of Veblen’s life and the recommended-readings list benefit from the past three decades of Veblen scholarship; in contrast, Dorfman’s biography from 1934 often looks like a target after a busy day at the shooting range. But the text’s apparatus limits itself to presenting the positive side of revisionist efforts rather than continuing to fire away.
For his own part, Teichgraeber, in his introductory essay, presents The Higher Learning in America as a more policy-minded work than a reader is likely to imagine going in with little sense of context beyond knowing about that abandoned subtitle.
Veblen started writing an essay on the university in 1904 and continued revising and expanding it for another dozen years, despite the reactions of colleagues and publishers, who were discouraging or appalled. In the preface drafted in 1916, he admits that circumstances “made it seem the part of insight and sobriety… to defer publication, until the color of an irrelevant personal equation should again have had time to fade into the background.” Veblen kept the discussion of institutional problems and academic politics on a level of generality that avoided naming names or describing his own troubles. But the note of personal frustration was audible even so, and readers at the time could hear it. (As, indeed, readers can now, though they'll usually need the annotation to fill in the details.)
But Veblen was not the only figure turning a critical eye on the higher education of his day. European models of graduate study and the research university, combined with the proliferation of land-grant colleges, inspired running public debates over academic freedom, curriculum reform, funding and so on. Teichgraeber points out that the whole genre of commentary even had a name to distinguish it: “the professors’ literature of protest.”
Veblen indicates that The Higher Learning in America was written in response to this “bulk of printed matter,” but without quoting it or identifying whom he’s answering. Perhaps he wanted to stop short of antagonizing people he hadn’t already made enemies, or causing trouble for anyone he agreed with. But our editor and annotator knows his way around “the professors’ literature of protest” and can make reasonable surmises about what articles and authors Veblen had in mind.
Between those sotto voce arguments and the biographical details, we can finally put his spleen in context. The Hopkins edition makes the best case possible for The Higher Learning in America as a serious contribution to institutional critique. At the same time, it’s the book in which Veblen refers to the corporate university as an “abomination of desolation.” Even without an annotation guiding you to the Book of Daniel, it’s easy to recognize that as something “often thought but ne’er so well expressed.”
When assessing scholarly books, pleasure is not normally a factor, any more than flavor is in judging medicines. Calling a monograph enjoyable is, after all, at best an expression of personal judgment. At worst, it’s a breach of the “professional professorial asceticism” that Pierre Bourdieu identified as definitive for Homo academicus.
But what the hell. A columnist has no other protocol to meet than deadline, so let the enthusiasm roll: Lothar Müller’s White Magic: The Age of Paper (Polity) is the most enjoyable scholarly book I’ve read in a while, despite my initial suspicion that it would be just one more example of the the rather hackneyed genre of middlebrow cultural histories with titles like Salsa: The Condiment That Changed Everything.
White Magic is, just to be clear, a serious work in the field of media studies. Müller, a professor of general and comparative literature at the Free University of Berlin, follows through on the implications of the Canadian historian Harold Innis’s work in a more cogent and coherent way than Innis’s best-known follower, Marshall McLuhan, ever did. The bibliography is broad and dense, and the text moves between economic and technological history and literary works in ways that shed light in all directions.
That said: what a great read! It is a book to warm up the brain on a day of mental fog. It’s possible to open up White Magic at random and find a piece of historical information or analysis that is interesting and suggestive in its own right, elaborated in prose that develops its points clearly, with none of the anxious tics (“unfortunately there is not space here to examine...”) that come from straining to establish authority without having the confidence to exercise it.
First published in Germany three years ago, and now published for the first time in English, White Magic continues the drawn-out effort to understand the changes in publishing, and in society at large, wrought by digital communication. Besides his academic position, Müller is editor of the features section of the Suddeutsche Zeitung, a newspaper. That’s certainly one way to experience the ongoing epochal shift of recent years up close and personally.
But White Magic isn’t a defense of print media, or even a eulogy. It challenges the perspectives embedded in the familiar grand narrative of an age of print, dawning with the invention of movable type, that has entered its twilight with the advent of the digital age. Perhaps the best way to introduce Müller’s point is to consider our presuppositions about Gutenberg’s innovation.
The familiar story is that his press made it possible to produce, at much greater speed and in far larger quantity, texts that in earlier centuries would have been copied by hand onto papyrus, parchment or vellum. From scroll to codex to bound volume, there was a continuity in the history of the book -- changes in format tending to make books more durable, with Gutenberg introducing the catalytic factor of mass production.
And very often the story then continues by recounting the intense, even convulsive impact of all that speedy production of writing in bulk: journalism, pamphleteering, the Protestant reformation, etc.
But imprinting ink on a surface with movable type required that the material it was printed on possess certain qualities (especially standard dimensions and consistent smoothness, but also resilience under pressure from metal type) and that it be available reliably and in great bulk. To put it another way, Gutenberg’s invention depended on a still earlier invention, paper, which was itself a mass-produced commodity, turned out in protofactories that represented sizable investments as well as wide distribution networks.
How paper manufacture was invented in China, perfected by the Arabs and eventually adopted throughout Europe is an exemplary piece of transnational history -- and given Inside Higher Ed’s audience, it’s worth noting the huge impact on university budgets, almost from the moment there was such a thing. “To free itself from dependence on paper dealers from Lombardy,” Müller writes, the University of Paris “successfully petitioned the king in 1354 for the right to run paper mills” of its own, operated by craftsmen “who had the status of university employees.”
That was well after Italian universities found a way around the costly reproduction of textbooks, circa 1200, by authorizing the transcription of costly parchment books as paper editions that “would be split into smaller pieces by book dealers or stationers, who would rent out the pieces to students.” Then the students would make their own copies or hire a scribe to do it for them.
Paper was a dynamic commodity. The supply created its own demands, accelerating if not creating bureaucracy and postal networks even before the printing press came on the scene. Müller’s chronicle of these developments and their cumulative impact is rich in detail but surprisingly brisk in the telling.
The significance of this history, the author explains, comes from the fact that “paper was never on its own; it always sought a symbiosis with other media.” We often talk about communication technologies in ways that stress conflict, forced obsolescence, the replacement of one medium by another.
“But media history also encompasses effects of resonance amplification and the symbiosis and feedback between media which have not become technologically integrated but instead react to and cooperate with one another as distinct, separate spheres.”
In Müller’s interpretation, paper stands as “a virtuoso of substitution... insinuating itself into existing patterns and routines” -- very much like digital communication itself, so often taken to be paper’s antithesis.
The translator, Jessica Spengler, has made the unusual choice to leave the Teutonic sprawl of Müller’s paragraphs intact, rather than breaking them up into pieces of less formidable size. A few run for three pages or more, and even the shorter ones sometimes read like miniature essays. While expansive, though, the paragraphs are lean. (Müller seems to have ignored Walter Benjamin’s tongue-in-cheek advice to academic authors: “Everything that is known a priori about an object is to be consolidated by an abundance of examples.... A number of opponents all sharing the same argument should each be refuted individually.”) White Magic is a remarkably concentrated book; that, I think, is why it will likely prove a re-readable one.
In 2012, Jessica L. Beyer received the Association of Internet Researchers award for her dissertation, “Youth and the Generation of Political Consciousness Online,” now been published as Expect Us: Online Communities and Political Mobilization (Oxford University Press).
The author, now a research scientist at the Information School at the University of Washington, spent several years monitoring and in some cases participating in a number of online communities which, though non-political, sometimes engaged in political discussion. Her analysis focuses on four sites. In two cases, the political concern led to offline activity, including the creation of parties that have won elections. At the other two sites, the conversation never made the leap to mobilization. Beyer’s study is series of ethnographies of the miniature social orders emerging at the sites, in search of the factors that generated or inhibited activism.
“Once I had chosen to study social sites,” Beyer explains in a long postscript on research methodology, “I had also chosen to study young people.” There’s an implicit “of course!” hovering over the remark – and fair enough, given that she did her digital fieldwork in the late ‘00s. But social sites have greyed somewhat in the meantime. Beyer’s perspective on “the generation of political consciousness online” may well apply to a broader demographic by now.
One of the sites in question enabled file-sharing, primarily of music and video, while two others were devoted to online gaming. The driving interest of a fourth cohort, the group known as Anonymous, seems harder to identify, though Beyer pins it down as well as seems possible by calling it “the nihilistic pursuit of entertainment, referred to as ‘lulz.’” Major sources of lulz (an idiom derived from an acronym: it’s the plural of LOL) include trolling, hoaxing, hacking, and “breaking s[tuff]."
The readerly appeal of ethnography usually comes from its attention to the details of everyday behavior and interaction taken for granted within a subculture. And that’s certainly true in the case of Anonymous, which -- like the Droogs in A Clockwork Orange -- has its own tightly self-encapsulating argot and code of conduct. The file-sharing and online-gaming communities also have specialized lingos and accepted norms, just as a stamp-collecting club might develop. But with Anonymous the markers of in-group status are much more sharply defined. Beyer understands this peculiarity to be a function, in part, of the design of the discussion forums that gave rise to Anonymous. Participants are never identified, even by a pseudonym, and venues do not have archives.
Because distinct identity is obliterated, “users assert their membership status in different ways,” writes Beyer. “To signal they are community members, users must use an extremely dense lexicon; show familiarity with community jokes and stories (signaling knowledge in a very particular way); articulate community values both directly and in the ways in which they frame conversations; and adhere to community norms of anonymity in all interactions, even when telling personal stories (e.g. ‘my math teacher is so stupid….’). Because of these norms of behavior, although the space is technically ‘anonymous,’ outsiders are easily spotted.”
While providing optimal conditions for digital hooliganism, these conditions would also seem to make political mobilization impossible – or, for that matter, completely irrelevant. (Misanthropic individualism tends to preclude any idea of the common good.)
But in 2008, the Church of Scientology forced a number of websites to take down the leaked video of a giddy Tom Cruise discussing his super-powers, and Anonymous responded with a campaign of attacks on its sites, accompanied by a memorable video of its own declaring war on Scientology. Faced with an angry swarm of unidentified and unidentifiable hackers, Scientology’s longtime strategy of litigation against its opponents was of no use. Members of Anonymous then joined forces with longtime critics of Scientology, many of them ex-members, to launch a worldwide series of protests outside its buildings which have continued, on and off, ever since.
Likewise, Pirate Bay, the file-sharing entity originally based in Sweden, took on the motion-picture and recording industries through street protests as well as its online activity. In 2006, it spawned a Pirate Party calling for the abolition of copyrights and patents and respect for privacy. By the end of the decade it was the fourth largest party in Sweden (with, Beyer notes, “the largest youth membership as well as the largest youth organization in Sweden”) and held two seats in the European Parliament. There are now Pirate Parties in at least 40 countries, with candidates elected to hundreds of offices at various levels of government, riding waves of discontent with intellectual property laws and surveillance.
Pirate Bay and the Pirate Parties share an ethos while remaining distinct. File-sharers can be anonymous, but not electoral candidates. While the original site administrators gave the political movement some direction, legal actions attempting to shut down Pirate Bay forced it to build anonymity into its very structure: it operates through a network of servers dispersed over an unknown range of countries, with no individual or group knowing more than a little of the system.
So anonymity, however counterintuitive this may seem, was a major factor in enabling the communities around two sites to move towards real-world activism. By contrast, the other two formations Beyer studied -- the game World of Warcraft and an online discussion-board system called the Imagine Gaming Network – required users to register and regulated their speech and behavior in ways that, she says, “undermine[d] collective group mobilization.”
Her account of how the layout of the different sites and the way they conditioned the degree of participants’ visible identity reveals a number of interesting contrasts – particularly between World of Warcraft, in which creation of an identity is part of the game, and the milieu of Anonymous, in which doing so is effectively impossible. On the gaming sites, in Beyer’s analysis, people are able to form smaller groups defined by shared interests or beliefs; they never reach the critical mass needed for mobilization in the offline world.
Perhaps, but other differences bear mentioning. Both WoW and IGN.com are commercial enterprises which exist strictly for entertainment. Individuals drawn to Anonymous or file-sharing through Pirate Bay are looking for entertainment too, of course. But they do so in ways that violate – or at best skirt – legal norms.
A gathering of stamp collectors might well include members also interested in international affairs. But no matter how passionate their discussion may become, they aren’t likely to be able to mobilize them on non-philatelic matters. I suspect gamers sites resemble the stamp collectors. They aren’t engaged in something that challenges any powers-that-be -- while Anonymous and the Pirates are, and wave a flag while doing it. Beyer's case studies are interesting, but her findings not entirely unexpected.
Some years ago I met a woman who owned a large calico cat bearing a certain resemblance to Queen Victoria: stout, regal, disapproving. She had enjoyed her mistress’s undivided attention as a kitten; thesecond cat joining them a few years later proved easy to dominate. But the large male primate who began coming to the apartment on some evenings was another matter. I appeared incapable of taking a hint, and she was not amused.
Before long I was spending most evenings there. Oaf though I may have been, I did get a feeling of being disdained, at best, and could imagine the older cat taking the woman aside to say, through unhappy looks and feline telepathy, “This guy has got to go.” If things ever reached that point, the woman held her ground – and reader, I married her. (The cat with less seniority had in the meantime grown fond of me, which may have helped.)
The situation de-escalated before reaching the stage depicted in Octave Tassert’s Die eifersüchtige Katze, one of the paintings reproduced in Jealousy (Yale University Press) by Peter Toohey, a professor of classics at the University of Calgary. The author roams across several cultures, media, and disciplines in his investigation of the green-eyed passion. In literature, jealousy tends to resemble a kind of madness, and it usually becomes part of the daily news only after escalating into lethal violence. But Tassert’s canvas presents the emotion in one of its more comic expressions.
Painted circa 1860, “The Jealous Cat” depicts a love triangle of sorts. We see a woman sprawled on her bed in dishabille -- with her lover in, let’s say, close proximity, still clothed for the most part but with his pants below his knees. (A coat hangs on a nearby chair, not draped over the back but thrown on it at an angle suggesting haste.) At first glance he appears to be standing. But the angle of his legs and the way one arm seems to be swinging upward -- and the startled expression on his face as he looks over his shoulder -- all suggest he has just bolted upright. Just behind him, and a little lower, you see the creature giving the painting its title: a jealous cat, stretching up to sink both claws into the man’s exposed buttocks.
“They’re obviously more tempting than the uninspiring ball of string left by the chair,” Toohey deadpans. The painting itself is humorous but it raises perennial questions about emotion: Do animals have them, or is that just anthropomorphizing? And if they do experience feelings that in a human would be understood as emotion, how similar to ours are they?
Animals’ inability to self-report their own mental states makes any answer more or less unverifiable, and we are in the same position regarding the emotions of the human child in its first few years. What we have with both nonverbal animals and preverbal infants is behavior that looks and sounds like what we associate with happiness, excitement, fear, and perhaps one or two other emotions. But is jealousy among them? The experience of it can be raw and overwhelming, but it responds to a situation that is fairly complex. “The foundation stone of jealousy,” writes Toohey, “is triangular”: the product of a situation “usually [involving] two people and some form of possession, animate or inanimate.” The classic form – “the clichéd sine qua non of the jealous situation,” as the author puts it – is the romantic triangle: the jealous party’s claim on the significant other is violated, or at least menaced, by a rival.
Whether or not its brain can process all the elements in play, the jealous feline in Octave Tassert’s painting has at least determined the fastest and most efficient way to disrupt the situation. A desire to hurt the rival may not be noble, but it’s understandable and reasonably straightforward, especially when the rival is standing right there.
Human beings are prone to making things more complicated. The desire for retribution can target the beloved as well as the rival, and even become more intense – sometimes to really horrifying extremes. The author cites one case that sounds like the brainchild of an exploitation-movie director trying to outdo the competition: A British man who spent a week beating, strangling, and threatening his girlfriend also tried to fill her ears and eyes with quick-sealing putty. In handing down a prison sentence, the magistrate told him: “You are almost insanely jealous.” Almost?!
Explosions of jealousy -- even of sexual jealousy, by all counts the most excruciating sort -- usually stop short of mayhem. Toohey notes that there is just enough of a stigma around jealousy to limit how openly we feel comfortable expressing it. At the same time, jealousy is a persistent enough force to make subduing it hellishly difficult, and also irresistible as raw material for art and literature. In Othello, the work most indelibly identified with the experience of jealousy, Shakespeare treats it as a passion that, once ignited, feeds itself, with imagination as the fuel -- even when the grounds for it are entirely false.
Toohey writes of the moment when an individual sees or hears something that ignites the emotion. Even when based in rock-solid fact – with no Iago whispering baseless insinuations – the suffering of the jealous person comes mostly from scenes and conversations running in an obsessive loop within the mind. One of the most interesting chapters of Jealousy considers how literary and artistic works present our eyes and ears as the organs that make us vulnerable to the suspicion then elaborated upon within the brain’s theater.
Perhaps that accounts for the bizarre revenge taken by the “almost insanely jealous” man mentioned earlier. And perhaps imagination is the factor distinguishing human jealousy from whatever it is animals feel when faced with rivalry. Our motives are more complex, and our memories are longer. That gives us an evolutionary advantage. But it also opens up wide vistas of potential misery, where the jealous mind is condemned to wander in circles.