Some weeks back, a publishing house in Spain announced that it would be issuing a deluxe facsimile edition of the enigmatic and sui generis volume best known as the Voynich manuscript, in a print run limited to 898 copies, selling at 7,000-8,000 euros each. That’s the equivalent of $7,400 to $8,400 -- a price tag guaranteed to separate the true bibliomaniac from the common run of book collectors.
But then Beinecke 408 (as the volume is also known, from its catalog reference in Yale University’s rare books collection) tends to throw a spell over those who contemplate it for very long. Running to about 200 parchment pages -- or closer to 240, if you count a number of long, folded-up sheets as multiple pages -- it is abundantly illustrated with drawings of plants that have somehow eluded the attention of botanists, surrounded by copious text in an unknown alphabet. It looks like what you’d get from throwing Roman, Greek and Arabic script into a blender along with a few alchemical symbols. At a certain point the artwork takes a noticeable turn: the plants are accompanied by miniature drawings of naked women sitting on the leaves, emerging from broken stems or bathing in pools. Those slightly Rubenesque figures also show up in what appear to be a number of astronomical or astrological charts. A final section of the book consists of page after page of closely written text, with starlike symbols in the margin that seem to indicate section or paragraph divisions.
It sounds like something H. P. Lovecraft and Jorge Luis Borges might have concocted to pull a prank on Umberto Eco. But mere description of the Voynich manuscript little prepares you for the experience of turning its pages, even in the considerably less expensive hardback just published by Yale University Press. The editor, Raymond Clemens, is curator of early books and manuscripts at the university’s Beinecke Rare Book & Manuscript Library. The color reproductions of each page are made at the size of the original; the ink or paint used by the illustrator at times bleeding through slightly behind the text and artwork on the other side of the parchment. The thing that strikes the eye most about the writing is how concentrated it looks: printed in a crisp, precise hand by someone who, especially in the final pages, seems determined to make good use of the space without sacrificing readability.
Which is, of course, the maddening thing about the book -- almost literally so, at times. The effort to figure out what it says has tested, and defeated, the mental powers of numerous researchers over the past century, beginning not long after the bookseller Wilfrid M. Voynich acquired it in 1912. (The Yale edition includes a detailed biographical article on Voynich, who seems to have escaped from the pages of a novel by Dostoyevsky before settling in London and moving, eventually, to New York. The label “bookseller” is too narrow by far. One telling detail: his father-in-law was George Boole.)
The first scholar to throw himself into solving the riddle was William Romaine Newbold, professor of moral and intellectual philosophy at the University of Pennsylvania, whose The Cipher of Roger Bacon was posthumously assembled from his notes and manuscripts and published in 1928. Its title reflects the earliest known attribution of the work: a letter from 1665 or ’66 reports that the book had been owned by Rudolph II -- the Holy Roman Emperor, patron of Johannes Kepler and alchemy enthusiast -- who believed the author to be the 13th-century English scientist and monk Roger Bacon. (Not to be confused with Francis Bacon, also an English scientist, born 300 years after the monk’s prime.)
Knowing that Roger Bacon was a pioneer in the study of optics and had experimented with lenses, Newbold boldly combined fact and speculation to argue that the Voynich manuscript reported Bacon’s discoveries using the microscope (spermatozoa, for example) and the telescope (the Andromeda galaxy). Furthermore, the hieroglyphs in the mysterious text actually consisted of much smaller letters -- combined in a code of great sophistication -- which were only visible with a microscope.
Quod est demonstrandum, sort of. Admirers of Roger Bacon found the interpretations plausible, anyway. But in 1931, the medievalist journal Speculum published a long and devastating assessment of Newbold’s methodology, which concluded that the code system he’d deduced was so vague and arbitrary that the messages he unearthed were “not discoveries of secrets hidden by Roger Bacon but the products of his own intense enthusiasm and his learned and ingenious subconsciousness.”
That judgment surely inspired caution among subsequent Voynich analysts. I found one paper, published in Science in 1945, claiming to have determined that the manuscript was written by a 16th-century astrologer and herbalist known to have had a particular interest in women’s illnesses. The researcher ends his report by insisting that it was not the product of “a learned and ingenious subconscious.” Be that as it may, the author also felt that “present war conditions” made it “undesirable to publish, at this time, the details of the key.”
That note of hesitation foreshadows one of the many interesting points made in the generally excellent short essays accompanying the Voynich manuscript in the Yale edition: “The extent to which the problems it poses have been a matter of professional as well as amateur interest is reflected in the fact that the best book-length introduction to this ‘elegant enigma’ was written by a government cryptologist … and published in-house by the U.S. National Security Agency.” The monograph (now in the public domain and available to download) indicates that the NSA had already played host to quite a bit of hard-core Voynich inquiry by the late 1970s, and who knows how much computational power has been directed at cracking it since then.
The Yale University Press edition ventures no theory of who created the Voynich manuscript or what it says. A chapter reporting on examination and tests of the material indicates that the parchment can be dated to roughly 1420, give or take a couple of decades, while multispectral imaging reveals the erased invisible signature of a pharmacist who died in 1622, using the noble title he received in 1608. That may not remove all possibility of a hoax, but it would seem to backdate it by a few centuries. The enigma, like the book itself, has proven nothing if not durable; this handsome and (relatively) affordable edition will serve to spread its fascination around.
In his autobiography, Benjamin Franklin describes how, as a striving young man in Philadelphia, he practiced a quite literal variety of moral bookkeeping. Having determined 13 virtues he ought to cultivate (temperance, frugality, chastity, etc.), he listed them on a table or grid, with the seven days of the week as its horizontal element. At night, before bed, he would make a mark for each time he had succumbed to a vice that day, in the row for the virtue so compromised.
A dot in the ledger was a blot on his character. Franklin explicitly states that his goal was moral perfection; the 13th virtue on his list was humility, almost as an afterthought. But without claiming to have achieved perfection, Franklin reports that his self-monitoring began to show results. Seeing fewer markings on the page from week to week provided a form of positive reinforcement that made Franklin, as he put it in his late 70s, “a better and a happier man than I otherwise should have been had I had not attempted it.”
Franklin’s feedback system was a prototype of the 21st-century phenomenon analyzed by Deborah Lupton in The Quantified Self (Polity), a study of how digital self-tracking is insinuating itself into every nook and cranny of human experience. (The author is a research professor in communication at the University of Canberra in Australia.) A device or application is available now for just about any activity or biological function you can think of (if not, just wait), generating a continuous flow of data. It’s possible to keep track of not only what you eat but where you eat it, at what time and how much ground was covered in walking to and from the restaurant, assuming you did.
In principle, the particulars of your digestive and excretory processes could also be monitored and stored: Lupton mentions “ingestible digital tablets that send wireless signals from inside the body to a patch worn on the arm.” She does not elaborate, but a little follow-up shows that their potential medical value is to provide “an objective measure of medication adherence and physiologic response.” Wearable devices can keep track of alcohol consumption (as revealed by sweat), as well as every exertion and benefit from a fitness routine. Sensor-equipped beds can monitor your sleep patterns and body temperature, not to mention “sounds and thrusting motions” possibly occurring there.
Self-tracking in the digital mode yields data about the individual characterized by harder-edged objectivity than even the most brutally honest self-assessment might allow. For Franklin, the path to self-improvement involved translating the moral evaluation of his own behavior into an externalized, graphic record; it was an experiment with the possibility of increasing personal discipline through enhanced self-awareness. The tools and practices that Lupton discusses -- the examples cited above are just a small selection -- expand upon Franklin’s sense of the self as something to be quantified, controlled and optimized. The important difference lies in how comprehensive and automated the contemporary methods are (many of the apps and devices can run in the background of everyday life, unnoticed most of the time), as well as how much more strongly they imply a technocratic sense of the world.
“The body is represented as a machine,” writes Lupton, “that generates data requiring scientific modes of analysis and contains imperceptible flows and ebbs of data that need to be identified, captured and harnessed so that they may be made visible to the observer.” But not only the body: other forms of self-tracking are available to monitor (and potentially to control) productivity, mood and social interaction. One device, “worn like a brooch … listens to conversations with and around the wearer and lights up when the conversation refers to topics that the user has listed in the associated app.”
Along with the ability to monitor and control various dimensions of an individual’s existence, there is likely to come the expectation or obligation to do so. On this point, Lupton’s use of the idea of self-reflexivity (as developed by the social theorists Zygmunt Bauman, Ulrich Beck and Anthony Giddens) proves more compelling than her somewhat perfunctory and obligatory references to Michel Foucault on “technologies of self” or Christopher Lasch on “the culture of narcissism.” The digitally enhanced, self-monitoring 21st-century citizen must meet the challenge of continuously “seeking information and making choices about one’s life in a context in which traditional patterns and frameworks that once structured the life course have largely dissolved … Because [people] must do so, their life courses have become much more open, but also much more subject to threats and uncertainties,” especially “in a political context of the developed world -- that of neoliberalism -- that champions self-responsibility, the market economy and competition and where the state is increasingly withdrawing from offering economic support to citizens.”
In such a context, high-tech self-tracking can provide access to exact, objective self-knowledge about health, productivity, status (there are apps that keep track of your standing in the world of social media) and so on. Know thyself -- and control thy destiny! Or so it would seem, if not for a host of issues around who has ownership, use or control of the digital clouds that shadow us. Lupton points to a recent case in which lawyers won damages in a personal-injury suit using data from a physical fitness monitor: the victim’s numbers from before and after the accident were concrete testimony to its effect. Conversely, it is not difficult to imagine such data being subpoenaed and used against someone.
The unintended consequences may also take the form of changed social mores: “Illness, emotional distress, lack of happiness or lack of productivity in the workplace come to be represented primarily as failures of self-control or efficiency on the part of individuals, and therefore as requiring greater or more effective individual efforts -- including perhaps self-tracking regimens of increased intensity -- to produce a ‘better self.’” Advanced technology may offer innovative ways to dig ourselves out of the hole, with the usual level of success.
Lupton is not opposed to self-tracking any more than she is a celebrant of it, in the manner of a loopy technovisionary prophet who announces, “Data will become integral with our sensory, biological self. And as we get more and more connected, our feeling of being tied into one body will also fade, as we become data creatures, bodiless, angelized.” (I will avoid naming the source of that quotation and simply express hope that it was meant to be a parody of Timothy Leary.) Instead, The Quantified Self is a careful, evenhanded survey of a trend that is on the cusp of seeming so ubiquitous that we’ll soon forget how utterly specific the problems associated with this aspect of our sci-fi future are to the wealthy countries, and how incomprehensible they must seem to the rest of the planet.
One detail from Sir Thomas More’s Utopia stuck with me after reading it long ago, and it’s come to mind with some regularity over the past few months: on More’s imaginary island, anyone who aspired to high office was judged to be, for that very unreason, unfit to hold it.
This fall happens to be the book’s quincentennial. More sent the manuscript to his friend Erasmus in September 1516, and it was in print by the end of the year. That the anniversary coincides with an exceptionally nasty and spirit-blighting American presidential election seems providential, as if to confirm that the Utopians were definitely on to something.
Apart from the systemic ban on political ambition, my only other recollection of Utopia was that it was a bit dull. The sole thing that kept me going was the adolescent conviction (long since abandoned) that starting to read a classic implied a commitment to finishing it, come what may. So when I returned to the book recently, it was without fond associations -- and no expectation at all of laughing, since its satirical quality had gone right over my head.
The title is a pun in Greek: More’s ideal society is a good place (eu-topia) that’s also no place (u-topia). The play on words, while minimally hilarious, hints that the author is working in the same ironic vein as Erasmus had just a few years earlier in The Praise of Folly. There, everything people treat as important, dignified or admirable is shown to be evidence of human foolishness at work. More’s detailed picture of a happy, harmonious, prosperous country serves to highlight the corruption and irrationality of the social and political system 500 years ago -- with every reason to think things would only get worse.
Utopia opens with a reference to Henry VIII, then reigning as “the unconquered King of England, a prince adorned with all the virtues that become a great monarch,” which certainly seems prudent. (Henry did eventually have the author executed, but not for his literary efforts.) The narrator and a friend are joined by one Raphael Hythloday, a learned and widely traveled gentleman, who has some experience with royal failings. Those occupying the throne tend to be “more set on acquiring new kingdoms, right or wrong, than on governing well those they possess,” for example. Influence on the court comes from “only those for whom the prince has much personal favour, whom by their fawning and flatteries they endeavour to fix to their own interests.” His complaints are broad enough to limit how much offense they might give to any particular sovereign.
The narrator and his friend try to persuade Hythloday that his wisdom and experience should be put to use in changing the system from within -- that is, by becoming a courtier. He refuses on the grounds that any reforms he might propose would meet with “proud, morose and absurd judgments” by those with a vested interest in the status quo.
Things are better organized in Utopia, a land somewhere beyond the equator where Hythloday lived for five years. His listeners prevail upon him to describe the place -- and so he does, at some length. The prolonged explanatory monologue became a standard element of utopian fiction; in this, the genre’s foundational work, it fills the remaining two-thirds of the book.
It’s a communist manifesto, minus any process of historical change in getting there. On Utopia there is no private property, no poverty and very few laws. The inhabitants exchange houses every 10 years and dress in simple, standardized clothes. They are industrious and work at the jobs for which they are suited by talent and temperament. Money is not used except in one emergency situation we’ll consider. The Utopians are pagans but well behaved. “One of their most ancient laws,” we’re told, is “that no man ought to be punished for his religion.” Before being married, a couple sees each other naked at a public ceremony; this may be shocking to Christendom but it prevents unwelcome surprises.
Whether More was advocating the policies and arrangements that his traveler described -- or even considered them realizable or desirable -- has been a matter for much subtle argument. (Given More’s subsequent persecution of Protestants, the religious pluralism in Utopia was never more than a thought experiment.) But what struck me while rereading the book was More’s consistent sense that social inequality and moral viciousness are as linked as chicken and egg.
“Pride, that plague of human nature,” says Hythloday, “… does not measure happiness so much by its own conveniences, as by the miseries of others; and would not be satisfied with being thought a goddess, if none were left that were miserable, over whom she might insult. [Pride] thinks its own happiness shines the brighter, by comparing it with the misfortunes of other persons; that by displaying its own wealth, they may feel their poverty the more sensibly.”
So keeping in mind that More himself was a lawyer, and a successful one, there’s a moral and satirical reason why Utopia has no attorneys: the inhabitants “consider them as a sort of people whose profession it is to disguise matters and to wrest the laws, and, therefore, they think it is much better that every man should plead his own cause …. After the parties have laid open the merits of the cause, without those artifices which lawyers are apt to suggest, the judge examines the whole matter, and supports the simplicity of such well-meaning persons, whom otherwise crafty men would be sure to run down …”
The Utopian policy regarding money allows More to score an especially sharp jab at pride and privilege. The Utopians accept that it’s necessary to keep a certain amount of gold and silver on hand, says Hythloday, in case they need it when dealing with other countries. But since they themselves judge the value of metals by their use, they have a much higher regard for iron. Rather than just pile up the gold in storage, however, they use it to make chamber pots and chains for criminals undergoing punishment. Likewise, they make practical use of jewels by giving them to small children as playthings.
A group of visiting dignitaries once wanted to overawe the Utopians with their power and wealth. And so they made their grand entrance, dressed to impress: “The ambassadors themselves, who were of the nobility of their country, were in cloth-of-gold, and adorned with massy chains, earrings and rings of gold; their caps were covered with bracelets set full of pearls and other gems -- in a word, they were set out with all those things that among the Utopians were either the badges of slavery, the marks of infamy or the playthings of children.”
More also ran diplomatic missions for England. He was on one to the Netherlands in 1515 when he started writing what became Utopia.The image of an ambassador decked out in fancy handcuffs and wearing, say, a solid-gold toilet seat around his neck is surprisingly broad for a writer of More’s learning and station; he clearly had mixed feelings about his own political role. But after 500 years, it’s still reasonably funny, and it puts the trappings of political ambition in a suitably critical perspective.
This is a story about a story. A story that might be worth millions of dollars. It’s also a cautionary tale for academics who dream of writing best-selling books.
One day in early 1999, I found myself awaiting the retrieval of books in the main reading room of the Jefferson Building at the Library of Congress. I was completing the research for my doctorate in history at Georgetown University. Passing the time by strolling through the alcoves circling the giant room, my eye caught the spines of a group of slim volumes resting on a shelf. They were a series of oral histories compiled by the LA84 Foundation, an organization assembled by the 1984 Los Angeles Olympic Committee that was tasked with, among other things, providing scholars and students with historical materials related to the Olympic Games. One volume contained an interview with Gordon Adam, a member of the University of Washington’s gold medal-winning crew team in Berlin in 1936. Having been a collegiate oarsman, I started reading Adam’s story.
It was riveting. Adam grew up poor on a farm in the Pacific Northwest. He worked at a salmon-canning factory in Alaska to make enough money for college and then enrolled at the University of Washington in the depths of the Depression. He decided to try rowing -- at the time a major intercollegiate sport -- and in his oral interview, he told a marvelous tale of how he and his teammates topped Eastern Ivy League competition for the right to represent the United States in Berlin. He recalled traveling to Europe, his impressions of Nazi Germany and seeing Hitler at the opening ceremony. He finished by describing his crew’s stirring come-from-behind victory over Italian and German crews in a very tight race.
Although that oral history had nothing to do with my dissertation or research, I knew I’d stumbled upon a great story. Global in scope, cinematic in its drama, this story -- I felt strongly -- would sell. I copied the oral history on the library Xerox machine, tucked it away in a file, and told myself someday I would research and compose a book on it.
Life moved on. I finished the dissertation, accepted a visiting assistant professor position and gained a tenure-track job. I focused on securing tenure by publishing my scholarship on radio and journalism history in top journals. I also worked on improving my teaching and agreed to enough service commitments to fill up my time. All the while, however, I kept gathering material on that 1936 crew team. I “collected string,” as they say in journalism.
The University of Washington put me in touch with surviving members of the crew, some of whom I interviewed, and I discovered the original CBS recording of the race broadcast at the Paley Center for the Media. I contacted Dan Raley, one of the last sports editors of the Seattle Post-Intelligencer, for more information on the crew, and he generously shared his materials and thoughts.
Then I received tenure, and I started to seriously pursue the book. I wrote a book proposal, a sample chapter, a magazine-length version of the story and even a 750-word op-ed about this incredible Olympic moment. I never considered composing the story as a dry academic or scholarly tome. Nor did I have literary pretension. Rather, I wanted to bring the story alive and engage the public journalistically, reporting the facts interspersed with the words and voices of the Olympians themselves. The story, it appeared to me, required little embellishment.
Crickets. I pitched the material everywhere. I actually had started pitching it as a book proposal and magazine article even before I got tenure, using the news peg of the 2006 and 2008 Olympic Games. I tailored my approaches to every kind of outlet as precisely as possible. For example, I pitched the story to the Chronicle Review, emphasizing how impoverished Depression-era college kids used intercollegiate athletics to learn about the world. But my magazine article pitches were either rejected or ignored by Slate, The Atlantic, The New Republic, Sports Illustrated, Smithsonian, the New York TimesMagazine and elsewhere. In fact, my version of the story was continually and consistently turned down as an extended essay, a newspaper column and a book proposal by numerous editors, literary agents and publishers. I got almost no feedback. I stopped counting rejections when they passed 60.
Still, I refused to give up. In 2012, with the London Olympics on the horizon, I again pitched the story everywhere. Josh Levin, an editor at Slate, liked it. He published it as “Six Minutes in Berlin” and made it the centerpiece of their 2012 London Olympic coverage. The article exploded on the web, lasting four days as Slate’s most-read feature, generating a long comment thread and thousands of social media recommendations.
Emails flooded in. Literary agents that had previously rejected the book proposal now inquired whether I would be interested in representation. One major publisher that explicitly prohibits the submission of unsolicited manuscripts wrote and asked me to send the book manuscript.
Then, just as quickly, silence. It turned out that, one year before, Viking Press had inked an enormous deal with an author named Daniel James Brown to write the story of the 1936 crew. That book, titled The Boys in the Boat, came out in 2013 and remains near the top of the New York Times nonfiction paperback best-seller list as of this writing. Brown’s book tells the story of Joe Rantz, one of the rowers, and the significant investment by Brown’s publisher in the success of The Boys in the Boat made others reluctant to take on a competing project.
An important New York literary agent told me as much over the telephone. The publishing industry, he explained, was under enormous economic stress. The book trade was getting slaughtered, he said, and big publishing had essentially evolved into a cartel (my word, not his). Publishers simply could not compete with each other by bidding competitively for the same stories, and once Brown got his contract, any chance I had of publishing “Six Minutes in Berlin” as a book had evaporated. No major publisher would waste their time or resources undercutting another major publisher’s list.
That was bad enough. But then The Boys in the Boat came out, and much to my surprise, my interviews with two oarsmen were cited by Brown. I only shared transcripts with the oarsmen themselves -- coxswain Bob Moch and Jim McMillin -- both of whom had died in 2005, before Brown met Joe Rantz or began his research. Somehow copies -- the only ones I shared -- ended up in Brown’s hands. He and his publishers undoubtedly knew I was working on my own book because of the publication of “Six Minutes in Berlin” in Slate in 2012. The interviews proved remarkably illustrative of occurrences in the boat during both the national championship and the Olympic gold medal races, and it disturbed me greatly to see information I had collected published elsewhere without my permission.
The Boys in the Boat is a good book, but it’s not history. It’s not the book I would have written. It’s peppered with inaccuracies and embellishments. One of the reasons my manuscript took so long to compose was that I possess a doctorate in history, and verifying information by cross-referencing sources requires an enormous amount of time. In other words: accuracy matters. I’m not bothered by the slight copy-editing errors that pop up in The Boys in the Boat that are endemic to any manuscript -- such as when Cornell University, not Columbia University, is inaccurately credited with victory in the first Intercollegiate Rowing Association regatta.
But I am disturbed that inaccuracy and embellishment is apparently acceptable when writing history for popular audiences. For example, Brown offers this dramatic opening to the race broadcast: “At 9:15 a.m., the voice of NBC’s commentator, Bill Slater, began to crackle over KOMO’s airwaves in Seattle, relayed from Berlin.” But according to NBC’s records in the Library of Congress and numerous other sources, Bill Slater was in London that evening preparing to cover the next day’s White City track meet featuring Jesse Owens. The rowing final in Berlin actually started at 9:02 a.m. Seattle time, and anybody tuning in to NBC would have missed it because the network widely publicized the wrong starting time in newspapers around America. Only those people tuning to CBS would have caught this Olympic exclusive. Yet these facts don’t deter Brown. “NBC’s Bill Slater was screaming over KOMO’s airwaves in Seattle,” he informs his audience at a particularly dramatic moment in the race narrative. This did not occur.
A lot of ink has been spilled recently about the need for academics to write for wider audiences. Much of the criticism presumes that academics prefer to write and speak in impenetrable rhetoric designed to limit communication to only people initiated in the cloistered world of scholarly interchange. I don’t doubt that this problem exists. But many critics have no idea how many scholars -- like myself -- have attempted to write for wider audiences but found ourselves blocked by gatekeepers in the publishing industry. Although I’ve published numerous essays and newspaper columns for wide public readership, and I believe my book proposal proved my ability to deliver clear, serviceable -- and even engaging -- prose, no publisher took a gamble on this first-time author coming out of academe.
This story, however, might have a happy ending. Although Daniel James Brown has a best seller and the revenues from his movie deal for The Boys in the Boat, I continued pursuing my project. I reshaped my manuscript to more closely align with academic standards and fit the constraints of scholarly publication. I then sent it out to academic publishers.
Obviously, Brown’s best seller significantly damaged the trade market for “Six Minutes in Berlin.” But the University of Illinois Press responded positively to the parts of my manuscript about Olympic broadcasting. No single volume exists on the birth of global sports broadcasting as developed by Nazi radio authorities. If I were willing to interweave this larger story about global telecommunication history into the narrative of the rowers, who gained brief national celebrity from their victory, they told me they would be interested. But I needed to satisfy peer reviewers and severely limit the word count. The first peer reviews proved encouraging, and a contract was signed. Six Minutes in Berlin: Broadcast Spectacle and Rowing Gold at the Nazi Olympics will be published this month.
But I won’t make a million dollars.
EDITOR’S NOTE: Inside Higher Ed reached out to the publisher and author of The Boys in the Boat for a response to this piece, and they had no comment.
Michael J. Socolow is an associate professor of communication and journalism at the University of Maine. Six Minutes in Berlin: Broadcast Spectacle and Rowing Gold at the Nazi Olympics will be published this month by University of Illinois Press.
In late August, residents of Greenville, S.C., began reporting to police that one or more clowns had been observed attempting to lure children into a wooded area. It was an odd moment in a year that had already seen more than its share.
Since then, reports of sinister-clown activity (e.g., threats, assaults, the brandishing of knives and standing in place while waving slowly in a menacing manner) have gone viral throughout the United States, with a few now coming in from elsewhere in the world. Professional clowns are distressed by the damage to their reputation, and Ronald McDonald has gone on sabbatical for an indefinite period.
Like many anomalous phenomena -- UFOs, for example, or appearances by Elvis or Bigfoot -- clown sightings tend to come in waves. The recent spate of them has been unusual in both its geographical range and its emotional intensity -- although I suspect that coulrophobia is in fact the normal, even default, emotional response to clowns in any context. A study of children’s response to hospital decorations conducted by researchers from the School of Nursing and Midwifery at the University of Sheffield in England found that “clowns are universally disliked by children. Some found them frightening and unknowable.” And over the past 30 years or so, a strain of pop-culture iconography has tapped into that basic anxiety and amplified it with a series of overtly horrific clowns.
Some of the recently reported incidents involved people wearing commercially produced horror-clown masks. Whatever deep psychological wellsprings may have driven the clown sightings of previous years, the current cycle is, at least in part, a performance of mass hysteria -- an acting out of uncanniness and anxiety, with some individuals playing the menacing part in an almost standardized way.
Trying to make sense of this funny business, I did a search of my digital archive of journal articles, conference papers and whatnot in hopes of finding a paper -- by a folklorist, maybe, or possibly a psychoanalyst -- that might help elucidate the clown question. The most interesting material to turn up was by the late Orrin E. Klapp (1915-1997), a sociologist, whose first book was Heroes, Villains and Fools: The Changing American Character (1962).
Sections of it originally appeared as journal articles; a few of them made passing reference to clowns and clowning. But in these pieces, Klapp is interested in something more general: the range of fairly informal labels or categories we use to characterize people in the course of ordinary life. Examples he gives are “underdog,” “champ,” “bully,” “Robin Hood,” “simpleton,” “crackpot,” “cheat,” “liar” and “big shot.” (“Clown” is one of them, of course, but let’s not get ahead of ourselves.)
What intrigues Klapp about such labels is that they reflect, but also enforce, prevailing values and social norms. Some express a severe judgment (“traitor”) while others are relatively inconsequential (“butterfingers”). New labels or epithets emerge from time to time as others fall out of use; they are part of the flux of everyday life. But Klapp argues that the labels implying particularly strong judgments fall into three general categories that do not change much with time: the hero, the villain and the fool.
“The most perfect examples of heroes,” Klapp writes in one paper, “are to be found in legendary or mythical personages who represent in a superhumanly exaggerated way the things the group admires most.” Villains are “idealized figures of evil, who tend to countermoral actions as a result of an inherently malicious will,” prone to “creating a crisis from which society is saved by a hero, who arrives to restore order to the world.”
The contrast between hero and villain is clear and sharp, but not exhaustive. “If the villain opposes the hero by exaggerated evil traits,” writes Klapp, “the fool does so by his weaknesses, his métier being failure and fiasco rather than success. Though an offender against decorum and good taste, he is too stupid or ineffectual to be taken seriously. His pranks are ridiculed rather than severely punished.”
These three almost archetypal figures are seldom encountered in their purest form outside of fairy tales or superhero comic books. But most of the labels applied to people in the course of ordinary life can, in Klapp’s view, be subsumed under them. (The underdog is a kind of hero; the traitor a form of villain; the fanatic a variety of fool.) The symbolic figures and the everyday labels alike “help in the preservation of values” and “nourish and maintain certain socially necessary sentiments” -- such as “admiration of courage and self-sacrifice, hatred of vice, contempt for folly” and so forth.
Preservation of consensual values and the proper nourishment of socially necessary sentiments were major concerns of American sociologists of the Eisenhower era -- and Klapp’s framework was, in that respect, both normative and normal. But there’s more to his argument than that. He worried that mass media and propaganda techniques could exploit or corrupt those sentiments: Klapp’s papers on villainy and vilification in American culture concern, in part, the then recent success of Joseph McCarthy. He also deserves credit for paying attention to the significant ideological baggage carried by ordinary language.
The clown, in his schema, definitely falls under the heading of the fool -- but with a difference. As someone deliberately accepting the role, inducing ridicule rather than just succumbing to it, the clown exemplifies what Klapp calls the paradoxical status of the fool as “both depreciated and valued: it is at the same time despised and tolerated, ridiculed and enjoyed, degraded and privileged … He also acts as a cathartic symbol for aggressions in the form of wit. He takes liberties with rank; and as butt or scapegoat receives indignities which in real life would be mortal insult or conflict creating.”
Klapp draws close to an insight into a type of clown he doesn’t seem to have recognized: the menacing kind, in Greenville or elsewhere. For the clown, on these terms, has reason to want revenge, to wreak havoc as much as the villain does. (Here one also thinks of a certain political figure with an orange face, unnatural hair and a strange combination of extreme self-centeredness with no discernable self-awareness.) The stock of widely accepted heroic figures may be at an all-time minimum, while neither clowns nor villains are in short supply, and it’s getting harder to tell them apart.
Across almost a century of American social and political change, W. E. B. Du Bois was the pre-eminent African-American author and thinker, bar none. He was born three years after the end of the Civil War and died just one day before the March on Washington in 1963. He was the first black scholar to receive a Ph.D. from Harvard University. The German sociologist Max Weber admired his book The Souls of Black Folk (1903) and tried to arrange its translation. And his place as founding editor of the National Association for the Advancement of Colored People's magazine, The Crisis, gave him not just an agenda-setting role in the history of the civil rights movement but also an international influence.
W. E. B. Du Bois: Revolutionary Across the Color Line by Bill V. Mullen (published by Pluto Press, with distribution in the United States by the University of Chicago Press) serves as a timely introduction to this impressive and somewhat imposing figure, while also reframing Du Bois’s life and work beyond the boundaries of the American context. Mullen is a professor of English and American studies at Purdue University and the author of two previous studies of Du Bois: Afro-Orientalism (University of Minnesota Press, 2004) and Un-American: W. E. B. Du Bois and the Century of World Revolution (Temple University Press, 2015). I interviewed him by email about his most recent book.
Q: Du Bois said that the problem of the 20th century was the problem of the color line. We heard a lot about the United States becoming a “postracial” society when President Obama was first elected on the assumption that the problem had been solved, which is not a perspective often championed these days. What do you think counts as the most pertinent aspect of Du Bois’s legacy now, after eight years of an African-American president and several of civic unrest on a scale we haven't seen for decades?
A: I think the most pertinent aspect of Du Bois’s legacy to today’s protest movements -- against police violence, for Black Lives Matter and the movement for Palestinian civil rights, for example -- was his insistence that only mass protest could bring about meaningful social change. Du Bois was eventually weaned away from the idea that capitalism and racism could be reformed from above. His view of democracy was that it was a living thing animated by ordinary people engaged in self-activity for equality.
All of the major social justice organizations he was involved with -- the Pan-African movement, the Socialist Party, the NAACP, the Peace Information Center against atomic weapons, the Communist Party -- were interracial or international movements that challenged institutions of power and authority. An especially relevant example to our time is the work Du Bois did to create the “We Charge Genocide” petition delivered to the United Nations in 1951. He wrote the first drafts of that petition, which charged the U.S. state with disproportionately causing black death through poverty, poor schooling, social and police violence. After Trayvon Martin was killed in 2012, a group of young Chicago activists formed the group We Charge Genocide to document police shootings of African-Americans in Chicago and to honor that earlier effort. Du Bois’s legacy to our time was made very real and direct in that moment.
Q: You write that biographers and scholars have neglected or underestimated the significance of Du Bois’s long-term political development, and at one point, you suggest there’s a tendency to overemphasize his early book The Souls of Black Folk (1903) almost as if that’s his single major work. David Levering Lewis’s two-volume biography of Du Bois seems very broad in scope and deep in detail, so I’m wondering if there are particular discussions of Du Bois, or perspectives on him, that you’re challenging.
A: There are two parts to this exclusion tendency. Levering Lewis’s biography of Du Bois is magnificent. But he dedicates only 16 out of almost 1,400 pages to the last eight years of Du Bois’s life. In that time, Du Bois traveled to the Soviet Union and China, joined the Communist Party, published his autobiography in the Soviet Union, and moved to Ghana. The effect of downplaying those events is to diminish them as late-in-life mistakes of someone who has taken a bad political turn or has simply lost his bearings in old age. I argue instead that that those culminating events of Du Bois’s life can only be explained by tracing them back to points of origins far earlier. I dedicate a whole chapter to Du Bois’s writings on Asia, for example, which begin in 1905, because they explain why he later supported Maoism so strongly and why he said in the 1940s that the future of the world depended upon events in Asia.
Second, there is still a tendency to ignore Du Bois’s lifelong interest in Marxism so that he remains an avuncular “race man” figure for scholars in the academy. To give an example, Du Bois wrote a 300-page manuscript called “Russia and America” in 1950. His publisher, Henry Giroux, wouldn’t bring it out during the Cold War, saying it was too pro-Soviet and anti-American. To this day, it has never been published. I spend a good deal of time talking about the book because it explains better than any other single Du Bois text why he sympathized with the Russian revolution. The book is also important for showing how Du Bois saw the Russian revolution as a sequel to African-American self-emancipation from slavery, an event he called an “experiment of Marxism.” My tendency then is to show that Marxism was always central to Du Bois’s political development -- not a detour, diversion or mistake.
Q: Arguably Du Bois’s life and work are too large, too far-flung, even for Paul Gilroy’s notion of the “Black Atlantic,” since the Indian independence struggle (among other Asian developments) was so important for him. You discuss him as a “transnational” figure. Please say more on that.
A: Du Bois was most accurately described as an internationalist. His worldview was framed by 19th-century nationalisms, the Pan-Africanist movement, Communist internationalism and the anticolonial movement of the 20th century. His political orientation was to see in all directions simultaneously the interdependence of the advanced and underdeveloped worlds, as well as the historical movements of people between nations and territories. He called Japan’s defeat of Russia in their 1905 war the first “crossing of the color line” in world history, and India’s independence in 1947 the greatest event of the 20th century. He first used his famous coinage “The problem of the 20th century is the problem of the color line” in the 1900 Pan-African Congress address to refer to the relationship of nonwhite peoples across the world to their colonial masters.
Intellectually, his influences ran from Hegel to Alexander Crummell, Bismarck to Nehru. His 1928 anticolonial novel, Dark Princess, is a rewriting of Shakespeare’s A Midsummer Night’s Dream. For me, communism and socialism provided the intellectual synthesis of this global perspective: he understood what the Communist International called “world revolution” as the drawing together of modern humanity into a single project, or totality, of global unity and emancipation. That is the main theme of my book, and the through line for my account of his lifelong political development.
Q: Would publishing the manuscript of Du Bois’s “Russia and America” be worthwhile now? It's certainly odd to think of a book-length work by a figure of such significance languishing in the archives.
A: “Russia and America” should absolutely be published. Vaughn Rasberry’s important new book, Race and the Totalitarian Century,also puts “Russia and America” at the center of Du Bois’s Cold War writing. The problem is the Du Bois scholarship industry. Most Du Bois scholars haven’t read the manuscript and therefore don’t understand its importance. Others who have read it dismiss it because Du Bois is full throated in his praise of the Soviet Union at a time when many of Stalinism’s worst errors were becoming well-known.
In other words, the manuscript still lives in the shadow of Cold War thinking that should be long past by now. Too many scholars would prefer to preserve a hagiographic image of Du Bois as a benign humanist or saint rather than comprehend both the depth of his commitment to Communism and the reasons he oftentimes looked past problems with Stalin’s Russia. It’s a kind of “Don’t ask, don’t tell” approach to scholarship, which does a disservice to students and scholars who want to comprehend Du Bois and socialism in the 20th century -- problems and all.
Q: Your book follows a difficult line with respect to some of Du Bois’s political commitments. You seem understanding, or at least nonpolemical, with regard to his support for the regimes of Stalin and Mao, but a number of remarks make clear you reject those politics. How do you manage to balance those perspectives?
A: Du Bois’s political evaluations of Stalin’s Russia and Mao’s China were consistent with those of many of the people whom we consider to be the most important radicals of the 20th century, including the majority of anticolonial leaders from Asia and Africa. His strong desire for decolonization led him to trust the Soviet Union and China and their promises of aid to that project well past the time their revolutions had become corrupted. To be for world revolution and decolonization in the 20th century, in other words, was to sign up for Communist internationalism with all of its faults. Du Bois signed up early and never fully recanted.
On the other hand, he misapprehended the meaning of Marxism and socialism in ways that we should not forgive or forget. He confused state capitalism -- Stalin’s system of socialism in one country and bureaucratic rule from above -- with the real meaning of socialism as working-class self-emancipation. His thin understanding of Japanese and Chinese history caused him to perceive Japanese imperialism and expansionism in China as a viable alternative to capitalism for nonwhite workers of the world. Du Bois was both brilliant and fallible.
But he was always, as I try to make clear, vying to find a way that ordinary people could fashion their own liberation and self-emancipation. He found this match of political will and human self-activity in his most brilliant book, Black Reconstruction in America (1935). If he had written nothing else in his life, Black Reconstruction would have cemented his place as one of the most original scholars and political theorists of human freedom. So his life and his work demand a judicious and balanced approach that is well grounded in the theories of revolution and human liberation he was trying to advance. I try to provide that approach, and as you say, walk that line, in my book.
Q: Du Bois’s early worldview reflects a belief in elite leadership -- “the talented tenth.” Your book stresses his move toward a more democratic perspective, an emphasis on agency and power from below. But isn’t there a lot of continuity in his thinking? Aren’t traces of the young Du Bois who admired Bismarck still discernable in the octogenarian who wrote a glowing tribute following Stalin’s death in 1953?
A: There are two kinds of continuity in Du Bois’s political thought across the course of his long life. One is the quest cited above for human emancipation carried out by ordinary people. In 1956, only seven years before his death, Du Bois wrote an essay in tribute to one of his heroes, the socialist militant labor leader Eugene Debs. At a time in which he was well aware of problems in the socialist models of both Stalin’s Russia and Mao’s China, Du Bois wrote, “A state socialism planned by the rich for their own survival is quite possible, but it is far from the state where the rule rests in the hands of those who produce wealth and services and whose aim is the welfare of the mass of the people.” That is the Du Bois who fought for what we can call “socialism from below.”
On the other hand, Du Bois never quite gave up the idea that a “great man” -- a Bismarck or a Stalin -- could redirect human history. The socialist William Gorman put this very well in an essay in the 1950s. About Du Bois’s defense of Stalinism, Gorman wrote, “There he can find embodied … in his life work in regard to the negroes: the conception of the talented tenth and the urge toward international revolt. Stalinism … approaches and manipulates the masses like an elite convinced of their backwardness and incapacity; hence the necessity to dictate, plan and administer for them from the heights of superior knowledge and wisdom.”
My final assessment is that Du Bois was a contradictory figure, but one who made the struggle for black freedom central to the 20th-century struggle for human emancipation in all its forms. We should not blame Du Bois that history didn’t solve the problem of the color line. We should celebrate the fact that he was one of the few people in American history to try to use every tool at his disposal to develop a theory and practice of human emancipation. He was a dangerous figure in the very best and most radical sense of that word.
Until Michel Foucault mentioned him in passing in the first volume of his History of Sexuality (1976), the Viennese physician Heinrich Kaan’s role as the pioneer in medical research on paraphilias seems to have gone unnoticed. The title would have gone by default to Richard Krafft-Ebing, who published the first edition of his encyclopedic Psychopathia Sexualis in 1886. And the long disappearance of Kaan into that work’s shadow is even more unjust given that he was the first to use the title, more than 40 years earlier. (Kaan goes unnamed in the English rendering of Krafft-Ebing’s 12th edition -- whether the omission is the author’s or the translator’s I don’t know.)
As remedy to that neglect, Cornell University Press has publishedHeinrich Kaan’s “Psychopathia Sexualis” (1844): A Classic Text in the History of Sexuality, edited by Benjamin Kahan, an assistant professor of English and women’s and gender studies at Louisiana State University, in a translation by Melissa Haynes, a classicist at Bucknell University. Judging it “too dangerous to hand over to the general public” until “its utility and integrity can be proven,” Kaan wrote his treatise in Latin, but he hoped that it would meet with sufficient professional approval that he could arrange to have it “translated into a vernacular language such as French.”
The index contains reviews from medical journals of the day, which are decidedly mixed. One of Kaan’s peers vents his irritation that “people continue to belabor themselves and others” by writing in a dead language that is inadequate for modern purposes “even when it is masterfully employed!” The reviewer then strongly implies that Kaan is “among those who must still struggle with vocabulary and syntax” and “would do best to simply avoid it altogether.” Another critic praises it as “creditable to the author,” unlike most publications “on the revolting subjects of which it treats.”
Understandably, then, no clamor for a translation was heard in Kaan’s own day. “As far as I am aware,” Foucault said during his course of lectures for 1974-75 at the Collège de France, “it is the first treatise of psychiatry to speak only of sexual pathology but the last to speak of sexuality in Latin.” (Presumably Foucault meant that it was the last monograph to be composed solely in that language: Krafft-Ebing switched from German to Latin whenever it was necessary to describe deviant sexual behavior in potentially salacious detail.)
The liminal status of the first Psychopathia Sexualis -- its position near the end of a centuries-old mode of scholarly discourse and at the inauguration of a new disciplinary organization of knowledge -- render Kaan’s project interesting now in ways that it couldn’t be for its contemporary audience. The book’s structure and method now look peculiar. Kaan announces at the start that he was driven by “a desire to collect case studies, to examine them and from them to deduce general principles, and then to apply to them every kind of theoretical and practical knowledge and, thus, to derive from them rules useful to physicians.” But unlike Krafft-Ebing, much less Sigmund Freud, the author keeps those case histories (and his “deductions” from them) mostly to himself.
Instead, Kaan moves directly to a high level of generalization: plants, animals and humans alike are distinguished from the inorganic world by “the vital force [vis vitalis] by means of which the organism comes into being, is nourished and sustained.” This vital force subsists through two modes of reproduction, internal and external, corresponding to an organism’s nutrition and propagation, respectively. Kaan then gives an overview of the comparative anatomy of the sexual organs of plants, animals and (finally) humans.
What’s striking here -- especially given the text is written in a language with liturgical and theological associations -- is that Kaan begins and remains on a strictly naturalistic level of description and explanation. In discussing the stages of human sexual maturation, he notes that puberty “begins around the twelfth year in girls and the fourteenth in boys, at which age the Old Testament laws allow for marriage” -- but this, like Kaan’s few other scriptural citations, is given as historical background rather than divine revelation. He expresses a definite belief in “the absolute necessity for monogamy and marriage” without trying to demonstrate its necessity.
Insofar as customs in such matters differ around the world, Kaan implies that it can be explained as the product of variations in the intensity of the libido -- which are, in turn, the function of environmental, biological and psychological factors. The hotter the climate, the darker the skin and the closer to the land, as he posits it, the stronger the sexual drive.
The source of nutrition is also important: erotic gratification is experienced “most vigorously among cannibals, less so among carnivores and flesh eaters, and least of all among vegetarians.” Here we can only lament the author’s failure to disclose his research methods.
Kaan establishes (to his own satisfaction, at least) a scientific basis for taking the monogamous, heterosexual, procreative couple as normative. But medical experience has taught him that deviations are alarmingly frequent, even among European noncannibals. His treatise takes the initial steps toward understanding the range and etiology of sexual disorders and, ultimately, curing them. And in a way the title is his first contribution to the cause: he uses the expression “psychopathia sexualis” to subsume a few practices and preferences under a common heading.
“The types of these aberrations are numerous enough,” he writes, “but the most common are onanism or masturbation, the love of boys (paiderastia), lesbian love, the violation of cadavers, sex with animals, and the satisfaction of lust with statues.” He defines lesbianism as “an aberration that consists in the satisfaction of the sexual drive either between men or between women by means of tribadism, or rubbing” -- which, as definitions go, seems at once very broad and surprisingly unimaginative. Kaan does not elaborate on the statue kink, but Krafft-Ebing gives a number of examples.
The most remarkable thing about Kaan’s catalog is how brief and undetailed it is (even compared to Krafft-Ebing’s, less than half a century later). Furthermore, “these types of deviation are merely one and the same thing, and they cross into one another.” Having identified autoerotic activity as one form of psychopathia sexualis, Kaan soon informs the reader that it is not just the first on his list but the matrix of all the rest. Not that everyone who masturbates will go gay or interfere with public sculpture, to be sure, but it is a dangerous practice and should be discouraged in children. Among the availability modalities of treatment, Kaan especially recommends very cold water.
For reasons cultural historians continue to debate, masturbation was a topic of fierce public concern for more than a century before Kaan’s treatise and for just as long afterward. Self-satisfaction had been condemned on religious grounds before that, of course, but without generating anything like the alarm over its terrible effects on mind, body and soul that began in the early 18th century. One of Kaan’s reviewers grumbled about how he had added to what was already an enormous and very repetitious literature on the subject.
His Psychopathia Sexualis is far from the most hyperbolic or obsessive example of such discourse, but the 21st-century reader cannot help feeling that each medical warning -- every injunction to parents, teachers and other responsible adults to watch for and prevent autoerotic activity -- must have created the very disturbances they were supposed to prevent.
At the same time, the original Psychopathia Sexualis does more than repeat the old “thou shalt not” in nonreligious terms. As Foucault pointed out in his lectures, Kaan’s work had some important implications. It treated human sexuality as entirely explicable within nature -- with nonprocreative forms being, in effect, the accidental effect of a natural force being redirected via the brain: sexual deviations are caused by masturbation, which is, in turn, an activity engaging the imagination (i.e., an organic capacity of our species). Kind of obvious once you think about it, but not until then, and it was Kaan who, pardon the expression, mastered this domain.