New book challenges the idea that professors don't care about teaching

Smart Title: 

Research from University of Washington shows professors to be self-critical about and constantly struggling to improve their teaching.

Flat World's shift in gears and what it means for open textbook publishing

Smart Title: 

Flat World Knowledge will no longer publish versions of its textbooks at no charge. How big a setback does the company's change represent for the 'open' movement?

An academic tries unsuccessfully to publish a book in the popular press (essay)

This is a story about a story. A story that might be worth millions of dollars. It’s also a cautionary tale for academics who dream of writing best-selling books.

One day in early 1999, I found myself awaiting the retrieval of books in the main reading room of the Jefferson Building at the Library of Congress. I was completing the research for my doctorate in history at Georgetown University. Passing the time by strolling through the alcoves circling the giant room, my eye caught the spines of a group of slim volumes resting on a shelf. They were a series of oral histories compiled by the LA84 Foundation, an organization assembled by the 1984 Los Angeles Olympic Committee that was tasked with, among other things, providing scholars and students with historical materials related to the Olympic Games. One volume contained an interview with Gordon Adam, a member of the University of Washington’s gold medal-winning crew team in Berlin in 1936. Having been a collegiate oarsman, I started reading Adam’s story.

It was riveting. Adam grew up poor on a farm in the Pacific Northwest. He worked at a salmon-canning factory in Alaska to make enough money for college and then enrolled at the University of Washington in the depths of the Depression. He decided to try rowing -- at the time a major intercollegiate sport -- and in his oral interview, he told a marvelous tale of how he and his teammates topped Eastern Ivy League competition for the right to represent the United States in Berlin. He recalled traveling to Europe, his impressions of Nazi Germany and seeing Hitler at the opening ceremony. He finished by describing his crew’s stirring come-from-behind victory over Italian and German crews in a very tight race.

Although that oral history had nothing to do with my dissertation or research, I knew I’d stumbled upon a great story. Global in scope, cinematic in its drama, this story -- I felt strongly -- would sell. I copied the oral history on the library Xerox machine, tucked it away in a file, and told myself someday I would research and compose a book on it.

Life moved on. I finished the dissertation, accepted a visiting assistant professor position and gained a tenure-track job. I focused on securing tenure by publishing my scholarship on radio and journalism history in top journals. I also worked on improving my teaching and agreed to enough service commitments to fill up my time. All the while, however, I kept gathering material on that 1936 crew team. I “collected string,” as they say in journalism.

The University of Washington put me in touch with surviving members of the crew, some of whom I interviewed, and I discovered the original CBS recording of the race broadcast at the Paley Center for the Media. I contacted Dan Raley, one of the last sports editors of the Seattle Post-Intelligencer, for more information on the crew, and he generously shared his materials and thoughts.

Then I received tenure, and I started to seriously pursue the book. I wrote a book proposal, a sample chapter, a magazine-length version of the story and even a 750-word op-ed about this incredible Olympic moment. I never considered composing the story as a dry academic or scholarly tome. Nor did I have literary pretension. Rather, I wanted to bring the story alive and engage the public journalistically, reporting the facts interspersed with the words and voices of the Olympians themselves. The story, it appeared to me, required little embellishment.

Crickets. I pitched the material everywhere. I actually had started pitching it as a book proposal and magazine article even before I got tenure, using the news peg of the 2006 and 2008 Olympic Games. I tailored my approaches to every kind of outlet as precisely as possible. For example, I pitched the story to the Chronicle Review, emphasizing how impoverished Depression-era college kids used intercollegiate athletics to learn about the world. But my magazine article pitches were either rejected or ignored by Slate, The Atlantic, The New Republic, Sports Illustrated, Smithsonian, the New York Times Magazine and elsewhere. In fact, my version of the story was continually and consistently turned down as an extended essay, a newspaper column and a book proposal by numerous editors, literary agents and publishers. I got almost no feedback. I stopped counting rejections when they passed 60.

Still, I refused to give up. In 2012, with the London Olympics on the horizon, I again pitched the story everywhere. Josh Levin, an editor at Slate, liked it. He published it as “Six Minutes in Berlin” and made it the centerpiece of their 2012 London Olympic coverage. The article exploded on the web, lasting four days as Slate’s most-read feature, generating a long comment thread and thousands of social media recommendations.

Emails flooded in. Literary agents that had previously rejected the book proposal now inquired whether I would be interested in representation. One major publisher that explicitly prohibits the submission of unsolicited manuscripts wrote and asked me to send the book manuscript.

Then, just as quickly, silence. It turned out that, one year before, Viking Press had inked an enormous deal with an author named Daniel James Brown to write the story of the 1936 crew. That book, titled The Boys in the Boat, came out in 2013 and remains near the top of the New York Times nonfiction paperback best-seller list as of this writing. Brown’s book tells the story of Joe Rantz, one of the rowers, and the significant investment by Brown’s publisher in the success of The Boys in the Boat made others reluctant to take on a competing project.

An important New York literary agent told me as much over the telephone. The publishing industry, he explained, was under enormous economic stress. The book trade was getting slaughtered, he said, and big publishing had essentially evolved into a cartel (my word, not his). Publishers simply could not compete with each other by bidding competitively for the same stories, and once Brown got his contract, any chance I had of publishing “Six Minutes in Berlin” as a book had evaporated. No major publisher would waste their time or resources undercutting another major publisher’s list.

That was bad enough. But then The Boys in the Boat came out, and much to my surprise, my interviews with two oarsmen were cited by Brown. I only shared transcripts with the oarsmen themselves -- coxswain Bob Moch and Jim McMillin -- both of whom had died in 2005, before Brown met Joe Rantz or began his research. Somehow copies -- the only ones I shared -- ended up in Brown’s hands. He and his publishers undoubtedly knew I was working on my own book because of the publication of “Six Minutes in Berlin” in Slate in 2012. The interviews proved remarkably illustrative of occurrences in the boat during both the national championship and the Olympic gold medal races, and it disturbed me greatly to see information I had collected published elsewhere without my permission.

The Boys in the Boat is a good book, but it’s not history. It’s not the book I would have written. It’s peppered with inaccuracies and embellishments. One of the reasons my manuscript took so long to compose was that I possess a doctorate in history, and verifying information by cross-referencing sources requires an enormous amount of time. In other words: accuracy matters. I’m not bothered by the slight copy-editing errors that pop up in The Boys in the Boat that are endemic to any manuscript -- such as when Cornell University, not Columbia University, is inaccurately credited with victory in the first Intercollegiate Rowing Association regatta.

But I am disturbed that inaccuracy and embellishment is apparently acceptable when writing history for popular audiences. For example, Brown offers this dramatic opening to the race broadcast: “At 9:15 a.m., the voice of NBC’s commentator, Bill Slater, began to crackle over KOMO’s airwaves in Seattle, relayed from Berlin.” But according to NBC’s records in the Library of Congress and numerous other sources, Bill Slater was in London that evening preparing to cover the next day’s White City track meet featuring Jesse Owens. The rowing final in Berlin actually started at 9:02 a.m. Seattle time, and anybody tuning in to NBC would have missed it because the network widely publicized the wrong starting time in newspapers around America. Only those people tuning to CBS would have caught this Olympic exclusive. Yet these facts don’t deter Brown. “NBC’s Bill Slater was screaming over KOMO’s airwaves in Seattle,” he informs his audience at a particularly dramatic moment in the race narrative. This did not occur.

A lot of ink has been spilled recently about the need for academics to write for wider audiences. Much of the criticism presumes that academics prefer to write and speak in impenetrable rhetoric designed to limit communication to only people initiated in the cloistered world of scholarly interchange. I don’t doubt that this problem exists. But many critics have no idea how many scholars -- like myself -- have attempted to write for wider audiences but found ourselves blocked by gatekeepers in the publishing industry. Although I’ve published numerous essays and newspaper columns for wide public readership, and I believe my book proposal proved my ability to deliver clear, serviceable -- and even engaging -- prose, no publisher took a gamble on this first-time author coming out of academe.

This story, however, might have a happy ending. Although Daniel James Brown has a best seller and the revenues from his movie deal for The Boys in the Boat, I continued pursuing my project. I reshaped my manuscript to more closely align with academic standards and fit the constraints of scholarly publication. I then sent it out to academic publishers.

Obviously, Brown’s best seller significantly damaged the trade market for “Six Minutes in Berlin.” But the University of Illinois Press responded positively to the parts of my manuscript about Olympic broadcasting. No single volume exists on the birth of global sports broadcasting as developed by Nazi radio authorities. If I were willing to interweave this larger story about global telecommunication history into the narrative of the rowers, who gained brief national celebrity from their victory, they told me they would be interested. But I needed to satisfy peer reviewers and severely limit the word count. The first peer reviews proved encouraging, and a contract was signed. Six Minutes in Berlin: Broadcast Spectacle and Rowing Gold at the Nazi Olympics will be published this month.

But I won’t make a million dollars.

EDITOR’S NOTE: Inside Higher Ed reached out to the publisher and author of The Boys in the Boat for a response to this piece, and they had no comment.

Michael J. Socolow is an associate professor of communication and journalism at the University of Maine. Six Minutes in Berlin: Broadcast Spectacle and Rowing Gold at the Nazi Olympics will be published this month by University of Illinois Press.

Editorial Tags: 
Image Source: 
Getty Images
Image Caption: 
Rowing teams race to the finish line at the 1936 Berlin Olympics.

Essay on wave of sinister clown sightings and Orrin E. Klapp's 'Heroes, Villains and Fools'

In late August, residents of Greenville, S.C., began reporting to police that one or more clowns had been observed attempting to lure children into a wooded area. It was an odd moment in a year that had already seen more than its share.

Since then, reports of sinister-clown activity (e.g., threats, assaults, the brandishing of knives and standing in place while waving slowly in a menacing manner) have gone viral throughout the United States, with a few now coming in from elsewhere in the world. Professional clowns are distressed by the damage to their reputation, and Ronald McDonald has gone on sabbatical for an indefinite period.

Like many anomalous phenomena -- UFOs, for example, or appearances by Elvis or Bigfoot -- clown sightings tend to come in waves. The recent spate of them has been unusual in both its geographical range and its emotional intensity -- although I suspect that coulrophobia is in fact the normal, even default, emotional response to clowns in any context. A study of children’s response to hospital decorations conducted by researchers from the School of Nursing and Midwifery at the University of Sheffield in England found that “clowns are universally disliked by children. Some found them frightening and unknowable.” And over the past 30 years or so, a strain of pop-culture iconography has tapped into that basic anxiety and amplified it with a series of overtly horrific clowns.

Some of the recently reported incidents involved people wearing commercially produced horror-clown masks. Whatever deep psychological wellsprings may have driven the clown sightings of previous years, the current cycle is, at least in part, a performance of mass hysteria -- an acting out of uncanniness and anxiety, with some individuals playing the menacing part in an almost standardized way.

Trying to make sense of this funny business, I did a search of my digital archive of journal articles, conference papers and whatnot in hopes of finding a paper -- by a folklorist, maybe, or possibly a psychoanalyst -- that might help elucidate the clown question. The most interesting material to turn up was by the late Orrin E. Klapp (1915-1997), a sociologist, whose first book was Heroes, Villains and Fools: The Changing American Character (1962).

Sections of it originally appeared as journal articles; a few of them made passing reference to clowns and clowning. But in these pieces, Klapp is interested in something more general: the range of fairly informal labels or categories we use to characterize people in the course of ordinary life. Examples he gives are “underdog,” “champ,” “bully,” “Robin Hood,” “simpleton,” “crackpot,” “cheat,” “liar” and “big shot.” (“Clown” is one of them, of course, but let’s not get ahead of ourselves.)

What intrigues Klapp about such labels is that they reflect, but also enforce, prevailing values and social norms. Some express a severe judgment (“traitor”) while others are relatively inconsequential (“butterfingers”). New labels or epithets emerge from time to time as others fall out of use; they are part of the flux of everyday life. But Klapp argues that the labels implying particularly strong judgments fall into three general categories that do not change much with time: the hero, the villain and the fool.

“The most perfect examples of heroes,” Klapp writes in one paper, “are to be found in legendary or mythical personages who represent in a superhumanly exaggerated way the things the group admires most.” Villains are “idealized figures of evil, who tend to countermoral actions as a result of an inherently malicious will,” prone to “creating a crisis from which society is saved by a hero, who arrives to restore order to the world.”

The contrast between hero and villain is clear and sharp, but not exhaustive. “If the villain opposes the hero by exaggerated evil traits,” writes Klapp, “the fool does so by his weaknesses, his métier being failure and fiasco rather than success. Though an offender against decorum and good taste, he is too stupid or ineffectual to be taken seriously. His pranks are ridiculed rather than severely punished.”

These three almost archetypal figures are seldom encountered in their purest form outside of fairy tales or superhero comic books. But most of the labels applied to people in the course of ordinary life can, in Klapp’s view, be subsumed under them. (The underdog is a kind of hero; the traitor a form of villain; the fanatic a variety of fool.) The symbolic figures and the everyday labels alike “help in the preservation of values” and “nourish and maintain certain socially necessary sentiments” -- such as “admiration of courage and self-sacrifice, hatred of vice, contempt for folly” and so forth.

Preservation of consensual values and the proper nourishment of socially necessary sentiments were major concerns of American sociologists of the Eisenhower era -- and Klapp’s framework was, in that respect, both normative and normal. But there’s more to his argument than that. He worried that mass media and propaganda techniques could exploit or corrupt those sentiments: Klapp’s papers on villainy and vilification in American culture concern, in part, the then recent success of Joseph McCarthy. He also deserves credit for paying attention to the significant ideological baggage carried by ordinary language.

The clown, in his schema, definitely falls under the heading of the fool -- but with a difference. As someone deliberately accepting the role, inducing ridicule rather than just succumbing to it, the clown exemplifies what Klapp calls the paradoxical status of the fool as “both depreciated and valued: it is at the same time despised and tolerated, ridiculed and enjoyed, degraded and privileged … He also acts as a cathartic symbol for aggressions in the form of wit. He takes liberties with rank; and as butt or scapegoat receives indignities which in real life would be mortal insult or conflict creating.”

Klapp draws close to an insight into a type of clown he doesn’t seem to have recognized: the menacing kind, in Greenville or elsewhere. For the clown, on these terms, has reason to want revenge, to wreak havoc as much as the villain does. (Here one also thinks of a certain political figure with an orange face, unnatural hair and a strange combination of extreme self-centeredness with no discernable self-awareness.) The stock of widely accepted heroic figures may be at an all-time minimum, while neither clowns nor villains are in short supply, and it’s getting harder to tell them apart.

Editorial Tags: 

Interview with Bill V. Mullen on his book 'W.E.B. Du Bois: Revolutionary Across the Color Line'

Across almost a century of American social and political change, W. E. B. Du Bois was the pre-eminent African-American author and thinker, bar none. He was born three years after the end of the Civil War and died just one day before the March on Washington in 1963. He was the first black scholar to receive a Ph.D. from Harvard University. The German sociologist Max Weber admired his book The Souls of Black Folk (1903) and tried to arrange its translation. And his place as founding editor of the National Association for the Advancement of Colored People's magazine, The Crisis, gave him not just an agenda-setting role in the history of the civil rights movement but also an international influence.

W. E. B. Du Bois: Revolutionary Across the Color Line by Bill V. Mullen (published by Pluto Press, with distribution in the United States by the University of Chicago Press) serves as a timely introduction to this impressive and somewhat imposing figure, while also reframing Du Bois’s life and work beyond the boundaries of the American context. Mullen is a professor of English and American studies at Purdue University and the author of two previous studies of Du Bois: Afro-Orientalism (University of Minnesota Press, 2004) and Un-American: W. E. B. Du Bois and the Century of World Revolution (Temple University Press, 2015). I interviewed him by email about his most recent book.

Q: Du Bois said that the problem of the 20th century was the problem of the color line. We heard a lot about the United States becoming a “postracial” society when President Obama was first elected on the assumption that the problem had been solved, which is not a perspective often championed these days. What do you think counts as the most pertinent aspect of Du Bois’s legacy now, after eight years of an African-American president and several of civic unrest on a scale we haven't seen for decades?

A: I think the most pertinent aspect of Du Bois’s legacy to today’s protest movements -- against police violence, for Black Lives Matter and the movement for Palestinian civil rights, for example -- was his insistence that only mass protest could bring about meaningful social change. Du Bois was eventually weaned away from the idea that capitalism and racism could be reformed from above. His view of democracy was that it was a living thing animated by ordinary people engaged in self-activity for equality.

All of the major social justice organizations he was involved with -- the Pan-African movement, the Socialist Party, the NAACP, the Peace Information Center against atomic weapons, the Communist Party -- were interracial or international movements that challenged institutions of power and authority. An especially relevant example to our time is the work Du Bois did to create the “We Charge Genocide” petition delivered to the United Nations in 1951. He wrote the first drafts of that petition, which charged the U.S. state with disproportionately causing black death through poverty, poor schooling, social and police violence. After Trayvon Martin was killed in 2012, a group of young Chicago activists formed the group We Charge Genocide to document police shootings of African-Americans in Chicago and to honor that earlier effort. Du Bois’s legacy to our time was made very real and direct in that moment.

Q: You write that biographers and scholars have neglected or underestimated the significance of Du Bois’s long-term political development, and at one point, you suggest there’s a tendency to overemphasize his early book The Souls of Black Folk (1903) almost as if that’s his single major work. David Levering Lewis’s two-volume biography of Du Bois seems very broad in scope and deep in detail, so I’m wondering if there are particular discussions of Du Bois, or perspectives on him, that you’re challenging.

A: There are two parts to this exclusion tendency. Levering Lewis’s biography of Du Bois is magnificent. But he dedicates only 16 out of almost 1,400 pages to the last eight years of Du Bois’s life. In that time, Du Bois traveled to the Soviet Union and China, joined the Communist Party, published his autobiography in the Soviet Union, and moved to Ghana. The effect of downplaying those events is to diminish them as late-in-life mistakes of someone who has taken a bad political turn or has simply lost his bearings in old age. I argue instead that that those culminating events of Du Bois’s life can only be explained by tracing them back to points of origins far earlier. I dedicate a whole chapter to Du Bois’s writings on Asia, for example, which begin in 1905, because they explain why he later supported Maoism so strongly and why he said in the 1940s that the future of the world depended upon events in Asia.

Second, there is still a tendency to ignore Du Bois’s lifelong interest in Marxism so that he remains an avuncular “race man” figure for scholars in the academy. To give an example, Du Bois wrote a 300-page manuscript called “Russia and America” in 1950. His publisher, Henry Giroux, wouldn’t bring it out during the Cold War, saying it was too pro-Soviet and anti-American. To this day, it has never been published. I spend a good deal of time talking about the book because it explains better than any other single Du Bois text why he sympathized with the Russian revolution. The book is also important for showing how Du Bois saw the Russian revolution as a sequel to African-American self-emancipation from slavery, an event he called an “experiment of Marxism.” My tendency then is to show that Marxism was always central to Du Bois’s political development -- not a detour, diversion or mistake.

Q: Arguably Du Bois’s life and work are too large, too far-flung, even for Paul Gilroy’s notion of the “Black Atlantic,” since the Indian independence struggle (among other Asian developments) was so important for him. You discuss him as a “transnational” figure. Please say more on that.

A: Du Bois was most accurately described as an internationalist. His worldview was framed by 19th-century nationalisms, the Pan-Africanist movement, Communist internationalism and the anticolonial movement of the 20th century. His political orientation was to see in all directions simultaneously the interdependence of the advanced and underdeveloped worlds, as well as the historical movements of people between nations and territories. He called Japan’s defeat of Russia in their 1905 war the first “crossing of the color line” in world history, and India’s independence in 1947 the greatest event of the 20th century. He first used his famous coinage “The problem of the 20th century is the problem of the color line” in the 1900 Pan-African Congress address to refer to the relationship of nonwhite peoples across the world to their colonial masters.

Intellectually, his influences ran from Hegel to Alexander Crummell, Bismarck to Nehru. His 1928 anticolonial novel, Dark Princess, is a rewriting of Shakespeare’s A Midsummer Night’s Dream. For me, communism and socialism provided the intellectual synthesis of this global perspective: he understood what the Communist International called “world revolution” as the drawing together of modern humanity into a single project, or totality, of global unity and emancipation. That is the main theme of my book, and the through line for my account of his lifelong political development.

Q: Would publishing the manuscript of Du Bois’s “Russia and America” be worthwhile now? It's certainly odd to think of a book-length work by a figure of such significance languishing in the archives.

A: “Russia and America” should absolutely be published. Vaughn Rasberry’s important new book, Race and the Totalitarian Century, also puts “Russia and America” at the center of Du Bois’s Cold War writing. The problem is the Du Bois scholarship industry. Most Du Bois scholars haven’t read the manuscript and therefore don’t understand its importance. Others who have read it dismiss it because Du Bois is full throated in his praise of the Soviet Union at a time when many of Stalinism’s worst errors were becoming well-known.

In other words, the manuscript still lives in the shadow of Cold War thinking that should be long past by now. Too many scholars would prefer to preserve a hagiographic image of Du Bois as a benign humanist or saint rather than comprehend both the depth of his commitment to Communism and the reasons he oftentimes looked past problems with Stalin’s Russia. It’s a kind of “Don’t ask, don’t tell” approach to scholarship, which does a disservice to students and scholars who want to comprehend Du Bois and socialism in the 20th century -- problems and all.

Q: Your book follows a difficult line with respect to some of Du Bois’s political commitments. You seem understanding, or at least nonpolemical, with regard to his support for the regimes of Stalin and Mao, but a number of remarks make clear you reject those politics. How do you manage to balance those perspectives?

A: Du Bois’s political evaluations of Stalin’s Russia and Mao’s China were consistent with those of many of the people whom we consider to be the most important radicals of the 20th century, including the majority of anticolonial leaders from Asia and Africa. His strong desire for decolonization led him to trust the Soviet Union and China and their promises of aid to that project well past the time their revolutions had become corrupted. To be for world revolution and decolonization in the 20th century, in other words, was to sign up for Communist internationalism with all of its faults. Du Bois signed up early and never fully recanted.

On the other hand, he misapprehended the meaning of Marxism and socialism in ways that we should not forgive or forget. He confused state capitalism -- Stalin’s system of socialism in one country and bureaucratic rule from above -- with the real meaning of socialism as working-class self-emancipation. His thin understanding of Japanese and Chinese history caused him to perceive Japanese imperialism and expansionism in China as a viable alternative to capitalism for nonwhite workers of the world. Du Bois was both brilliant and fallible.

But he was always, as I try to make clear, vying to find a way that ordinary people could fashion their own liberation and self-emancipation. He found this match of political will and human self-activity in his most brilliant book, Black Reconstruction in America (1935). If he had written nothing else in his life, Black Reconstruction would have cemented his place as one of the most original scholars and political theorists of human freedom. So his life and his work demand a judicious and balanced approach that is well grounded in the theories of revolution and human liberation he was trying to advance. I try to provide that approach, and as you say, walk that line, in my book.

Q: Du Bois’s early worldview reflects a belief in elite leadership -- “the talented tenth.” Your book stresses his move toward a more democratic perspective, an emphasis on agency and power from below. But isn’t there a lot of continuity in his thinking? Aren’t traces of the young Du Bois who admired Bismarck still discernable in the octogenarian who wrote a glowing tribute following Stalin’s death in 1953?

A: There are two kinds of continuity in Du Bois’s political thought across the course of his long life. One is the quest cited above for human emancipation carried out by ordinary people. In 1956, only seven years before his death, Du Bois wrote an essay in tribute to one of his heroes, the socialist militant labor leader Eugene Debs. At a time in which he was well aware of problems in the socialist models of both Stalin’s Russia and Mao’s China, Du Bois wrote, “A state socialism planned by the rich for their own survival is quite possible, but it is far from the state where the rule rests in the hands of those who produce wealth and services and whose aim is the welfare of the mass of the people.” That is the Du Bois who fought for what we can call “socialism from below.”

On the other hand, Du Bois never quite gave up the idea that a “great man” -- a Bismarck or a Stalin -- could redirect human history. The socialist William Gorman put this very well in an essay in the 1950s. About Du Bois’s defense of Stalinism, Gorman wrote, “There he can find embodied … in his life work in regard to the negroes: the conception of the talented tenth and the urge toward international revolt. Stalinism … approaches and manipulates the masses like an elite convinced of their backwardness and incapacity; hence the necessity to dictate, plan and administer for them from the heights of superior knowledge and wisdom.”

My final assessment is that Du Bois was a contradictory figure, but one who made the struggle for black freedom central to the 20th-century struggle for human emancipation in all its forms. We should not blame Du Bois that history didn’t solve the problem of the color line. We should celebrate the fact that he was one of the few people in American history to try to use every tool at his disposal to develop a theory and practice of human emancipation. He was a dangerous figure in the very best and most radical sense of that word.

Editorial Tags: 

Review of ‘Heinrich Kaan’s “Psychopathie Sexualis” (1844): A Classic Text in the History of Sexuality’

Until Michel Foucault mentioned him in passing in the first volume of his History of Sexuality (1976), the Viennese physician Heinrich Kaan’s role as the pioneer in medical research on paraphilias seems to have gone unnoticed. The title would have gone by default to Richard Krafft-Ebing, who published the first edition of his encyclopedic Psychopathia Sexualis in 1886. And the long disappearance of Kaan into that work’s shadow is even more unjust given that he was the first to use the title, more than 40 years earlier. (Kaan goes unnamed in the English rendering of Krafft-Ebing’s 12th edition -- whether the omission is the author’s or the translator’s I don’t know.)

As remedy to that neglect, Cornell University Press has published Heinrich Kaan’s “Psychopathia Sexualis” (1844): A Classic Text in the History of Sexuality, edited by Benjamin Kahan, an assistant professor of English and women’s and gender studies at Louisiana State University, in a translation by Melissa Haynes, a classicist at Bucknell University. Judging it “too dangerous to hand over to the general public” until “its utility and integrity can be proven,” Kaan wrote his treatise in Latin, but he hoped that it would meet with sufficient professional approval that he could arrange to have it “translated into a vernacular language such as French.”

The index contains reviews from medical journals of the day, which are decidedly mixed. One of Kaan’s peers vents his irritation that “people continue to belabor themselves and others” by writing in a dead language that is inadequate for modern purposes “even when it is masterfully employed!” The reviewer then strongly implies that Kaan is “among those who must still struggle with vocabulary and syntax” and “would do best to simply avoid it altogether.” Another critic praises it as “creditable to the author,” unlike most publications “on the revolting subjects of which it treats.”

Understandably, then, no clamor for a translation was heard in Kaan’s own day. “As far as I am aware,” Foucault said during his course of lectures for 1974-75 at the Collège de France, “it is the first treatise of psychiatry to speak only of sexual pathology but the last to speak of sexuality in Latin.” (Presumably Foucault meant that it was the last monograph to be composed solely in that language: Krafft-Ebing switched from German to Latin whenever it was necessary to describe deviant sexual behavior in potentially salacious detail.)

The liminal status of the first Psychopathia Sexualis -- its position near the end of a centuries-old mode of scholarly discourse and at the inauguration of a new disciplinary organization of knowledge -- render Kaan’s project interesting now in ways that it couldn’t be for its contemporary audience. The book’s structure and method now look peculiar. Kaan announces at the start that he was driven by “a desire to collect case studies, to examine them and from them to deduce general principles, and then to apply to them every kind of theoretical and practical knowledge and, thus, to derive from them rules useful to physicians.” But unlike Krafft-Ebing, much less Sigmund Freud, the author keeps those case histories (and his “deductions” from them) mostly to himself.

Instead, Kaan moves directly to a high level of generalization: plants, animals and humans alike are distinguished from the inorganic world by “the vital force [vis vitalis] by means of which the organism comes into being, is nourished and sustained.” This vital force subsists through two modes of reproduction, internal and external, corresponding to an organism’s nutrition and propagation, respectively. Kaan then gives an overview of the comparative anatomy of the sexual organs of plants, animals and (finally) humans.

What’s striking here -- especially given the text is written in a language with liturgical and theological associations -- is that Kaan begins and remains on a strictly naturalistic level of description and explanation. In discussing the stages of human sexual maturation, he notes that puberty “begins around the twelfth year in girls and the fourteenth in boys, at which age the Old Testament laws allow for marriage” -- but this, like Kaan’s few other scriptural citations, is given as historical background rather than divine revelation. He expresses a definite belief in “the absolute necessity for monogamy and marriage” without trying to demonstrate its necessity.

Insofar as customs in such matters differ around the world, Kaan implies that it can be explained as the product of variations in the intensity of the libido -- which are, in turn, the function of environmental, biological and psychological factors. The hotter the climate, the darker the skin and the closer to the land, as he posits it, the stronger the sexual drive.

The source of nutrition is also important: erotic gratification is experienced “most vigorously among cannibals, less so among carnivores and flesh eaters, and least of all among vegetarians.” Here we can only lament the author’s failure to disclose his research methods.

Kaan establishes (to his own satisfaction, at least) a scientific basis for taking the monogamous, heterosexual, procreative couple as normative. But medical experience has taught him that deviations are alarmingly frequent, even among European noncannibals. His treatise takes the initial steps toward understanding the range and etiology of sexual disorders and, ultimately, curing them. And in a way the title is his first contribution to the cause: he uses the expression “psychopathia sexualis” to subsume a few practices and preferences under a common heading.

“The types of these aberrations are numerous enough,” he writes, “but the most common are onanism or masturbation, the love of boys (paiderastia), lesbian love, the violation of cadavers, sex with animals, and the satisfaction of lust with statues.” He defines lesbianism as “an aberration that consists in the satisfaction of the sexual drive either between men or between women by means of tribadism, or rubbing” -- which, as definitions go, seems at once very broad and surprisingly unimaginative. Kaan does not elaborate on the statue kink, but Krafft-Ebing gives a number of examples.

The most remarkable thing about Kaan’s catalog is how brief and undetailed it is (even compared to Krafft-Ebing’s, less than half a century later). Furthermore, “these types of deviation are merely one and the same thing, and they cross into one another.” Having identified autoerotic activity as one form of psychopathia sexualis, Kaan soon informs the reader that it is not just the first on his list but the matrix of all the rest. Not that everyone who masturbates will go gay or interfere with public sculpture, to be sure, but it is a dangerous practice and should be discouraged in children. Among the availability modalities of treatment, Kaan especially recommends very cold water.

For reasons cultural historians continue to debate, masturbation was a topic of fierce public concern for more than a century before Kaan’s treatise and for just as long afterward. Self-satisfaction had been condemned on religious grounds before that, of course, but without generating anything like the alarm over its terrible effects on mind, body and soul that began in the early 18th century. One of Kaan’s reviewers grumbled about how he had added to what was already an enormous and very repetitious literature on the subject.

His Psychopathia Sexualis is far from the most hyperbolic or obsessive example of such discourse, but the 21st-century reader cannot help feeling that each medical warning -- every injunction to parents, teachers and other responsible adults to watch for and prevent autoerotic activity -- must have created the very disturbances they were supposed to prevent.

At the same time, the original Psychopathia Sexualis does more than repeat the old “thou shalt not” in nonreligious terms. As Foucault pointed out in his lectures, Kaan’s work had some important implications. It treated human sexuality as entirely explicable within nature -- with nonprocreative forms being, in effect, the accidental effect of a natural force being redirected via the brain: sexual deviations are caused by masturbation, which is, in turn, an activity engaging the imagination (i.e., an organic capacity of our species). Kind of obvious once you think about it, but not until then, and it was Kaan who, pardon the expression, mastered this domain.

Editorial Tags: 

In new book, scholars make the case for value of diversity in higher education and society generally

Smart Title: 

In new collection of essays, scholars make the case for diversity as essential to higher education and society generally.

Review of Lucas Graves's "Deciding What’s True: The Rise of Political Fact-Checking in American Journalism"

“Everyone is entitled to his own opinions,” the sociologist and politico Daniel Patrick Moynihan said, “but not to his own facts.” He may have been improving upon a similar if less trenchant remark (“ …but no man has a right to be wrong in his facts”) attributed to the financier Bernard Baruch.

Until sitting down to write this I did not know about Baruch’s version. A certainty that my eagle-eyed editor would inquire about the source obliged me to vet the attribution to Moynihan; she requires more than my vague recollection of having read it somewhere. In checking my facts, she bolsters my conscience, enforcing Moynihan’s (and Baruch’s) point about accountability.

Lucas Graves, an assistant professor of journalism and mass communication at the University of Wisconsin at Madison, uses the expression “internal fact-checking” to describe this kind of preventative, behind-the-scenes work. It tries “to eliminate untruth, not call attention to it” by catching and correcting mistakes in an article before it goes to press. In Deciding What’s True: The Rise of Political Fact-Checking in American Journalism (Columbia University Press), Graves traces how internal fact-checking morphed into something almost antithetical: the very public evaluation of factual assertions made by politicians and other figures in the news.

News organizations such as PolitiFact, and The Washington Post’s Fact Checker -- to name only the most nationally prominent -- intervene so frequently in American public discourse now that it seems counterintuitive to think they’ve only recently become a force in the world. Until the last two or three presidential election cycles, scrutiny of a candidate’s claims tended to be episodic and ad hoc -- and often enough conducted by the opposing campaign, bringing its own biases to the process. To the ethos of newspaper editors and reporters circa 1950, the idea of confirming or debunking a public figure’s statements of fact seemed perilously close to an expression of opinion, to be avoided at the risk of compromising one’s reputation for objectivity. Reporting that a fact was in dispute might be acceptable in some cases, but making a judgment call on it was best left to the pundits and thumb suckers.

The title Deciding What’s True is clearly an homage to Deciding What’s News by Herbert J. Gans, a classic study of newsroom culture, and Graves followed in his predecessor’s participant-observer footsteps by working for two major fact-checking organizations between 2010 and 2012. The book thus benefits from having two vantage points: the historical and sociological perspective available from media-studies scholarship, plus close ethnographic observation of how major fact-checking stories are discovered, investigated, debated in-house before being sent out to make their mark on the world.

His most striking insight, it seems to me, is how specific, self-defined and virtually self-contained the world of professionalized fact-checking tends to be. The naïve observer might think of fact-checking organizations as being akin to media watchdog groups such as the Media Research Center on the right and Media Matters for America to the left, with PolitiFact falling somewhere in between. But in reality the fact-checking milieu sees itself as unrelated to the partisan watchdog groups: it doesn’t work with or quote them, and Graves recounts one fact-checker as saying he almost decided to kill an investigation when he saw that Media Matters was already interested in it. Likewise, fact-checking journalists see a bright line between their work and blogging.

This is not just a matter of professional amour propre. The major fact-checking organizations have ties to established media institutions, including journalism schools, and retain a belief (which watchdogs and bloggers alike tend to reject) in old newsroom ideals of objectivity, impartiality and conscientious reporting.

The ’00s put confidence in those ideals under enormous strain from a number of catastrophically bad judgment calls (reporting war propaganda and Wall Street shilling without due diligence) as well as cases of plagiarism or outright fabrication in major news publications. Compounding those problems, even inducing some of them, was the growing array of new media competing for public attention while also driving up the pace of the news cycle.

In an email exchange with Graves, I indicated that PolitiFact, Fact Checker and so on seemed like a response, in part, to the 24-hour news cycle that emerged around the time of the first Gulf War and intensified still more once the internet started to permeate everyday life. Rumors, misinformation and bogus statistics could spread faster, and farther, than ever before.

Graves agreed, but added, “Another way to think about that is that the traditional model of objectivity, for all of its flaws, made some sense in a world where professional journalists acted as gatekeepers and could effectively police the borders of political discourse. Then wild rumors about the president’s birthplace didn’t have to be debunked because they could be denied coverage altogether. But the opening up of political discourse after the 1960s and the fragmentation of the media beginning in the 1990s -- both healthy developments in many ways, and both with echoes in the 19th century -- also effectively spelled the end of the journalist as gatekeeper. And especially with the rise of the internet, that fragmentation calls for a more critical style of political reporting that’s willing to directly challenge false claims.”

In principle, at least, systematic and high-quality fact-checking ought to make politicians and other public figures more careful about the claims they make while giving the public a running lesson in critical thought at the same time. At times, Deciding What’s True seems to encourage that hope. But I’ve been reading the book between rounds of binge-watching campaign coverage, and it is not an experience to recommend. The idea that fact-checking can impose some kind of restraint on a candidate, or influence public response, seems utterly negated by the candidacy of Donald Trump. His well-documented but unrelenting dishonesty -- his talent for lying without restraint or regard for evidence, outright and brazenly, even after the facts have been shown repeatedly -- never wavers yet makes no dent in his level of support. This seems really strange.

“This a large and complicated question,” Graves responded, “and people who study journalism and political communication are trying to approach it in many different ways. But one answer is that there have always been fairly wide slices of the American electorate that are deeply suspicious of establishment discourse, sometimes with good reason. If you listen to Trump supporters in interviews, they seem to accept that he doesn’t have the grasp of policy that other politicians do, and they don’t necessarily believe everything he says or subscribe to all of his views. He seems to say whatever he thinks and embrace a common-sense approach that many people find appealing. Beyond that, none of us makes political calculations in the detached, rational way that political theorists sometimes imagine.

“And at the same time, fact-checking has made a difference,” Graves continued. “It arguably has helped to solidify the ceiling over Trump’s support, giving ammunition to both voters and politicians who say they’ll never back him. And it has had a tremendous influence on coverage of his campaign, with front-page articles on Trump’s extraordinary disregard for the facts and constant references to his falsehoods even in straight news reports. We have nothing to compare this race to, and it’s impossible to say where this thing would be if more journalists had stuck to the traditional ‘he said, she said’ formula.”

Editorial Tags: 

Review of Teddy Wayne, 'Loner: A Novel'

Teddy Wayne's Loner: A Novel (Simon & Schuster) is the second book I've read in as many weeks narrated by a manipulative and highly verbal straight white man possessing a degree of upward social mobility as well as the impulse to see how much emotional damage he can inflict on others. Journalistic custom requires three instances to spot a trend, but reader, I do not have it in me to endure any more such company. (Anyway, both narratives resonate with Aaron James's political and philosophical musings, discussed here earlier in the month.)

The other volume was Diary of an Oxygen Thief, an anonymous and purportedly autobiographical work that "went from self-published obscurity to best-sellerdom," as reported in Publishers Weekly this summer. Loner is set at Harvard University, more or less in the present day, while Diary roams between Ireland and the United States as the narrator works as an advertising art director around the turn of the millennium. Despite considerable differences, the books follow broadly comparable narrative arcs. Romantic entanglements between characters (not just the hooking-up part but the emotional upheaval sometimes accompanying it) generally turn out to be misunderstandings at best. Often enough the disasters are intentional.

Neither author seems to be aware of, or responding to, the other's work, but they seem to be mapping similar terrain. And the fairly positive reception for Loner and Diary of an Oxygen Thief suggests that readers find something recognizable about the emotional landscape they depict. To discuss the similarities without giving away significant plot turns means being carefully vague at times. Ultimately it is the narrator's attitude or verbal demeanor that sticks with the reader more than the events recounted.

David Federman, the narrator of Loner, arrives at Harvard as a freshman with an acute sense of his middling as well as seemingly perfect confidence in his prospects as a member of an elite. Entitlement and embarrassment do not make for a stable combination, however, and it becomes increasingly volatile once he becomes aware of Veronica Morgan Wells, the figure he addresses in the second person from that point on: "[It] was obvious, from your clothes, your body language, the impervious confidence you projected, as if any affront would bounce off you like a battleship deflecting a BB pellet: you came from money …. It wasn’t just your financial capital that set you apart; it was your worldliness, your taste, your social capital. What my respectable, professional parents had deprived me of by their conventional ambitions and absence of imagination."

Not a unprecedented situation, of course, as the narrator himself realizes. But any similarity between Veronica Morgan Wells and Daisy Fay Buchanan is slight compared to the fact that Jay Gatsby, whatever else you might call him, wasn't a stalker. David Federman's unreliability as a narrator is shown chiefly in the fact that thinks Veronica accepts his carefully planned coincidental meetings at face value and that his effort to ingratiate himself is working. The campaign has its comic aspects. All of it unfolds against a background of campus sex codes, feminist cultural-studies seminars and expressions of concern about social inequality.

But David's increasingly fetishistic obsession with her, and his willingness to use another female character sexually as a means to gaining access to Veronica, grows very uncomfortable to witness from the inside. He goes from callow virgin to budding young psychopath very rapidly and without missing a step. He even manages to incorporate some of the campus sex code into his strategy.

The unnamed narrator of Diary of an Oxygen Thief is much less preoccupied with social status, or at least less overtly so, and his introspection never leaves the reader with understanding of what drives his malevolence toward women. His sadism is purely emotional but well practiced. In ending things, he follows a scorched-earth policy:

“‘This is what I look like when I’m pretending to listen to your boring conversation.’ I froze my sweetest expression, my innocent blues eyes widening in pseudo-interest, the same expression I’d used on teachers. … ‘This is what I look like when I’m pretending to be in love with you …. I’m going to dismantle us tonight. And there’s nothing you can do about it. You’ll have to sit there and listen while I wrench the U from the S. You’ll question your own judgment. Maybe you’ll never really trust yourself again. I hope so. Because if I don’t want you, and believe me I don’t, then I don’t want you being happy with someone else when there’s any doubt that I might get another girl.’”

What makes it considerably nastier is that the narrator treats this not as a way to get out of a relationship but as the whole point of it -- a moment when the self-loathing that he otherwise numbs with alcohol can be off-loaded on the woman he's maneuvered into position to endure it.

At a crucial moment in each book, the axis pivots to reveal just how limited and self-deluded the narrator is about his sense of control over others and over himself. The manipulation rebounds on him, but not as revenge only. The reader is left in a position to see that his seemingly pathological mind games can also be understood as having a certain logic: "Though Hollywood would have us believe that all we seek in romantic relationships is love," one character says, "it is just one of several exchangeable commodities, along with sex, money, status, validation, services and so on." An exchange, furthermore, in which one side can only win at the other's expense. Failure to understand that is a guarantee of losing.

I'm not going to argue with anyone else's sense of these things: people who reach such bleak conclusions probably have grounds for doing so. Still, it would be good to think that readers aren't responding to these two page-turners simply as confirmation of their own experience, but in the spirit of facing a worst-case scenario in order to find the nerve to try again.

Editorial Tags: 

Essay on Edgar Cayce, sociology of religion, terahertz waves and 'Repo Man'

Around this time 20 years ago, I met an elderly gentleman who’d had what sounded like an exceptionally interesting and unusual dissertation-writing experience. A couple of recent coincidences bring the encounter to mind and so inspired this little causerie.

His name was Harmon Bro, and he was in his late 70s when we met. He’d spent the better part of 50 years as an ordained minister and Jungian psychotherapist. If anyone ever looked the part of a Jungian archetype, it was Harmon, who personified the Wise Old Man. In 1955, the University of Chicago Divinity School awarded him a Ph.D. after accepting a doctoral thesis called “The Charisma of the Seer: A Study in the Phenomenology of Religious Leadership.”

It was based in part on work Harmon did in his early 20s as an assistant to Edgar Cayce, “the sleeping prophet.” Despite minimal education, Cayce, it is said, could give long, extemporaneous discourses in response to questions posed to him while he was in a trance state. Among these “readings” were medically sophisticated diagnoses of people miles or continents away, as well as detailed accounts of ancient history and predictions of the future.

Cayce died in 1945, but he left a vast mass of transcripts of his “readings.” By the 1960s, publishers were mining them to produce a seemingly endless series of paperback books extolling Cayce’s powers. Insofar as the New Age can be said to have founding figures, he was one of them.

Harmon was clearly a believer in Cayce’s miraculous powers. I was not (and am not) but have always enjoyed the legends by and about him. As a schoolboy, for example, he would put a textbook under his pillow and absorb its contents while asleep. He graduated (so to speak) to the Akashic Records -- an ethereal library documenting life on Atlantis and in ancient Egypt, and much else besides. He could also see into the future, but the track record is not impressive: China did not convert to Christianity in 1968, nor did Armageddon arrive in 1999. Cayce also predicted that an earthquake in the 1960s would cause California to sink into the Pacific Ocean. It remains attached to the continental United States as of this writing.

Harmon didn’t take skepticism as a threat or an insult, and anyway I preferred listening to arguing. He stressed how very improbable Cayce had been as a subject for serious scholarly attention in the 1950s -- at the University of Chicago, no less. It took three or four tries to get his topic approved; by the time the dissertation was finished and accepted, it felt like every faculty member concerned with the history and psychology of religion had weighed in on it. He happily lent me a copy (when anyone expresses interest in a decades-old dissertation, its author will usually have one of two responses: pleasure or horror), and from reading it, I could see that the scrutiny had been all for the best. It obliged him to practice a kind of methodological agnosticism about Cayce’s powers, and he demonstrated a solid grounding in the social-scientific literature on religion -- in particular, Max Weber’s work on prophetic charisma.

But by 1996, Harmon Bro was not at all happy with the institutions routinizing that charisma. The man he’d known and studied had an ethical message -- “love thy neighbor as thyself,” more or less. The New Age ethos amounted to “love thyself and improve thy karma.” You didn’t have to share his worldview to see his point.

The timing was fortunate: we grew acquainted during what proved to be the final year of Harmon Bro’s life. His obituary in the Chicago Tribune in 1997 made no reference to Cayce, but looking it up just now leaves me with a definite feeling of synchronicity: Harmon died on Sept. 13, which is also the date I’m finishing this piece. A message from Harmon, via the cosmic unconscious?

Probably not, although it was another and even more far-flung coincidence that reminded me of him in the first place. On Friday, the journal Nature Communication published a paper called “Terahertz time-gated spectral imaging for content extraction through layered structures,” which the science-news website EurekAlert kindly translates into laymanese as “Researchers prototype system for reading closed books.” Not by putting them under a pillow and sleeping on them, alas, but it’s impressive even so.

Researchers at the Massachusetts Institute of Technology and the Georgia Tech Institute of Technology collaborated in developing a system that uses bursts of terahertz radiation (“the band of electromagnetic radiation between microwaves and infrared light,” says EurekAlert) to create images of the surfaces of individual pieces of paper in a stack. Ink in a printed letter absorbs the radiation differently from the blank page around it; the contrast between the signals reflecting back are fed into an algorithm that identifies the letter on the page. The prototype can “read” the surfaces of up to nine pages in a pile; with more work, reading at greater depths seems possible. The story quotes one of the researchers as saying, “The Metropolitan Museum in New York showed a lot of interest in this, because they want to, for example, look into some antique books that they don’t even want to touch.” The signal-sorting algorithm may yet enable spambots to defeat captchas. (Which arguably represents grounds for halting research right away, though that is unlikely.)

The train of association between breaking technological news from last week and the memory of one of the more generous and unusual people to cross my path is admittedly twisty and random. On the other hand, reading by terahertz radiation seems like another example of Clarke’s Third Law: “Any sufficiently advanced technology is indistinguishable from magic.”

Editorial Tags: 
Image Caption: 
Edgar Cayce


Subscribe to RSS - Books
Back to Top