In late August, residents of Greenville, S.C., began reporting to police that one or more clowns had been observed attempting to lure children into a wooded area. It was an odd moment in a year that had already seen more than its share.
Since then, reports of sinister-clown activity (e.g., threats, assaults, the brandishing of knives and standing in place while waving slowly in a menacing manner) have gone viral throughout the United States, with a few now coming in from elsewhere in the world. Professional clowns are distressed by the damage to their reputation, and Ronald McDonald has gone on sabbatical for an indefinite period.
Like many anomalous phenomena -- UFOs, for example, or appearances by Elvis or Bigfoot -- clown sightings tend to come in waves. The recent spate of them has been unusual in both its geographical range and its emotional intensity -- although I suspect that coulrophobia is in fact the normal, even default, emotional response to clowns in any context. A study of children’s response to hospital decorations conducted by researchers from the School of Nursing and Midwifery at the University of Sheffield in England found that “clowns are universally disliked by children. Some found them frightening and unknowable.” And over the past 30 years or so, a strain of pop-culture iconography has tapped into that basic anxiety and amplified it with a series of overtly horrific clowns.
Some of the recently reported incidents involved people wearing commercially produced horror-clown masks. Whatever deep psychological wellsprings may have driven the clown sightings of previous years, the current cycle is, at least in part, a performance of mass hysteria -- an acting out of uncanniness and anxiety, with some individuals playing the menacing part in an almost standardized way.
Trying to make sense of this funny business, I did a search of my digital archive of journal articles, conference papers and whatnot in hopes of finding a paper -- by a folklorist, maybe, or possibly a psychoanalyst -- that might help elucidate the clown question. The most interesting material to turn up was by the late Orrin E. Klapp (1915-1997), a sociologist, whose first book was Heroes, Villains and Fools: The Changing American Character (1962).
Sections of it originally appeared as journal articles; a few of them made passing reference to clowns and clowning. But in these pieces, Klapp is interested in something more general: the range of fairly informal labels or categories we use to characterize people in the course of ordinary life. Examples he gives are “underdog,” “champ,” “bully,” “Robin Hood,” “simpleton,” “crackpot,” “cheat,” “liar” and “big shot.” (“Clown” is one of them, of course, but let’s not get ahead of ourselves.)
What intrigues Klapp about such labels is that they reflect, but also enforce, prevailing values and social norms. Some express a severe judgment (“traitor”) while others are relatively inconsequential (“butterfingers”). New labels or epithets emerge from time to time as others fall out of use; they are part of the flux of everyday life. But Klapp argues that the labels implying particularly strong judgments fall into three general categories that do not change much with time: the hero, the villain and the fool.
“The most perfect examples of heroes,” Klapp writes in one paper, “are to be found in legendary or mythical personages who represent in a superhumanly exaggerated way the things the group admires most.” Villains are “idealized figures of evil, who tend to countermoral actions as a result of an inherently malicious will,” prone to “creating a crisis from which society is saved by a hero, who arrives to restore order to the world.”
The contrast between hero and villain is clear and sharp, but not exhaustive. “If the villain opposes the hero by exaggerated evil traits,” writes Klapp, “the fool does so by his weaknesses, his métier being failure and fiasco rather than success. Though an offender against decorum and good taste, he is too stupid or ineffectual to be taken seriously. His pranks are ridiculed rather than severely punished.”
These three almost archetypal figures are seldom encountered in their purest form outside of fairy tales or superhero comic books. But most of the labels applied to people in the course of ordinary life can, in Klapp’s view, be subsumed under them. (The underdog is a kind of hero; the traitor a form of villain; the fanatic a variety of fool.) The symbolic figures and the everyday labels alike “help in the preservation of values” and “nourish and maintain certain socially necessary sentiments” -- such as “admiration of courage and self-sacrifice, hatred of vice, contempt for folly” and so forth.
Preservation of consensual values and the proper nourishment of socially necessary sentiments were major concerns of American sociologists of the Eisenhower era -- and Klapp’s framework was, in that respect, both normative and normal. But there’s more to his argument than that. He worried that mass media and propaganda techniques could exploit or corrupt those sentiments: Klapp’s papers on villainy and vilification in American culture concern, in part, the then recent success of Joseph McCarthy. He also deserves credit for paying attention to the significant ideological baggage carried by ordinary language.
The clown, in his schema, definitely falls under the heading of the fool -- but with a difference. As someone deliberately accepting the role, inducing ridicule rather than just succumbing to it, the clown exemplifies what Klapp calls the paradoxical status of the fool as “both depreciated and valued: it is at the same time despised and tolerated, ridiculed and enjoyed, degraded and privileged … He also acts as a cathartic symbol for aggressions in the form of wit. He takes liberties with rank; and as butt or scapegoat receives indignities which in real life would be mortal insult or conflict creating.”
Klapp draws close to an insight into a type of clown he doesn’t seem to have recognized: the menacing kind, in Greenville or elsewhere. For the clown, on these terms, has reason to want revenge, to wreak havoc as much as the villain does. (Here one also thinks of a certain political figure with an orange face, unnatural hair and a strange combination of extreme self-centeredness with no discernable self-awareness.) The stock of widely accepted heroic figures may be at an all-time minimum, while neither clowns nor villains are in short supply, and it’s getting harder to tell them apart.
As many grad students approach the end of their academic programs, they realize they’ve forgotten how to talk about their strengths and skills to different types of employers. Joseph Barber provides advice.
There is no holy book in which God tells us what libraries should be. Over the centuries, the contours of library services and collections have instead been mediated by humans, including founders, funders, managers and -- surprise, surprise -- users. That’s the conclusion I came to after researching and writing Part of Our Lives: A People’s History of the American Public Library. In it, I trace the history of this ubiquitous institution, largely by listening to the voices of those who have used libraries since the mid-19th century, to identify reasons why it has been loved for generations.
As I analyzed the data, I was surprised at how quickly those reasons organized into three broad categories. People have loved their libraries for: (1) the useful information they made accessible, (2) the transformative potential of commonplace reading they circulated and (3) the public spaces they provided. Examples abound.
While sitting at a Cincinnati public library desk in 1867, Thomas Edison compiled a bibliography on electricity. “Many times Edison would get excused from duty under pretense of being too sick to work,” a colleague later recalled, “and invariably strike a beeline for the library, where he would spend the entire day and evening reading … such works on electricity as were to be had.”
In 1971, 10-year-old Barack Obama returned to Honolulu from Jakarta. “The first place I wanted to be was in a library,” he said years later. “One Saturday … with the help of a raspy-voiced old librarian who appreciated my seriousness, I found a book on East Africa.” Obama wanted information about Kenya, birthplace of his father, a Luo tribe member. “The Luo raised cattle and lived in mud huts and ate cornmeal and yams and something called millet,” the book noted. “Their traditional costume was a leather thong across the crotch.” Shocked by what he read, Obama “left the book open-faced on a table and walked out.”
The Transformative Potential of Reading
After her father died in 1963, 9-year-old Sonia Sotomayor buried herself in reading at her Bronx library and the apartment she shared with her mother and brother. “Nancy Drew had a powerful hold on my imagination,” she remembered. “Every night, when I’d finished reading and got into bed and closed my eyes, I would continue the story, with me in Nancy’s shoes, until I fell asleep.” Her mind, she noted, “worked in ways very similar” to Nancy’s. “I was a keen observer and listener. I picked up on clues. I figured things out logically, and I enjoyed puzzles. I loved the clear, focused feeling that came when I concentrated on solving a problem and everything else faded out.”
In 1984, President Ronald Reagan wrote the daughter-in-law of Harold Bell Wright, whose best-selling 1920s religious novel That Printer of Udell’s Reagan read as an adolescent in Dixon, Ill. Shortly after reading the book, he declared himself saved and was baptized. The novel’s protagonist, Reagan wrote Wright’s daughter-in-law 60 years later, served as a role model that shaped his life. It’s likely the copy of That Printer of Udell’s Reagan read came from the Dixon Public Library, which he visited twice weekly in the early 1920s, often reading on the library’s front steps.
Library as Place
In the 1930s at the Atlanta Public Library’s African-American branch, one of the few public places where blacks felt safe and welcome, 10-year-old Martin Luther King Jr. came to the library several times weekly. Director Annie Watters later recalled their interactions. “He would walk up to the desk and … look me straight in the eye.” “Hello, Martin Luther,” she would say, always calling him by his first and middle names. “What’s on your mind?” “Oh, nothing, particularly.” For Watters, that was the cue King had learned a new “big word,” and between them they had a conversation in which King used the word repeatedly. Another game involved poetry. Again, King would stand by the desk, waiting. “What’s on your mind, Martin Luther?” Watters asked. “For I dipped into the future, far as the human eye could see,” he responded. Watters recognized the Tennyson poem, and finished the verse: “Saw the vision of the world, and all the wonder that would be.”
In 2005, The Washington Post carried an article by Eric Wee on a District of Columbia branch library in one of Washington’s poorest neighborhoods. In it, Wee reported that every Tuesday night a homeless man named Conrad Cheek entered the library and set up his chessboard on a table in the children’s room. Wee immediately noticed a transformation. “No more ignored pleas” for this homeless man, “no averted glances. During the next hour, people will look him in the eye. They’ll listen to his words. In this down-at-the-heels library he’s the teacher.” Among his students was 9-year-old Ali Osman, whose mother explained that her son’s confidence had soared after playing with Conrad, that he was now bragging to friends about being a chess player. “We owe it all to Mr. Conrad,” she said.
Information access, the transformative power of commonplace reading, library as place -- all three combine to explain why people have valued their public libraries for the past 160 years. By harnessing the literatures on information access, commonplace reading and public spaces to analyze the historical roles of American public libraries, Part of Our Lives shows that from their origins they have contributed to their host communities in multiple ways.
They have been places of performance where users displayed moral progress and achievement. They have functioned as centripetal forces to craft a sense of community among disparate populations and evolve community trust among its multicultural elements. They have acted as key players not only to increase literacy (tens of thousands of immigrants learned English by reading printed materials from their public libraries) but also to construct group identity through the stories and places they provided. And public libraries have also started neighborhood conversations, welcomed the recently arrived into their midst, and served as community anchors.
A Limited Focus
I could only come to those conclusions, however, by tapping deeply into non-library and information studies literature that addresses reading and place. For most of its history, LIS has focused instead on what in the 18th century was called “useful knowledge,” in the 19th and 20th was called “best reading,” and in the late 20th morphed into “information.” That focused term has given particular meaning to phrases like “information access,” “information literacy” and “information community” that not only tend to exaggerate the role of LIS in the larger world of “information” (see, for example, how much attention LIS gets in James W. Cortada’s All The Facts: A History of Information in the United States Since 1870), but also dominate -- and limit -- the profession’s thinking.
Take library education, for example. As professional education programs evolved from “library schools” into “schools of information” in the last 30 years, most have focused on “information” as defined by the professional discourse they inherited, and then incorporated into that discourse analysis of the storage and retrieval properties of developing communications technologies. In the process, however, they decentered the library as a subject for instruction and research. Thus, when the 17 “I-schools” (12 ALA-accredited) met for the first time in 2005, none had core courses analyzing reading and place from the “library in the life of the user” perspective that I took in Part of Our Lives.
That’s unfortunate, because my historical research suggests that not knowing more about the reading and places libraries of all types provide greatly limits our ability to understand more deeply what libraries actually mean to their host communities. My research has demonstrated that generations of users have valued the public library as a place by voluntarily visiting it again and again for multiple reasons, many of which had nothing to do with information access.
Although I-school curricula emphasize services leading to the kinds of information Thomas Edison and Barack Obama found useful, they undervalue the impacts of information products that guided the lives of Ronald Reagan and Sonia Sotomayor, and they overlook the importance of library as place so evident in the experiences of Martin Luther King Jr., Conrad Cheek and Ali Osman.
If Part of Our Lives proves that reading and place have been as important to the American public library (and other types of libraries) as information access, then not having a core course in either at ALA-accredited programs is the equivalent of an American Bar Association-accredited law school without a core course on the Constitution or civil procedure. Unless organizations like the Association for Library and Information Science Education and the American Library Association, as well as the ALA Committee on Accreditation, insist that reading and place are essential parts of librarianship’s “domain” that must be taught at the core level, LIS education programs will continue to manifest limitations.
Such limitations are also evident in prognostications. In BiblioTech: Why Libraries Matter More Than Ever in the Age of Google, John Palfrey rightfully contends that library digitization can equalize access to education, jobs and information, but he worries that “bad nostalgia” for services like commonplace reading and traditional library programs will interfere with future planning. In a January 2016 Wall Street Journal article, Steve Barker lamented that because of emerging technologies “the role for librarians and public libraries is shrinking.” “Don’t mourn the loss of libraries,” John McTernan argued in a March 2016 Telegraph article. “The internet has made them obsolete.”
Ironically, unlike LIS educators and researchers, library practitioners intuitively seem to recognize the value of reading and place. The American library press abounds in reports of popular programs. Kathleen de la Peña McCook devotes much attention to library as place in her two editions (2004 and 2011) of Introduction to Public Librarianship. ALA initiated a “Libraries Transform” campaign last year to increase awareness of the multiple roles America’s academic, school and public libraries play in their host communities. Then there’s the “Project Outcome” initiative ALA’s Public Library Association (PLA) recently crafted to measure public library impacts, the report Public Libraries: A Vital Space for Family Engagement released in August by the Harvard Family Research Project and the PLA that calls on libraries to increase efforts to engage families in children’s learning, and the three-year study entitled “Bringing Home Early Literacy: Determining the Impact of Library Programming on Parent Behavior” that the Institute of Museum and Library Services is funding.
And regarding “library as place,” academic librarians especially have shown leadership in recent years by renovating spaces rescued from print collections now digitized and accessible online into group study areas that students use for a variety of class-related purposes. The sociability that reading has fostered for generations among students is much in evidence in these places. Many college and university libraries also installed coffee bars. The collective effect of these actions (sometimes referred to as the “information commons” movement) is obvious at my home institution, Florida State University, where students now call the main Robert M. Strozier Library “Club Stroz.” In recent years, turnstile counts have spiked.
For all of those efforts, however, researchers outside the profession and already overworked library practitioners have taken the initiative. Where is the LIS research community? Why aren’t members of that community conducting longitudinal studies evaluating library activities like the impact of summer reading programs on student reading levels as they move from one grade to another? Where’s the LIS research that identifies the community effects of programs like film festivals, book clubs, children’s story hours, English as a second language classes, literacy tutoring, art exhibits and musical presentations that thousands of public libraries have routinely been hosting for generations? Where are the LIS researchers to perform similar evaluation studies on the multiple community effects of library reading and library as place in all types of libraries across the country and over time that take into account demographic variables like race, age, gender, sexual orientation, class, etc.?
Data generated by such research would not only benefit librarians struggling to define mission statements and justify budgets to city managers, council members and school and college administrators (many of whom are convinced the internet has made libraries “obsolete”), it would also help librarians identify which programs and services are providing greatest benefit to their communities and thus deserve additional resources.
Part of Our Lives shows that, over the generations, library users learned many things in multiple ways through the useful information libraries made accessible, the commonplace reading materials they circulated and the public spaces they provided. But until LIS educators teach library reading and library as place in their professional programs at the core level, and until LIS researchers ask questions about what users learn from their interaction with libraries and determine how that learning fits into their everyday lives, both are addressing only a fraction of what libraries actually do for their patrons. And both are falling short of their profession’s needs.
Wayne A. Wiegand is F. William Summers Professor Emeritus of Library and Information Studies at Florida State University and author of Part of Our Lives: A People’s History of the American Public Library (Oxford University Press, 2015). Between January and May 2017, he will be Distinguished Visiting Scholar at the Library of Congress’s John W. Kluge Center.
This month will mark the one-year anniversary of the death of Thomas Leland Berger. Some readers of Inside Higher Ed might have known Berger while he lived -- he was a fairly well-known scholar and teacher of English Renaissance literature, active in both the Shakespeare Association of America and the Malone Society. He was renowned for his brilliance (one of his former students, the novelist and short-story writer Lorrie Moore, celebrated his intellect and enthusiasm for literature in a New York Times essay a decade ago) as well as his wit. (He remains one of the funniest people I have ever met.)
Certainly, his intelligence and his cleverness were quite memorable. But I don’t think that’s why I continue to think about Berger today, one year after his death and 17 years after my own graduation from St. Lawrence University, where he taught for decades.
I suppose I should start with my first memory of Berger, who made quite an impression on me when I was a surly 17-year-old high school senior touring colleges throughout New York and New England. My parents and I had come to St. Lawrence University one fall Saturday for a prospective student day -- campus tours, food in the dining hall and meetings with students and faculty members. I didn’t know his name at the time, but Berger was representing the English department alongside representatives from, as I recall, the theater and sociology departments in a discussion of those fields. The idea was that they would tell those of us who had indicated potential interest in those majors just what course work was entailed and what we could expect should we enroll at St. Lawrence. It’s not the most exciting way for an academic to spend his Saturday, but if Berger was annoyed or inconvenienced, he didn’t let on.
Toward the end of the discussion, the person from the admissions office who was coordinating things asked the three faculty members to tell us and -- importantly, I think -- our parents just what sort of valuable, real-world (which probably most of us would take to mean “marketable” or “job-related”) things we would learn in their classes. I don’t recall what the other two professors said. Something about critical thinking and learning how to learn and stuff like that, I’m sure. But when it was his turn to speak, Berger seemed thoughtful, and he answered rather slowly and deliberately.
“All sorts of things,” he said. “For instance, my Shakespeare students are currently reading Antony and Cleopatra, and I think the most important life lesson you can get from that is, if you are in charge of a massive army, and all of your generals tell you to do one thing but your girlfriend thinks you should do something else ….” Here he paused, then leaned forward conspiratorially, and said, “Listen to your generals.”
That wasn’t the only reason I wound up attending St. Lawrence, but his comment did distinguish that prospective student day from others, which are often rather scripted and not particularly distinctive. Here was a college, I knew, where at least one professor knew how to capture your attention.
So that lesson in military strategy was the first lesson Berger taught me -- the first of many. I would go on to take several classes with him, and I learned a great deal about literature, early modern London and, of course, comic timing. Those were all valuable lessons. He was a challenging professor, to be sure -- his quiz questions were unbelievably specific, demanding that we pay careful attention to the minutia in every BBC production of Shakespeare’s plays. In fact, when one classmate complained after one such quiz, “Dr. Berger, your quizzes are too hard,” he put his hands in his pockets, cocked his head thoughtfully and replied, “Or, perhaps you’re just too …. Nah. It must be the quiz.”
As demanding as he was, though, students overwhelmingly seemed to love him. His classes were always full, and among the English majors, at least, there was a general agreement that Berger was probably the smartest man in the world. He didn’t work from notes -- he simply came in, opened the textbook to the play under discussion and then typically sat down in front of the class, crossed his legs and began to hold court. I’d never known anyone with so much knowledge in his head, ready to be shared so informally, without pretension or even apparent effort. It made us all want to work even harder -- a good grade from such an obviously intelligent man might mean that we were intelligent, too.
But as I said, he was funny, too. There was the time, toward the end of one semester, when he came into class with the manila envelope and fistful of pencils that indicated we would be filling out course evaluations. He put the evaluation materials on the table at the front of the class, then turned to the chalkboard, where he wrote out the word “dickhead.”
“I figure most of you probably need to know how it’s spelled,” he explained.
The educator in me knows that there probably were students in the class who thought Berger was a dickhead -- no one professor can be loved by all students, surely -- but I couldn’t imagine how someone could think such a thing. Still, I appreciated his suggestion that the student who might think such a thing probably wasn’t very good at spelling, as well as his posture of not caring a whit one way or another. He seemed cooler than cool.
Speak What We Feel
I benefited tremendously from Berger’s instruction as a student, and I doubt I would have gone on to graduate school had he not made being smart look so damn cool. Still, I think he saved his wisest words for our conversations when I wasn’t enrolled in higher education. When I was diagnosed with Hodgkin’s disease shortly after Christmas my senior year, I wound up withdrawing from the university. Even if I wasn’t taking classes with him, however, I found that I still wasn’t finished learning from Berger.
I had been in touch with a few friends after the diagnosis, but I hadn’t contacted too many faculty members because, honestly, I didn’t think they would care. I was just one of the hundreds of students they taught over the course of their careers, after all.
So when my mother called down the stairs one morning to tell me I had a phone call, I did not expect to hear Tom Berger’s voice on the other end. But there it was. At a time in my life when I was more scared and lonely than I had ever been, Berger called to see if I was OK. That’s when I realized he was more than a clever person -- more than a wise person, even. He was a very, very good person who had the compassion to reach out to me when I didn’t even realize that I needed to hear from him.
I wish I remembered more of that conversation than I do. I can tell you that later, when I went back to campus to visit while going through chemotherapy, I sat in his office and told him I didn’t think I’d ever want to write about the experience of being ill, as several of my friends and former professors had encouraged me to do. I didn’t want to write something trite or clichéd, I told him -- which was true but wasn’t the biggest issue. The biggest issue, which I think he understood, was that I didn’t want to live with cancer-- even thoughts of cancer-- any longer than I had to. So, I insisted, I would never write about having cancer.
He neither encouraged nor discouraged me in this idea, as I sat across from him in his book-stuffed university office, but he did give me one piece of advice that I still take to heart every day. He said, simply, “You don’t want to be defined by your worst experience.”
So smart. So obviously true. But also, kind of hard to do, sometimes.
If you’re anything like me, maybe you tend to dwell on your pain, or on worst-case scenarios, or on the suffering that is inflicted on all of us over the course of our lives. But if you’re like me, and had a wise mentor caution you not to let such things define you, then maybe you’re also able to remind yourself of your blessings, too. The pets who seem to intuit when you are sad and come over to offer comfort. A cold beer on a hot afternoon. The friends who laugh with you. The spouse who supports you. The family that loves you. The teachers who inspired you.
I was invited to speak at a celebration of Tom Berger’s life earlier this year at the Blackfriars Playhouse, home of the American Shakespeare Center, where I shared an earlier, truncated version of this essay. And a couple weeks ago, his widow and kids (who are all older than I am) sent me a gift -- a coffee mug with a Tom Berger quote -- “I wouldn’t say no to a cup of joe” -- and a very nice card thanking me for sharing my memories at the celebration.
To be honest, the gift made me feel a little guilty. I don’t think I was a particularly memorable student in Tom Berger’s career. While I ran into him at conferences in the years after I graduated, and he vaguely knew my wife, who is also an early modern scholar, we did not stay in particularly close touch. I was always glad to see him, and I think he was always glad to see me. But in the end, I was just one of the thousands of students he interacted with over the course of his amazing career. I happened to have come to the family’s attention because I published something about him after he died, and they invited me to join them in sharing memories, but really, I think just about anyone who studied with the man probably had stories to share.
As Hamlet said, “He was a man, take him for all in all, I shall not look upon his like again.” And if I can have even a fraction of the impact on the students that I work with as Tom Berger had on his students, I feel like I will have lived a remarkable life.
William Bradley is an essayist and writing center coordinator at Heidelberg University in Tiffin, Ohio. His book Fractals was recently released by Lavender Ink.
Across almost a century of American social and political change, W. E. B. Du Bois was the pre-eminent African-American author and thinker, bar none. He was born three years after the end of the Civil War and died just one day before the March on Washington in 1963. He was the first black scholar to receive a Ph.D. from Harvard University. The German sociologist Max Weber admired his book The Souls of Black Folk (1903) and tried to arrange its translation. And his place as founding editor of the National Association for the Advancement of Colored People's magazine, The Crisis, gave him not just an agenda-setting role in the history of the civil rights movement but also an international influence.
W. E. B. Du Bois: Revolutionary Across the Color Line by Bill V. Mullen (published by Pluto Press, with distribution in the United States by the University of Chicago Press) serves as a timely introduction to this impressive and somewhat imposing figure, while also reframing Du Bois’s life and work beyond the boundaries of the American context. Mullen is a professor of English and American studies at Purdue University and the author of two previous studies of Du Bois: Afro-Orientalism (University of Minnesota Press, 2004) and Un-American: W. E. B. Du Bois and the Century of World Revolution (Temple University Press, 2015). I interviewed him by email about his most recent book.
Q: Du Bois said that the problem of the 20th century was the problem of the color line. We heard a lot about the United States becoming a “postracial” society when President Obama was first elected on the assumption that the problem had been solved, which is not a perspective often championed these days. What do you think counts as the most pertinent aspect of Du Bois’s legacy now, after eight years of an African-American president and several of civic unrest on a scale we haven't seen for decades?
A: I think the most pertinent aspect of Du Bois’s legacy to today’s protest movements -- against police violence, for Black Lives Matter and the movement for Palestinian civil rights, for example -- was his insistence that only mass protest could bring about meaningful social change. Du Bois was eventually weaned away from the idea that capitalism and racism could be reformed from above. His view of democracy was that it was a living thing animated by ordinary people engaged in self-activity for equality.
All of the major social justice organizations he was involved with -- the Pan-African movement, the Socialist Party, the NAACP, the Peace Information Center against atomic weapons, the Communist Party -- were interracial or international movements that challenged institutions of power and authority. An especially relevant example to our time is the work Du Bois did to create the “We Charge Genocide” petition delivered to the United Nations in 1951. He wrote the first drafts of that petition, which charged the U.S. state with disproportionately causing black death through poverty, poor schooling, social and police violence. After Trayvon Martin was killed in 2012, a group of young Chicago activists formed the group We Charge Genocide to document police shootings of African-Americans in Chicago and to honor that earlier effort. Du Bois’s legacy to our time was made very real and direct in that moment.
Q: You write that biographers and scholars have neglected or underestimated the significance of Du Bois’s long-term political development, and at one point, you suggest there’s a tendency to overemphasize his early book The Souls of Black Folk (1903) almost as if that’s his single major work. David Levering Lewis’s two-volume biography of Du Bois seems very broad in scope and deep in detail, so I’m wondering if there are particular discussions of Du Bois, or perspectives on him, that you’re challenging.
A: There are two parts to this exclusion tendency. Levering Lewis’s biography of Du Bois is magnificent. But he dedicates only 16 out of almost 1,400 pages to the last eight years of Du Bois’s life. In that time, Du Bois traveled to the Soviet Union and China, joined the Communist Party, published his autobiography in the Soviet Union, and moved to Ghana. The effect of downplaying those events is to diminish them as late-in-life mistakes of someone who has taken a bad political turn or has simply lost his bearings in old age. I argue instead that that those culminating events of Du Bois’s life can only be explained by tracing them back to points of origins far earlier. I dedicate a whole chapter to Du Bois’s writings on Asia, for example, which begin in 1905, because they explain why he later supported Maoism so strongly and why he said in the 1940s that the future of the world depended upon events in Asia.
Second, there is still a tendency to ignore Du Bois’s lifelong interest in Marxism so that he remains an avuncular “race man” figure for scholars in the academy. To give an example, Du Bois wrote a 300-page manuscript called “Russia and America” in 1950. His publisher, Henry Giroux, wouldn’t bring it out during the Cold War, saying it was too pro-Soviet and anti-American. To this day, it has never been published. I spend a good deal of time talking about the book because it explains better than any other single Du Bois text why he sympathized with the Russian revolution. The book is also important for showing how Du Bois saw the Russian revolution as a sequel to African-American self-emancipation from slavery, an event he called an “experiment of Marxism.” My tendency then is to show that Marxism was always central to Du Bois’s political development -- not a detour, diversion or mistake.
Q: Arguably Du Bois’s life and work are too large, too far-flung, even for Paul Gilroy’s notion of the “Black Atlantic,” since the Indian independence struggle (among other Asian developments) was so important for him. You discuss him as a “transnational” figure. Please say more on that.
A: Du Bois was most accurately described as an internationalist. His worldview was framed by 19th-century nationalisms, the Pan-Africanist movement, Communist internationalism and the anticolonial movement of the 20th century. His political orientation was to see in all directions simultaneously the interdependence of the advanced and underdeveloped worlds, as well as the historical movements of people between nations and territories. He called Japan’s defeat of Russia in their 1905 war the first “crossing of the color line” in world history, and India’s independence in 1947 the greatest event of the 20th century. He first used his famous coinage “The problem of the 20th century is the problem of the color line” in the 1900 Pan-African Congress address to refer to the relationship of nonwhite peoples across the world to their colonial masters.
Intellectually, his influences ran from Hegel to Alexander Crummell, Bismarck to Nehru. His 1928 anticolonial novel, Dark Princess, is a rewriting of Shakespeare’s A Midsummer Night’s Dream. For me, communism and socialism provided the intellectual synthesis of this global perspective: he understood what the Communist International called “world revolution” as the drawing together of modern humanity into a single project, or totality, of global unity and emancipation. That is the main theme of my book, and the through line for my account of his lifelong political development.
Q: Would publishing the manuscript of Du Bois’s “Russia and America” be worthwhile now? It's certainly odd to think of a book-length work by a figure of such significance languishing in the archives.
A: “Russia and America” should absolutely be published. Vaughn Rasberry’s important new book, Race and the Totalitarian Century,also puts “Russia and America” at the center of Du Bois’s Cold War writing. The problem is the Du Bois scholarship industry. Most Du Bois scholars haven’t read the manuscript and therefore don’t understand its importance. Others who have read it dismiss it because Du Bois is full throated in his praise of the Soviet Union at a time when many of Stalinism’s worst errors were becoming well-known.
In other words, the manuscript still lives in the shadow of Cold War thinking that should be long past by now. Too many scholars would prefer to preserve a hagiographic image of Du Bois as a benign humanist or saint rather than comprehend both the depth of his commitment to Communism and the reasons he oftentimes looked past problems with Stalin’s Russia. It’s a kind of “Don’t ask, don’t tell” approach to scholarship, which does a disservice to students and scholars who want to comprehend Du Bois and socialism in the 20th century -- problems and all.
Q: Your book follows a difficult line with respect to some of Du Bois’s political commitments. You seem understanding, or at least nonpolemical, with regard to his support for the regimes of Stalin and Mao, but a number of remarks make clear you reject those politics. How do you manage to balance those perspectives?
A: Du Bois’s political evaluations of Stalin’s Russia and Mao’s China were consistent with those of many of the people whom we consider to be the most important radicals of the 20th century, including the majority of anticolonial leaders from Asia and Africa. His strong desire for decolonization led him to trust the Soviet Union and China and their promises of aid to that project well past the time their revolutions had become corrupted. To be for world revolution and decolonization in the 20th century, in other words, was to sign up for Communist internationalism with all of its faults. Du Bois signed up early and never fully recanted.
On the other hand, he misapprehended the meaning of Marxism and socialism in ways that we should not forgive or forget. He confused state capitalism -- Stalin’s system of socialism in one country and bureaucratic rule from above -- with the real meaning of socialism as working-class self-emancipation. His thin understanding of Japanese and Chinese history caused him to perceive Japanese imperialism and expansionism in China as a viable alternative to capitalism for nonwhite workers of the world. Du Bois was both brilliant and fallible.
But he was always, as I try to make clear, vying to find a way that ordinary people could fashion their own liberation and self-emancipation. He found this match of political will and human self-activity in his most brilliant book, Black Reconstruction in America (1935). If he had written nothing else in his life, Black Reconstruction would have cemented his place as one of the most original scholars and political theorists of human freedom. So his life and his work demand a judicious and balanced approach that is well grounded in the theories of revolution and human liberation he was trying to advance. I try to provide that approach, and as you say, walk that line, in my book.
Q: Du Bois’s early worldview reflects a belief in elite leadership -- “the talented tenth.” Your book stresses his move toward a more democratic perspective, an emphasis on agency and power from below. But isn’t there a lot of continuity in his thinking? Aren’t traces of the young Du Bois who admired Bismarck still discernable in the octogenarian who wrote a glowing tribute following Stalin’s death in 1953?
A: There are two kinds of continuity in Du Bois’s political thought across the course of his long life. One is the quest cited above for human emancipation carried out by ordinary people. In 1956, only seven years before his death, Du Bois wrote an essay in tribute to one of his heroes, the socialist militant labor leader Eugene Debs. At a time in which he was well aware of problems in the socialist models of both Stalin’s Russia and Mao’s China, Du Bois wrote, “A state socialism planned by the rich for their own survival is quite possible, but it is far from the state where the rule rests in the hands of those who produce wealth and services and whose aim is the welfare of the mass of the people.” That is the Du Bois who fought for what we can call “socialism from below.”
On the other hand, Du Bois never quite gave up the idea that a “great man” -- a Bismarck or a Stalin -- could redirect human history. The socialist William Gorman put this very well in an essay in the 1950s. About Du Bois’s defense of Stalinism, Gorman wrote, “There he can find embodied … in his life work in regard to the negroes: the conception of the talented tenth and the urge toward international revolt. Stalinism … approaches and manipulates the masses like an elite convinced of their backwardness and incapacity; hence the necessity to dictate, plan and administer for them from the heights of superior knowledge and wisdom.”
My final assessment is that Du Bois was a contradictory figure, but one who made the struggle for black freedom central to the 20th-century struggle for human emancipation in all its forms. We should not blame Du Bois that history didn’t solve the problem of the color line. We should celebrate the fact that he was one of the few people in American history to try to use every tool at his disposal to develop a theory and practice of human emancipation. He was a dangerous figure in the very best and most radical sense of that word.