I fell in love with him first over the phone. A man older than my father, a man I hadn’t yet met in person, a man about whom I knew little except that he was kind to me, but someone who was different in obvious and profound ways from the people I encountered every day.
In 1984 I had just started a full-time job as an editorial assistant. I was fortunate to be working for an editor who saw it as part of her mission to educate me. Susan would invite me to sit in on her hectoring conversations with authors, where she would tell them, harsh and didactic in both her tone and her language, exactly what was wrong with their thinking. She would explain to me how manuscripts get shaped into books, what the expectations of a reader are and how the author can’t afford to frustrate them. She talked a lot about fairness. She was the first proud Reaganite I ever thought was smart.
A part of my job, after I’d finished typing up correspondence, preparing projects to go into production, trafficking copy and marketing materials, and answering the phone, was to line up readers to report on manuscripts. This was just as Oxford was starting to publish scholarly books, and it was an easy way to build a list. Peer review counted more than an editor’s good judgment and took less time. Susan had only to look quickly at the first few pages and decide whether or not a project was worthy of being “sent out.” She’d give me a list of names, or tell me to call one person (usually an author) and ask for suggestions. This was one of my least favorite parts of the job. You’d have to make a zillion phone calls, and leave hundreds of messages. Sometimes academics were nice, but they were rarely in their offices. They could be mean or pompous, and would sometimes lecture me on the manuscript they hadn’t yet read. I imagine that now, with e-mail, things are a whole lot easier.
Sometimes I’d strike out so many times I would end up with a reader who wouldn’t be a natural extension of the scholars on the original list, which is probably how it happened, because now I can’t believe that Susan would ever have asked me to call him as a reader. He was surprised at being asked to do something for Oxford. But he talked to me in a way that other readers -- busy, name-brand academics -- didn’t.
He wondered what it was like to do my job. (I loved it. Really? he said. You love your work?) Where had I come from? (College -- I’d started working at Oxford one day a week before I’d even graduated. He asked which college, and then didn’t say much.) Where did I live? (Manhattan. He’d grown up, he said, in Brooklyn. His accent, in fact, reminded me of my grandparents who still lived there.) Where had I grown up? (In the boonies of upstate New York.) My parents’ jobs? (My father was a bitter, third-rate academic at a state university.) Future plans? (No plans -- I took this job instead of going for my dream, working on a dude ranch in Wyoming. Too bad, he said, that my dream hadn’t quite worked out. Yet.)
I can’t remember what manuscript I’d sent him, or what he sent back in his reader’s report. I do remember that Susan was surprised to see a report from him and to hear that we’d had a long talk. And I remember that not long after, he called to say that he was going to be in New York City and wanted to come by the office. Come by, I said.
He wasn’t anything like what I expected. He was tall, very tall, and athletic-looking, with a strong face, and salty dark hair. He was handsome in ways that I didn’t think a sixty-year-old man could ever be.
We sat in my office and chatted. He told me about his kids -- there were two, both older than me. Mostly, though, he wanted to know about me.
How much is there for a 22-year-old girl to say about herself? I talked about my work. I talked about how this job was like being in graduate school. I got to learn not only about publishing, but about a whole bunch of academic fields I hadn’t gone deep into in college, where I read mostly English literature, a little bit of criticism, and dabbled in philosophy. Now I was immersed in political science, sociology and, when Susan could sneak it in, history. Oxford was then divided -- in divisive and rancorous ways -- by traditional disciplinary lines and editorial jealousies, and fears of “poaching” buzzed like flies in the hallways.
We editorial assistants had our own sources of strife. Some had to work for bosses who were, if I’m to be honest, bat-shit crazy. Others were chained to their desks, barely allowed to leave the building for lunch. Most didn’t get the kind of author contact I was allowed, because most editors didn’t share their jobs as fully as Susan did. I got to go to important meetings while my friends were typing. I went to lunches at fancy expensive account restaurants as long as I got the Xeroxing done.
I told him about the opportunities being handed to me, the fact that I didn’t even know, most of the time, when I was talking to people who were famous in their fields. I was paid to read, I told him, to learn. It was a great job.
Later, he would confess that he told his students about me -- that I was one of the few people he knew who was truly happy with her work. Now I don’t know whether to believe this; I suspect he said it to make me feel good and that he knew that someday I would realize that things were not so simple.
We began having lunches. I flexed a fledgling expense account to take him out when he came to New York, and I made time to see him on the occasions when I got to travel to Boston. I would always ask what he was working on, acting like a big girl editor. After my first year at OUP I left Susan to work for the American history editor, so this would have made sense, but it didn’t interest him in the least. His work was different from most of Oxford’s list; in many ways, I later realized, we were the mainstream he was reacting against.
Once he called me up to say that he had written a play about Emma Goldman; it was being produced in New York, directed by his son. He gave me the information and I said I would go. He told me to introduce myself to his son. I went to the play -- enraptured by the fact of knowing the playwright -- but was too shy to track down his son, who was tall and handsome like his father.
One day, over lunch in Boston, I grilled him and made me tell him his story. It’s a familiar one, by now: He was the son of Jewish immigrant, and began his career as a rabble-rouser at age 17. At that point I had moved to Brooklyn, and he talked about the Brooklyn Naval Yard, about meeting Roslyn, his wife. He talked about joining the Army to fight on the side of good against evil. “I was a bombardier,” he said. He said it twice, as if he could hardly believe it We were eating lunch, maybe at Legal Sea Foods in Cambridge, and I knew he was telling me something important. He told me about the box he’d stuffed all his army belongings into -- his medals, his papers -- and that he’d written “Never again” across the top.
He told me about teaching at Spelman College and his work during as part of the civil rights movement. He told me about getting fired.
And then he talked about Boston University and about John Silber. He said that he taught the biggest course at the university and that he wasn’t allowed any teaching assistants. The president had offered him some, he said, if he cut down his class size. Way down. He was on the eve of retirement, but said that he wanted to stick around just to irritate Silber.
At that point, I hadn’t read A People’s History of the United States. I knew Howard Zinn only as a professor who had read a manuscript for me and become, unexpectedly, my friend. And then I read him, and fell in love with him in myriad other ways. For his bravery. For his lucidity. And of course, for the generosity and authenticity of his vision. I have met labor historians who have no truck with laborers; defenders of social justice who say racist and sexist and plain old bigoted things after a cocktail or two.
There are many others who can talk about how Howard Zinn changed not only their lives, but the world. I am well aware of my good fortune. When I needed the figure of a good father, someone wise and kind, challenging and encouraging, I did a slipshod job at work and called him to read a manuscript.
Rachel Toor teaches creative writing in the MFA program at Eastern Washington University, in Spokane.
We seem to be in the midst of a religious revival. At least that seems true within higher education, and especially within our own field of American history. According to a recent report from the American Historical Association (and written about inInside Higher Ed), religion now tops the list of interests that historians claim to have as their specialty.
This renaissance bodes well for a discipline that more or less has forgotten about or tended to marginalize religion, especially when it has examined modern America (typically defined as anything after 1865). Even to this day, religion is everywhere around us, and religious historians have written about it in compelling and exciting ways, but within mainstream historiography it has been basically left behind. In a sense, religion is everywhere in modern American history, but nowhere in modern American historiography.
To illustrate the point, Jon Butler’s 2004 article for the Journal of American History analyzed American history textbooks to see if religion was present in their pages. He found that religion was omnipresent in the telling of early American history (before 1865), but after the Civil War it appeared only episodically, "as a jack-in-the-box," popping up "colorfully" here and there, then disappearing, "momentary, idiosyncratic thrustings up of impulses from a more distant American past or as foils for a more persistent secular history."
Butler is not alone in noticing this shortcoming. Robert Orsi, an eminent historian of American everyday religious practice recently suggested that historians have failed to grasp what has been going on within the subdiscipline of religious history. And David A. Hollinger, an intellectual historian and the incoming president of the Organization of American Historians, has urged historians to study religion, not for the sake of advocacy, but because of the extensive gap between intellectuals and the rest of the population. His articles have carried such urgent titles as "Jesus Matters in the USA" and "Why is there so much Christianity in the United States?"
There are several possible explanations for this everywhere/nowhere disjunction. Some explanations include the rise of social history, the disconnect in professed religious beliefs between academics and other Americans, the fact that the widespread recognition of America’s religious pluralism has forced our institutions to become increasingly secular, and more.
But we would like to suggest some ways in which religious historians have attempted to fuse their stories into the mainstream narrative. A good example to begin with (because its timing accords perfectly with religion’s historiographical absence) might be Edward Blum’s award-winning book Reforging the White Republic: Race, Religion, and American Nationalism, 1865-1898 (2005). When explaining the demise of the cause of equality that was so prominent in the Civil War, Blum lays blame directly on American religious institutions and their leaders. Blum shows that many if not most of the narratives of reconciliation that emerged in the 1870s embraced Christian images of reunion, a Messianic notion of coming together again and working on the great American project, all at the expense of African Americans. And Northern ministers led the way. The capstone final moment in D. W. Griffith’s Birth of a Nation (1915), whichilluminates the triumph of the Klan as the "birth of the nation," shows Jesus’ face hovering over it. It was Protestant ministers and activists, Blum shows, who midwived the end of Reconstruction and the rise of a united, white Christian America, aggressively on the prowl for territorial conquests.
During the progressive era of the early 20th century, even as many American institutions were secularizing, religion marked many aspects of social life. Clifford Putney’s study of recreational and professional sports from 1880 to 1920 put a Muscular Christianity, as he titled his 2001 book, at the center of Victorian manhood. A revitalized and reformed Protestantism, based in no small part on excluding itself from new immigrants, help to recreate the notion of Victorian manhood. Religion was the key. William J. Baker has updated this story for our own times in Playing with God: Religion and Modern Sport (2007), which affirms Putney’s timing that muscular Christianity emerged out of a turn-of-the-twentieth-century need to redefine manhood in the Industrial Age.
For the interwar years, Matthew Avery Sutton’s new biography, Aimee Semple McPherson and the Resurrection of Christian America (2007), portrays McPherson’s deep influence in the period (and also, therefore, conservative Protestantism’s deep influence too). Sutton argues that McPherson was among the first in the modern era to unite conservative Christianity, American nationalism, and a political sensibility favoring the fantastic. Although she has been derided as a sexualized simpleton who quickly faded from the scene, Sutton portrays her as the forefather (mother?) of today’s Religious Right. Her style of publicly and personally sensational politics created a model that would be picked up several decades later by the likes of Jimmy Swaggart, Pat Robertson, and James and Tammy Faye Baker. In her day, she was everywhere, but today, she hardly appears at all in the mainstream narrative of American history, this despite her importance in laying the groundwork for the rise of the Religious Right.
During the post-World War II period, the United States experienced a religious revival of sorts as well, although one that was unusual in American history because it was a revival not just for Protestants, but for Roman Catholics and Jews. Catholics and Jews took advantage of the anti-fascist rhetoric of World War II and the Cold War in order to combat any lingering connections between Protestantism and American nationalism. Instead, they articulated the idea that the state should be neutral in handling religious affairs, whether it be in the U.S. Census or in the realm of public education. In this way, religion sits at the root of today’s multicultural struggles, where differences are to be recognized and even championed, but never prioritized by the state. Interestingly, these ways of managing pluralism were worked out when religious groups were the primary provocateurs, and not by racial, ethnic, or gendered groups.
There are many more examples of these acts of incorporation. But despite all this recent work, our general thesis that religion has been everywhere in history but nowhere in historiography has two major exceptions: in historical works on the civil rights movement and the religious right. When it comes to civil rights historiography, religious interpretations have vitally influenced scholarship; indeed, those who downplay the influence of religion tend to be the “heretics,” rather than the other way around. Meanwhile, we now have a small library of books on contemporary figures of the Religious Right, from Jerry Falwell to James Dobson to Phyllis Schlafly.
Noting these two exceptions raises important questions. For example, since these are two groups that have been historically racialized and/or marginalized, does that make it “safer” to incorporate religion more centrally into their intellectual trajectories? And to what degree do they influence the mainstream narrative? In other words, when we move from the mainstream to the margins, does it become safer to introduce religion as a central actor in people’s lives? And if so, will that scholarship focusing on the margins find its way into the mainstream narratives? The almost complete absence of religion from David M. Kennedy’s Freedom from Fear and James Patterson’s Grand Expectations, the two Oxford History of the United States volumes covering the period from 1932 to 1974, provides just cause for such reflection.
Meanwhile, religion continues to influence and shape Americans’ lives. The much-publicized "U.S. Religious Landscape Survey" (2008) from the Pew Forum on Religion and Public Life provides some startling data. First, the survey found that almost 1 out of every 10 Americans is an ex-Catholic. For the past 100 years, Catholics have always been and still do make up about 25 percent of the population, but the stability of proportion for recent years is only explicable because of the large numbers of immigrants, mostly Hispanic, that have come to the United States since 1965, when the United States loosened its immigration laws. Second, by the standards of population, the United States is still not “Abrahamic” or even “Judeo-Christian,” if it ever was. Jews make up 1.7 percent of the population, while no other non-Christian religion constitutes more than 1 percent. Meanwhile, nearly 80 percent of Americans consider themselves to be some variety of Christian. And finally, the largest growth area in people’s religious identification lies in the category of “uncommitted,” now amounting to about 14 percent of the population, according to the Pew survey. That figure varies vastly by region. Thus, "uncommitted" makes up a sizable portion of the Pacific Northwest, but barely registers as a religious alternative in the Deep South.
Other highlights from the Pew survey include the fact that there are more Buddhists than Muslims in America, there almost as many atheists as Jews (and more agnostics), and more than a quarter (28 percent) of all Americans have left the grand faith tradition into which they were born, while nearly half of all Americans have left the faith of their birth or switched denominations at some point in their life (44 percent). The survey thus emphasizes that the structure of faith in America is an amorphous thing, constantly changing, influencing people’s lives in new and dynamic and important ways. And religious historians have been busy tracing religion’s dynamism in modern American history.
If only more historians would care. Perhaps our discipline’s “religious revival” will help make it so.
Kevin M. Schultz and Paul Harvey
Kevin M. Schultz is assistant professor of history at the University of Illinois at Chicago. Paul Harvey is associate professor of history at the University of Colorado at Colorado Springs.This essay is adapted from a forthcoming article in the Journal of the American Academy of Religion.
In two weeks, the National Book Critics Circle will vote on this year’s awards, and so, of late, I am reading until my eyes bleed. Well, not literally. At least, not yet. But it is a constant reminder of one's limits -- especially of the brain's plasticity. The ability to absorb new impressions is not limitless.
But one passage in Edmund White’s City Boy: My Life in New York During the 1960s and ‘70s (a finalist in the memoir category, published by Bloomsbury) did leave a trace, and it seems worth passing along. The author is a prominent gay novelist who was a founding member of the New York Institute for the Humanities. One generation’s gossip is the next one’s cultural history, and White has recorded plenty that others might prefer to forget. City Boy will be remembered in particular for its chapter on Susan Sontag. White says that it is unfortunate she did not win the Nobel Prize, because then she would have been nicer to people.
But the lines that have stayed with me appear earlier in the book, as White reflects on the cultural shift underway in New York during the 1960s. The old order of modernist high seriousness was not quite over; the new era of Pop Art and Sontag's "new sensibility" had barely begun.
White stood on the fault line:
"I still idolized difficult modern poets such as Ezra Pound and Wallace Stevens," he writes, "and I listened with uncomprehending seriousness to the music of Schoenberg. Later I would learn to pick and choose my idiosyncratic way through the ranks of canonical writers, composer, artists, and filmmakers, but in my twenties I still had an unquestioning admiration for the Great -- who were Great precisely because they were Great. Only later would I begin to see the selling of high art as just one more form of commercialism. In my twenties if even a tenth reading of Mallarmé failed to yield up its treasures, the fault was mine, not his. If my eyes swooned shut while I read The Sweet Cheat Gone, Proust's pacing was never called into question, just my intelligence and dedication and sensitivity. And I still entertain those sacralizing preconceptions about high art. I still admire what is difficult, though I now recognize it's a 'period' taste and that my generation was the last to give a damn. Though we were atheists, we were, strangely enough, preparing ourselves for God's great Quiz Show; we had to know everything because we were convinced we would be tested on it -- in our next life."
This is a bit overstated. Young writers at a blog like The New Inquiry share something of that " 'period' taste," for example. Here and there, it seems, "sacralizing preconceptions about high art" have survived, despite inhospitable circumstances.
White's comments caught my bloodshot eye because I had been thinking about Arthur C. Danto's short book Andy Warhol, published late last year by Yale University Press. (It is not among the finalists for the NBCC award in criticism, which now looks, to my bloodshot eye, like an unfortunate oversight.)
It was in his article “The Artworld,” published in The Journal of Philosophy in 1964, that Danto singled out for attention the stack of Brillo boxes that Warhol had produced in his studio and displayed in a gallery in New York. Danto maintained that this was a decisive event in aesthetic history: a moment when questions about what constituted a piece of art (mimesis? beauty? uniqueness?) were posed in a new way. Danto, who is now professor emeritus of philosophy at Columbia University, has never backed down from this position. He has subsequently called Warhol “the nearest thing to a philosophical genius the history of art has produced.”
It is easy to imagine Warhol's response to this, assuming he ever saw The Journal of Philosophy: “Wow. That’s really great.”
Danto's assessment must be distinguished from other expressions of enthusiasm for Warhol's work at the time. One critic assumed that Warhol's affectlessness was inspired by a profound appreciation for Brecht’s alienation effect; others saw his paintings as a radical challenge to consumerism and mass uniformity.
This was pretty wide of the mark. The evidence suggests that Warhol’s work was far more celebratory than critical. He painted Campbell’s soup cans because he ate Campell’s soup. He created giant images based on sensational news photos of car crashes and acts of violence -- but this was not a complaint about cultural rubbernecking. Warhol just put it into a new context (the art gallery) where people would otherwise pretend it did not exist.
“He represented the world that Americans lived in,” writes Danto in his book, “by holding up a mirror to it, so that they could see themselves in its reflection. It was a world that was largely predictable through its repetitions, one day like another, but that orderliness could be dashed to pieces by crashes and outbreaks that are our nightmares: accidents and unforeseen dangers that make the evening news and then, except for those immediately affected by them, get replaced by other horrors that the newspapers are glad to illustrate with images of torn bodies and shattered lives.... In his own way, Andy did for American society what Norman Rockwell had done.”
It seems like an anomalous take on an artist whose body of work also includes films in which drag queens inject themselves with amphetamines. But I think Danto is on to something. In Warhol, he finds an artistic figure who fused conceptual experimentation with unabashed mimeticism. His work portrays a recognizable world. And Warhol’s sensibility would never think to change or challenge any of it.
Chance favors the prepared mind. While writing this column, I happened to look over a few issues of The Rag, one of the original underground newspapers of the 1960s, published in Austin by students at the University of Texas. (It lasted until 1977.) The second issue, dated October 17, 1966, has a lead article about the struggles of the Sexual Freedom League. The back cover announces that the Thirteenth Floor Elevators had just recorded their first album in Dallas the week before. And inside, there is a discussion of Andy Warhol’s cinema by one Thorne Dreyer, who is identified, on the masthead, not as the Rag’s editor but as its “funnel.”
The article opens with an account of a recent showing, of the 35-minute film Warhol film “Blow Job” at another university. The titular action is all off-screen. Warhol's camera records only the facial expressions of the recipient. Well before the happy ending, a member of the audience stood up and yelled, “We came to get a blow job and we ended up getting screwed.” (This anecdote seems to have passed into the Warhol lore. I have seen it repeated in various places, though Danto instead mentions the viewers who began singing “He shall never come” to the tune of the civil-right anthem.)
Dreyer goes on to discuss the recent screening at UT of another Warhol film, which consisted of members of the artist's entourage hanging out and acting silly. The reviewer calls it “mediocrity for mediocrity’s sake.” He then provides an interpretation of Warhol that I copy into the digital record for its interest as an example of the contemporary response to his desacralizing efforts -- and for its utterly un-Danto-esque assessment of the artist's philosophical implications.
“Warhol’s message is nihilism," writes Dreyer. "Man in his social relations, when analyzed in the light of pure objectivity and cold intellectualism, is ridiculous (not absurd). And existence is chaos. But what is this ‘objectivity’? How does one obtain it? By not editing his film and thus creating ‘real time’? By boring the viewer into some sort of ‘realization’? But then, is not ‘objectivity’ just as arbitrary and artificial a category as any other? Warhol suggests there is a void. He fills it with emptiness. At least he is pure. He doesn’t cloud the issue with aesthetics.”
And so the piece ends. I doubt a copy ever reached Warhol. It is not hard to imagine how he would have responded, though: “It gives me something to do.” The line between nihilism and affirmation could be awfully thin when Warhol drew it.
Morris Dickstein's Dancing in the Dark: A Cultural History of the Great Depression, published last year by W.W. Norton, is one of five finalists for the National Book Critics Circle award in criticism. The author, a professor of English and theater at CUNY Graduate Center, has written and edited numerous other works of literary and cultural analysis. His explorations of American literature, films, and music of the "long decade" between 1929 and 1941 seem to be written in an almost classical mode -- as if he were simultaneously channeling the major figures assayed in his Double Agent: The Critic and Society (Oxford, 1992).
My short discussion of Dancing in the Dark recently appeared at the website of the National Book Critics Circle. This was written as part of my duties as a member of the NBCC board, but doing so was no burden; this is a book to inspire enthusiasm. So without further ado, here follows the transcript of an e-mail interview with its author. The winners of the NBCC awards will be announced during a ceremony at the New School University in New York City on Thursday night.
Q: What was it like to find that a project you'd been working on for years suddenly turned out to be all too timely?
A: I actually began working on it in the Reagan era, when it was timely, since Reagan set about to upend the New Deal consensus in many ways, especially about the role of government in our lives and about our need, beautifully articulated in FDR's second Inaugural address, to take collective responsibility for each other, and especially for the worst off among us. Reagan gave a license to self-seeking that was a throwback to the Harding-Coolidge era, and he also dealt a severe and lasting blow to unions when he broke the air traffic controllers' strike.
The book was also timely because every year, almost every month of the 1980s brought the 50th anniversary of some New Deal program. For me, however, many other projects large and small intervened, and the book seemed to grow less timely by the year. In the go-go years of 1990s, the '30s seemed like ancient history. But as I was finishing the book in 2008 the economy tanked, and suddenly the Depression was on everyone's lips.
My wonderful editor at Norton, Bob Weill, said "I would have bought this book even without the financial meltdown." I reminded him that he did -- two months before. But I'm sure this timeliness is responsible for much of the attention the book has received, including an amazing number of reviews and pretty healthy sales.
Q: Your book is not a work of social or political history. But I'm not entirely persuaded that your subtitle is quite right to call it a cultural history, either. A cultural historian of the Depression would have to pay a lot more attention to radio, for one thing. It feels very much more like an interconnected set of essays in literary and cinematic criticism, written with a close attention to form, but also with an old-fashioned willingness to assess value. Is that pigeonholing you wrongly?
A: The book is a hybrid of cultural history and criticism, in proportions that are entirely my own. I wanted to explore how the arts illuminate the Depression and how the Depression illuminates the work done in the arts, even work that appears to have little or no reference to it, and has usually been seen as escapist. I also aimed to uncover the patterns that link or contrast many different kinds of artists: Mike Gold and Henry Roth, Walker Evans and Margaret Bourke-White, Steinbeck and Faulkner, Steinbeck and Nathanael West, Busby Berkeley and Leni Riefenstahl.
I grouped them around four broad cultural themes, and I actually considered subtitling the book "Cultural Themes from the Great Depression." You'd agree that that would have been unduly modest, besides being commercially obtuse.
Why should cultural history have to cover everything, instead of offering an abundant variety of case studies, chosen because they were representative but also because they mattered strongly to me as a critic, and hence I might have something fresh to say about them? What usually passes for cultural history tends to be panoramic but superficial. Jacques Barzun, a pioneer cultural historian, once said that his friend and colleague Lionel Trilling often urged him to dig a little deeper, to pause over some of his many examples. He certainly did that in his book on Berlioz. That's where the critic comes in to deepen and thicken the work of the historian.
The British critic F.R. Leavis, often wrongly seen as a New Critic because of his focus on the texture of a writer's language, actually began writing in the Marxist decade of the 1930s as a sociological critic. Look at the early volumes of his magazine Scrutiny. But he also wrote several essays complaining that social historians tended to use literature merely for documentation, whereas the only real way to "use" literature, to bare its social meanings, was to get deeply inside it, as any subtle and patient reader does. That demands attention to form and language as well as surface content. It also involves critical judgment to assess where the work works, where its intentions are effectively realized and where it actually moves us.
This is not a thumbs-up, thumbs-down approach but a sensitivity to where the writer's or artist's imagination come alive. If you think, as I do, that the arts offer a unique access to the mind and heart of an era, to the way people really thought and felt, then you need to pay a good deal of attention to form and also to exercise your best critical judgment. It requires that indefinable thing called taste as well as some knowledge of the history of the arts. For me the arts, including a large swath of popular culture- - music, movies, photography, theater, and design -- served as a way to lay bare the inner life of the era. I could have subtitled the book "The Inner History of the Great Depression," but that would have been even more controversial, besides making people think it was a book about depression.
Q: Your discussion of Let Us Now Praise Famous Men, the book that James Agee wrote to accompany Walker Evans's photographs of tenant farmers, seems very judicious. I admire this book a lot, but think of it as a defective masterpiece, and you clarify why that seems like a fitting assessment of it. But it was also striking to see your comment on teaching the book: "When I've assigned it to undergraduates, the results have been disastrous." Would you say more about that? Not about Agee's text, necessarily, but about what you mean by the results being disastrous.
A: Since the cliche about the literature and the visual arts of the 30s is that they were folky and naturalistic, or else overtly political, it was important for me to highlight the modernist currents that carried over from the 20s. Such experiments took on a different meaning during the Depression. Agee's writing is Faulknerian, the book is long and digressive, its structure is elusive, and the whole thing can drive you crazy at times. All this proved maddening to undergraduate readers. It took me a while to realize that this was his fault as much as theirs. Much of 30s writing -- Steinbeck, for example -- is more straightforward, since it's rooted in journalism, influenced by Hemingway, and its social criticism is right on the surface.
This is not true of either Walker Evans's pictures, which are far more detached, or Agee's prose, at the other extreme -- so tortured and self-conscious, and so written. Yet it's a genuine 30s book, since it raises serious questions about motives of social documentary and the ethical demands of reporting about lives lived in poverty and deprivation.
For Agee, paradoxically, this was both guilt-inducing and spiritually exalting. Neither of these emotions is comes naturally to readers today, young or old; this was also the case for readers in 1941, when the book first appeared. The worst of the Depression was over, so the book must have seemed like a throwback, despite the timeless appeal of the photographs, which are among the best ever taken by an American. But they lacked the drama and narrative punch that readers had come to expect from photojournalism, thanks to movie newsreels and Life magazine. The book was stillborn till it was republished in 1960.
Q: Were there other experiences in the classroom that deepened your understanding of this period -- or of particular texts, films, etc. -- in ways that were decisive for your book?
A: In my experience, the cultural work of the 1930s teaches very well, especially the films. With the movie industry's rapid adjustment to the technology of sound, that decade saw the consolidation of the classic Hollywood style in relation to mass taste, including strikingly different studio styles, the development of the star system, and the cultivation of genre as the keystone of industrial production. Many of the stars and genres from that period remain hugely attractive today. The luster of Bogart, Cooper, Hepburn, Davis, Fonda, Cagney, Stewart, Stanwyck, Gable, Colbert, Lombard are many others remains surprisingly undimmed.
Screwball comedies and gangster movies still go over very well with students, partly because they're hard-boiled, cynical, and unsentimental, yet also subtly romanticized. The best social problem dramas, from I Am a Fugitive from a Chain Gang to The Grapes of Wrath, have a strong visceral appeal, besides teaching students essential things about the times. I lean to movies that touch obliquely on the Depression, including Busby Berkeley musicals, Frank Capra comedies, and historical epics about other hard times, such as Gone with the Wind, which I often ask students to compare with The Grapes of Wrath. Repeatedly teaching such movies helped create an agenda for the book by showing me what worked and how it still gripped people, which pointed the way to how it might have gripped many Americans back then.
By and large I avoid concentrating on inferior stuff simply as evidence of the times. I have no patience for it and no feeling for it. To do good work, I need to write about things that turn me on. The lasting quality of certain films, books, songs, and photographs is an indication of the many levels on which they work, and also of how much more they have to tell us. Multi-layered books like Call It Sleep, As I Lay Dying, Miss Lonelyhearts, The Day of the Locust, Tender Is the Night, Their Eyes Were Watching God, and Let Us Now Praise Famous Men tell us more about the period than its transient best-sellers, even though they were not truly appreciated until much later. Some books that were sensationally popular then, such as Native Son and The Grapes of Wrath, along with many of the musical standards, remain just as meaningful today. Others books, among them the Studs Lonigan and U.S.A. trilogies, are simply too cumbersome to teach, besides seeming somewhat dated in their naturalistic styles.
One of my aims in the book was to create a living canon of works from the 1930s, not simply an archaeological dig to expose the buried and forgotten layers of a distant culture. Finally, I wanted to show what art could contribute to a society in the throes of a social and economic crisis, a theme that took on unexpected resonance in the wake of our own financial meltdown.
Q: History doesn't repeat itself but it does rhyme, as Mark Twain is supposed to have said. Have any of the works from the Great Depression seemed to you to "rhyme" somehow with the experience of the past 18 months? I've occasionally thought that the title of Edmund Wilson's book of reportage, "The American Jitters," feels quite fitting now....
A: "Jitters" is exactly the right word. Instead of the kind of dramatic crisis that lasted for years during the Depression, a low-level anxiety hangs over the country, what with talk of a "jobless recovery," a concern about structural shifts in the economy, with little sign of improvement in the housing sector, and a resentful sense that only the banks have truly bounced back, largely at our own expense. Despite the populist backlash that has fed Republican hopes, the current economic fears are an improvement over the grim mood of the first six months of the recession, a pervasive dread that we were sliding inexorably into Depression 2.0.
That mood was closely parallel to the early years of the Great Depression. Both were set off by a banking crisis; both witnessed a plague of foreclosures on homes and farm; both revealed terrific flaws in the regulatory system and required serious federal intervention; above all, in both periods there was a huge crisis of confidence, economically in the collapse of the credit markets, psychologically in the widespread fear for the future, especially our children's future. It now becomes clear that the Obama administration did not or could not take full advantage of those first few months of crisis. Like FDR, the president tried hard to serve as Cheerleader in Chief, with some success. Along with the rescue of the banks, the passage of the stimulus bill was probably the single most important factor in avoiding a slide into a second Depression.
But where the New Deal managed to pursue three R's at once -- relief, regulation, and reform -- the administration has so far made little progress toward instituting new forms of regulation or vitally needed reforms. Some such reforms are in the pipeline but they will be bitterly contested, especially now that the worst of the crisis is assumed to have passed.
As far as the response of the arts to the Great Recession, that too is in the pipeline and impossible to predict, though we can be sure that the current downbeat mood, the sense of lowered expectations, will soon be reflected in indie filmmaking, darkly realistic fiction, and popular music at once cheering and keening. By economic indicators the recession may formally have ended, though there has scarcely been a full recovery, but the psychological recession will be with us for a long while.
When considering the political scene of the moment, it is difficult not to see how historical allegory plays an important role in the public spectacle known as the Tea Party movement. From the name itself, an acronym (Taxed Enough Already) that fuses current concerns to a patriotic historical moment, to the oral and written references by some of its members to Stalin and Hitler, the Tea Party appears to be steeped (sorry) in history. However, one has only to listen to a minute of ranting to know that what we really are talking about is either a deliberate misuse or a sad misunderstanding of history.
Misuse implies two things: first, that the Partiers themselves know that they are attempting to mislead, and second, that the rest of us share an understanding of what accurate history looks like. Would that this were true. Unfortunately, there is little indication that the new revolutionaries possess more than a rudimentary knowledge of American or world history, and there is even less reason to think that the wider public is any different. Such ignorance allows terms like communism, socialism, and fascism to be used interchangeably by riled-up protesters while much of the public, and, not incidentally, the media, nods with a fuzzy understanding of the negative connotations those words are supposed to convey (of course some on the left are just as guilty of too-liberally applying the “fascist” label to any policy of which they do not approve). It also allows the Tea Partiers to believe that their situation – being taxed with representation – somehow warrants use of "Don’t Tread On Me" flags and links their dissatisfaction with a popularly elected president to that of colonists chafing under monarchical rule.
While the specifics of the moment (particularly, it seems, the fact of the Obama presidency) account for some of the radical resentment, the intensity of feeling among the opposition these days seems built upon a total lack of historical perspective. Would someone who really understood the horrors of Stalin’s purges still believe that President Obama sought to emulate the Soviet leader? Or, a drier example, could you speak of a sudden government "takeover" of health care, replete with death panels, if you knew of the long and gradual approach to building the modern American welfare state? The problem, of course, is that many Americans have at best a shaky hold on the relevant historical facts and are therefore credulous when presented with distortions and fabrications. Even after college graduation, too many students lack understanding of key historical developments. And that’s just college students – let’s not forget the majority of Americans who last studied history in their high school years, perhaps in a state like Texas, where Thomas Jefferson was just erased from the past because he is now considered too radical and the word "capitalism" has been replaced by "free enterprise" to help smooth out its rough edges.
It is important to realize that ignorance about history allows falsehoods and distortions to be presented as facts, but it is also significant that Tea Partiers look to history to legitimize their endeavors. In other words, history is still seen as authoritative; the problem is that the authority is being abused. Such abuse can succeed only when the public’s collective historical memory has been allowed to atrophy.
In addition to a vague (at best) recollection of the pertinent facts, Tea Partier warnings of cataclysm are taken seriously because the skill of thinking historically has not been emphasized in high school and college curriculums. Teaching students to understand that things change over time because of particular actions taken or not taken and that context matters, also referred to as "critical thinking," gives them some perspective and helps them to take the long view that can illuminate the emptiness of sky-is-falling scare tactics. The politics of our moment, focused solely on what's happening this minute and what it means for the next election (no matter how far off), cry out for a skeptical appreciation by an electorate that unfortunately does not know how to think historically.
In recent years, conservative groups like the Intercollegiate Studies Institute and the American Council of Trustees and Alumni have been the loudest critics of the low status of history in colleges in the United States. They are especially upset with the lack of American history requirements at elite universities. But this should not be solely a conservative issue, nor can it be one that professional historians ignore. As the Tea Party movement is demonstrating, there are direct political consequences if the public is unable to perceive when history is used to mislead and confuse people.
Unfortunately, as budgets are being slashed at colleges and universities nationwide, history is seen by many as impractical and unimportant. Courses that focus on “career-building” and “real-world skills” are prioritized while history departments are unable to replace retiring faculty. One reason for this is that the case for history has not been made effectively. As ACTA has reported, none of the top 50 universities requires its students to take U.S. history – and 10 require no history course at all. Some students may take a history course that fulfills a broader core requirement, but many do not. And too often these core courses are deficient in teaching historical practice. Historians, whether just entering the field or preparing to retire, have an obligation as people with special knowledge of history's significance to make the case for a greater commitment to the discipline – to students, campus administrators, legislators, and the public. Indeed, anyone concerned about education who does not want to see our contemporary political discourse sink lower should be actively interested in promoting history education.
This is an uphill battle. There is no easy-to-measure market value for teaching history, no space race to gin up patriotic sentiment, no simplistic explanation to combat the perception that studying the subject offers no reward. Yet as the Tea Party "movement" has made apparent, history continues to float in the air of our political discourse, its authority ripe for sucking into every imaginable debate. There will always be divergent interpretations of the past and disagreements about what facts to emphasize, and individual schools and teachers will construct their courses as they see fit. But most of all, we must redouble our efforts to foster historical thinking. Teaching students how historians find and use evidence to construct their arguments develops the critical skills necessary for sorting through the various and often outlandish claims available 24 hours a day on cable TV and the Internet. As long as people reference past events while staking out their positions in the present – and that is unlikely to change – a functioning democracy demands a citizenry capable of spotting historical fantasy and hyperbolic misapplication of historical precedent.
Erik Christiansen and Jeremy Sullivan
Erik Christiansen teaches history at the University of Rhode Island and at Roger Williams University. Jeremy Sullivan is a Ph.D. candidate in history at the University of Maryland at College Park.
Last year, the bicentennial of Thomas Paine’s death came and went without much ceremony. It’s always some anniversary or other; perhaps we were just all commemorated out. Besides, even if there had been some big effort to mark the memory of the American Revolution’s greatest pamphleteer and most radical ideologue, media attention would have focused on a fairly sensational topic: the strange afterlife of his physical remains.
Ten years after Paine was buried in New Rochelle, N.Y., his body was dug up by a political enemy and transported to England. (This seems like carrying argumentativeness to an extreme.) The subsequent fate of Paine's body is a complicated matter. Its owner, if that is the word one wants, died in the 1830s. Bits and pieces of Paine circulated on what seems like a fairly ghoulish black market; one story had it that some of the bones were made into buttons. Eventually a number of the remains were gathered up by the Paine National Historical Association, which returned them to New Rochelle for reburial in 1905.
There was more to it than bringing a creepy situation to a close. Even before his death, Paine had been largely forgotten in the United States. His religious skepticism and early support for the principles of the French Revolution made him the most inconvenient of Founding Fathers. But his memory remained alive in the underground of freethinkers and radicals, and giving him a proper burial was a mission for some of them. It was a step towards recovering the truth about the origins of the United States, which were never so pious as some people would have you believe.
Paine’s busy afterlife first came to my attention a few years ago – just about the time, as coincidence had it, that I was reading about what happened to Voltaire’s non-literary remains. They were dug up in 1791, at the high tide of the French Revolution, so that they could be buried alongside the body of Jean-Jacques Rousseau.
In spite of his subtitle, Kammen, a professor emeritus of American history and culture at Cornell University, does not focus exclusively on the United States. But he focuses on how tightly linked the phenomenon of reburial is to the shaping of historical memory in the wake of the American Revolution.
The tradition of marking the graves of common people emerged fairly recently in history; it only became widespread in the 19th century. The act of re-commemorating an individual’s life by moving his remains is quintessentially modern -- a kind of secular resurrection of the deceased person's social importance. “Relocation and reburial,” writes Kammen, “or ‘translation’ of a body, to use the traditional, Latin-derived word, are invariably all about the resurgence of the reputation of and hence respect for someone whose lamp and visage had dimmed in some way.”
The concerns of the living mattered at least as much as the intentions of the dead. “Survivors (or members of the next generation),” Kammen writes, “frequently insisted – sometimes despite evidence to the contrary – that the deceased had really wanted to be buried in a spot other than the one initially chosen, often for reasons of convenience at the time. Arguments over where the most appropriate site might be frequently persisted for decades, and in certain cases even longer.”
The most extreme case may be Daniel Boone. The explorer and politician was born in Pennsylvania in 1734, lived for many years in Kentucky, and died in 1820 in Missouri, where he was buried. Within a couple of decades, the Kentucky legislature decided its favorite son should be brought home. Boone himself might not have liked that idea. Given certain unhappy experiences in Kentucky involving real estate, he never set foot in the state again after 1799.
And in any case, Missouri officials wanted him to stay where he had been planted.
Things got emotional, and then they got weird. With the approval of members of Boone’s family, his remains were excavated and transported to Kentucky to be reburied with great public celebration. But first the skull was, in the words of one witness, “handled by the persons present and its peculiarities commented upon.” The new resting place was good for Kentucky's tourism. It also boosted the sale of lots at the cemetery where the body was interred. But meanwhile, people in Missouri grew increasingly annoyed.
As one editorialist put it in 1888, the great man’s grave was “desecrated to gratify a spasm of Kentucky pride.” A rumor circulated that Boone’s relatives had deliberately guided Kentucky’s patriots to the wrong grave, meaning that the real body was still in Missouri. The argument raged on for decades, and it still does. Search “Daniel Boone gravesite” in 2010 and you still have your choice of locations.
The image of people standing around commenting on the shape of Daniel Boone’s skull may sound comic or disgusting, but it was not such an unusual thing to do, back then. Perhaps the most disconcerting aspect of Kammen’s book is its reminder of how much has changed about our attitudes towards close contact between the living and the dead.
One 19th century man of the cloth pointed out, Kammen says, “that the desire to scrutinize bodily decay – he called it a ‘morbid desire’ – was especially prevalent among women; some even wanted to descend into tombs, lift the coffin lid, and ‘gaze upon the mouldering bones’ of their parent or child.” No doubt there was some perfectly understandable reason why they felt this urge. Even so, I just don’t want to think about it.
And then there is this description of an exhumation in 1875, reported in a Baltimore newspaper:
“The laborers employed to perform the task, upon digging to the depth of about five feet, discovered the coffin in a good state of preservation, after having lain in place nearly 26 years. The lid was removed, and the remains curiously examined by the few present. There, before their gaze, was extended the skeleton, almost in perfect condition, and lying with the long bony hands reposing one upon the other, as they had been arranged in death. The skull bore marks of greater decay, the teeth from the upper jaw having become dislodged, but those in the lower were all in place, and some little hair was still clinging near the forehead....”
Here the guest of honor was Edgar Allen Poe. At least his peaceful condition shows he was not buried alive.
Kammen’s anthology of “translations” is, in effect, an account of what historians are always doing – digging into the past, moving the remains to a new location, engraving a new memorial. For that matter, it brings to mind the old joke about dissertation writing as a process of transporting bones from one coffin to another.
But it is also a reminder of the final context of all human activity, scholarly and otherwise. In the words of Thomas Paine, writing in 1777: “However men may differ in their ideas of grandeur or government here, the grave is nevertheless a perfect republic.” In the long run, we all end up naturalized.
The new titles that arrive from publishers each week usually come with promotional material that, apart from remembering to recycle, I carefully ignore. But over the past week -- thanks to an eagle-eyed colleague -- I have been making up for this practiced neglect by lingering over one publicist's letter in particular.
It is remarkable. It may be the most striking and provocative bit of prose concerning a scholarly book to have circulated in some while. The passage in question runs to one paragraph appearing about two-thirds of the way down the page of a note accompanying the page proofs for 1877: America’s Year of Living Violently by Michael A. Bellesiles, to be published by the New Press in August. Here it is:
“A major new work of popular history, 1877 is also notable as the comeback book for a celebrated U.S. historian. Michael Bellesiles is perhaps most famous as the target of an infamous ‘swiftboating’ campaign by the National Rifle Association, following the publication of his Bancroft Prize-winning book Arming America (Knopf, 2000) -- ‘the best kind of non-fiction,’ according to the Chicago Tribune -- which made daring claims about gun ownership in early America. In what became the history profession’s most talked-about and notorious case of the past generation, Arming America was eventually discredited after an unprecedented and controversial review called into question its sources, charges which Bellesiles and his many prominent supporters have always rejected.”
These sentences have absorbed and rewarded my attention for days on end. They are a masterpiece of evasion. The paragraph is, in its way, quite impressive. Every word of it is misleading, including “and” and “the.”
Bellesiles has a certain claim to fame, certainly, but not as “the target of an infamous ‘swiftboating’ campaign.” He is, and will be forever remembered as, a historian whose colleagues found him to have violated his profession's standards of scholarly integrity. Arming America won the Bancroft Prize -- the highest honor for a book on American history. But far more salient is the fact that the Bancroft committee took the unprecedented step of withdrawing the prize.
It is true that he drew the ire of the National Rifle Association, and I have no inclination to give that organization's well-funded demagogy the benefit of any doubt. But gun nuts did not force Bellesiles to do sloppy research or to falsify sources. That his scholarship was grossly incompetent on many points is not a "controversial" notion. Nor is it open to dispute whether or not he falsified sources. That has been exhaustively documented by his peers. To pretend otherwise is itself demagogic.
If a major commercial press wants to help a disgraced figure make his comeback, that is one thing, but rewriting history is another. The New Press published many excellent books by important authors. It is out of respect for that record that I want to invite it to make a public apology for violating the trust its readers have in it.
The saga of Michael Bellesiles (pronounced "buh-LEELS" or "buh-LAYELS," depending on who you ask) was at its height in 2001 and came to a resolution (or so one thought) the following year, when Bellesiles resigned from his position as professor of history at Emory University. As the case was unfolding, I followed it rather closely, but until seeing the New Press statement last week had managed to forget it almost entirely.
This was not just a matter of midlife memory loss. The affair was embarrassing and disgraceful, and it left Bellesiles in a position where he had little left that anyone would recognize as dignity. If you regard Charlton Heston as a role model for political activism, maybe the whole thing seems like a glorious chapter in recent history. For anyone else, to forget the whole thing was a mercy.
Matters began with an article Bellesiles published in The Journal of American History in 1996. He claimed that his research among probate records suggested a very low rate of individual gun ownership in colonial America -- and indeed well into the 19th century. What Bellesiles called a “gun culture” only really developed in the wake of the Civil War,he argued, when mass-production of firearms made them more affordable.
Expanding on his thesis in Arming America, the author presented a new way of looking at the early days of the country. Firearms had been scarce and expensive, and were not found in most households. Hunting mostly involved using traps, rather than shooting. What guns were commonly available were usually old and in bad shape. The men who took up arms for their country during the American Revolution mostly got them from depots. And those citizen-soldiers didn't shoot very well, for not many of them were accustomed to handling guns. Since, again, guns were expensive and scarce.
Bellesiles cited many and diverse sources for all of these claims, but the most impressive aspect of his work -- the part he mentioned in interviews, and the part that professional historians and journalistic reviewers alike always stressed -- was the statistical evidence from his examination of probate records.
Now, people who care about no other part of the Constitution so much as the Second Amendment were incensed by Bellesiles's counternarrative of early history, which is hardly surprising. Besides conducting themselves in the usual polemical matter WHICH OFTEN INVOLVES WRITING LIKE THIS, they started to examine his notes and sources very, very closely. That is not surprising, either. Who else would have the incentive?
But the gun nuts were not the only people who had problems with Bellesiles’s work. Arming America received many favorable reviews in major journals of opinion, but fellow historians had been expressing reservations about the probate data ever since that article had appeared in the JAH a few years earlier. For one thing, there were questions about how Bellesiles had gathered his information, and where; and about whether he was counting things correctly. He treated wills as if they were a completely reliable list of the whole of someone's property, even though the experts on probate know better, and even though he cited some of those scholars in his own notes.
The statistical claims in particular were a problem. Scholars would later try -- and fail -- to duplicate the results Bellesiles reported from his number-crunching. At first, it was possible to shrug this off as evidence that he was clumsy with the calculator. But things were not that simple.The figures on Bellesiles’s statistical tables were the tip of the iceberg.
People following up his notes kept finding problems: inaccurate quotations, mischaracterized sources, failure to include evidence that ran contrary to his thesis, and so on. At first, it was easy to dismiss the complaints because they had a screed-like quality. But qualified scholars who looked into the matter came away shaking their heads. A symposium on Arming America appeared in the William and Mary Quarterly in early 2002, followed not much later by James Lindgren’s review-essay in The Yale Law Journal.
At the request of Emory University, three prominent historians, assisted by graduate students, examined the evidence about Bellesiles’s work. In particular, they looked at his claims concerning what probate and militia records showed about gun ownership in early America -- and, in what proved even more of a problem, at how he accounted for the discrepancies between what he claimed and what the archival records actually showed. The resulting “Report of the Investigative Committee in the Matter of Professor Michael Bellesiles,” released in October 2002, was devastating.
“We have interviewed Professor Bellesiles,” the committee reported, “and found him both cooperative and respectful of this process. Yet the best that can be said of his work with the probate and militia records is that he is guilty of unprofessional and misleading work. Every aspect of his work in the probate records is deeply flawed.... Subsequent to the allegations of research misconduct, his responses have been prolix, confusing, evasive, and occasionally contradictory. We are surprised and troubled that Bellesiles has not availed himself of the opportunities he has had since the notice of this investigation to examine, identify, and share his remaining research materials.”
While acknowledging that "unfamiliarity with quantitative methods or plain incompetence" possibly accounted for some of the deficiencies in Bellesiles's statistical data, the committee found that he was also in violation of the standards of scholarly integrity as defined by the American Historical Association, which (to quote its report) "includes ‘an awareness of one’s own bias and a readiness to follow sound methods and analysis wherever they may lead,’ ‘disclosure of all significant qualifications of one’s arguments,’ careful documentation of findings and the responsibility to ‘thereafter be prepared to make available to others their sources, evidence, and data,’ and the injunction that ‘historians must not misrepresent evidence or the sources of evidence.’ ”
Bellesiles was culpable on all points. “In fact,” the report noted, “Professor Bellesiles told the committee that because of criticism from other scholars, he himself had begun to doubt the quality of his probate research well before he published it in the Journal of American History.”
So much for the myth of a scholar whose greatest crime was making “daring claims” that left him vulnerable to "swiftboating." Michael Bellesiles's greatest enemy was never the NRA. It was Michael Bellesiles.
Just after reading the promotional letter accompanying Bellesiles's new book, I contacted the New Press to find out more about this campaign to rehabilitate him. The publicist offered to provide me with a copy of the chapter on Arming America from Jon Wiener’s book Historians in Trouble: Plagiarism, Fraud, and Politics in the Ivory Tower, published by the New Press in 2005.
As it happened, I had already seen the chapter, and have ended up going over it a couple of times over the past week while reading other material on l'affaire Bellesiles. Wiener portrays his subject as the victim of a witch hunt -- suggesting that his errors were few in number, limited in significance for his argument, and finally of a rather unremarkable sort. They were the result of being sloppy about record-keeping and venturing too far out of his depth in the cliometrics department. To be human is to make mistakes. Besides, everybody forgets about those parts of Arming America where there weren’t any problems.
This seems generous to a fault. Anyone trying to form an assessment of the affair needs to read the Emory report -- keeping in mind that the committee ignored numerous problems with claims and evidence. Indeed, a fairly useful pedagogical tool for students in history, law, and journalism would be a casebook on Arming America, including documentation from Bellesiles's various attempts to explain himself and the evidence that made his efforts more difficult (such as this, for example).
In any case, I finally got in touch with Marc Favreau, his editor at the New Press, to ask whether any sort of due diligence had been practiced with Bellesiles's new book, considering the author's reputation. He responded that he was "well-versed" in the scholarly disputes over Arming America and referred me to Wiener's book. "What we are concerned with, then and now," he told me, "is the extent to which the fury around Michael’s thesis was stoked by a virulent pro-gun movement."
Now, this is hardly satisfactory. A thing may be true even if Charlton Heston said it. But in any case, Favreau insisted that Bellesiles's new book 1877 will be uncontroversial as to both argument and methodology. "In our initial conversations with him we were impressed by his knowledge of and passion for his subject matter," he said. "Although trade publishers rarely, if ever, solicit peer reviews (unlike university presses), we've nevertheless been very pleased to receive wonderful advance quotes from a number of prominent scholars and historians."
It also seemed appropriate to get in touch with Bellesiles himself. He is currently an adjunct in history at Central Connecticut State University. I wrote to him to ask what he hoped the book would accomplish, given that his return to public life must necessarily include questions about his credibility. Had he taken any particular steps that would inspire confidence in someone who was acquainted with his colleagues' findings about Arming America?
"I rest my credibility on the basic standards of scholarship," he responded by e-mail, "and have done what every reputable historian does, and exactly what I did in Arming America: I cite my sources."
At this point while reading his note, I found my eyes turning away from the screen in embarrassment. Eight years ago, when reputable historians found Bellesiles's work to lack scholarly integrity, none of them claimed he had failed to cite sources. Anyone can cite sources. The pro-gun arguments of John Lott are decked out with the apparatus of scholarship, but that doesn't mean his statistical claims aren't dubious.
In any case, Bellesiles has made himself "familiar with modern technology, computer databases, and all aspects of our digital world," he told me. "All my notes to 1877 are digitized and thoroughly backed up in a number of different formats."
All things considered, this is only just so reassuring.
In October, the U.S. Department of Labor announced a fine of more than $87.4 million on BP North America Inc. for "failure to correct [the] potential hazards faced by employees” that had been uncovered by the Occupational Safety and Health Administration. This set an all-time record for penalties set by OSHA on any company -- dwarfing the previous one, from 2005, of a mere $21 million, imposed after an explosion at a BP refinery killed 15 people and injured 170 others.
Since last fall, BP has gone on to bigger things. A tone of moral indignation has been heard lately (on Capitol Hill, for instance) regarding those OSHA violations. But why the outrage? It’s just business. As long as risk to the company's workers can be translated into a calculable expense, decisions will be made on a rational basis. With an eye on the bottom line, the company can decide whether or not to install adequate equipment to protect either workers or the environment.
Or not to protect them, as the case may be. Profit is profit, and the ocean has no lawyer. Let's not pretend otherwise.
Of course, events might have unfolded very differently if the people working on the offshore rig had decided to shut production down when the company pushed them (once again) to cut corners and ignore danger signs. Every time I see a picture from the Gulf of Mexico, I wonder about that. But when politicians or people in the mass media discuss the situation, work stoppage by BP's employees is one possibility that never comes up.
The very idea seems almost unthinkable. It is easier to get mad at how flagrantly BP ignored safety violations than to imagine labor acting outside the established framework of government regulation and corporate decision making. Maybe BP can afford this failure of the imagination -- but I doubt the planet can, at least not forever.
So it’s a good time to have a new edition of Irving Bernstein’s two studies The Lean Years (1960) and The Turbulent Years (1969). Originally published by Houghton Mifflin, they have just been reissued in paperback by Haymarket Books and offer, between them, a classic survey of how American workers fared during the 1920s and ‘30s. SPOILER ALERT: They tended to do best when they had the confidence and the willingness to challenge their employers -- and not just over wages. Bernstein, who at the time of his death in 2001 was an emeritus professor of political science at the University of California at Los Angeles, makes clear that control over working conditions was usually also at stake.
What set Bernstein's work apart from the usual run of scholarship on American labor history at mid-century was his strong interest in the life and activity of non-unionized people -- including those working in agriculture, or leaving it behind for new kinds of employment, in the case of African-Americans leaving the South. And Bernstein wrote with grace. He had a knack for the thumbnail biography of ordinary people: There are numerous miniature portraits embedded in the epic. He was sensitive to the changes in mood among workers as they faced the boom of the 1920s (which passed most of them by) and the agony of the Depression (which hit them hardest). In many cases, they blamed themselves for their misery. The possibility of joining forces with others to change anything took a while to sink in.
The new paperback editions come with introductions by Frances Fox Piven, a professor of sociology and political science at the City University of New York Graduate Center, who draws out Bernstein's argument on this point: "The train of developments that connects changes in social conditions to a changed consciousness is not simple. People ... harbor somewhere in their memories the building blocks of different and contradictory interpretations of what it is that is happening to them, of who should be blamed, and what can be done about it. Even the hangdog and ashamed unemployed worker who swings his lunch box and strides down the street so the neighbors will think he is going to a job can also have other ideas that only have to be evoked, and when they are, make it possible for him on another day to rally with others and rise up in anger at his condition."
Quoting that passage gives me pause -- for Piven, a former president of the American Sociological Association, has in recent months been the focus of intricate theories about how Barack Obama was using ACORN to impose martial law on gated communities. Or perhaps ACORN was using Barack Obama to that end. I must admit some difficulty in reading the pertinent diagram. But in short, she has been involved in some quite nefarious activity, such as encouraging poor people to vote.
No doubt this will make Piven's endorsement of Irving Bernstein's two books seem particularly worrying. Only someone in the Tea Party (a well-funded movement organized by professional lobbyists) is supposed to "rally with others and rise up in anger in his condition" -- not an unemployed person who wants work and decent health care. Furthermore, protesters ought to direct their rage strictly at the government, and never at private enterprise.
I suppose the late Irving Bernstein will end up as a box in the big flow chart of cyclothymic, pseudopopulist political discourse. It seems like a matter of time. But if you read his books, something eventually becomes clear. He thought the New Deal had saved capitalism and made it more fair. He was not fond of the Communists, who expected the Depression would work to their advantage. Before writing his labor histories, Bernstein specialized in collective bargaining. (Aside from publishing books on the subject, he served in arbitration disputes.) The Turbulent Years is dedicated to Clark Kerr -- the president of the University of California system and a major target of the radical student movement in the 1960s.
In short, when Bernstein wrote with sympathy about the strikes and street fighting of the 1930s, it was not out of an instinctive combativeness but from a sense that people do these things because they have been left no choice by "an unbalanced society" (to borrow an expression he used to describe the United States on the eve of the crash of 1929). If his book sounds almost revolutionary now, that is a sign that the ordinary frame of reference for political judgment has skewed so far to the right that reality is standing sideways.
I contacted Frances Fox Piven to ask her opinion of this assessment.
"Bernstein definitely thought of himself as a centrist, but a reformer," she told me. "He was quite contemptuous, for example, of ideologues on the Left in the 1930s. But he was never contemptuous of workers themselves, and his respect and empathy for workers forced him to pay attention, even respectful attention, to the strikes and sit-downs and demonstrations they undertook during the 1930s. One of the consequences of the rise of a turbulent and aggressive labor movement was to open up normal politics, to move the political culture to the Left. The Civil Rights movement had a similar consequence thirty years later. It is chastening to observe that in the absence of mass movements from the bottom (and the Tea Party is not a movement from the bottom) that our politics reverts to a kind of default position in which business interest groups have outsized influence."
If a sufficiently "turbulent and aggressive" spirit had prevailed among the people working for BP just a couple of months ago, there might not now be one hundred thousand barrels of crude oil (by the company's own estimate) surging into the ocean every day -- with no end in sight.
Certain research topics seem to destined to inspire the question, “Seriously, you study that?” So it is with the field of Twitter scholarship. Which -- just to get this out of the way -- is not actually published in 140 characters or less. (The average “tweet” is the equivalent of two fairly terse sentences. It is like haiku, only more self-involved.)
The Library of Congress announced in April that it was acquiring the complete digital archives of the “microblogging” service, beginning with the very first tweet, from ancient times. At present, the Twitter archive consists of 5 terabytes of data. If all of the printed holdings of the LC were digitalized, it would come to 10 to 20 terabytes (this figure does not include manuscripts, photographs, films, or audio recordings).
Some 50 million new messages are sent on Twitter each day, although one recent discussion at the LC suggested that the rate is much higher -- at least when the site is not shutting down from sheer traffic volume, which seems to be happening a lot lately. A new video on YouTube shows a few seconds from the "garden hose" of incoming Twitter content.
When word of this acquisition was posted to the Library of Congress news blog two months ago, it elicited comment by people who could not believe that anything so casual and hyper-ephemeral as the average tweet was worth preserving for posterity – let alone analyzing. Thanks to the Twitter archive, historians will know that someone ate a sandwich. Why would they care?
Other citizens became agitated at the thought that “private” communications posted to Twitter were being stored and made available to a vast public. Which really does seem rather unclear on the concept. I’m as prone to dire mutterings about the panopticon as anybody -- but come on, folks. The era of digital media reinforces the basic principle that privacy is at least in part a matter of impulse control. Keeping something to yourself is not compatible with posting it to a public forum. Evidently this is not as obvious as it should be. Things you send directly to friends on Twitter won't be part of the Library's holdings, but if you celebrated a hook-up by announcing it to all and sundry, it now belongs to the ages.
A working group of librarians is figuring out how to “process” this material (to adapt the lingo we used when I worked as an archival technician in the Library's manuscript division) before making the collection available to researchers. But it’s not as if scholars have been waiting around until the collection is ready. Public response to the notion of “Twitter studies” might be incredulous, but the existing literature gives you some idea of what can be done with this giant pulsing mass of random discursive particles.
A reading of the scholarship suggests that individual tweets, as such, are not the focus of very much attention. I suppose the presidential papers of Barack Obama will one day include an annotated edition of postings to his Twitter feed. But that is the exception and not the rule.
Instead, the research, so far, tends to fall into two broad categories. One body focuses on the properties of Twitter as a medium. (Or, what amounts to a variation on the same thing, as one part of an emerging new-media ecosystem.) The other approach involves analyzing gigantic masses of Twitter data to find evidence concerning public opinion or mood.
Before giving a thumbnail account of some of this work – which, as the bibliography I’ve consulted suggests, seems intrinsically interdisciplinary – it may be worth pointing out something mildly paradoxical: the very qualities that make Twitter seem unworthy of study are precisely what render it potentially quite interesting. The spontaneity and impulsiveness of expression it encourages, and the fact that millions of people use it to communicate in ways that often blur the distinction between public and private space, mean that Twitter has generated an almost real-time documentary record of ordinary existence over the past four years.
There may be some value to developing tools for understanding ordinary existence. It is, after all, where we spend most of our time.
Twitter shares properties found in numerous other new-media formats. The term “information stream” is sometimes used to characterize digital communication, of whatever sort. Inside Higher Ed “flows” at the rate of a certain number of articles per day during the workweek. An online scholarly journal, by contrast, will probably trickle. A television network’s website -- or the more manic sort of Twitter feed -- will tend to gush. But the “streaming” principle is the same in any case, and you never step into the same river twice.
A recent paper by Mor Naaman and others from the School of Communication and Information at Rutgers University uses a significant variation on this concept, the “social awareness stream,” to label Twitter and Facebook, among other formats. Social awareness streams, according to Naaman et al., “are typified by three factors distinguishing them from other communication: a) the public (or personal-public) nature of the communication and conversation; b) the brevity of posted content; and, c) a highly connected social space, where most of the information consumption is enabled and driven by articulated online contact networks.”
Understanding those “articulated online contact networks” involves, for one thing, mapping them. And such mapping efforts have been underway since well before Twitter came on the scene. What makes the Twitter “stream” particularly interesting is that – unlike Facebook and other social-network services -- the design of the service permits both reciprocal connections (person A “follows” person B, and vice versa) and one-sided (A follows B, but that’s it). This makes for both strong and weak communicative bonds within networks -- but also among them. And various conventions have emerged to allow Twitter users to signal one another or to urge attention to a particular topic or comment. Besides “retweeting” someone’s message, you can address a particular person (using the @ symbol, like so: @JohnDoe) or index a message by topic (noted with the hashtag, thusly: #topicdujour).
All of this is, of course, familiar enough to anyone who uses Twitter. But it has important implications for just what kind of communication system Twitter fosters. To quote the title of an impressive paper by Haewoon Kwak and three other researchers from the department of computer science at the Korea Advanced Institute of Science and Technology: “What is Twitter, a Social Network or a News Media?” (No sense protesting that “media” is not a singular noun. Best to grind one’s teeth quietly.)
Analyzing almost 42 million user profiles and 106 million tweets, Kwak and colleagues find that Twitter occupies a strange niche that combines elements of both mass media and homophilous social groups. (Homophily is defined as the tendency of people to sustain more contact with those they judge to be similar to themselves than with those who they perceive to be dissimilar.) "Twitters shows a low level of reciprocity," they write. "77.9 percent of user pairs with any link between them are connected one-way, and only 22.1 percent have reciprocal relationships between them.... Previous studies have reported much higher reciprocity on other social networking services: 68 percent on Flickr and 84 percent on Yahoo."
In part, this reflects the presence on Twitter of already established mass-media outlets – not to mention already-famous people who have millions of “followers” without reciprocating. But the researchers find that a system of informal but efficient “retweet trees” also function “as communication channels of information diffusion.” Interest in a given Twitter post can rapidly spread across otherwise disconnected social networks. Kwak’s team found that any retweeted item would “reach an average of 1,000 users no matter what the number of followers is of the original tweet. Once retweeted, a tweet gets retweeted [again] almost instantly on the second, third, and fourth hops away from the source, signifying fast diffusion of information after the first retweet.”
Eventually someone will synthesize these and other analyses of Twitter’s functioning -- along with studies of other institutional and mass-media networks -- and give us some way to understand this post-McLuhanesque cultural system. In the meantime, research is being done on how to use the constant landslide of Twitter messages to gauge public attitudes and mood.
As Brendan O’Connor and his co-authors from Carnegie Mellon University note in a paper published last month, the usual method of conducting a public-opinion poll by telephone can cost tens of thousands of dollars. (Besides, lots of us hang up immediately on the suspicion that it will turn into a telemarketing call.)
Using one billion Twitter messages from 2008 and ’09 as a database, O’Connor and colleagues ran searches for keywords related to politics and the economy, then generated a “sentiment score” based on the lists of 1,600 “positive” and 1,200 “negative” words. They then compared these “text sentiment” findings to the results of more traditional public opinion polls concerning consumer confidence, the election of 2008, and the new president’s approval ratings. They found sufficiently strong correlation to be encouraging -- and noted that by the summer of 2009, when many more people were on Twitter than had been the case in 2008, the text-sentiment results proved a good predictor of consumer confidence levels.
A different methodology was used in “Modeling Public Mood and Emotion: Twitter Sentiment and Socio-Economic Phenomena” by John Bollen of Indiana University and two other authors. They collected all public tweets from August 1 to December 20, 2008 and harvested from them data about the content that could be plugged into “a well-established psychometric instrument, the Profile of Mood States” which “measures six individual dimensions of mood, namely Tension, Depression, Anger, Vigor, Fatigue, and Confusion.” This sounds like something from one of Woody Allen’s better movies.
The data crunching yielded “a six dimensional mood vector” covering the months in question. Which, as luck would have it, coincided with both the financial meltdown and the presidential election of 2008. The resulting graphs are intriguing.
Following the election, the negative moods (Tension, Depression, etc.) fell off. There was “a significant spike in Vigor.” Examination of samples of Twitter traffic showed “a preponderance of tweets expressing high levels of energy and positive sentiments.”
But by December 2008, as the Dow Jones Industrial Average fell to below 9000 points, the charts show a conspicuous rise in Anger -- and an even stronger one for Depression. The researchers write that this may have been an early signal of “what appears to be a populist movement in opposition to the new Obama administration.”
“Tweets may be regarded,” write Bollen and colleagues, “as microscopic instantiations of mood.” And they speculate that the microblogging system may do more than reflect shifts of public temper: “The social network of Twitter may highly affect the dynamics of public sentiment…[O]ur results are suggestive of escalating bursts of mood activity, suggesting that sentiment spreads across network ties.”
As good a reason as any to put this archive of the everyday into the time capsule. And while my perspective on this may be a little off-center, I think it is fair that the Twitter record should be stored at the the Library of Congress, which also houses the papers of the American presidents up through Theodore Roosevelt.
Almost 20 years ago, I started to work there just around the corner from the bound volumes containing, among other things, the diaries of George Washington. The experience of taking a quick look at them was something like a rite of passage for people working in the manuscript division. And to judge by later conversations among colleagues, the experience was usually slightly bewildering.
You would open the volume and gaze at the very page where his hand had guided the quill. You would start to read, expecting deep thoughts, or historical-seeming ones, at any rate. And this, more or less, is what you found on every page:
"Rained today. Three goats died. Need to buy a new plow.”
He had another 85 characters to spare.
P.S. Follow me on Twitter here, and keep up with news on scholarly publishing here.
Call it a revival, of sorts. In recent years, anyone interested in contemporary European philosophy has noticed a tendency variously called the religious or theological "turn" (adapting a formulation previously used to describe the "linguistic turn" of the 1960s and '70s). Thinkers have revisited scriptural texts, for example, or traced the logic of seemingly secular concepts, such as political sovereignty, back to their moorings in theology. The list of figures involved would include Emmanuel Levinas, Jacques Derrida, Gianni Vattimo, Alain Badiou, Giorgio Agamben, Slavoj Å½iÅ¾ek, and Jürgen Habermas -- to give a list no longer or more heterogenous than that.
A sampling of recent work done in the wake of this turn can be found in After the Postsecular and the Postmodern: New Essays in Continental Philosophy of Religion, a collection just issued by Cambridge Scholars Publishing. One of the editors, Anthony Paul Smith, is a Ph.D. candidate at the University of Nottingham and also a research fellow at the Institute for Nature and Culture at DePaul University. The other, Daniel Whistler, is a tutor at the University of Oxford, where he just submitted a dissertation on F.W.J. Schelling's theology of language. I interviewed them about their book by e-mail. A transcript of the discussion follows.
Q: Let’s start with one word in your title -- "postsecular." What do you mean by this? People used to spend an awful lot of energy trying to determine just when modernity ended and postmodernity began. Does “postsecularity” imply any periodization?
APS: In the book we talk about the postsecular event, an obvious nod to the philosophy of Alain Badiou. For a long time in Europe and through its colonial activities our frame of discourse, the way we understood the relationship of politics and religion, was determined by the notion that there is a split between public politics and private religion. This frame of reference broke down. We can locate that break, for the sake of simplicity, in the anti-colonial struggles of the latter half of the 20th century. The most famous example is, of course, the initial thrust of the Iranian Revolution.
It took some time before the implications of this were thought through, and it is difficult to pin down when “postsecularity” came to prominence in the academy, but in the 1990s a number of Christian theologians like John Milbank and Stanley Hauerwas, along with non-Christian thinkers like Talal Asad, began to question the typical assumption of philosophy of religion: that religious traditions and religious discourses need to be mediated through a neutral secular discourse in order to make sense. Their critique was simple: the secular is not neutral. Philosophy is intrinsically biased towards the secular. If you follow people like Asad and Tomoko Masuzawa, this means it is biased toward a Christian conception of the secular, and this hinders it from appreciating the thought structures at work in particular religions.
One of the reasons the title of the book reads, “after the postsecular” is that we felt philosophy of religion had yet to take the postsecular event seriously enough; it was ignoring the intellectual importance of this political event and still clinging to old paradigms for philosophizing about religion, when they had in fact been put into question by the above critique. So, the question is: What does philosophy of religion do now, after the postsecular critique?
DJW: There are two other reasons we speak of this volume being situated after the postsecular. First, in our “Introduction” we distinguish between a genuine postsecular critique of the kind Anthony mentions and a problematic theological appropriation of this critique. The former results in a pluralization of discourses about religion, because the secular is no longer the overarching master-narrative, but one more particular tradition. The latter, however, has tried to replace the secular master-narrative with a Christian one, and so has perversely impeded this process of pluralization.
Yet it is precisely this theological move (exemplified by Radical Orthodoxy) which is more often than not associated with the postsecular. Thus, one of the aims of the volume is to move beyond (hence, “after”) this theological appropriation of the postsecular.
Second, we also conjecture in the Introduction that postsecularity has ended up throwing the baby out with the bathwater – that is, everything from the secular tradition, even what is still valuable. So, in Part One of the volume, especially, the contributors return to the modern, secular tradition to test what is of value in it and what can be reappropriated for contemporary philosophy of religion. In this sense, "after the postsecular" means a mediated return to the secular.
Q: You mentioned Radical Orthodoxy, of which the leader is John Milbank. His rereading of the history of European philosophy and social theory tries to claim a central place for Christian theology as "queen of the sciences." As an agnostic, I tend to think of this as sort of the intellectual equivalent of the Society for Creative Anachronism. But clearly it's been an agenda-setting program in some sectors of theology and philosophy of religion. In counterposing your notion of the postsecular to Radical Orthodoxy, are you implying that the latter is exhausted? Or does that mean that Radical Orthodoxy is still a force to be reckoned with?
APS: On the one hand Radical Orthodoxy, as a particular movement or tendency, is probably exhausted in terms of the creativity and energy that attracted a lot of younger scholars who were working mostly in Christian theology but also in Continental philosophy of religion.
In a way, those of us in this field know what Radical Orthodoxy is now -- whereas before its anachronism seemed to be opening genuinely interesting lines of intellectual inquiry, perhaps encouraging interesting changes in the structure of institutional religious life. Now its major figures have aligned themselves with the thought of the current Pope in his attempt at “Re-Christianizing Europe,” with its nefarious narrative of a Christian Europe needing to be defended against Islam and secularism. They are also aligned with the policies of the present-day UK Tory Party via Phillip Blond and his trendy ResPublica think-tank.
So, on the other hand, while its creative power is probably on the wane, it is still something that must be reckoned with -- precisely because of this newfound institutional power, and because we know that its research program ends in old answers to new questions. We have to move beyond mere criticism, though, to offering a better positive understanding of religion, philosophy, and politics, and this volume begins to do that. This means going far beyond addressing Radical Orthodoxy as such, though, and to addressing the reactionary and obfuscatory form of thought that lies beneath Radical Orthodoxy and which persists in other thinkers who don’t identity with this particular movement.
DJW: Yes, it is something broader that troubles continental philosophy of religion now – not merely Radical Orthodoxy as such, but what we try to articulate in our Introduction as the more general tendency to theologize philosophy of religion. Many philosophers of religion – even when they see themselves as opponents of Radical Orthodoxy – ultimately treat their discipline as an extension of theology. It is quite normal to attend a keynote lecture at a Continental philosophy of religion conference and end up listening to a theology lecture! This is the reason that questions concerning the specificity of philosophy of religion (what sets it structurally apart from theology) dominate After the Postsecular and the Postmodern. Such questions are not meant solely as attacks on Radical Orthodoxy, but aim to interrogate the whole zeitgeist in which Radical Orthodoxy participates.
Q: I'm struck by how your book reflects a revival of interest in certain thinkers -- Schelling, Bergson, Rosenzweig. Or rather, perhaps, their transformation from the focus of more or less historical interest to inspiration for contemporary speculation. How much of this is a matter of following in the footsteps of Deleuze or Å½iÅ¾ek?
DJW: Deleuze and Å½iÅ¾ek are exemplary figures for many of the contributors to this volume. We philosophize in their shadow – and, you’re right, in particular it is their perverse readings of Bergson, Schelling etc which have taught us how to relate to the history of philosophy in new, heterodox ways.
“Experiment” is one of the key words in After the Postsecular and the Postmodern: all of us who contributed wanted to see what new potential could be opened up within philosophy of religion by mutating its traditions and canons through the lens of contemporary speculation. Having said that, I think both terms of your distinction (“inspiration for contemporary speculation” and “historical interest”) are important at the present moment.
Ignorance of the history of philosophy of religion is the academic norm, and our wager is that through straightforward history of philosophy one can excavate resources that have been neglected, so as to begin to see the discipline afresh. It is a matter of revitalizing our sense of what philosophy of religion can do. Therefore, while mutating the history of philosophy is crucial, so too is understanding what that history is. So little has been written about Bergson or Rosenzweig’s contributions in this regard that a relatively straight-laced understanding of them is one of the volume’s most pressing tasks.
APS: In France at the time that Deleuze was studying and writing his first books, there was hegemony in the study of philosophy by the "three H's” (Hegel, Husserl, and Heidegger). He followed a different path in his own work, writing important studies on Hume, Bergson, and Nietzsche (amongst others). With the rise in Deleuze’s popularity these choices in figures have taken on the character of a canon, but at the time it was considered quite heretical and bold.
While the historical canon for mainstream Anglophone philosophy of religion tends to focuses on Locke, Hume, and Kant, we hope our volume helps to establish an alternative canon that draws on more speculative thinkers from the modern tradition, like Spinoza, Schelling, and Bergson. We think that not only will this help us to address the persistent questions of philosophy of religion but will allow us to reframe those very questions.
Q: The names of a few contributors are familiar to me from reading An und für sich and other blogs. Would you say something about how the sort of "floating seminar space" of online conversation shapes the emergence of a project like this one?
APS: Many people have noted the democratic nature of blogging, which can disrupt the usual hierarchies in the academic world. While that can lead to intensely antagonistic encounters -- especially in the early days when we were all still navigating this new social space -- it can also lead to incredible intellectual friendships. I started blogging when I was 19 in the hopes of being part of an intellectual community that I didn’t have at university. This lack of a community was partly because I was a commuter student traveling four hours round trip per day, which didn’t leave a lot of time to participate face-to-face, and partly because my own interests in religion were not shared by most of the other students in my philosophy department.
The group blogs I have been a part of, first The Weblog and then An und für sich, attracted people in similar situations -- people who existed in a liminal space between philosophy, theory, theology, and religious studies and wanted to discuss these issues, but for whatever reason couldn’t do so in their immediate communities.
I think it is safe to say that without the blogging community the volume wouldn’t have existed. It was because of the blog that Daniel first contacted me about participating in the postgraduate conference in philosophy of religion that he had set up in Oxford and it was this conference that ultimately led to the volume. We have tried to transfer the democratic spirit of blogging to the volume, so while we do have contributions from established academics in the volume, we also have included a number of graduate students, intellectuals outside the academy, and those still searching for a tenured position (if there are any!).
Even though we don’t have a “big name” like Å½iÅ¾ek or Vattimo in the volume, we have still been able to attract interest simply on the strength of the ideas in the book, which are talked about on AUFS and other blogs. The volume has even made its way onto a syllabus already! John Caputo, formerly professor of philosophy at Villanova and now professor of religion and humanities at Syracuse, has his students reading the Editors’ Introduction for his graduate course called "The Future of Continental Philosophy of Religion," which we are really excited about.
Q: Sometimes the relationship of academic theological discourse to any creed or confession can be difficult to make out. With the philosophy of religion, obviously, such distance seems to be built right in. What are the stakes of your book – if any – for "people of faith," as the expression goes? That is, do you see this work as having consequences for what goes on at a church, synagogue, mosque, or whatever?
DJW: I tend to deploy a rather crude, form/content model on this issue: the material with which "people of faith," theologians, and philosophers of religion all deal is the same – "religion" in the broadest sense of the word. It is the operations of thought to which this material is subjected that differentiates them. What distinguishes philosophy of religion from theology or everyday religious practice is the specific kind of labor to which “religion” is here subjected. The question then becomes: Does "religion" after such transformations bear any resemblance to or (more importantly) have any relevance to the “religion” with which “people of faith” engage? And the answer is still very much open to dispute.
To take some examples: George Pattison (one of the contributors to the volume) is currently involved in a project on the phenomenology of religious life and it seems plausible that some form of this project could indeed be relevant to everyday religious practice – articulating its often implicit assumptions. On the other hand, I would be horrified if someone found a kernel of everyday relevance in my contribution on Schelling (in which I argue that names such as “Christ” or “Krishna” are literally the products of geological eruptions).
Personally (and here I am speaking very much for myself), I think there’s an element of smugness to the anti-“ivory tower” rhetoric that has emerged in the academy in the last century: the assertion that academics have something interesting or useful to say to the world imparts, in my mind, false value to what we say. In other words, I feel content to revel in the uselessness of my work.
APS: I love this answer! The militancy behind it stands against the pathetic “Theologian-Pope impulse” of so many theologians or the “Philosopher-King impulse” of so many philosophers that think the salvation for the world lies in our thought.
However, I want to nuance it somewhat, as I do think some of what lies behind what we do as academics, the reasons we take up this work, can participate in political struggles or help to deal with the very serious problems we face without our thought being directly “useful” in some crude practice of meeting targets or productivity goals. Spinoza wouldn’t have been much use as the ruler of the Netherlands, I’m sure, but when his ideas were taken up by others, and thereby mutated, they did have a real effect, and much of it positive.
The same goes for most of our great philosophers. But what Dan called the "uselessness" of our work in some sense mirrors the uselessness of religion in general. This character that religion has, identified by philosophers like Bataille, Nietzsche, and contemporaries like Goodchild, is in many ways offensive to the shape of contemporary life, where everything has its proper price, where we have to be thrifty and austere. Religion seems like a magnificent waste of time and money, unless of course it can be put to use convincing people to go to war to kill or be good little boys and girls and not harm their potential market value as workers with too much unclean living.
The same is true of this kind of academic work we do. It is useless within the parameters of contemporary society, but when contemporary society produces things like the poor and middle-class paying for massive bank bailouts and ecological disasters in the Gulf of Mexico and off the coast of Nigeria, then maybe uselessly thinking about things outside those parameters isn’t such a bad way to spend ones life.
Q: As I've been reading your book, Republican leader Newt Gingrich and others have been arguing idea that the imposition of Sharia law in the United States is an urgent danger that must be fought. From one perspective, this looks like pure cynicism; the notion that it’s a real issue in American political life is laughable. But what do you make of it? How does it fit in any narrative of the postsecular condition, or any analysis of the strains and fault lines of secularity?
APS: Right, there is about as much danger of Sharia law being imposed as there is of French becoming the national language! This is an example of what we call in our introduction the “obscure postsecular” (again drawing on Badiou). Out one side of their mouth these politicians tell us that we must defend our modern, secular values from the medieval barbarism of radical Islam, and out of the other side they are condemning secularists for not understanding the “power of religion.”
The power of this obscure postsecular, why it gets taken seriously, is because it latches on to a kernel of truth. Frankly, many in the public sphere don’t understand the power of religion! Hell, when it comes to Islam, many of them don’t even understand the basics, let alone that within Islam there is a cacophony of different spiritual practices and, as in most religions, an internal conflict between a law-bound Islam and an Islam of liberty. This is argued for very clearly in a number of French scholars of Islam, like Henry Corbin and Christian Jambet, though it doesn’t appear to have been a lesson the ruling class have learned going by the recent idiotic, racist and completely unsecular headscarf ban in France.
So, this lack of knowledge is behind both Gingrich’s call to resist Sharia law and the ruling, which Gingrich referenced, from the New Jersey judge that a Muslim man could forcibly rape his wife because it was a religious custom; I know of a number of Islamic feminists who I’m sure would like to speak with Judge Edith Payne! With both Gingrich and Payne we have an obscuring of the postsecular: they both recognize that something has changed, but they call on some transcendent identity of Islam or America that obscures any real confrontation with that change. Notice that neither one of them recognizes that there are elements within Islam -- mainstream Islam! -- that reject honor killings, abuse of women, the murder of civilians, and the like.
The situation becomes even more obscure in the UK, where I currently live. While in the U.S. all our money declares “In God We Trust”; in the UK all money bears the image of the sovereign, Queen Elizabeth II. Surely this, a divine right monarchy, is an example of the relic of medievalism that Gingrich mentions! Yet, on the other side of the bill, depending on the denomination, you will find Charles Darwin or Adam Smith. The very figures who ushered in the forms of thought that our old narratives tell us swept away medieval superstition.
Now, to my mind this means that all our conventional narratives of secularization are inherently flawed. The classic liberal narrative of a neutral secular has been undone by the postsecular event. The liberal secular was a weapon used in the expansion of European imperialism, which tried to deny those in the colonial world resources from their varied religious traditions.
At the same time the anti-liberal narrative that secularity is to be rejected because of this complicity is also false. It has a similar political function, by creating and exacerbating divisions within a particular class but along imaginary or unimportant differences, playing into a myopic Clash of Civilizations theory that actually engenders the reality of that clash. The volume offers resources towards constructing a very different theory of the secular, of a postsecular secular, what we call a “generic secular” that goes some way towards superseding these flawed, conventional narratives.
Practically that means both a straightforward understanding of particular religions as they present themselves in their complexity, suppressing as much as possible the imperialist tendencies of the liberal secular, and deploying the same kind of bold internal, immanent critique of these particular religions that we find in the modern thinkers covered in the volume. The answer to these political problems may partially be found by experimenting with ideas from Islam and Christianity from the position of the generic secular.