History

We, the People...

For better and for worse, the American reception of contemporary French thought has often followed a script that frames everything in terms of generational shifts. Lately, that has usually meant baby-boomer narcissism -- as if the youngsters of '68 don't have enough cultural mirrors already. Someone like Bernard-Henri Lévy, the roving playboy philosopher, lends himself to such branding without reserve. Most of his thinking is adequately summed up by a thumbnail biography -- something like,  "BHL was a young Maoist radical in 1968, but then he denounced totalitarianism, and started wearing his shirts unbuttoned, and the French left has never recovered."

Nor are American academics altogether immune to such prepackaged blendings of theory and lifestyle. Hey, you -- the Foucauldian with the leather jacket that doesn't fit anymore....Yeah, well, you're complicit too.

But there are thinkers who don't really follow the standard scripts very well, and Pierre Rosanvallon is one them. Democracy Past and Future, the selection of his writings just published by Columbia University Press, provides a long overdue introduction to a figure who defies both sound bites and the familiar academic division of labor. Born in 1948, he spent much of the 1970s as a sort of thinker-in-residence for a major trade union, the Confédération Française Démocratique du Travail, for which he organized seminars and conferences seeking to create a non-Marxist "second left" within the Socialist Party. He emerged as a theoretical voice of the autogestion (self-management) movement. His continuing work on the problem of democracy was honored in 2001 when he became a professor at the Collège de France, where Rosanvallon lectures on the field he calls "the philosophical history of the political."

Rosanvallon has written about the welfare state. Still, he isn't really engaged in political science. He closely studies classical works in political philosophy -- but in a way that doesn't quite seem like intellectual history, since he's trying to use the ideas as much as analyze them. He has published a study of the emergence of universal suffrage that draws on social history. Yet his overall project -- that of defining the essence of democracy -- is quite distinct from that of most social historians. At the same time (and making things all the more complicated) he doesn't do the kind of normative political philosophy one now associates with John Rawls or Jurgen Habermas.

Intrigued by a short intellectual autobiography that Rosanvallon presented at a conference a few years ago, I was glad to see the Columbia volume, which offers a thoughtful cross-section of texts from the past three decades. The editor, Samuel Moyn, is an assistant professor of history at Columbia. He answered my questions on Rosanvallon by e-mail.

Q:Rosanvallon is of the same generation as BHL. They sometimes get lumped together. Is that inevitable? Is it misleading?

A: They are really figures of a different caliber and significance, though you are right to suggest that they lived through the same pivotal moment. Even when he first emerged, Bernard-Henri Lévy faced doubts that he mattered, and a suspicion that he had fabricated his own success through media savvy. One famous thinker asked whether the "new philosophy" that BHL championed was either new or philosophy; and Cornelius Castoriadis attacked BHL and others as "diversionists." Yet BHL drew on some of the same figures Rosanvallon did -- Claude Lefort for example -- in formulating his critique of Stalinist totalitarianism. But Lefort, like Castoriadis and Rosanvallon himself, regretted the trivialization that BHL's meteoric rise to prominence involved.

So the issue is what the reduction of the era to the "new philosophy" risks missing. In retrospect, there is a great tragedy in the fact that BHL and others constructed the "antitotalitarian moment" (as that pivotal era in the late 1970s is called) in a way that gave the impression that a sententious "ethics" and moral vigilance were the simple solution to the failures of utopian politics. And of course BHL managed to convince some people -- though chiefly in this country, if the reception of his recent book is any evidence -- that he incarnated the very "French intellectual" whose past excesses he often denounced.

In the process, other visions of the past and future of the left were ignored. The reception was garbled -- but it is always possible to undo old mistakes. I see the philosophy of democracy Rosanvallon is developing as neither specifically French nor of a past era. At the same time, the goal is not to substitute a true philosopher for a false guru. The point is to use foreign thinkers who are challenging to come to grips with homegrown difficulties.

Q:Rosanvallon's work doesn't fit very well into some of the familiar disciplinary grids. One advantage of being at the Collège de France is that you get to name your own field, which he calls "the philosophical history of the political." But where would he belong in terms of the academic terrain here?

A: You're right. It's plausible to see him as a trespasser across the various disciplinary boundaries. If that fact makes his work of potential interest to a great many people -- in philosophy, politics, sociology, and history -- it also means that readers might have to struggle to see that the protocols of their own disciplines may not exhaust all possible ways of studying their questions.

But it is not as if there have not been significant interventions in the past -- from Max Weber for example, or Michel Foucault in living memory  -- that were recognized as doing something relevant to lots of different existing inquiries. In fact, that point suggests that it may miss the point to try to locate such figures on disciplinary maps that are ordinarily so useful. If I had to sum up briefly what Rosanvallon is doing as an intellectual project, I would say that the tradition of which he's a part -- which includes his teacher Lefort as well as some colleagues like Marcel Gauchet and others -- is trying to replace Marxism with a convincing alternative social theory.

Most people write about Marxism as a political program, and of course any alternative to it will also have programmatic implications. But Marxism exercised such appeal because it was also an explanatory theory, one that claimed, by fusing the disciplines, to make a chaotic modern history -- and perhaps history as a whole -- intelligible. Its collapse, as Lefort's own teacher Maurice Merleau-Ponty clearly saw, threatened to leave confusion in its wake, unless some alternative to it is available. (Recall Merleau-Ponty's famous proclamation: "Marxism is not a philosophy of history; it is the philosophy of history, and to renounce it is to dig the grave of reason in history.")

Rosanvallon seems to move about the disciplines because, along with others in the same school, he has been trying to put together a total social theory that would integrate all the aspects of experience into a convincing story. They call the new overall framework they propose "the political," and Rosanvallon personally has focused on making sense of democratic modernity in all its facets. Almost no one I know about in the Anglo-American world has taken up so ambitious and forbidding a transdisciplinary task, but it is a highly important project.

Q:As the title of your collection neatly sums up, Rosanvallon's definitive preoccupation is democracy. But he's not just giving two cheers for it, or drawing up calls for more of it. Nor is his approach, so far as I can tell, either descriptive nor prescriptive. So what does that leave left for a philosopher to do?

A: At the core of his conception of democracy, there is a definitive problem: The new modern sovereign (the "people" who now rule) is impossible to identify or locate with any assurance. Democracy is undoubtedly a liberatory event -- a happy tale of the death of kings. But it must also face the sadly intractable problem of what it means to replace them.

Of course, the history of political theory contains many proposals for discovering the general will. Yet empirical political scientists have long insisted that "the people" do not preexist the procedures chosen for knowing their will. In different words, "the people" is not a naturally occurring object. Rosanvallon's work is, in one way or another, always about this central modern paradox: If, as the U.S. Constitution for instance says, "We the people" are now in charge, it is nevertheless true that we the people have never existed together in one place, living at one time, speaking with one voice. Who, then, is to finally say who "we" are?

The point may seem either abstract or trivial. But the power of Rosanvallon's work comes from his documentation of the ways -- sometimes blatant and sometimes subtle -- that much of the course and many of the dilemmas of modern history can be read through the lens of this paradox. For example, the large options in politics can also be understood as rival answers to the impossible quandary or permanent enigma of the new ruler's identity. Individual politicians claim special access to the popular will either because they might somehow channel what everyone wants or because they think that a rational elite possesses ways of knowing what the elusive sovereign would or should want. Democracy has also been the story, of course, of competing interpretations of what processes or devices are most likely to lead to results approximating the sovereign will.

Recently, Rosanvallon has begun to add to this central story by suggesting that there have always been -- and increasingly now are -- lots of ways outside electoral representation that the people can manifest their will, during the same era that the very idea that there exists a coherent people with a single will has entered a profound crisis.

One of the more potent implications of Rosanvallon's premise that there is no right answer to the question of the people's identity is that political study has to be conceptual but also historical. Basic concepts like the people might suggest a range of possible ways for the sovereign will to be interpreted, but only historical study can uncover the rich variety of actual responses to the difficulty.

The point, Rosanvallon thinks, is especially relevant to political theorists, who often believe they can, simply by thinking hard about what democracy must mean, finally emerge with its true model, whether based on a hypothetical contract, an ideal of deliberation, or something else. But the premise also means that democracy's most basic question is not going to go away, even if there are better and worse responses.

Q:Now to consider the relationship between Rosanvallon's work and political reality "on the ground" right now. Let's start with a domestic topic: the debate over immigration. Or more accurately, the debate over the status of people who are now part of the U.S. economy, but are effectively outside the polity. I'm not asking "what would Rosanvallon do?" here, but rather wondering: Does his work shed any light on the situation? What kinds of questions or points would Rosanvallonists (assuming there are any) be likely to raise in the discussion?

A: It's fair to ask how such an approach might help in analyzing contemporary problems. But his approach always insists on restoring the burning issues of the day to a long historical perspective, and on relating them to democracy's foundational difficulties. Without pretending to guess what Rosanvallon might say about America's recent debate, I might offer a couple of suggestions about how his analysis might begin.

The controversy over immigrants is so passionate, this approach might begin by arguing, not simply because of economic and logistical concerns but also because it reopens (though it was never closed!) the question of the identity of the people in a democracy. The challenge immigrants pose, after all, is not one of inclusion simply in a cultural sense, as Samuel Huntington recently contended, but also and more deeply in a conceptual sense.

In a fascinating chapter of his longest work, on the history of suffrage, Rosanvallon takes up the history of French colonialism, including its immigrant aftermath. There he connects different historical experiences of immigrant inclusion to the conceptual question of what the criteria for exclusion are, arguing that if democracies do not come to a clear resolution about who is inside and outside their polity, they will vacillate between two unsatisfactory syndromes. One is the "liberal" response of taking mere presence on the ground as a proxy for citizenship, falsely converting a political problem into one of future social integration. The other is the "conservative" response of of conceptualizing exclusion, having failed to resolve its meaning politically, in the false terms of cultural, religious, or even racial heterogeneity. Both responses avoid the real issue of the political boundaries of the people.

But Rosanvallon's more recent work allows for another way of looking at the immigration debate. In a new book coming out in French in the fall entitled "Counterdemocracy," whose findings are sketched in a preliminary and summary fashion in the fascinating postscript to the English-language collection, Rosanvallon tries to understand the proliferation of ways that popular expression occurs outside the classical parliamentary conception of representation. There, he notes that immigration is one of several issues around which historically "the people" have manifested their search for extraparliamentary voice.

For Rosanvallon, the point here is not so much to condemn populist backlash, as if it would help much simply to decry the breakdown of congressional lawmaking under pressure. Rather, one might have to begin by contemplating the historical emergence of a new form of democracy -- what he calls unpolitical democracy -- that often crystallizes today around such a hot-button topic as the status of immigrants. This reframing doesn't solve the problem but might help see that its details turn out to be implicated in a general transformation of how democracy works.

Q:OK, now on to foreign policy. In some circles, the invasion of Iraq was justified as antitotalitarianism in action, and as the first stage a process of building democracy. (Such are the beauty and inspiration of high ideals.) Does Rosanvallon's work lend itself to support for "regime change" via military means? Has he written anything about "nation building"?

A: This is a very important question. I write in my introduction to the collection about the contemporary uses of antitotalitarianism, and I do so mainly to make criticize the recent drift in uses of that concept.

Of course, when the critique of totalitarianism activated a generation, it was the Soviet Union above all that drew their fire. But their critique was always understood to have its most salient implications for the imagination of reform at home, and especially for the renewal of the left. This is what has changed recently, in works of those "liberal hawks," like Peter Beinart and Paul Berman, who made themselves apologists for the invasion of Iraq in the name of antitotalitarian values. Not only did they eviscerate the theoretical substance on which the earlier critique of totalitarianism drew -- from the work of philosophers like Hannah Arendt and Claude Lefort among others -- but they wholly externalized the totalitarian threat so that their critique of it no longer had any connection to a democratic program. It became purely a rhetoric for the overthrow of enemies rather than a program for the creation or reform of democracies. In the updated approach, what democracy is does not count as a problem.

It is clear that this ideological development, with all of its real-world consequences, has spelled the end of the antitotalitarian coalition that came together across borders, uniting the European left (Eastern and Western) with American liberalism, thirty years ago. That the attempt to update it and externalize that project had failed became obvious even before the Iraq adventure came to grief -- the project garnered too few allies internationally.

Now it is perfectly true that the dissolution of this consensus leaves open the problem of how democrats should think about foreign policy, once spreading it evangelistically has been unmasked as delusional or imperialistic. A few passages in the collection suggest that Rosanvallon thinks the way to democratize the world is through democratization of existing democracies -- the reinvigoration of troubled democracies is prior to the project of their externalization and duplication. Clearly this response will not satisfy anyone who believes that the main problem in the world is democracy's failure to take root everywhere, rather than its profound difficulties where it already is. But clarifying the history and present of democracy inside is of undoubted relevance to its future outside.

Q:There are some very striking passages in the book that discuss the seeming eclipse of the political now. More is involved than the withdrawl from civic participation into a privatized existence. (At the same time, that's certainly part of it.) Does Rosanvallon provide an account of how this hollowing-out of democracy has come to pass? Can it be reversed? And would its reversal necessarily be a good thing?

A: One of the most typical responses to the apparent rise of political apathy in recent decades has been nostalgia for some prior society -- classical republics or early America are often cited -- that are supposed to have featured robust civic engagement. The fashion of "republicanism" in political theory, from Arendt to Michael Sandel or Quentin Skinner, is a good example. But Rosanvallon observes that the deep explanation for what is happening is a collapse of the model of democracy based on a powerful will.

The suggestion here is that the will of the people is not simply hard to locate or identify; its very existence as the foundation of democratic politics has become hard to credit anymore. The challenge is to respond by taking this transformation as the starting point of the analysis. And there appears to be no return to what has been lost.

But in his new work, anticipated in the postscript, Rosanvallon shows that the diagnosis may be faulty anyway. What is really happening, he suggests, is not apathy towards or retreat from politics in a simple sense, but the rise of new forms of democracy -- or counterdemocracy -- outside the familiar model of participation and involvement. New forms seeking expression have multiplied, through an explosion of devices, even if they may seem an affront to politics as it has ordinarily been conceptualized.

Rosanvallon's current theory is devoted to the project of putting the multiplication of representative mechanisms -- ones that do not fit on existing diagrams of power -- into one picture. But the goal, he says, is not just to make sense of them but also to find a way for analysis to lead to reform. As one of Rosanvallon's countrymen and predecessors, Alexis de Tocqueville, might have put it: Democracy still requires a new political science, one that can take it by the hand and help to sanctify its striving.

Author/s: 
Scott McLemee
Author's email: 
scott.mclemee@insidehighered.com

For further reading: Professor Moyn is co-author (with Andrew Jainhill of the University of California at Berkeley) of an extensive analysis of the sources and inner tensions of Rosanvallon's thought on democracy, available online.  And in an essay appearing on the Open Democracy Webs ite in 2004, Rosanvallon reflected on globalization, terrorism, and the war in Iraq.

Scott McLemee writes Intellectual Affairs each week. Suggestions and ideas for future columns are welcome.

The Kircher Code

The table sits at the front of the bookshop, near the door. That way it will get maximum exposure as people come and go. "If you enjoyed The Da Vinci Code," the sign over it says, "you might also like..." The store is part of a national chain, meaning there are hundreds of these tables around the country. Thousands, even.

And yet the display, however eyecatching, is by no means a triumph of mass-marketing genius. The bookseller is denying itself a chance to appeal to an enormous pool of consumer dollars. I'm referring to all the people who haven’t read Dan Brown’s globe-bestriding best-seller -- and have no intention of seeing the new movie -- yet are already sick to death of the whole phenomenon.

"If you never want to hear about The Da Vinci Code again," the sign could say, "you might like...."

The book’s historical thesis (if that is the word for it) has become the cultural equivalent of e-mail spam. You just can’t keep it out. The premise sounds more preposterous than thrilling: Leonardo da Vinci was the head of a secret society (with connections to the Knights Templar) that guarded the hidden knowledge that Mary Magdeleine fled Jerusalem, carrying Jesus’s child, and settled in France....

All of this is packaged as a contribution to the revival of feminine spirituality. Which is, in itself, enough to make the jaw drop, at least for anyone with a clue about the actual roots of this little bit of esoteric hokum.

Fantasies about the divine bloodlines of certain aristocratic families are a staple of the extreme right wing in Europe. (The adherents usually also possess "secret knowledge" about Jewish bankers.) And anyone contending that the Knights Templar were a major factor behind the scenes of world history will turn out to be a simpleton, a lunatic, or some blend of the two -- unless, of course, it’s Umberto Eco goofing on the whole thing, as he did in Foucault’s Pendulum.

It's not that Dan Brown is writing crypto-fascist novels. He just has really bad taste in crackpot theories. (Unlike Eco, who has good taste in crackpot theories.)

And Leonardo doesn’t need the publicity -- whereas my man Athanasius Kircher, the brilliant and altogether improbable Jesuit polymath, does.

Everybody has heard of the Italian painter and inventor. As universal geniuses go, he is definitely on the A list. Yet we Kircher enthusiasts feel duty-bound to point out that Leonardo started a lot more projects than he ever finished -- and that some of his bright ideas wouldn’t have worked.

Sure, Leonardo studied birds in order to design a flying machine. But if you built it and jumped off the side of a mountain, they’d be scrapping you off the bottom of the valley. Of course very few people could have painted "Mona Lisa." But hell, anybody can come up with a device permitting you to plunge to your death while waving your arms.

Why should he get all the press, while Athanasius Kircher remains in relative obscurity? He has just as much claim to the title of universal genius. Born in Germany in 1602, he was the son of a gentleman-scholar with an impressive library (most of it destroyed during the Thirty Years’ War). By the time Kircher became a monk at the age of 16, he had already become as broadly informed as someone twice his age.

He joined the faculty of the Collegio Romano in 1634, his title was Professor of Mathematics. But by no means is that a good indicator of his range of scholarly accomplishments. He studied everything. Thanks to his access to the network of Jesuit scholars, Kircher kept in touch with the latest discoveries taking place in the most far-flung parts of the world. And a constant stream of learned visitors to Rome came to see his museum at the Vatican, where Kircher exhibited curious items such as fossils and stuffed wildlife alongside his own inventions.

Leonardo kept most of his more interesting thoughts hidden in notebooks. By contrast, Kircher was all about voluminous publication. His work appeared in dozens of lavishly illustrated folios, the publication of which was often funded by wealthy and powerful figures. The word "generalist" is much too feeble for someone like Kircher. He prepared dictionaries, studied the effects of earthquakes, theorized about musical acoustics, and engineered various robot-like devices that startled tourists with their lifelike motions.

He was also enthusiastic about the microscope. In a book published in 1646, Kircher mentioned having discovered “wonders....in the verminous blood of those sick with fever, and numberless other facts not known or understood by a single physician.” He speculated that very small animals “with a vast number and variety of motions, colors, and almost invisible parts” might float up from from “the putrid vapors” emitted by sick people or corpses.

There has long been a scholarly debate over whether or not Kircher deserves recognition as the inventor of the germ theory of disease. True, he seems not to have had a very clear notion of what was involved in experimentation (then a new idea). And he threw off his idea about the very tiny animals almost in passing, rather than developing it in a rigorous manner.  But then again, Kircher was a busy guy. He managed to stay on the good side of three popes, while some of his colleagues in the sciences had trouble keeping the good will of even one.
Among Kircher’s passions was the study of ancient Egypt. As a young man, he read an account of the hieroglyphics that presented the idea that they were decorative inscriptions -- the equivalent of stone wallpaper, perhaps. (After all, they looked like tiny pictures.) This struck him as unlikely. Kircher suspected the hieroglyphics were actually a language of some kind, setting himself the task of figuring out how to read it.

And he made great progress in this project – albeit in the wrong direction. He decided that the symbols were somehow related to the writing system of the Chinese, which he did know how to read, more or less. (Drawing on correspondence from his missionary colleagues abroad, Kircher prepared the first book on Chinese vocabulary published in Europe.)

Only in the 19th century was Jean Francois Champollion able to solve the mystery, thanks to the discovery of the Rosetta Stone. But the French scholar gave the old Jesuit his due for his pioneering (if misguided) work. In presenting his speculations, Kircher had also provided reliable transcriptions of the hieroglyphic texts. They were valuable even if his guesses about their meaning were off.

Always at the back of Kircher’s mind, I suspect, was the story from Genesis about the Tower of Babel. (It was the subject of one of his books.) As a good Jesuit, he was doubtless confident of belonging to the one true faith -- but at the same time, he noticed parallels between the Bible and religious stories from around the world. There were various trinities of dieties, for example. As a gifted philologist, he noticed the similarities among different languages.

So it stood to reason that the seeming multiplicity of cultures was actually rather superficial. At most, it reflected the confusion of tongues following God’s expressed displeasure about that big architectural project. Deep down, even the pagan and barbarous peoples of the world had some rough approximation of the true faith.

That sounds ecumenical and cosmopolitan enough. It was also something like a blueprint for conquest: Missionaries would presumably use this basic similarity as a way to "correct" the beliefs of those they were proselytizing.

But I suspect there is another level of meaning to his musings. Kircher’s research pointed to the fundamental unity of the world. The various scholarly disciplines were, in effect, so many fragments of the Tower of Babel. He was trying to piece them together. (A risky venture, given the precedent.)

He was not content merely to speculate. Kircher tried to make a practical application of his theories by creating a "universal polygraphy" -- that is, a system of writing that would permit communication across linguistic barriers. It wasn’t an artificial language like Esperanto, exactly, but rather something like a very low-tech translation software. It would allow you to break a sentence in one language down to units, which were to be represented by symbols. Then someone who knew a different language could decode the message.

Both parties needed access to the key -- basically, a set of tables giving the meaning of Kircher’s "polygraphic" symbols. And the technique would place a premium on simple, clear expression. In any case, it would certainly make international communication faster and easier.

Unless (that is) the key were kept secret. Here, Kircher seems to have had a brilliant afterthought. The same tool allowing for speedy, transparent exchange could (with some minor adjustments) also be used to conceal the meaning of a message from prying eyes. He took this insight one step further -- working out a technique for embedding a secret message in what might otherwise look like a banal letter. Only the recipient -- provided he knew how to crack the code -- would be able to extract its hidden meaning.

Even before his death in 1680, there were those who mocked Athanasius Kircher for his vanity, for his gullibility (he practiced alchemy), and for the tendency of his books to wander around their subjects in a rather garrulous and self-indulgent manner. Nor did the passing of time and fashion treat him well. By the 18th century, scholars knew that the path to exact knowledge involved specialization. The wild and woolly encyclopedism of Athanasius Kirscher was definitely a thing of the past.

Some of the disdain may have been envy. Kircher was the embodiment of untamed curiosity, and it is pretty obvious that he was having a very good time. Even granting detractors all their points, it is hard not to be somewhat in awe of the man. Someone who could invent microbiology, multiculturalism, and encryption technology (and in the 17th century no less) at least deserves to be on a T-shirt.

But no! All anybody wants to talk about is da Vinci. (Or rather, a bogus story about him that is the hermeneutic equivalent of putting "The Last Supper" on black velvet.)

Well, if you can’t beat 'em.... Maybe it's time for a trashy historical thriller that will give Kircher his due. So here goes:

After reading this column, Tom Hanks rushes off to the Vatican archives and finds proof that Kircher used his "universal polygraphy" to embed secret messages in his the artwork for his gorgeously illustrated books.

But that’s not all. By cracking the code, he finds a cure to the avian flu. Kircher has recognized this as a long-term menace, based on a comment by a Jesuit missionary work. (We learn all this in flashbacks. I see Phillip Seymour Hoffman as Athanasius Kircher.)

Well, it's a start, anyway. And fair warning to Dan Brown. Help yourself to this plot and I will see you in court. It might be a terrible idea, but clearly that's not stopped you before.

Author/s: 
Scott McLemee
Author's email: 
scott.mclemee@insidehighered.com

Grab Bag

A warning: This week’s column will be miscellaneous, not to say meandering. It updates earlier stories on Wikipedia, Upton Sinclair, and the Henry Louis Gates method of barbershop peer-review. It also provides a tip on where to score some bootleg Derrida.

Next week, I’ll recap some of my talk from the session on “Publicity in the Digital Age” at the annual conference of the Association of American University Presses, covered here last week. The audience consisted of publicists and other university-press staff members. But some of the points covered might be of interest to readers and writers of academic books, as well as those who publish them.

For now, though, time to link up some loose ends....

One blogger noted that the comments following my column on Wikipedia were rather less vituperative than usual. Agreed -- and an encouraging sign, I think. The problems with open-source encyclopedism are real enough. Yet so are the opportunities it creates for collaborative and public-spirited activity. It could be a matter of time before debate over Wikipedia turns into the usual indulgence in primal-scream therapy we call "the culture wars." But for now, anyway, there’s a bit of communicative rationality taking place. (The Wikipedia entry on "communicative rationality" is pretty impressive, by the way.)

A few days after that column appeared, The New York Times ran a front-page article on Wikipedia. The reporter quoted one Wikipedian’s comment that, at first, “everything is edited mercilessly by idiots who do stupid and weird things to it.” Over time, though, each entry improves. The laissez faire attitude towards editing is slowly giving way to quality control. The Times noted that administrators are taking steps to reduce the amount of “drive-by nonsense.”

The summer issue of the Journal of American History includes a thorough and judicious paper on Wikipedia by Roy Rosenzweig, a professor of history and new media at George Mason University. Should professional historians join amateurs in contributing to Wikipedia? “My own tentative answer,” he writes, “is yes.”

Rosenzweig qualifies that judgment with all the necessary caveats. But overall, he finds that the benefits outweigh the irritations. “If Wikipedia is becoming the family encyclopedia for the twenty-first century,” he says, “historians probably have a professional obligation to make it as good as possible. And if every member of the Organization of American Historians devoted just one day to improving the entries in her or his areas of expertise, it would not only significantly raise the quality of Wikipedia, it would also enhance popular historical literacy.”

The article should be interesting and useful to scholars in other fields. It is now available online here.

This year marks the centennial of Upton Sinclair’s classic muckraking novel, The Jungle, or rather, of its appearance in book form, since it first ran as a serial in 1905. In April of last year, I interviewed Christopher Phelps, the editor of a new edition of the novel, for this column.

Most of Sinclair’s other writings have fallen by the wayside. Yet he is making a sort of comeback. Paul Thomas Anderson, the director of Boogie Nights and Magnolia, is adapting Sinclair’s novel Oil! --  for the screen; it should appear next year under the title There Will Be Blood. (Like The Jungle, the later novel from 1927 was a tale of corruption and radicalism, this time set in the petroleum industry.) And Al Gore has lately put one of Sinclair's pithier remarks into wide circulation in his new film: “It is difficult to get a man to understand something when his salary depends upon his not understanding it.”

That sentiment seems appropriate as a comment on a recent miniature controversy over The Jungle. As mentioned here one year ago, a small publisher called See Sharp Press  claims that the standard edition of Sinclair’s text is actually a censored version and a travesty of the author’s radical intentions. See Sharp offers what it calls an “unexpurgated” edition of the book -- the version that “Sinclair very badly wanted to be the standard edition,” as the catalog text puts it.

An article by Phelps appearing this week on the History News Network Web site takes a careful look at the available evidence regarding the book’s publishing history and Sinclair’s own decisions regarding the book and debunks the See Sharp claims beyond a reasonable doubt.

In short, Sinclair had many opportunities to reprint the serialized version of his text, which he trimmed in preparing it for book form. He never did so. He fully endorsed the version now in common use, and made no effort to reprint the "unexpurgated" text as it first appeared in the pages of a newspaper.

It is not difficult to see why. Perhaps the most telling statement on this matter comes from Anthony Arthur, a professor of English at California State University at Northridge, whose biography Radical Innocent: Upton Sinclair has just been published by Random House. While Arthur cites the “unexpurgated” edition in his notes, he doesn’t comment on the claims for its definitive status. But he does characterize the serialized version of the novel as “essentially a rough draft of the version that readers know today, 30,000 words longer and showing the haste with which it was written.”

A representative of See Sharp has accused me of lying about the merits of the so-called unexpurgaged edition. Indeed, it appears that I am part of the conspiracy against it. (This is very exciting to learn.) And yet -- restraining my instinct for villainy, just for a second -- let me also point you to a statement at the See Sharp website explaining why the version of The Jungle that Sinclair himself published is a cruel violation of his own intentions.

Memo to the academy: Why isn’t there a variorum edition of The Jungle? There was a time when it would have been a very labor-intensive project -- one somebody might have gotten tenure for doing. Nowadays it would take a fraction of the effort. The career benefits might be commensurate, alas. But it seems like a worthy enterprise. What’s the hold-up?

In February 2005, I attended a conference on Jacques Derrida held at the Cardozo Law School in New York, covering it in two columns: here and here. A good bit of new material by “Jackie” (as his posse called him) has appeared in English since then, with more on the way this fall. Next month, Continuum is publishing both a biography of Derrida and a volume described as “a personal and philosophical meditation written within two month’s of Derrida’s death.”

Bet you didn’t know there was going to be a race, did you?

In the meantime, I’ve heard about a new translation, available online, of one of Derrida’s late-period writings. It is part of his engagement with the figure of Abraham, the founding phallogocentric patriarch of the three great monotheistic religions. The translator, Adam Kotsko, is a graduate student at the Chicago Theological Seminary. (See this item on the translation from his blog.)

The potential for “open source” translation may yet open more cans of worms than any team of intellectual-property lawyers can handle. I’ll throw this out as a request to anyone who has thoughts on the matter: If you’ve committed them to paper (or disk) please drop me a line at the address given below.

And finally, a return to the intriguing case of Emma Dunham Kelley-Hawkins -- the most important African-American writer who was not actually an African-American writer.

In a column last spring, I reported on the effort to figure out how the author of some rather dull, pious novels had become a sort of cottage industry for critical scholarship in the 1990s. After a couple of days of digging, I felt pretty confident in saying that nobody had thought to categorize Kelley-Hawkins as anything but a white, middle-class New England novelist before 1955.

That was the year a bibliographer included her in a listing of novels by African-American writers -- though without explaining why. And for a long time after that, the scholarship on Kelley-Hawkins was not exactly abundant. Indeed, it seemed that the most interesting thing you could say about her fiction was that all of the characters appeared to be white. Kelley-Hawkins did make a very few references to race, but they were perfectly typical of white prejudice at its most casually cruel.

Only after Henry Louis Gates included her work in a series of reprints by African-American women writers did critics begin noticing all the subtle -- the very, very subtle -- signs of irony and resistance and whatnot. Why, the very absence of racial difference marked the presence of cultural subversion! Or something.

So much ingenuity, in such a bad cause.... Subsequent research suggests that Kelley-Hawkins was Caucasian, by even the most stringent “one drop” standards of white racial paranoia in her day.

A recent item by Caleb McDaniel discusses the most recent work on Kelley-Hawkins. The puzzle now is how the initial re-categorization of her ever took place. Evidently that bibliography from 1955 remains the earliest indication that she might have been African-American. (A second puzzle would be how anyone ever managed to finished reading one of her novels, let alone embroidering it with nuance. They can be recommended to insomniacs.)

McDaniel also quotes something I’d forgotten: the statement by Henry Louis Gates that, if he had put up a photograph of Kelly-Hawkins in his barbershop, “I guarantee the vote would be to make her a sister."

You tend to expect a famous scholar to be familiar with the concept of the sepia tone. Evidently not. Here, again, is where Wikipedia might come in handy.

Author/s: 
Scott McLemee
Author's email: 
scott.mclemee@insidehighered.com

A Lesson From the Churchill Inquiry

Ward Churchill should be fired for academic misconduct -- that’s the decision made by the interim chancellor at the University of Colorado at Boulder, after receiving a report from a faculty committee concluding that Churchill is guilty of falsification, fabrication and plagiarism. That report shows that, even under difficult political conditions, it’s possible to do a good job dealing with charges of research misconduct. The Colorado report on Churchill provides a striking contrast to the flawed 2002 Emory University report on Michael Bellesiles, the historian of gun culture in America, who was found guilty of “falsification” in one table. The contrast says a lot about the ways universities deal with outside pressure demanding that particular professors be fired.

Churchill is the Native American activist and professor of ethnic studies at Colorado who famously declared that some of the people killed in the World Trade Center on 9/11 were “little Eichmanns.” In the furor that followed, the governor of Colorado demanded that the university fire Churchill; the president of the university defended his right to free speech, but then -- facing a series of controversies -- resigned. Churchill’s critics then raised charges that his writings were full of fabrications and plagiarism, and the university appointed a committee of faculty members to evaluate seven charges of specific instances of research misconduct. Their 124-page report, released on May 16, concluded that Churchill’s misconduct was serious and was not limited to a few isolated cases, but was part of a pattern. The panel divided on an appropriate penalty: one recommended revoking his tenure and dismissing him, two recommended suspension without pay for five years, while two others recommended that he be suspended without pay for two years.

One key instance of “falsification and fabrication” was Churchill’s writing about the Mandan, an Indian tribe living in what is now North Dakota, who were decimated by a smallpox epidemic in 1837. The Mandan, Churchill argues, provide one example of how American Indians were the victims of genocide. In an essay titled “An American Holocaust?," he wrote that the U.S. Army infected the Mandan with smallpox by giving them contaminated blankets in a deliberate effort to “eliminate” them. Churchill footnotes several sources as providing evidence for this claim, including UCLA anthropologist Russell Thornton’s book American Indian Holocaust and Survival. But Thornton’s book says the opposite: the Army did not intentionally give infected blankets to the Mandan. None of Churchill’s other sources provide support for his claim. Nevertheless Churchill repeated his argument in six publications over a period of ten years, during which his claims about official U.S. policy toward the Mandan “generally became more extreme.” He refused to admit to the committee that his claims were not supported by the evidence he cited. Therefore, the committee concluded, Churchill was guilty of “a pattern of deliberate academic misconduct involving falsification [and] fabrication.” The panel members came to similar conclusions regarding five other charges.

The five-member Colorado committee worked under a cloud: The only reason they were asked to look at his academic writing was that powerful political voices outside the university wanted Churchill fired for his statement about 9/11. After the university refused to fire him for statements protected by the First Amendment, his critics raised charges of research misconduct, hoping to achieve their original goal. What are the responsibilities of an investigating committee in such a highly-charged political situation?

In this respect the Ward Churchill case has some striking similarities to the case Michael Bellesiles, who was an Emory University historian when he wrote Arming America, a book that won considerable scholarly praise when it first appeared -- and that aroused a storm of outrage because of its argument that our current gun culture was not created by the Founding Fathers. Pro-gun activists demanded that Emory fire Bellesiles, raising charges of research misconduct. Historians too sharply criticized some of his research. Emory responded by appointing a committee that found “evidence of falsification;" Bellesiles then resigned his tenured position.

Although the cases have some striking similarities, starting with the political pressures that gave rise to the investigations and concluding with findings of “falsification,” the differences are significant and revealing. The Emory committee concluded that Bellesiles’ research into probate records was “unprofessional and misleading” as well as “superficial and thesis-driven,” and that his earlier explanations of errors “raise doubts about his veracity." But the panel found “evidence of falsification” only on one page: Table 1, “Percentage of probate inventories listing firearms.” They did not find that he had “fabricated data.” The “falsification” occurred when Bellesiles omitted two years from the table, which covered almost a century -- 1765 to 1859. The two years, 1774 and 1775, would have shown more guns, evidence against his thesis that Americans had few guns before the Civil War.

But the Emory committee failed to consider how significant this omission was for the book as a whole. In fact the probate research criticized by the committee was referred to in only a handful of paragraphs in Bellesiles’s 400 page book, and he cited the problematic Table 1 only a couple of times. If Bellesiles had omitted all of the probate data that the committee (and others) criticized, the book’s argument would still have been supported by a wide variety of other relevant evidence that the committee did not find to be fraudulent.

The Colorado committee, in contrast, made it a point to go beyond the narrow charges they were asked to adjudicate. They acknowledged that the misconduct they found concerned “no more than a few paragraphs” in an “extensive body of academic work.”  They explicitly raised the question of “why so much weight is being assigned to these particular pieces.” They went on to evaluate the place of the misconduct they found in Churchill’s “broader interpretive stance,” and presented evidence of  “patterns of academic misconduct” that were intentional and widespread.

The two committees also took dramatically different approaches to the all-important question of  sanctions. At Emory the committee members never said what they considered an appropriate penalty for omitting 1774 and 1775 from his Table 1. They did not indicate whether any action by Emory was justified -- or whether the harsh criticism Bellesiles received from within the profession was penalty enough.

The Colorado committee members, in contrast, devoted four single-spaced pages to “The Question of Sanctions.”  They insisted that the university “resist outside interference and pressures” when a final decision on Churchill was made. Those favoring the smallest penalty, suspension without pay for two years, declared they were “troubled by the circumstances under which these allegations have been made,” and concerned that dismissal “would have an adverse effect on the ability of other scholars to conduct their research with due freedom.” These important issues needed to be raised, and they were.

Finally, the Colorado committee explicitly discussed the political context of their work, while the Emory committee failed to do so. The Colorado report opened with a section titled simply “Context.” It said “The committee is troubled by the origins of, and skeptical concerning the motives for, the current investigation.” The key, they said, was that their investigation “was only commenced after, and perhaps in some response to, the public attack on Professor Churchill for his controversial publications.” But, they said, because the claims of academic misconduct were serious, they needed to be investigated fully and fairly.

The basic problem with the Emory report was that it accepted the terms of debate set by others, and thereby abdicated responsibility to work independently and to consider the significance of the findings. Their inquiry should have been as sweeping as the stakes were high; instead they limited their examination to a few pages in a great big book.  Colorado shows how to avoid the kind of tunnel vision that marred the Emory report. The report on Ward Churchill demonstrates that charges of research misconduct that arise in a heated political environment can be addressed with intelligence and fairness.

Author/s: 
Jon Wiener
Author's email: 
Wiener@uci.edu

Jon Wiener is professor of history at the University of California at Irvine, and author of Historians in Trouble: Plagiarism, Fraud and Politics in the Ivory Tower (The New Press, 2005).

Last Bastion of Liberal Education?

Why do narratives of decline have such perennial appeal in the liberal arts, especially in the humanities?  Why is it, year after year, meeting after meeting, we hear laments about the good old days and predictions of ever worse days to come?  Why is such talk especially common in elite institutions where, by many indicators,  liberal education is doing quite well, thank you very much.  I think I know why.  The opportunity is just too ripe for the prophets of doom and gloom to pass up.

There is a certain warmth and comfort in being inside the “last bastion of the liberal arts,” as  B.A. Scott characterized prestigious colleges and research universities in his collection of essays The Liberal Arts in a Time of Crisis (NY Praeger, 1990). The weather outside may be frightful, but inside the elite institutions, if not “delightful,” it’s perfectly tolerable, and likely to remain so until retirement time.

Narratives of decline have also been very useful to philanthropy, but in a negative way.  As Tyler Cowen recently noted in The New York Times, “many donors … wish to be a part of large and successful organizations -- the ‘winning team’ so to speak.” They are not eager to pour out their funds in order to fill a moat or build a wall protecting some isolated  “last bastion.” Narratives of decline provide a powerful reason not to reach for the checkbook. Most of us in the foundation world, like most other people, prefer to back winners than losers. Since there are plenty of potential winners out there, in areas of pressing need, foundation dollars have tended to flow away from higher education in general, and from liberal education in particular.

But at the campus level there’s another reason for the appeal of the narrative of decline, a genuinely insidious one. If something goes wrong the narrative of decline of the liberal arts always provides an excuse. If course enrollments decline, well, it’s just part of the trend.  If students don’t like the course, well, the younger generation just doesn’t appreciate such material. If the department loses majors, again, how can it hope to swim upstream when the cultural currents are so strong?  Believe in a narrative of decline and you’re home free; you never have to take responsibility, individual or collective, for anything having to do with liberal education.  

There’s just one problem. The narrative of decline is about one generation out of date and applies now only in very limited circumstances. It’s true that in 1890, degrees in the liberal arts and sciences accounted for about 75 percent of all bachelor’s degrees awarded; today the number is about 39 percent, as Patricia J. Gumport and  John D. Jennings noted in “Toward the Development of Liberal Arts Indicators” (American Academy of Arts and Sciences, 2005). But most of that decline had taken place by 1956, when the liberal arts and sciences had 40 percent of the degrees. 

Since then the numbers have gone up and down, rising to 50 percent by 1970, falling to 33 percent by 1990, and then rising close to the 1956 levels by 2001, the last year for which the data have been analyzed. Anecdotal evidence, and some statistics, suggest that the numbers continue to rise, especially in  Research I universities.  

For example, in the same AAA&S report ("Tracking Changes in the Humanities) from which these figures have been derived, Donald Summer examines the University of Washington (“Prospects for the Humanities as Public Research Universities Privatize their Finances”) and finds that majors in the humanities have been increasing over the last few years and course demand is strong.

The stability of liberal education over the past half century seems to me an amazing story, far more compelling than a narrative of decline, especially when one recognizes the astonishing changes that have taken place over that time: the vast increase in numbers of students enrolled in colleges and universities,  major demographic changes, the establishment of new institutions, the proliferation of knowledge, the emergence of important new disciplines, often in the applied sciences and engineering, and, especially in recent years, the financial pressures that have pushed many institutions into offering majors designed to prepare students for entry level jobs in parks and recreation, criminal justice, and now homeland security studies. And, underlying many of these changes, transformations of the American economy.    

The Other, Untold Story

How, given all these changes, and many others too, have the traditional disciplines of the arts and sciences done as well as they have? That would be an interesting chapter in the history of American higher education. More pressing, however, is the consideration of one important consequence of narratives of decline of the liberal arts.

This is the “last bastion” mentality, signs of which are constantly in evidence when liberal education is under discussion. If liberal education can survive only within the protective walls of elite institutions, it doesn’t really make sense to worry about other places. Graduate programs, then, will send the message that success means teaching at a well-heeled college or university, without any hint that with some creativity and determination liberal education can flourish in less prestigious places, and that teaching there can be as satisfying as it is demanding.

Here’s one example of what I mean. In 2000, as part of a larger initiative to strengthen undergraduate liberal education,  Grand Valley State University, a growing regional public institution in western Michigan, decided to establish a classics department. Through committed teaching, imaginative curriculum design, and with strong support from the administration, the department has grown to six tenured and tenure track positions with about 50 majors on the books at any given moment. Most of these are first-generation college students from blue-collar backgrounds who had no intention of majoring in classics when they arrived at Grand Valley State, but many have an interest in mythology or in ancient history that has filtered down through popular culture and high school curricula. The department taps into this interest through entry-level service courses, which are taught by regular faculty members, not part timers or graduate students.

That’s a very American story, but the story of liberal education is increasingly a global one as well.  New colleges and universities in the liberal arts are springing up in many countries, especially those of the former Soviet Union.

I don’t mean that the spread of liberal education comes easily, in the United States or elsewhere. It’s swimming upstream. Cultural values, economic anxieties, and all too often institutional practices (staffing levels, salaries, leave policies and research facilities) all exert their downward pressure. It takes determination and devotion to press ahead. And those who do rarely get the recognition or credit they deserve.

But breaking out of the protective bastion of the elite institutions is vital for the continued flourishing of liberal education. One doesn’t have to read a lot of military history to know what happens to last bastions. They get surrounded; they eventually capitulate, often because those inside the walls squabble among themselves rather than devising an effective breakout strategy. We can see that squabbling at work every time humanists treat with contempt the quantitative methods of their scientific colleagues and when scientists contend that the reason we are producing so few scientists is that too many students are majoring in other fields of the liberal arts.  

The last bastion mentality discourages breakout strategies. Even talking to colleagues in business or environmental studies can be seen as collaborating with the enemy rather than as a step toward broadening and enriching the education of students majoring in these fields. The last bastion mentality, like the widespread narratives of decline, injects the insidious language of purity into our thinking about student learning, hinting that any move  beyond the cordon sanitaire is somehow foul or polluting and likely to result in the corruption of high academic standards.   

All right, what if one takes this professed concern for high standards seriously? What standards, exactly, do we really care about and wish to see maintained? If it’s a high level of student engagement and learning, then let’s say so, and be forthright in the claim that liberal education is reaching that standard, or at least can reach that standard if given half a chance. That entails, of course, backing up the claim with some systematic form of assessment.

That provides one way to break out of the last bastion mentality. One reason that liberal education remains so vital  is that when properly presented it contributes so much to personal and cognitive growth. The subject matter of the liberal arts and sciences provides some of the best ways of helping students achieve goals such as analytical thinking, clarity of written and oral expression,  problem solving, and alertness to moral complexity, unexpected consequences and cultural difference. These goals command wide assent outside academia, not least among employers concerned about the quality of their work forces. They are, moreover, readily attainable  through liberal education provided proper attention is paid to “transference.”  “High standards” in liberal education require progress toward these cognitive capacities.

Is it not time, then, for those concerned with the vitality of liberal education to abandon the defensive strategies that derive from the last bastion mentality, and adopt a new and much more forthright stance? Liberal education cares about high standards of student engagement and learning, and it cares about them for all students regardless of their social status or the institution in which they are enrolled.

There is, of course, a corollary. Liberal education can’t just make the claim that it is committed to such standards, still less insist that others demonstrate their effectiveness in reaching them, unless those of us in the various fields of the arts and sciences are willing to put ourselves on the line. In today’s climate  we have to be prepared to back up the claim that we are meeting those standards. Ways to make such assessments are now at hand, still incomplete and imperfect, but good enough to provide an opportunity for the liberal arts and sciences to show what they can do.

That story, I am convinced, is far more compelling than any narrative of decline.

Author/s: 
W. Robert Connor
Author's email: 
newsroom@insidehighered.com

W. Robert Connor is president of the Teagle Foundation and blogs frequently about liberal education.

The Global Exception

The way of thinking called American exceptionalism comes in two major varieties. One is more or less religious: A faith that the United States has a special place under heaven's watchful eye. Sometimes this involves a literal belief that the country has a role in the divine plan; in other cases, it's just a matter of rhetoric verging on national egomania.

The other form of American exceptionalism has a more left-wing genealogy. It emerged from debates over the peculiarities of the United States compared to other highly industrialized nation-states -- especially the lack of a labor party or a mass-based socialist movement of the kind that became standard elsewhere in the world. That, in turn, raises some interesting questions about what distinctive factors might explain the "exception." Was it slavery? The lack of an aristocracy? All those natural resources on the frontier, ripe for the plucking?

In either version, the United States stands as a nation apart -- somehow the product of forces cutting it off from the rest of the world's history. But Eric Rauchway, a professor of history at the University of California at Davis, takes a different and rather paradoxical approach to American exceptionalism in his new book, Blessed Among Nations: How the World Made America, published by Hill and Wang.

In a brief but panoramic survey of the half century following the Civil War, Rauchway shows how both foreign investment and the influx of unskilled labor helped indirectly bolster the feeling that the United States was a unique nation. "The first modern age of globalization," he writes, "gave Americans reason to believe that the rest of humankind intended to let the United States fulfill its special wish to have as little government as possible."

The more the nation's economy relied on capital and labor from abroad, the more its citizens thought of the state as "a referee charged with regulating the influence of such forces in an open and fluid system," writes Rauchway, "rather than as a machine for allocating scarce resources." The result was "an especially lucky country on the periphery of a global system" -- one that felt "that the world economy will right and regulate itself without government action."

World War I and the Great Depression did yank us out of that mentality, for a while. But it still has its temptations.

Several months ago, a friend who is an American historian praised Rauchway's earlier book Murdering McKinley: The Making of Theodore Roosevelt's America as very smart and well-written. I would not hesitate to use the same adjectives for Blessed Among Nations. While reading it, I got in touch with Rauchway, who answered a few questions about the book by e-mail.

Q: What do you make of the persistence of American exceptionalism? Doesn't it point to some rock-bottom distinctiveness about the country? Has there been any particular strand of American exceptionalism that you've had a productive dialogue or running argument with, over time?

A: The persistence of the religious variety of American exceptionalism makes me a little nervous, I confess. My ancestors (on my mother's side) were the kind of people who would banish you from the Massachusetts Bay Colony if you claimed a special revelation of God's intentions, whereupon you would probably be eaten by wolves, or at least have to live in Rhode Island.

So I would not claim a productive engagement with that tradition. But the more social-sciency variety of exceptionalism, the kind that flourished in the 1950s, troubles me in a much more useful way. I had to start wrestling with it seriously when I took a job that required me to teach American history overseas to students who were not Americans. And their first question was almost always, "Why is America so weird?"

To which I said, okay, let's read Charles and Mary Beard's Rise of American Civilization, Richard Hofstadter's American Political Tradition, Louis Hartz's Liberal Tradition in America, and David Potter's People of Plenty, and we can start to talk about why they're wrong (you'll notice I mention only safely dead people, here) and where their wrongness points us in enlightening directions.

I'm not persuaded that the persistence of exceptionalism points, all by itself, to a rock-bottom American distinctiveness -- other countries, maybe all other countries, have their own similar senses of exceptionalism -- but I would say that it's easier for Americans to indulge our exceptionalism, because recent history, and the rest of the world's people, have conspired with us in maintaining it.

Which is to say, history (and the rest of the world's people) kicked the shins of the German Sonderweg pretty hard. But Americans' sense of ourselves as a uniquely free people, able to do without so much of the machinery of government that other peoples find necessary -- that sense has been nurtured, maintained, and (I would even say) created afresh by a century and a half or so of world events.

Q: There's a tendency to treat globalization as a new phase in history. In pundit-speak, it's more or less a catch-all term for whatever started happening once the Cold War wound down. Your frame of reference is different -- but how, exactly? And are you ruling out the idea of the national economy as a frame of reference for interpreting American history altogether?

A: I'm using the word globalization to mean what a large number of historians and other scholars mean. Edward Learner provides a useful definition here: "Globalization is the increased international mobility of goods, people, contracts including financial claims) and thoughts (facts, ideas, and beliefs)." So we're talking not only about the permeability of national borders, but the actual, and measurable, motion of things across them.

In this sense globalization is no new thing, though it has waxed and waned in recent history. For example, there's a nice graph on page 6 of this paper by Maurice Obstfeld and my colleague Alan M. Taylor, showing their informed guess (I love the source note there, by the way) about the course of international capital mobility over the period since the Civil War -- it rises up to World War I, plummets thereafter, and then begins to rise again in the last few decades.

We could say the same about globalization more broadly -- that the late 19th century was an era of increased globalization, particularly with respect to the international movement of capital and labor (i.e., migration of people), an era that ended around World War I.

Now, there's a theory about what globalization does -- the theory says that globalization makes the world one:

"[A] constantly expanding market.... must nestle everywhere, settle everywhere, establish connexions everywhere.... We have intercourse in every direction, universal inter-dependence of nations.... And as in material, so also in intellectual production. The intellectual creations of individual nations become common property. National one-sidedness and narrow-mindedness become more and more impossible...."

That's Marx and Engels, of course, but it might as well be almost any modern booster or critic of globalization; both categories think that globalization makes each country more like the others.

This is, as far as we know, true-ish. Absent other factors, the unimpeded flow of stuff across borders makes anyplace like everyplace. You can think of it like water seeking its own level when you open a canal lock. Where it was once higher on this side of the door than on that side, when you open the door, it's the same on both sides.

The trouble is, as you know, the movement of stuff across borders isn't actually a natural process like the falling of water owing to gravity; it's a political process. The permeability of borders is a political choice. What's even more important is that the motion of stuff across borders often generates a reaction that, channeled through political institutions, affects the openness of borders.

To be less general: if immigration lowers wages, or is seen to lower wages, for a significant number of citizens in an immigrant-receiving country, you get a reaction, which if it's substantial enough, will lead to legislation restricting immigration. (I've written a little bit more on these ideas here and here.)

Q: So what's that meant for the United States?

A: Well, I'm making the argument that the influence of globalization on the U.S. did not make the U.S. more like other countries, but rather, that it reinforced American ideas of exceptionalism and, in demonstrable ways, gave enduring institutional life to those exceptionalist traditions.

The influx of international investment capital into the U.S., particularly into the West, gave the U.S. a vigorous politics of protest against international investment capital; the influx of immigration gave the U.S. a vigorous politics of protest against immigration. Both kinds of politics seriously inflected the major arguments of the day, like anti-capitalist protests, or arguments for social spending. You couldn't talk about the depredations of capitalists without also talking about the depredations specifically of foreign capitalists; you couldn't talk about the circumstances of the working class in America without also talking about the immigrant constituency among the working class in America.

Americans of that era saw themselves as affected by international factors that we would nowadays group together under the heading of globalization. And their concern with international influences on American industrial development was borne out in policies; not only in the kinds of policies Americans did not adopt but also in the kinds they did adopt.

So I'm not just arguing that, e.g., immigration kept enthusiasm for general social spending damped (though it probably did) but rather that immigration inspired certain kinds of social spending in parts of the United States where immigration had particular effects. This is not a book about the small, or weak, American state; it's a book about the peculiarly shaped American state, and about how globalization made those peculiarities.

Q: We're downstream in history from the time you have in mind -- it's now two or three world wars later, depending on how you look at these things. How much continuity is there between that moment of globalization and its effects and the present? Nowadays, someone like Pat Buchanan gripes about both foreign investment capital and immigration policy. But for the most part, it's just the latter that has much traction now. Or am I missing something?

A: Between the late 19th century and the present there's a great deal of -- I don't want to say continuity, but commonality; the forces that then drove American politics have now resumed their operation after being pent up in the middle of the 20th century.

Which is to say that our present moment, despite all that intervening history (including however many world wars you care to count) looks a lot more like the pre-New-Deal past than it looks like, say, the 1970s.  So our policy framework should reflect that.

Instead we tend to talk, in this and in other countries, about whether government should have more or fewer powers as if that question could be answered in the abstract, as if we could logically derive the correct answer from a set of axioms about human behavior. I think this is not only wrong but, potentially, fatally so. Government represents a set of specific solutions to specific problems, a set of adaptations to environment. And a set of adaptations specific to one environment might not do so well when the environment changes.

That's what we see in the era around World War I: the U.S. had adopted a set of policies based on its particular position in the world, and then it kept those policies, to its detriment, even after its position changed.  After 1918, the U.S. had become the central country in the world economy. If anyone were going to restore what John Maynard Keynes called the "economic Utopia" of the prewar years, it was going to be the U.S. But, as E. H. Carr wrote, "[i]n 1918 world leadership was offered, by almost unanimous consent, to the United States.... [and] it was then declined."

Into the 1920s, Americans kept the set of assumptions that had served them all right so far. They assumed that they had globalization on tap, and could take or leave it as they chose -- a tariff here, an immigration restriction law there, a bit of credit tightening in a pinch. These could all answer domestic needs and had nothing to do with how the rest of the world worked.

People found out, starting in 1929, how poorly these assumptions served in an environment where American policy could actually help shut down the whole world economy.

During what you might call the long New Deal, from FDR through Nixon, our old habits went into abeyance. So, I think not coincidentally, did globalization. As that graph I mentioned above shows, international capital mobility was at low ebb; so was immigration.  Now globalization is back, if in slightly different guise (trade probably matters more now than it did then); and so is the old American tradition of assuming it will continue to work for us when we want it. I think this assumption is no better now than in 1929.

Q: Not that you should play Nostradamus, exactly, but what are the long-term implications of seeing globalization and American exceptionalism as deeply connected?

A: If there are lessons for today in the book, they're these: (a) We should make policy decisions based on an accurate assessment of our position in the world, and not on assumptions or principles; and (b) we need to reassess our policy framework periodically to make sure it still suits us, because our place in the world changes.

When you hear anyone start talking about how we can take or leave world trade, how we can take or leave immigration, how the Federal Reserve has room to maneuver to regulate the flow of capital in a crisis -- you need to ask yourself, are the assumptions behind this policy, about our place in the world economy, sound?

But I won't pretend I've written a white paper for future action; the book is an argument about how we got here, why we believe what we do, why -- in the language of my former students -- America is so weird. And it's still weird in very much the same way it was. If in the middle of the twentieth century the country was on a trend toward being more like other nations, that's reversed.

I don't think we believe the world will take care of us because we're intrinsically more optimistic, or foolish, than other peoples; I think we've simply had our myths more or less ratified by the events of history for a very long time. I also think there's more than enough evidence in the historical record to suggest that mistaking the indulgence of events for the actions of a benign Providence is a recipe for disaster.

Author/s: 
Scott McLemee
Author's email: 
scott.mclemee@insidehighered.com

Scott McLemee writes Intellectual Affairs each week. Suggestions and ideas for future columns are welcome.

The High and the Low

Tuning in to CSPAN’s weekend books coverage a few months ago, I caught the rebroadcast of a panel discussion among three or four biographers of American presidents, held in a large auditorium somewhere. All of them had done well -- all of the biographies, that is. Not all of the presidents were so lucky. But the topic of the moment, as I happened to start watching, was neither the highest office in the land or the unique challenges faces a best-selling author. They were discussing, rather, the state of American history as a field.

The consensus appeared to be that the situation was terrible. Scholars were neglecting the lives and achievements of the truly important figures. Instead, they were studying social history, cultural history, economic history -- everything, alas, except Great Man history. One fellow on the panel was the author of books on the Founding Fathers that had won great acclaim; it was easy to imagine big bags of money being delivered to his door regularly by a grateful publisher. He proferred a simple explanation for all the scholarship on slave revolts, immigrant neighborhoods, obscure women’s organizations, and other such riff-raff. It was very simple, actually. Those historians hated America.

He offered no rational argument for this claim. Nor, indeed, would one have been possible. The assertion went unchallenged.

Now, everyone has the right to express an opinion -- and nobody is under any obligation that it be informed. But there must be limits to just how much shameless nonsense the public sphere can afford to let circulate. The idea that American historians are refusing to study the illustrious dead -- let alone that they are doing so because they are "anti-American" -- is too bizarre for sane people to indulge.

If a decreasing percentage of the historical profession’s resources go to studying, say, the Founding Fathers, as such, a couple of less feverish possibilities come to mind. One is that the number of historians interested in the U.S. grows from decade to decade -- while the population of Founding Fathers available for study remains constant. The real barrier for scholars wishing to concentrate on them comes from the need to find something new to say about them.

But that's only part of the situation. And of course there is still good work being done raising issues about the Founding Fathers. History is a pluralistic field, both at the level of the phenomena it examines and the methods it uses to study them. Pluralism does not equal either moral relativism or epistemological skepticism. (Nor will a million ranters in the blogosphere ever make it so.) But it does preclude acting as if there were a single correct approach -- one single level of historical reality worth serious attention, or one uniquely effective tool or framework for understanding the past.

Just as a matter of personal preference, let me admit to being quite interested in Benjamin Franklin. He qualifies as a dead white property-owning male, if anyone could, and he was in many respects the Founding Fathers’ Founding Father. I would much rather read a biography of Franklin than, say, a detailed study of labor contracts in 18th century Philadelphia -- or an econometric analysis of how King George’s taxation policies affected the North American paper industry.

But history is not a zero-sum enterprise. The well-being of history as a discipline demands that scholars be able to do that sort of monographic work. And it is in my long-term interest as a reader of books about Franklin that precisely such research be done. (It gives biographers access to elements of the world in which he lived.) All of this seems pretty obvious, though not the sort of point that gets made on television very often. Demagogy is so much more exciting.

A memory of that cringe-inducing moment on CSPAN flooded back to mind a few days ago, upon news of the death of Lawrence W. Levine, a professor of history emeritus at the University of California at Berkeley. (He also served as president of the Organization of American Historians and, after retiring from Berkeley, taught in the history and cultural studies programs at George Mason University.) The headline of one obituary summed up his life and work by calling Levine a "historian and multiculturalist." Accurate enough, as far as it went. But that word "multiculturalist" is now about as stimulating to the higher centers of the brain as Pavlov’s bell. The minute they hear it, some people start to drool.

Ten years ago, Levine offered a calm and reasoned response to Alan Bloom in a book called The Opening of the American Mind: Canons, published by Beacon Press. (For a sympathetic but not uncritical review that sums up his argument, scroll down the page a bit here.) Levine was an early recipient of the MacArthur Fellowship, and his more specialized books are as accessible to the general reader as serious scholarship can be. But most of his influence was on other historians.

Hearing that he was a "multiculturalist" really tells you very little about Levine accomplished. It was not just that he looked at the diversity of cultural traditions making up American life. He also made connections between history and other fields.

His groundbreaking study Black Culture and Black Consciousness: Afro-American Folk Thought From Slavery to Freedom (Oxford University Press, 1977) looked at how songs and stories gave black Americans “the means of preventing legal slavery from becoming spiritual slavery” by creating a domain “free of control by those who ruled the earth.” His approach was, in part, a matter of using ideas from folklore and anthropology about African “survivals” that had survived the Middle Passage. But Levine’s research also led him in another direction -- toward Shakespeare.

In Highbrow/Lowbrow: The Emergence of Cultural Hierarchy in (Harvard University Press, 1988), Levine describes reading accounts of minstrel shows “to derive some more exact sense of how antebellum whites depicted black culture.” What he found, to his astonishment, was an abundance of allusions to the Bard -- jokes and parodies, for example, that implied that the audience knew the plays fairly well.

Digging deeper, he unearthed an entire lost world. Levine showed how, until sometime shortly after the Civil War, Shakespeare was part of the nation’s common culture, drawing large and rowdy audiences who had very definite opinions about how the plays should be performed, and were not shy about expressing them. (The egg, as more than one visiting British actor learned, proved a handy instrument of dramatic criticism.) Favorite scenes from Hamlet were often part of the bill at variety shows, along with trained monkeys and similar acts. In major cities, two or three different stagings of Macbeth sometimes competed for the public’s patronage, while bored residents of a mining camp might put on Richard III for fun.

Some of the adaptations sound abominable. One very popular version of King Lear, for example, had a happy ending. But the gusto was unmistakable. American Bardolatry included the belief that he was a very great writer, perhaps the very greatest. But it was also shot through with a sense that he was, deep down, a man of the people -- hence, especially to be appreciated in a democratic nation. Levine described one stage-curtain from the early 19th century showing Shakespeare climbing into the heavens atop the back of an American eagle.

By the late 19th century, though, something had happened. People began to think of Shakespeare as anything but entertainment. His verse was either sublime and uplifting (if you were the refined sort) or a bore (if you weren’t). By the 1870s, it was getting harder and harder to find a show that would offer you both some Shakespeare and a performance involving dancers and musically gifted livestock. And by the dawn of the 20th century, nobody was looking.

What happened? Well, you should read Levine’s book, which also shows how a similar transformation occurred in the public appetite for opera and classical music across the same period. Suffice it to say that deep changes in American economy and the society made for very different attitudes towards Shakespeare and Mozart. It is a short book, but also one of the great mind-opening works on U.S. history -- a strangely moving reminder of how little of the nation’s actual past survives in the contemporary memory.

“That essay on ‘William Shakespeare in America’ is worth a whole library of cultural studies work,” Michael Kazin told me recently when we discussed Levine by phone. Earlier this year, Kazin, who is a professor of history at Georgetown University, published a biography of William Jennings Bryan. Levine had studied Bryan for his own dissertation at Columbia University, later published as a book.

Levine’s analysis, which challenged the familiar image of Bryan as creationist buffoon, was an important influence on Kazin’s interpretation of the politician. Levine and Kazin were also friends, initially bonding over an interest in the films of Frank Capra. Levine read parts of Kazin’s work in manuscript, and for a while they were in a book group together.

“He had a great no-bullshit style,” said Kazin. “It was a New York Jewish working-class wit. It reminded me of my father, though Larry was younger by maybe 15 years.”

The comparison caught me by surprise. His father, the late Alfred Kazin, had published major studies of American literature such as On Native Grounds and God and the American Writer. These were works of literary scholarship of a decidedly untheoretical and pre-multicultural sort.

So I wondered if there could be more to the resemblance between Levine and Kazin Sr. than something about the way they spoke.

“O n Native Grounds is about literature,” Kazin said, “but it’s also about the process of ‘Americanizing.’ My father was trying to understand the popular wellsprings of literature. Sure, Larry was the great historian of multiculturalism, but most of his work was about trying to understand the nation as a unity. The great thing about him was that he always had big questions about how that unity actually worked. How did blacks resist slavery and survive it afterwards? How did we end up with the plays of Shakespeare, the mass artist, becoming something restricted to the elite? Those should be questions in American history.”

And for anyone concerned about the neglect of Great Men, it’s worth mentioning that Levine’s last published book, written with his wife Cornelia R. Levine, was The People and the President: America’s Conversation With FDR (Beacon Press, 2002). But it wasn’t a departure from his practice of studying history from below.

“It looks at the letters people sent to FDR,” said Kazin. “It’s about how high and low interact. And what’s the point in having a democracy if you don’t try to understand that relationship?”

Author/s: 
Scott McLemee
Author's email: 
scott.mclemee@insidehighered.com

Scott McLemee writes Intellectual Affairs each week. Suggestions and ideas for future columns are welcome.

Quote Unquote

Keeping a commonplace book -- a notebook for copying out the striking passages you’ve come across while reading -- was once a fairly standard practice, not just among professional scholars but for anyone who (as the expression went) “had humane letters.” Some people still do it, though the very idea of creating your own customized, hand-written anthology does seem almost self-consciously old-fashioned now. Then again, that may be looking at things the wrong way. When John Locke circulated his New Method of a Common-Place Book in the late 17th century, he wasn’t offering Martha Stewart-like tips on how to be genteel. He had come up with a system of streamlined indexing and text-retrieval -- a way to convert the commonplace book into a piece of low-tech software for smart people on the go.

There is a fairly direct line running from Locke’s efficiency-enhancing techniques to The Yale Book of Quotations, a handsome and well-indexed compilation just issued by Yale University Press. That line runs through the work of John Bartlett, the prodigious American bookworm whose recall of passages from literature made him semi-famous in Cambridge, Mass. even before he published a small collection of Familiar Quotations in 1855. He included more and more material from his own commonplace book in later editions, so that the book grew to doorstop-sized. I don’t know whether or not Bartlett had read Locke’s essay. But he did index the book in a manner the philosopher would have found agreeable.

Following his death in 1905, “Bartlett’s” has become almost synonymous with the genre of quotation-collection itself – a degree of modest immortality that might have surprised him. (Chances are, he expected to be remembered for the fact that his friend James Russell Lowell once published a poem about Bartlett’s skill as a fisherman.)

The new Yale collection follows Bartlett’s example, both in its indexing and in sheer heft. It is not just a compilation but a work of scholarship. The editor, Fred R. Shapiro, is an associate librarian and lecturer in legal research at the Yale Law School; and his edition of The Oxford Dictionary of American Legal Quotations is well-regarded by both lawyers and reference librarians. In The Yale Book of Quotations, he proves even more diligent than Bartlett was about finding the exact origins and wording of familiar quotations.

The classic line from Voltaire that runs “I disapprove of what you say, but I will defend to the death your right to say it” does not appear among the selections from Voltaire, for the simple reason that he never actually said it. (According to an article appearing in the November 1943 issue of Modern Language Notes, it was actually coined by one of Voltaire's biographers, S. G. Tallentre.) Shapiro finds that the principle later known as “Murphy’s Law” was actually formulated by George Orwell in 1941. (“If there is a wrong thing to do,” wrote Orwell, “it will be done, infallibly. One has come to believe in that as if it were a law of nature.”)

In his posthumously published autobiography, Mark Twain attributed the phrase “lies, damned lies, and statistics” to Benjamin Disraeli. But the saying has long been credited to Twain himself, in the absence of any evidence that Disraeli actually said it. Thanks to the digitized editions of old newspapers, however, Shapiro finds it attributed to the former British prime minister in 1895, almost 30 years before Twain’s book was published.

It turns out that Clare Boothe Luce’s most famous quip, “No good deed goes unpunished,” first recorded in 1957, was actually attributed to Walter Winchell 15 years earlier. And as Shapiro notes, there is evidence to suggest that it had been a proverb even before that. Likewise, it was  not Liberace who coined the phrase “crying all the way to the bank” but rather, again, Winchell. (Oddly enough, the gossip columnist -- a writer as colorful as he was callous -- does not get his own entry.)

The historical notes in small type -- elaborating on sources and parallels, and sometimes cross-referencing other quotations within the volume -- make this a really useful reference work. It is also a profitable (or at least entertaining) way to procrastinate.

At the same time, it is a book that would have bewildered John Bartlett – and not simply because it places less emphasis on classic literature than commonplace-keepers once did. The editor has drawn on a much wider range of sources than any other volume of quotations I’ve come across, including film, television, popular songs, common sayings, and promotional catchphrases. Many of the choices are smart, or at least understandable. The mass media, after all, serve as the shared culture of our contemporary Global Village, as Marshall McLuhan used to say.

But many of the entries are inexplicable -- and some of them are just junk. What possible value is there to a selection of 140 advertising slogans (“There’s something about an Aqua Velva man”) or 90 television catchphrases (“This is CNN”)? The entry for Pedro Almodavar, the Spanish director, consists entirely of the title of one of his films, Women on the Verge of a Nervous Breakdown. Why bother?

A case might be made for including the “Space, the final frontier...” soliloquy from the opening of Star Trek, as Shapiro does in the entry for Gene Roddenberry. He also cross-references it to a quotation from 1958 by the late James R. Killian, then-president of MIT, who defined space exploration as a matter of “the thrust of curiosity that leads me to try to go where no man has gone before.” So far, so good. But why also include the slightly different wordings used in the openings to The Wrath of Khan and Star Trek: The Next Generation?

The fact that quotations from Mae West run to more than one and a half pages is not a problem. They are genuinely witty and memorable. (e.g., “Between two evils, I always pick the one I’ve never tried before.”) But how is it that the juvenile lyrics of Alanis Mrrissette merit nearly as much space as the entry for Homer? (The one from Greece, I mean, not from Springfield.)

It is hard to know what to make of some of these editorial decisions. It’s as if Shapiro had included, on principle, a certain amount of the static and babble that fills the head of anyone tuned into the contemporary culture – “quotations” just slightly more meaningful than the prevailing media noise (and perhaps not even that).

But another sense of culture prevailed in Bartlett’s day -- one that Matthew Arnold summed up as a matter of “getting to know, on all the matters that concern us, the best which has been thought and said in the world.” That doesn’t mean excluding popular culture. The lines here from Billie Holiday, Bob Dylan, and "The Simpsons" are all worth the space they fill. But the same is not true of “Plop plopp, fizz fizz, oh what a relief it is."

All such griping aside, The Yale Book of Quotations is an absorbing reminder that all one’s best observations were originally made by someone else. And it includes a passage from Dorothy Sayer explaining how to benefit from this: “I always have a quotation for everything,” she wrote. “It saves original thinking.”

I had considered suggesting that it might make a good present for Christmas, Hanukkah, Festivus, etc. According to the publisher’s Web site, the first printing is already sold out. It is available in bookstores, however, and also from some online booksellers. Here’s hoping it goes through many editions -- so that Shapiro will get a chance to recognize that Eminem’s considerable verbal skills do not translate well into cold type.

Author/s: 
Scott McLemee
Author's email: 
scott.mclemee@insidehighered.com

Scott McLemee writes Intellectual Affairs each week. Suggestions and ideas for future columns are welcome.

Eros Unbound

Valentine’s Day seems an appropriate occasion to honor the late Gershon Legman, who is said to have coined the slogan “Make love, not war.” Odd to think that saying had a particular author, rather than being spontaneously generated by the countercultural Zeitgeist in the 1960s. But I've seen the line attributed to Legman a few times over the years; and the new Yale Book of Quotations (discussed in an earlier column) is even more specific, indicates that he first said it during a speech at Ohio University in Athens, Ohio, sometime in November 1963.

Legman, who died in 1999 at the age of 81, was the rare instance of a scholar who had less of a career than a profound calling -- one that few academic institutions in his day could have accommodated. Legman was the consummate bibliographer and taxonomist of all things erotic: a tireless collector and analyst of all forms of discourse pertaining to human sexuality, including the orally transmitted literature known as folklore. He was an associate of Alfred Kinsey during the 1940s, but broke with him over questions of statistical methodology. If it hadn’t been that, it would have been something else; by all accounts, Legman was a rather prickly character.

But it is impossible to doubt his exacting standards of scholarship after reading The Horn Book: Studies in Erotic Folklore and Bibliography (University Books, 1964) -- a selection of Legman's papers reflecting years of exploration in the “restricted” collections of research libraries. (At the Library of Congress, for example, you will sometimes find a title listed as belonging to “the Delta Collection,” which was once available to a reader only after careful vetting by the authorities. The books themselves have long since been integrated into the rest of the library’s holdings, but not-yet-updated catalog listings still occasionally reveal that a volume formerly had that alluring status: forbidden yet protected.) Legman approached erotic literature and "blue" folklore with philological rigor, treating with care songs and books that only ever circulated on the sly.  

Some of Legman's work appeared from commercial publishers and reached a nonscholarly audience. He assembled two volumes of obscene limericks, organized thematically and in variorum. The title of another project, The Rationale of the Dirty Joke, only hints at its terrible sobriety and analytic earnestness. Sure, you can skim around in it for the jokes themselves. But Legman’s approach was strictly Freudian, his ear constantly turned to the frustration, anxiety, and confusion expressed in humor.

Not all of his work was quite that grim. Any scholar publishing a book called Oragentialism: Oral Techniques in Genital Excitation may be said to have contributed something to the sum total of human happiness. The first version, devoted exclusively to cunnilingus, appeared from a small publisher in the 1940s and can only have had very limited circulation. The commercial edition published in 1969 expanded its scope -- though Legman (who in some of his writings comes across, alas, as stridently hostile to the early gay rights movement) seemed very emphatic in insisting that his knowledge of fellatio was strictly as a recipient.

Defensiveness apart, what’s particularly striking about the book is the degree to which it really is a work of scholarship. You have to see his literature review (a critical evaluation of the available publications on the matter, whether popular, professional, or pornographic, in several languages) to believe it. Thanks to Legman’s efforts, it is possible to celebrate Valentine’s Day with a proper sense of tradition.

Legman was a pioneer of cultural studies, long before anyone thought to call it that. He served as editor for several issues of Neurotica, a great underground literary magazine published between 1948 and 1952. Most of its contributors were then unknown, outside very small circles; but they included Allen Ginsberg, Anatole Broyard, Leonard Bernstein, and an English professor from Canada named Marshall McLuhan.

As the title may suggest, Neurotica reflected the growing cultural influence of Freud. But it also went against the prevalent tendency to treat psychoanalysis as a tool for adjusting misfits to society. The journal treated American popular culture itself as profoundly deranged; and in developing this idea, Legman served as something like the house theorist.

In a series of essays adapted from his pamphlet Love and Death (1948), Legman cataloged the seemingly endless sadism and misogyny found in American movies, comic books, and pulp novels. (Although Love and Death is long out of print, a representative excerpt can be found in Jeet Heer and Kent Worcester's collection Arguing Comics: Literary Masters on a Popular Medium, published by the University Press of Mississippi in 2004.)

Legman pointed out that huge profits were to be made from depicting murder, mutilation, and sordid mayhem. But any attempt at a frank depiction of erotic desire, let alone of sex itself, was forbidden. And this was no coincidence, he concluded. A taste for violence was being “installed as a substitute outlet for forbidden sexuality” by the culture industry.

Censorship and repression were warping the American psyche at its deepest levels, Legman argued. The human needs that ought to be met by a healthy sexual life came back, in distorted form, as mass-media sadism: "the sense of individuality, the desire for importance, attention, power; the pleasure in controlling objects, the impulse toward violent activity, the urge towards fulfillment to the farthest reaches of the individual’s biological possibilities.... All these are lacking in greater or lesser degree when sex is lacking, and they must be replaced in full.”

Replaced, that is, by the noir pleasures of the trashy pop culture available in the 1940s.

Here, alas, it proves difficult to accept Legman's argument in quite the terms framing it. His complaints about censorship and hypocrisy are easy to take for granted as justified. But the artifacts that filled him with contempt and rage -- Gone With the Wind, the novels of Raymond Chandler, comic books with titles like Authentic Police Cases or Rip Kirby: Mystery of the Mangler -- are more likely to fill us with nostalgia.

It's not that his theory about their perverse subtext now seems wrong. On the contrary, it often feels as if he's on to something. But while condemning the pulp fiction or movies of his day as symptomatic of a neurotic culture, Legman puts his finger right on what makes them fascinating now -- their nervous edge, the tug of war between raw lust and Puritan rage.

In any case, a certain conclusion follows from Legman’s argument -- one that we can test against contemporary experience.

Censorship of realistic depictions of sexuality will intensify the climate of erotic repression, thereby creating an audience prone to consuming pop-culture sadomasochism. If so, per Legman, then the easing or abolition of censorship ought to yield, over time, fewer images and stories centering on violence, humiliation, and so on.

Well, we know how that experiment turned out. Erotica is now always just a few clicks away (several offers are pouring into your e-mail account as you read this sentence). And yet one of the most popular television programs in the United States is a drama whose hero is good at torture .

They may have been on to something in the pages of Neurotica, all those decades ago, but things have gotten more complicated in the meantime.

As it happens, I’ve just been reading a manuscript called “Eros Unbound: Pornography and the Internet” by Blaise Cronin, a professor of information science at Indiana University at Bloomington, and former dean of its School of Information and Library Science. His paper will appear in The Internet and American Business: An Historical Investigation, a collection edited by William Aspray and Paul Ceruzzi scheduled for publication by MIT Press in April 2008.

Contacting Cronin to ask permission to quote from his work, I asked if he had any connection with the Kinsey Institute, also in Bloomington. He doesn’t, but says he is on friendly terms with some of the researchers there. Kinsey was committed to recording and tabulating sexual activity in all its forms. Cronin admits that he cannot begin to describe all the varieties of online pornography. Then again, he doesn’t really want to try.

“I focus predominantly on the legal sex industry,” he writes in his paper, “concentrating on the output of what, for want of a better term, might be called the respectable, or at least licit, part of the pornography business. I readily acknowledge the existence of, but do not dwell upon the seamier side, unceremoniously referred to by an anonymous industry insider as the world of ‘dogs, horses, 12-year old girls, all this crazed Third-World s—.’ ”

The notion of a “respectable” pornography industry would have seemed oxymoronic when Legman published Love and Death. It’s clearly much less so at a time when half the hotel chains in the United States offer X-rated films on pay-per-view. Everyone knows that there is a huge market for online depictions of sexual behavior. But what Cronin’s study makes clear is that nobody has a clue just how big an industry it really is. Any figure you might hear cited now is, for all practical purposes, a fiction.

The truth of this seems to have dawned on Cronin following the publication, several years ago, of “E-rogenous Zones: Positioning Pornography in the Digital Marketplace,” a paper he co-authored with Elizabeth Davenport. One of the tables in their paper “estimated global sales figures for the legal sex/pornography industry,” offering a figure of around $56 billion annually. That estimate squared with information gathered from a number of trade and media organizations. But much of the raw data had originally been provided by a specific enterprise -- something called the Private Media Group, Inc., which Cronin describes as “a Barcelona-based, publicly traded adult entertainment company.”

After the paper appeared in the journal Information Society in 2001, Cronin says, he was contacted “by Private’s investor relations department wondering if I could furnish the company with growth projections and other related information for the adult entertainment industry -- I, who had sourced some of my data from their Web site.” That estimate of $56 billion per year, based on research now almost a decade old, is routinely cited as if it were authoritative and up to date.

“Many of the numbers bandied about by journalists, pundits, industry insiders and market research organizations,” he writes, “are lazily recycled, as in the case of our aforementioned table, moving effortlessly from one story and from one reporting context to the next. What seem to be original data and primary sources may actually be secondary or tertiary in character.... Some of the startling revenue estimates and growth forecasts produced over the years by reputable market research firms ... have been viewed all too often with awe rather than healthy skepticism.”

Where Legman was, so to speak, an ideologue of sex, Blaise Cronin seems more scrupulously dispassionate. His manuscript runs to some 50 pages and undertakes a very thorough review of the literature concerning online pornography. (My wife, a reference librarian whose work focuses largely on developments in digital technology and e-commerce, regards Cronin’s paper as one of the best studies of the subject around.) He doesn't treat the dissemination of pornography as either emancipatory or a sign of decadence. It's just one of the facts of life, so to speak.

His paper does contain a surprise, though. It's a commonplace now that porn is assuming an increasingly ordinary role as cultural commodity -- one generating incalculable, but certainly enormous, streams of revenue for cable companies, Internet service providers, hotel chains, and so on. But the "mainstreaming" of porn is a process that works both ways. Large sectors of the once-marginal industry are morphing into something ever more resembling corporate America.

“The sleazy strip joints, tiny sex shops, dingy backstreet video stores and other such outlets may not yet have disappeared,” writes Cronin, “but along with the Web-driven mainstreaming of pornography has come -- almost inevitably, one has to say -- full-blown corporatization and cosmeticization.... The archetypal mom and pop business is being replaced by a raft of companies with business school-trained accountants, marketing managers and investment analysts at the helm, an acceleration of a trend that began at the tail-end of the twentieth century. As the pariah industry strives to smarten itself up, the language used by some of the leading companies has become indistinguishable from that of Silicon Valley or Martha Stewart. It is a normalizing discourse designed to resonate with the industry’s largely affluent, middle class customer base.”

As an example, he quotes what sounds like a formal mission statement at one porn provider’s website: “New Frontier Media, Inc. is a technology driven content distribution company specializing in adult entertainment. Our corporate culture is built on a foundation of quality, integrity and commitment and our work environment is an extension of this…The Company offers diversity of cultures and ethnic groups. Dress is casual and holiday and summer parties are normal course. We support team and community activities.”

That’s right, they have casual Fridays down at the porn factory. Also, it sounds like, a softball team.

I doubt very much that anybody in this brave new world remembers cranky old Gershon Legman, with his index cards full of bibliographical data on Renaissance handbooks on making the beast with two backs. (Nowadays, of course, two backs might be considered conservative.) Ample opportunity now exists to watch or read about sex. Candor seems not just possible but obligatory. But that does not necessarily translate into happiness -- into satisfaction of  "the urge towards towards fulfillment to the farthest reaches of the individual’s biological possibilities," as Legman put it.

That language is a little gray, but the meaning is more romantic than it sounds. What Legman is actually celebrating is the exchange taking place at the farthest reaches of a couple's biological possibilities: the moment when sex turns into erotic communion. And for that, broadband access is irrelevant. For that, you need to be really lucky.

Author/s: 
Scott McLemee
Author's email: 
scott.mclemee@insidehighered.com

Party in the Streets

During the first administration of Franklin Delano Roosevelt (or so goes a story now making the rounds of American progressives), the president met with a group of citizens who urged him to seize the moment. Surely it was time for serious reforms: The Depression made it impossible to continue with business as usual. Just what measures the visitors to the Oval Office proposed -- well, that is not clear, at least from the versions I have heard. Perhaps they wanted laws to regulate banking, or to protect the right of labor unions to organize, or to provide income help for the aged. Maybe all of the above.

The president listened with interest and evident sympathy. As the meeting drew to a close, Roosevelt thanked his guests, expressing agreement with all they had suggested. “So now,” he told them on their way out the door, “go out there and make me do it.”

This is less a historical narrative, strictly speaking, than an edifying tale. Its lesson is simple. Even with wise and trustworthy leadership holding power -- perhaps especially then -- you must be ready to apply pressure from below. (The moral here is not especially partisan, by the way. One can easily imagine conservative activists spurring one another on with more or less the same story, with Ronald Reagan assuming the star role.)

I recalled this anecdote on Saturday after meeting Michael T. Heaney, an assistant professor of political science at the University of Florida. He stopped by for a visit after spending the afternoon collecting data at the antiwar demonstration here in Washington.

For the past few years, Heaney has been collaborating with Fabio Rojas,  an assistant professor of sociology at Indiana University, on a study of the turnout at major national antiwar protests. With the help of research assistants, they have done surveys of some 3,550 randomly selected demonstrators. (That figure includes the 350 surveys gathered this weekend.) Their research has already yielded two published papers, available here and here, with more now in the works.

We’ll go over some of their findings in a moment. But a remark that Heaney made in conversation resonated with that fable about the New Deal era, and it provides a context for understanding the work he and Rojas have been doing.

“Political scientists are good at analyzing how established institutions function,” he said. “We have the tools for that, and the tools work really well. But there is very strong resistance to studying informal organizations or to recognizing them as part of the political landscape.”

In the course of thinking over their research, Rojas and Heaney have improvised a concept they call “the party in the street” -- that segment of a political party that, to borrow FDR’s (possibly apocryphal) injunction, gets out there and pushes.

Party affiliation was only one of the questions asked during the survey, which also gathered information about a demonstrator’s age, gender, ethnicity, zip code, membership in non-political organizations, and how he or she heard about the protest. (The form allowed responders to remain anonymous.)

“We attended or sent proxies to all major protests during a one-year period, from August 2004 until September 2005,” Heaney told me, “and we’ve coded all those surveys. We’ve also collected surveys at other demonstrations since then, including roughly a thousand responses just in 2007.”

The researchers attended demonstrations sponsored by each of the two major coalitions organizing them, United for Peace and Justice (UFPJ) and Act Now to Stop War and End Racism (ANSWER). The two coalitions have been at odds with one another for years, but worked together to organize the September 2005 protest in Washington before going their separate ways again. “We couldn’t have planned this,” as Heaney puts it, “but now we have data from each stage – when the two coalitions were in conflict, when they worked together, and then again after they parted.”

During the September 2005 activities, Rojas and Heaney gathered information both from those  who attended a large open-air protest and from the thousand or so people who stuck around to lobby members of Congress two days later.

Their survey data also cover demonstrations in the months before and after the midterm elections in November, though most of those results remain to be processed.

“I’ve been shocked at how few academics have paid attention to the antiwar movement,” Heaney told me. “When we first went out to do a survey at a demonstration, I sort of expected to find other political scientists doing research too. But apart from a couple of people in sociology, there doesn’t seem to be much else happening so far.”

I asked if they had met with much suspicion in the course of their research -- people refusing to take the survey for fear of being, well, surveilled.

“No,” he said, “the response rate has been very high. There hasn’t been much paranoia. The temper isn’t like it was after 9/11. People don’t feel as much like the government is out to get them. And fear on the part of the police has gone down too. Now they don’t seem as concerned that a protest is going to turn into a terrorist act.”

The survey results from demonstrations in 2004 and 2005 showed that “40% of activists within the antiwar movement describe themselves as Democrats, 39% identify as independents (i.e. they list no party affiliation), 20% claim membership in a third party, and only 2% belong to the Republican party.”

Some of their findings confirm things one might predict from a simple deduction. Protestors who identified as members of the Democratic Party were more likely to stay in town to lobby their members of Congress than those who didn’t, for example.

Likewise, the researchers found that Democratic members of Congress “are more likely to meet with antiwar lobbyists than are Republicans, other things being equal.... Members of Congress who had previously expressed high levels of support for antiwar positions were more likely to meet with lobbyists than those whose support had been weak or nonexistent.”

Other results were more interesting. Protestors who belonged to “at least one civic, community, labor, or political organization” proved to be 17 percent more likely to lobby. People who turned out for the demonstration after being contacted by an organization were 13 percent more likely to lobby – while those who found about the event only through the mass media were 16 percent less likely to go to Capitol Hill.

The contemporary antiwar movement has a “distinctly bimodal” distribution with respect to age. In other words, there are two significant cohorts, one between the ages of 18 and 27, the other between 46 and 67, “with relatively fewer participants outside these ranges.”

Each birthday added “about 1 percent to an individual’s willingness to lobby when all other variables are held at their means or modes,” report Heaney and Rojas in a paper for the journal American Politics Research. “We did not find that sex, race, or occupational prestige make a difference in an individual’s propensity to lobby.”

In conversation, Heaney also mentioned a provisional finding that they are now double-checking. “The single strongest predictor of lobbying was whether an individual had been involved in the movement against the Vietnam War.”

It was while attending a demonstration outside the Republican National Convention in New York in 2004 that Heaney came up with an expression that has somewhat complicated the reception of this research among his colleagues. The city’s labor unions had turned out a large and obstreperous crowd to express displeasure with the president.  The crowd was overwhelmingly likely to vote for Democratic candidates, but Heaney was struck by the thought that it was a very different gathering from the one he expected would assemble before long at a Democratic national convention.

“I thought: this is more like a festival,” he told me. “It’s the Democratic Party. But it’s also the party having a party...in the street.”

This phrase – “the party in the street” – had a special overtone for Heaney as a political scientists, given one familiar schema used in analyzing American politics. In his profession, it is common to speak of a major party as having three important sectors: “the party in government,” “the party in the electorate,” and “the party as organization.”

The idea that mass movements might constitute a fourth sector of the party – with the Christian Right, for example, being a component of the Republican “party in the street” – might seem self-evident in some ways. But not so for political scientists, it seems. “We met a lot of resistance to the idea of the ‘party in the street,’” Heaney told me, “and to the idea that [it might apply] to the Republicans as well.” The paper in which Heaney and Rojas first referred to “the party in the street” ended up going to three different journals -- with substantial revisions along the way – before it was accepted for publication in American Politics Research.

Speaking of the antiwar protests as manifestations of the Democratic “party in the street” will also meet resistance from many activists. (A catchphrase of the hard left is that the Democratic Party is “the graveyard of mass movements.”) And according to their own surveys, Heaney and Rojas find that just over one fifth of demonstrators see themselves as clearly outside its ranks.

But that still leaves the majority of antiwar activists as either identifying themselves as Democrats or at least willing to vote for the party. “Like it or not,” write Heaney and Rojas, “their moral and political struggles are within or against the Democratic Party; it actions and inactions construct opportunities for and barriers to the achievement of their issue-specific policy goals.” (Though Heaney and Rojas don’t quote Richard Hofstadter, their analysis implicitly accepts the historian’s famous aphorism that American third parties “are like bees: they sting once and die.”)

“We do not claim,” they take care to note, “that the party in the street has equal standing with the party in government, the party in the electorate, or the party as organization. We are not asserting that the formal party organization is coordinating these activities. The party in the street lacks the stability possessed by other parts of the party because it is not supported by enduring institutions. Furthermore, it is small relative to other parts of the party and at times may be virtually nonexistent.”

As Heaney elaborated when we met, a great deal of the organizing work of the antiwar “party” is conducted by e-mail – a situation that makes it much easier for groups with a small staff to reach a large audience. But that also makes for somewhat shallow or episodic involvement in the movement on the part of many participants. An important area for study by political scientists might be the relationship between the emerging zone of activist organizations and the informal networks of campaign consultants, lobbyists, financial contributors, and activists” shaping the agenda of other sectors of political parties. “If they remain well organized and attract enthusiastic young activists,” write Rojas and Heaney, “then the mainstream political party is unable to ignore them for long.” 

Studying the antiwar movement has not exhausted the attention of either scholar. Heaney is working on a book about Medicare, while Rojas is the author of From Black Power to Black Studies: How a Radical Social Movement Became an Academic Discipline, forthcoming from Johns Hopkins University Press. But now they have an abundance of data to analyze, and expect to finish four more papers over the next few months. In addition to crunching more than three years’ worth of survey data, Heaney and Rojas have been examining the antiwar movement’s publications online and observing in person how protests are organized.

I scribbled down working titles and thumbnail descriptions of the papers in progress as Heaney discussed them. So here, briefly, is an early report on some research you may hear pundits refer to knowingly some months from now....

“Mobilizing the Antiwar Movement” will analyze how organizations get people to turn out and which kinds of groups are most successful at it. “Network Dynamics of the Antiwar Movement” will consider how different groups interact at events and how those interactions have changed over time. “Leaders and Followers in the Antiwar Movement” will examine the survey data gathered at large protests, comparing and contrasting it with information about activists who participate in smaller workshops or training exercises for committed activists.

Finally, “Coalition Dissolution in the Antiwar Movement” will look at tensions within the organizing efforts. “There has been some work in sociology on coalition building,” as Heaney explained, “but there’s been almost none on how they fall apart.”

It’s worth repeating that all of this work on the antiwar “party in the street” could just as well inspire research on the relationship between conservative movements and the Republican Party. Perhaps someone will eventually write a paper called “Coalition Dissolution in the Christian Right.” I say that purely in the interests of scholarship, of course, and with no gloating at the prospect whatsoever.

                                       

Author/s: 
Scott McLemee
Author's email: 
scott.mclemee@insidehighered.com

Scott McLemee writes Intellectual Affairs each week. Suggestions and ideas for future columns are welcome.

Pages

Subscribe to RSS - History
Back to Top