The way of thinking called American exceptionalism comes in two major varieties. One is more or less religious: A faith that the United States has a special place under heaven's watchful eye. Sometimes this involves a literal belief that the country has a role in the divine plan; in other cases, it's just a matter of rhetoric verging on national egomania.
The other form of American exceptionalism has a more left-wing genealogy. It emerged from debates over the peculiarities of the United States compared to other highly industrialized nation-states -- especially the lack of a labor party or a mass-based socialist movement of the kind that became standard elsewhere in the world. That, in turn, raises some interesting questions about what distinctive factors might explain the "exception." Was it slavery? The lack of an aristocracy? All those natural resources on the frontier, ripe for the plucking?
In either version, the United States stands as a nation apart -- somehow the product of forces cutting it off from the rest of the world's history. But Eric Rauchway, a professor of history at the University of California at Davis, takes a different and rather paradoxical approach to American exceptionalism in his new book, Blessed Among Nations: How the World Made America, published by Hill and Wang.
In a brief but panoramic survey of the half century following the Civil War, Rauchway shows how both foreign investment and the influx of unskilled labor helped indirectly bolster the feeling that the United States was a unique nation. "The first modern age of globalization," he writes, "gave Americans reason to believe that the rest of humankind intended to let the United States fulfill its special wish to have as little government as possible."
The more the nation's economy relied on capital and labor from abroad, the more its citizens thought of the state as "a referee charged with regulating the influence of such forces in an open and fluid system," writes Rauchway, "rather than as a machine for allocating scarce resources." The result was "an especially lucky country on the periphery of a global system" -- one that felt "that the world economy will right and regulate itself without government action."
World War I and the Great Depression did yank us out of that mentality, for a while. But it still has its temptations.
Several months ago, a friend who is an American historian praised Rauchway's earlier book Murdering McKinley: The Making of Theodore Roosevelt's America as very smart and well-written. I would not hesitate to use the same adjectives for Blessed Among Nations. While reading it, I got in touch with Rauchway, who answered a few questions about the book by e-mail.
Q: What do you make of the persistence of American exceptionalism? Doesn't it point to some rock-bottom distinctiveness about the country? Has there been any particular strand of American exceptionalism that you've had a productive dialogue or running argument with, over time?
A: The persistence of the religious variety of American exceptionalism makes me a little nervous, I confess. My ancestors (on my mother's side) were the kind of people who would banish you from the Massachusetts Bay Colony if you claimed a special revelation of God's intentions, whereupon you would probably be eaten by wolves, or at least have to live in Rhode Island.
So I would not claim a productive engagement with that tradition. But the more social-sciency variety of exceptionalism, the kind that flourished in the 1950s, troubles me in a much more useful way. I had to start wrestling with it seriously when I took a job that required me to teach American history overseas to students who were not Americans. And their first question was almost always, "Why is America so weird?"
To which I said, okay, let's read Charles and Mary Beard's Rise of American Civilization, Richard Hofstadter's American Political Tradition, Louis Hartz's Liberal Tradition in America, and David Potter's People of Plenty, and we can start to talk about why they're wrong (you'll notice I mention only safely dead people, here) and where their wrongness points us in enlightening directions.
I'm not persuaded that the persistence of exceptionalism points, all by itself, to a rock-bottom American distinctiveness -- other countries, maybe all other countries, have their own similar senses of exceptionalism -- but I would say that it's easier for Americans to indulge our exceptionalism, because recent history, and the rest of the world's people, have conspired with us in maintaining it.
Which is to say, history (and the rest of the world's people) kicked the shins of the German Sonderweg pretty hard. But Americans' sense of ourselves as a uniquely free people, able to do without so much of the machinery of government that other peoples find necessary -- that sense has been nurtured, maintained, and (I would even say) created afresh by a century and a half or so of world events.
Q: There's a tendency to treat globalization as a new phase in history. In pundit-speak, it's more or less a catch-all term for whatever started happening once the Cold War wound down. Your frame of reference is different -- but how, exactly? And are you ruling out the idea of the national economy as a frame of reference for interpreting American history altogether?
A: I'm using the word globalization to mean what a large number of historians and other scholars mean. Edward Learner provides a useful definition here: "Globalization is the increased international mobility of goods, people, contracts including financial claims) and thoughts (facts, ideas, and beliefs)." So we're talking not only about the permeability of national borders, but the actual, and measurable, motion of things across them.
In this sense globalization is no new thing, though it has waxed and waned in recent history. For example, there's a nice graph on page 6 of this paper by Maurice Obstfeld and my colleague Alan M. Taylor, showing their informed guess (I love the source note there, by the way) about the course of international capital mobility over the period since the Civil War -- it rises up to World War I, plummets thereafter, and then begins to rise again in the last few decades.
We could say the same about globalization more broadly -- that the late 19th century was an era of increased globalization, particularly with respect to the international movement of capital and labor (i.e., migration of people), an era that ended around World War I.
Now, there's a theory about what globalization does -- the theory says that globalization makes the world one:
"[A] constantly expanding market.... must nestle everywhere, settle everywhere, establish connexions everywhere.... We have intercourse in every direction, universal inter-dependence of nations.... And as in material, so also in intellectual production. The intellectual creations of individual nations become common property. National one-sidedness and narrow-mindedness become more and more impossible...."
That's Marx and Engels, of course, but it might as well be almost any modern booster or critic of globalization; both categories think that globalization makes each country more like the others.
This is, as far as we know, true-ish. Absent other factors, the unimpeded flow of stuff across borders makes anyplace like everyplace. You can think of it like water seeking its own level when you open a canal lock. Where it was once higher on this side of the door than on that side, when you open the door, it's the same on both sides.
The trouble is, as you know, the movement of stuff across borders isn't actually a natural process like the falling of water owing to gravity; it's a political process. The permeability of borders is a political choice. What's even more important is that the motion of stuff across borders often generates a reaction that, channeled through political institutions, affects the openness of borders.
To be less general: if immigration lowers wages, or is seen to lower wages, for a significant number of citizens in an immigrant-receiving country, you get a reaction, which if it's substantial enough, will lead to legislation restricting immigration. (I've written a little bit more on these ideas here and here.)
Q: So what's that meant for the United States?
A: Well, I'm making the argument that the influence of globalization on the U.S. did not make the U.S. more like other countries, but rather, that it reinforced American ideas of exceptionalism and, in demonstrable ways, gave enduring institutional life to those exceptionalist traditions.
The influx of international investment capital into the U.S., particularly into the West, gave the U.S. a vigorous politics of protest against international investment capital; the influx of immigration gave the U.S. a vigorous politics of protest against immigration. Both kinds of politics seriously inflected the major arguments of the day, like anti-capitalist protests, or arguments for social spending. You couldn't talk about the depredations of capitalists without also talking about the depredations specifically of foreign capitalists; you couldn't talk about the circumstances of the working class in America without also talking about the immigrant constituency among the working class in America.
Americans of that era saw themselves as affected by international factors that we would nowadays group together under the heading of globalization. And their concern with international influences on American industrial development was borne out in policies; not only in the kinds of policies Americans did not adopt but also in the kinds they did adopt.
So I'm not just arguing that, e.g., immigration kept enthusiasm for general social spending damped (though it probably did) but rather that immigration inspired certain kinds of social spending in parts of the United States where immigration had particular effects. This is not a book about the small, or weak, American state; it's a book about the peculiarly shaped American state, and about how globalization made those peculiarities.
Q: We're downstream in history from the time you have in mind -- it's now two or three world wars later, depending on how you look at these things. How much continuity is there between that moment of globalization and its effects and the present? Nowadays, someone like Pat Buchanan gripes about both foreign investment capital and immigration policy. But for the most part, it's just the latter that has much traction now. Or am I missing something?
A: Between the late 19th century and the present there's a great deal of -- I don't want to say continuity, but commonality; the forces that then drove American politics have now resumed their operation after being pent up in the middle of the 20th century.
Which is to say that our present moment, despite all that intervening history (including however many world wars you care to count) looks a lot more like the pre-New-Deal past than it looks like, say, the 1970s. So our policy framework should reflect that.
Instead we tend to talk, in this and in other countries, about whether government should have more or fewer powers as if that question could be answered in the abstract, as if we could logically derive the correct answer from a set of axioms about human behavior. I think this is not only wrong but, potentially, fatally so. Government represents a set of specific solutions to specific problems, a set of adaptations to environment. And a set of adaptations specific to one environment might not do so well when the environment changes.
That's what we see in the era around World War I: the U.S. had adopted a set of policies based on its particular position in the world, and then it kept those policies, to its detriment, even after its position changed. After 1918, the U.S. had become the central country in the world economy. If anyone were going to restore what John Maynard Keynes called the "economic Utopia" of the prewar years, it was going to be the U.S. But, as E. H. Carr wrote, "[i]n 1918 world leadership was offered, by almost unanimous consent, to the United States.... [and] it was then declined."
Into the 1920s, Americans kept the set of assumptions that had served them all right so far. They assumed that they had globalization on tap, and could take or leave it as they chose -- a tariff here, an immigration restriction law there, a bit of credit tightening in a pinch. These could all answer domestic needs and had nothing to do with how the rest of the world worked.
People found out, starting in 1929, how poorly these assumptions served in an environment where American policy could actually help shut down the whole world economy.
During what you might call the long New Deal, from FDR through Nixon, our old habits went into abeyance. So, I think not coincidentally, did globalization. As that graph I mentioned above shows, international capital mobility was at low ebb; so was immigration. Now globalization is back, if in slightly different guise (trade probably matters more now than it did then); and so is the old American tradition of assuming it will continue to work for us when we want it. I think this assumption is no better now than in 1929.
Q: Not that you should play Nostradamus, exactly, but what are the long-term implications of seeing globalization and American exceptionalism as deeply connected?
A: If there are lessons for today in the book, they're these: (a) We should make policy decisions based on an accurate assessment of our position in the world, and not on assumptions or principles; and (b) we need to reassess our policy framework periodically to make sure it still suits us, because our place in the world changes.
When you hear anyone start talking about how we can take or leave world trade, how we can take or leave immigration, how the Federal Reserve has room to maneuver to regulate the flow of capital in a crisis -- you need to ask yourself, are the assumptions behind this policy, about our place in the world economy, sound?
But I won't pretend I've written a white paper for future action; the book is an argument about how we got here, why we believe what we do, why -- in the language of my former students -- America is so weird. And it's still weird in very much the same way it was. If in the middle of the twentieth century the country was on a trend toward being more like other nations, that's reversed.
I don't think we believe the world will take care of us because we're intrinsically more optimistic, or foolish, than other peoples; I think we've simply had our myths more or less ratified by the events of history for a very long time. I also think there's more than enough evidence in the historical record to suggest that mistaking the indulgence of events for the actions of a benign Providence is a recipe for disaster.
Tuning in to CSPAN’s weekend books coverage a few months ago, I caught the rebroadcast of a panel discussion among three or four biographers of American presidents, held in a large auditorium somewhere. All of them had done well -- all of the biographies, that is. Not all of the presidents were so lucky. But the topic of the moment, as I happened to start watching, was neither the highest office in the land or the unique challenges faces a best-selling author. They were discussing, rather, the state of American history as a field.
The consensus appeared to be that the situation was terrible. Scholars were neglecting the lives and achievements of the truly important figures. Instead, they were studying social history, cultural history, economic history -- everything, alas, except Great Man history. One fellow on the panel was the author of books on the Founding Fathers that had won great acclaim; it was easy to imagine big bags of money being delivered to his door regularly by a grateful publisher. He proferred a simple explanation for all the scholarship on slave revolts, immigrant neighborhoods, obscure women’s organizations, and other such riff-raff. It was very simple, actually. Those historians hated America.
He offered no rational argument for this claim. Nor, indeed, would one have been possible. The assertion went unchallenged.
Now, everyone has the right to express an opinion -- and nobody is under any obligation that it be informed. But there must be limits to just how much shameless nonsense the public sphere can afford to let circulate. The idea that American historians are refusing to study the illustrious dead -- let alone that they are doing so because they are "anti-American" -- is too bizarre for sane people to indulge.
If a decreasing percentage of the historical profession’s resources go to studying, say, the Founding Fathers, as such, a couple of less feverish possibilities come to mind. One is that the number of historians interested in the U.S. grows from decade to decade -- while the population of Founding Fathers available for study remains constant. The real barrier for scholars wishing to concentrate on them comes from the need to find something new to say about them.
But that's only part of the situation. And of course there is still good work being done raising issues about the Founding Fathers. History is a pluralistic field, both at the level of the phenomena it examines and the methods it uses to study them. Pluralism does not equal either moral relativism or epistemological skepticism. (Nor will a million ranters in the blogosphere ever make it so.) But it does preclude acting as if there were a single correct approach -- one single level of historical reality worth serious attention, or one uniquely effective tool or framework for understanding the past.
Just as a matter of personal preference, let me admit to being quite interested in Benjamin Franklin. He qualifies as a dead white property-owning male, if anyone could, and he was in many respects the Founding Fathers’ Founding Father. I would much rather read a biography of Franklin than, say, a detailed study of labor contracts in 18th century Philadelphia -- or an econometric analysis of how King George’s taxation policies affected the North American paper industry.
But history is not a zero-sum enterprise. The well-being of history as a discipline demands that scholars be able to do that sort of monographic work. And it is in my long-term interest as a reader of books about Franklin that precisely such research be done. (It gives biographers access to elements of the world in which he lived.) All of this seems pretty obvious, though not the sort of point that gets made on television very often. Demagogy is so much more exciting.
A memory of that cringe-inducing moment on CSPAN flooded back to mind a few days ago, upon news of the death of Lawrence W. Levine, a professor of history emeritus at the University of California at Berkeley. (He also served as president of the Organization of American Historians and, after retiring from Berkeley, taught in the history and cultural studies programs at George Mason University.) The headline of one obituary summed up his life and work by calling Levine a "historian and multiculturalist." Accurate enough, as far as it went. But that word "multiculturalist" is now about as stimulating to the higher centers of the brain as Pavlov’s bell. The minute they hear it, some people start to drool.
Ten years ago, Levine offered a calm and reasoned response to Alan Bloom in a book called The Opening of the American Mind: Canons, published by Beacon Press. (For a sympathetic but not uncritical review that sums up his argument, scroll down the page a bit here.) Levine was an early recipient of the MacArthur Fellowship, and his more specialized books are as accessible to the general reader as serious scholarship can be. But most of his influence was on other historians.
Hearing that he was a "multiculturalist" really tells you very little about Levine accomplished. It was not just that he looked at the diversity of cultural traditions making up American life. He also made connections between history and other fields.
His groundbreaking study Black Culture and Black Consciousness: Afro-American Folk Thought From Slavery to Freedom (Oxford University Press, 1977) looked at how songs and stories gave black Americans “the means of preventing legal slavery from becoming spiritual slavery” by creating a domain “free of control by those who ruled the earth.” His approach was, in part, a matter of using ideas from folklore and anthropology about African “survivals” that had survived the Middle Passage. But Levine’s research also led him in another direction -- toward Shakespeare.
In Highbrow/Lowbrow: The Emergence of Cultural Hierarchy in (Harvard University Press, 1988), Levine describes reading accounts of minstrel shows “to derive some more exact sense of how antebellum whites depicted black culture.” What he found, to his astonishment, was an abundance of allusions to the Bard -- jokes and parodies, for example, that implied that the audience knew the plays fairly well.
Digging deeper, he unearthed an entire lost world. Levine showed how, until sometime shortly after the Civil War, Shakespeare was part of the nation’s common culture, drawing large and rowdy audiences who had very definite opinions about how the plays should be performed, and were not shy about expressing them. (The egg, as more than one visiting British actor learned, proved a handy instrument of dramatic criticism.) Favorite scenes from Hamlet were often part of the bill at variety shows, along with trained monkeys and similar acts. In major cities, two or three different stagings of Macbeth sometimes competed for the public’s patronage, while bored residents of a mining camp might put on Richard III for fun.
Some of the adaptations sound abominable. One very popular version of King Lear, for example, had a happy ending. But the gusto was unmistakable. American Bardolatry included the belief that he was a very great writer, perhaps the very greatest. But it was also shot through with a sense that he was, deep down, a man of the people -- hence, especially to be appreciated in a democratic nation. Levine described one stage-curtain from the early 19th century showing Shakespeare climbing into the heavens atop the back of an American eagle.
By the late 19th century, though, something had happened. People began to think of Shakespeare as anything but entertainment. His verse was either sublime and uplifting (if you were the refined sort) or a bore (if you weren’t). By the 1870s, it was getting harder and harder to find a show that would offer you both some Shakespeare and a performance involving dancers and musically gifted livestock. And by the dawn of the 20th century, nobody was looking.
What happened? Well, you should read Levine’s book, which also shows how a similar transformation occurred in the public appetite for opera and classical music across the same period. Suffice it to say that deep changes in American economy and the society made for very different attitudes towards Shakespeare and Mozart. It is a short book, but also one of the great mind-opening works on U.S. history -- a strangely moving reminder of how little of the nation’s actual past survives in the contemporary memory.
“That essay on ‘William Shakespeare in America’ is worth a whole library of cultural studies work,” Michael Kazin told me recently when we discussed Levine by phone. Earlier this year, Kazin, who is a professor of history at Georgetown University, published a biography of William Jennings Bryan. Levine had studied Bryan for his own dissertation at Columbia University, later published as a book.
Levine’s analysis, which challenged the familiar image of Bryan as creationist buffoon, was an important influence on Kazin’s interpretation of the politician. Levine and Kazin were also friends, initially bonding over an interest in the films of Frank Capra. Levine read parts of Kazin’s work in manuscript, and for a while they were in a book group together.
“He had a great no-bullshit style,” said Kazin. “It was a New York Jewish working-class wit. It reminded me of my father, though Larry was younger by maybe 15 years.”
The comparison caught me by surprise. His father, the late Alfred Kazin, had published major studies of American literature such as On Native Grounds and God and the American Writer. These were works of literary scholarship of a decidedly untheoretical and pre-multicultural sort.
So I wondered if there could be more to the resemblance between Levine and Kazin Sr. than something about the way they spoke.
“O n Native Grounds is about literature,” Kazin said, “but it’s also about the process of ‘Americanizing.’ My father was trying to understand the popular wellsprings of literature. Sure, Larry was the great historian of multiculturalism, but most of his work was about trying to understand the nation as a unity. The great thing about him was that he always had big questions about how that unity actually worked. How did blacks resist slavery and survive it afterwards? How did we end up with the plays of Shakespeare, the mass artist, becoming something restricted to the elite? Those should be questions in American history.”
And for anyone concerned about the neglect of Great Men, it’s worth mentioning that Levine’s last published book, written with his wife Cornelia R. Levine, was The People and the President: America’s Conversation With FDR (Beacon Press, 2002). But it wasn’t a departure from his practice of studying history from below.
“It looks at the letters people sent to FDR,” said Kazin. “It’s about how high and low interact. And what’s the point in having a democracy if you don’t try to understand that relationship?”
Keeping a commonplace book -- a notebook for copying out the striking passages you’ve come across while reading -- was once a fairly standard practice, not just among professional scholars but for anyone who (as the expression went) “had humane letters.” Some people still do it, though the very idea of creating your own customized, hand-written anthology does seem almost self-consciously old-fashioned now. Then again, that may be looking at things the wrong way. When John Locke circulated his New Method of a Common-Place Book in the late 17th century, he wasn’t offering Martha Stewart-like tips on how to be genteel. He had come up with a system of streamlined indexing and text-retrieval -- a way to convert the commonplace book into a piece of low-tech software for smart people on the go.
There is a fairly direct line running from Locke’s efficiency-enhancing techniques to The Yale Book of Quotations, a handsome and well-indexed compilation just issued by Yale University Press. That line runs through the work of John Bartlett, the prodigious American bookworm whose recall of passages from literature made him semi-famous in Cambridge, Mass. even before he published a small collection of Familiar Quotations in 1855. He included more and more material from his own commonplace book in later editions, so that the book grew to doorstop-sized. I don’t know whether or not Bartlett had read Locke’s essay. But he did index the book in a manner the philosopher would have found agreeable.
Following his death in 1905, “Bartlett’s” has become almost synonymous with the genre of quotation-collection itself – a degree of modest immortality that might have surprised him. (Chances are, he expected to be remembered for the fact that his friend James Russell Lowell once published a poem about Bartlett’s skill as a fisherman.)
The new Yale collection follows Bartlett’s example, both in its indexing and in sheer heft. It is not just a compilation but a work of scholarship. The editor, Fred R. Shapiro, is an associate librarian and lecturer in legal research at the Yale Law School; and his edition of The Oxford Dictionary of American Legal Quotations is well-regarded by both lawyers and reference librarians. In The Yale Book of Quotations, he proves even more diligent than Bartlett was about finding the exact origins and wording of familiar quotations.
The classic line from Voltaire that runs “I disapprove of what you say, but I will defend to the death your right to say it” does not appear among the selections from Voltaire, for the simple reason that he never actually said it. (According to an article appearing in the November 1943 issue of Modern Language Notes, it was actually coined by one of Voltaire's biographers, S. G. Tallentre.) Shapiro finds that the principle later known as “Murphy’s Law” was actually formulated by George Orwell in 1941. (“If there is a wrong thing to do,” wrote Orwell, “it will be done, infallibly. One has come to believe in that as if it were a law of nature.”)
In his posthumously published autobiography, Mark Twain attributed the phrase “lies, damned lies, and statistics” to Benjamin Disraeli. But the saying has long been credited to Twain himself, in the absence of any evidence that Disraeli actually said it. Thanks to the digitized editions of old newspapers, however, Shapiro finds it attributed to the former British prime minister in 1895, almost 30 years before Twain’s book was published.
It turns out that Clare Boothe Luce’s most famous quip, “No good deed goes unpunished,” first recorded in 1957, was actually attributed to Walter Winchell 15 years earlier. And as Shapiro notes, there is evidence to suggest that it had been a proverb even before that. Likewise, it was not Liberace who coined the phrase “crying all the way to the bank” but rather, again, Winchell. (Oddly enough, the gossip columnist -- a writer as colorful as he was callous -- does not get his own entry.)
The historical notes in small type -- elaborating on sources and parallels, and sometimes cross-referencing other quotations within the volume -- make this a really useful reference work. It is also a profitable (or at least entertaining) way to procrastinate.
At the same time, it is a book that would have bewildered John Bartlett – and not simply because it places less emphasis on classic literature than commonplace-keepers once did. The editor has drawn on a much wider range of sources than any other volume of quotations I’ve come across, including film, television, popular songs, common sayings, and promotional catchphrases. Many of the choices are smart, or at least understandable. The mass media, after all, serve as the shared culture of our contemporary Global Village, as Marshall McLuhan used to say.
But many of the entries are inexplicable -- and some of them are just junk. What possible value is there to a selection of 140 advertising slogans (“There’s something about an Aqua Velva man”) or 90 television catchphrases (“This is CNN”)? The entry for Pedro Almodavar, the Spanish director, consists entirely of the title of one of his films, Women on the Verge of a Nervous Breakdown. Why bother?
A case might be made for including the “Space, the final frontier...” soliloquy from the opening of Star Trek, as Shapiro does in the entry for Gene Roddenberry. He also cross-references it to a quotation from 1958 by the late James R. Killian, then-president of MIT, who defined space exploration as a matter of “the thrust of curiosity that leads me to try to go where no man has gone before.” So far, so good. But why also include the slightly different wordings used in the openings to The Wrath of Khan and Star Trek: The Next Generation?
The fact that quotations from Mae West run to more than one and a half pages is not a problem. They are genuinely witty and memorable. (e.g., “Between two evils, I always pick the one I’ve never tried before.”) But how is it that the juvenile lyrics of Alanis Mrrissette merit nearly as much space as the entry for Homer? (The one from Greece, I mean, not from Springfield.)
It is hard to know what to make of some of these editorial decisions. It’s as if Shapiro had included, on principle, a certain amount of the static and babble that fills the head of anyone tuned into the contemporary culture – “quotations” just slightly more meaningful than the prevailing media noise (and perhaps not even that).
But another sense of culture prevailed in Bartlett’s day -- one that Matthew Arnold summed up as a matter of “getting to know, on all the matters that concern us, the best which has been thought and said in the world.” That doesn’t mean excluding popular culture. The lines here from Billie Holiday, Bob Dylan, and "The Simpsons" are all worth the space they fill. But the same is not true of “Plop plopp, fizz fizz, oh what a relief it is."
All such griping aside, The Yale Book of Quotations is an absorbing reminder that all one’s best observations were originally made by someone else. And it includes a passage from Dorothy Sayer explaining how to benefit from this: “I always have a quotation for everything,” she wrote. “It saves original thinking.”
I had considered suggesting that it might make a good present for Christmas, Hanukkah, Festivus, etc. According to the publisher’s Web site, the first printing is already sold out. It is available in bookstores, however, and also from some online booksellers. Here’s hoping it goes through many editions -- so that Shapiro will get a chance to recognize that Eminem’s considerable verbal skills do not translate well into cold type.
Valentine’s Day seems an appropriate occasion to honor the late Gershon Legman, who is said to have coined the slogan “Make love, not war.” Odd to think that saying had a particular author, rather than being spontaneously generated by the countercultural Zeitgeist in the 1960s. But I've seen the line attributed to Legman a few times over the years; and the new Yale Book of Quotations (discussed in an earlier column) is even more specific, indicates that he first said it during a speech at Ohio University in Athens, Ohio, sometime in November 1963.
Legman, who died in 1999 at the age of 81, was the rare instance of a scholar who had less of a career than a profound calling -- one that few academic institutions in his day could have accommodated. Legman was the consummate bibliographer and taxonomist of all things erotic: a tireless collector and analyst of all forms of discourse pertaining to human sexuality, including the orally transmitted literature known as folklore. He was an associate of Alfred Kinsey during the 1940s, but broke with him over questions of statistical methodology. If it hadn’t been that, it would have been something else; by all accounts, Legman was a rather prickly character.
But it is impossible to doubt his exacting standards of scholarship after reading The Horn Book: Studies in Erotic Folklore and Bibliography (University Books, 1964) -- a selection of Legman's papers reflecting years of exploration in the “restricted” collections of research libraries. (At the Library of Congress, for example, you will sometimes find a title listed as belonging to “the Delta Collection,” which was once available to a reader only after careful vetting by the authorities. The books themselves have long since been integrated into the rest of the library’s holdings, but not-yet-updated catalog listings still occasionally reveal that a volume formerly had that alluring status: forbidden yet protected.) Legman approached erotic literature and "blue" folklore with philological rigor, treating with care songs and books that only ever circulated on the sly.
Some of Legman's work appeared from commercial publishers and reached a nonscholarly audience. He assembled two volumes of obscene limericks, organized thematically and in variorum. The title of another project, The Rationale of the Dirty Joke, only hints at its terrible sobriety and analytic earnestness. Sure, you can skim around in it for the jokes themselves. But Legman’s approach was strictly Freudian, his ear constantly turned to the frustration, anxiety, and confusion expressed in humor.
Not all of his work was quite that grim. Any scholar publishing a book called Oragentialism: Oral Techniques in Genital Excitation may be said to have contributed something to the sum total of human happiness. The first version, devoted exclusively to cunnilingus, appeared from a small publisher in the 1940s and can only have had very limited circulation. The commercial edition published in 1969 expanded its scope -- though Legman (who in some of his writings comes across, alas, as stridently hostile to the early gay rights movement) seemed very emphatic in insisting that his knowledge of fellatio was strictly as a recipient.
Defensiveness apart, what’s particularly striking about the book is the degree to which it really is a work of scholarship. You have to see his literature review (a critical evaluation of the available publications on the matter, whether popular, professional, or pornographic, in several languages) to believe it. Thanks to Legman’s efforts, it is possible to celebrate Valentine’s Day with a proper sense of tradition.
Legman was a pioneer of cultural studies, long before anyone thought to call it that. He served as editor for several issues of Neurotica, a great underground literary magazine published between 1948 and 1952. Most of its contributors were then unknown, outside very small circles; but they included Allen Ginsberg, Anatole Broyard, Leonard Bernstein, and an English professor from Canada named Marshall McLuhan.
As the title may suggest, Neurotica reflected the growing cultural influence of Freud. But it also went against the prevalent tendency to treat psychoanalysis as a tool for adjusting misfits to society. The journal treated American popular culture itself as profoundly deranged; and in developing this idea, Legman served as something like the house theorist.
In a series of essays adapted from his pamphlet Love and Death (1948), Legman cataloged the seemingly endless sadism and misogyny found in American movies, comic books, and pulp novels. (Although Love and Death is long out of print, a representative excerpt can be found in Jeet Heer and Kent Worcester's collection Arguing Comics: Literary Masters on a Popular Medium, published by the University Press of Mississippi in 2004.)
Legman pointed out that huge profits were to be made from depicting murder, mutilation, and sordid mayhem. But any attempt at a frank depiction of erotic desire, let alone of sex itself, was forbidden. And this was no coincidence, he concluded. A taste for violence was being “installed as a substitute outlet for forbidden sexuality” by the culture industry.
Censorship and repression were warping the American psyche at its deepest levels, Legman argued. The human needs that ought to be met by a healthy sexual life came back, in distorted form, as mass-media sadism: "the sense of individuality, the desire for importance, attention, power; the pleasure in controlling objects, the impulse toward violent activity, the urge towards fulfillment to the farthest reaches of the individual’s biological possibilities.... All these are lacking in greater or lesser degree when sex is lacking, and they must be replaced in full.”
Replaced, that is, by the noir pleasures of the trashy pop culture available in the 1940s.
Here, alas, it proves difficult to accept Legman's argument in quite the terms framing it. His complaints about censorship and hypocrisy are easy to take for granted as justified. But the artifacts that filled him with contempt and rage -- Gone With the Wind, the novels of Raymond Chandler, comic books with titles like Authentic Police Cases or Rip Kirby: Mystery of the Mangler -- are more likely to fill us with nostalgia.
It's not that his theory about their perverse subtext now seems wrong. On the contrary, it often feels as if he's on to something. But while condemning the pulp fiction or movies of his day as symptomatic of a neurotic culture, Legman puts his finger right on what makes them fascinating now -- their nervous edge, the tug of war between raw lust and Puritan rage.
In any case, a certain conclusion follows from Legman’s argument -- one that we can test against contemporary experience.
Censorship of realistic depictions of sexuality will intensify the climate of erotic repression, thereby creating an audience prone to consuming pop-culture sadomasochism. If so, per Legman, then the easing or abolition of censorship ought to yield, over time, fewer images and stories centering on violence, humiliation, and so on.
Well, we know how that experiment turned out. Erotica is now always just a few clicks away (several offers are pouring into your e-mail account as you read this sentence). And yet one of the most popular television programs in the United States is a drama whose hero is good at torture .
They may have been on to something in the pages of Neurotica, all those decades ago, but things have gotten more complicated in the meantime.
As it happens, I’ve just been reading a manuscript called “Eros Unbound: Pornography and the Internet” by Blaise Cronin, a professor of information science at Indiana University at Bloomington, and former dean of its School of Information and Library Science. His paper will appear in The Internet and American Business: An Historical Investigation, a collection edited by William Aspray and Paul Ceruzzi scheduled for publication by MIT Press in April 2008.
Contacting Cronin to ask permission to quote from his work, I asked if he had any connection with the Kinsey Institute, also in Bloomington. He doesn’t, but says he is on friendly terms with some of the researchers there. Kinsey was committed to recording and tabulating sexual activity in all its forms. Cronin admits that he cannot begin to describe all the varieties of online pornography. Then again, he doesn’t really want to try.
“I focus predominantly on the legal sex industry,” he writes in his paper, “concentrating on the output of what, for want of a better term, might be called the respectable, or at least licit, part of the pornography business. I readily acknowledge the existence of, but do not dwell upon the seamier side, unceremoniously referred to by an anonymous industry insider as the world of ‘dogs, horses, 12-year old girls, all this crazed Third-World s—.’ ”
The notion of a “respectable” pornography industry would have seemed oxymoronic when Legman published Love and Death. It’s clearly much less so at a time when half the hotel chains in the United States offer X-rated films on pay-per-view. Everyone knows that there is a huge market for online depictions of sexual behavior. But what Cronin’s study makes clear is that nobody has a clue just how big an industry it really is. Any figure you might hear cited now is, for all practical purposes, a fiction.
The truth of this seems to have dawned on Cronin following the publication, several years ago, of “E-rogenous Zones: Positioning Pornography in the Digital Marketplace,” a paper he co-authored with Elizabeth Davenport. One of the tables in their paper “estimated global sales figures for the legal sex/pornography industry,” offering a figure of around $56 billion annually. That estimate squared with information gathered from a number of trade and media organizations. But much of the raw data had originally been provided by a specific enterprise -- something called the Private Media Group, Inc., which Cronin describes as “a Barcelona-based, publicly traded adult entertainment company.”
After the paper appeared in the journal Information Society in 2001, Cronin says, he was contacted “by Private’s investor relations department wondering if I could furnish the company with growth projections and other related information for the adult entertainment industry -- I, who had sourced some of my data from their Web site.” That estimate of $56 billion per year, based on research now almost a decade old, is routinely cited as if it were authoritative and up to date.
“Many of the numbers bandied about by journalists, pundits, industry insiders and market research organizations,” he writes, “are lazily recycled, as in the case of our aforementioned table, moving effortlessly from one story and from one reporting context to the next. What seem to be original data and primary sources may actually be secondary or tertiary in character.... Some of the startling revenue estimates and growth forecasts produced over the years by reputable market research firms ... have been viewed all too often with awe rather than healthy skepticism.”
Where Legman was, so to speak, an ideologue of sex, Blaise Cronin seems more scrupulously dispassionate. His manuscript runs to some 50 pages and undertakes a very thorough review of the literature concerning online pornography. (My wife, a reference librarian whose work focuses largely on developments in digital technology and e-commerce, regards Cronin’s paper as one of the best studies of the subject around.) He doesn't treat the dissemination of pornography as either emancipatory or a sign of decadence. It's just one of the facts of life, so to speak.
His paper does contain a surprise, though. It's a commonplace now that porn is assuming an increasingly ordinary role as cultural commodity -- one generating incalculable, but certainly enormous, streams of revenue for cable companies, Internet service providers, hotel chains, and so on. But the "mainstreaming" of porn is a process that works both ways. Large sectors of the once-marginal industry are morphing into something ever more resembling corporate America.
“The sleazy strip joints, tiny sex shops, dingy backstreet video stores and other such outlets may not yet have disappeared,” writes Cronin, “but along with the Web-driven mainstreaming of pornography has come -- almost inevitably, one has to say -- full-blown corporatization and cosmeticization.... The archetypal mom and pop business is being replaced by a raft of companies with business school-trained accountants, marketing managers and investment analysts at the helm, an acceleration of a trend that began at the tail-end of the twentieth century. As the pariah industry strives to smarten itself up, the language used by some of the leading companies has become indistinguishable from that of Silicon Valley or Martha Stewart. It is a normalizing discourse designed to resonate with the industry’s largely affluent, middle class customer base.”
As an example, he quotes what sounds like a formal mission statement at one porn provider’s website: “New Frontier Media, Inc. is a technology driven content distribution company specializing in adult entertainment. Our corporate culture is built on a foundation of quality, integrity and commitment and our work environment is an extension of this…The Company offers diversity of cultures and ethnic groups. Dress is casual and holiday and summer parties are normal course. We support team and community activities.”
That’s right, they have casual Fridays down at the porn factory. Also, it sounds like, a softball team.
I doubt very much that anybody in this brave new world remembers cranky old Gershon Legman, with his index cards full of bibliographical data on Renaissance handbooks on making the beast with two backs. (Nowadays, of course, two backs might be considered conservative.) Ample opportunity now exists to watch or read about sex. Candor seems not just possible but obligatory. But that does not necessarily translate into happiness -- into satisfaction of "the urge towards towards fulfillment to the farthest reaches of the individual’s biological possibilities," as Legman put it.
That language is a little gray, but the meaning is more romantic than it sounds. What Legman is actually celebrating is the exchange taking place at the farthest reaches of a couple's biological possibilities: the moment when sex turns into erotic communion. And for that, broadband access is irrelevant. For that, you need to be really lucky.
During the first administration of Franklin Delano Roosevelt (or so goes a story now making the rounds of American progressives), the president met with a group of citizens who urged him to seize the moment. Surely it was time for serious reforms: The Depression made it impossible to continue with business as usual. Just what measures the visitors to the Oval Office proposed -- well, that is not clear, at least from the versions I have heard. Perhaps they wanted laws to regulate banking, or to protect the right of labor unions to organize, or to provide income help for the aged. Maybe all of the above.
The president listened with interest and evident sympathy. As the meeting drew to a close, Roosevelt thanked his guests, expressing agreement with all they had suggested. “So now,” he told them on their way out the door, “go out there and make me do it.”
This is less a historical narrative, strictly speaking, than an edifying tale. Its lesson is simple. Even with wise and trustworthy leadership holding power -- perhaps especially then -- you must be ready to apply pressure from below. (The moral here is not especially partisan, by the way. One can easily imagine conservative activists spurring one another on with more or less the same story, with Ronald Reagan assuming the star role.)
I recalled this anecdote on Saturday after meeting Michael T. Heaney, an assistant professor of political science at the University of Florida. He stopped by for a visit after spending the afternoon collecting data at the antiwar demonstration here in Washington.
For the past few years, Heaney has been collaborating with Fabio Rojas, an assistant professor of sociology at Indiana University, on a study of the turnout at major national antiwar protests. With the help of research assistants, they have done surveys of some 3,550 randomly selected demonstrators. (That figure includes the 350 surveys gathered this weekend.) Their research has already yielded two published papers, available here and here, with more now in the works.
We’ll go over some of their findings in a moment. But a remark that Heaney made in conversation resonated with that fable about the New Deal era, and it provides a context for understanding the work he and Rojas have been doing.
“Political scientists are good at analyzing how established institutions function,” he said. “We have the tools for that, and the tools work really well. But there is very strong resistance to studying informal organizations or to recognizing them as part of the political landscape.”
In the course of thinking over their research, Rojas and Heaney have improvised a concept they call “the party in the street” -- that segment of a political party that, to borrow FDR’s (possibly apocryphal) injunction, gets out there and pushes.
Party affiliation was only one of the questions asked during the survey, which also gathered information about a demonstrator’s age, gender, ethnicity, zip code, membership in non-political organizations, and how he or she heard about the protest. (The form allowed responders to remain anonymous.)
“We attended or sent proxies to all major protests during a one-year period, from August 2004 until September 2005,” Heaney told me, “and we’ve coded all those surveys. We’ve also collected surveys at other demonstrations since then, including roughly a thousand responses just in 2007.”
The researchers attended demonstrations sponsored by each of the two major coalitions organizing them, United for Peace and Justice (UFPJ) and Act Now to Stop War and End Racism (ANSWER). The two coalitions have been at odds with one another for years, but worked together to organize the September 2005 protest in Washington before going their separate ways again. “We couldn’t have planned this,” as Heaney puts it, “but now we have data from each stage – when the two coalitions were in conflict, when they worked together, and then again after they parted.”
During the September 2005 activities, Rojas and Heaney gathered information both from those who attended a large open-air protest and from the thousand or so people who stuck around to lobby members of Congress two days later.
Their survey data also cover demonstrations in the months before and after the midterm elections in November, though most of those results remain to be processed.
“I’ve been shocked at how few academics have paid attention to the antiwar movement,” Heaney told me. “When we first went out to do a survey at a demonstration, I sort of expected to find other political scientists doing research too. But apart from a couple of people in sociology, there doesn’t seem to be much else happening so far.”
I asked if they had met with much suspicion in the course of their research -- people refusing to take the survey for fear of being, well, surveilled.
“No,” he said, “the response rate has been very high. There hasn’t been much paranoia. The temper isn’t like it was after 9/11. People don’t feel as much like the government is out to get them. And fear on the part of the police has gone down too. Now they don’t seem as concerned that a protest is going to turn into a terrorist act.”
The survey results from demonstrations in 2004 and 2005 showed that “40% of activists within the antiwar movement describe themselves as Democrats, 39% identify as independents (i.e. they list no party affiliation), 20% claim membership in a third party, and only 2% belong to the Republican party.”
Some of their findings confirm things one might predict from a simple deduction. Protestors who identified as members of the Democratic Party were more likely to stay in town to lobby their members of Congress than those who didn’t, for example.
Likewise, the researchers found that Democratic members of Congress “are more likely to meet with antiwar lobbyists than are Republicans, other things being equal.... Members of Congress who had previously expressed high levels of support for antiwar positions were more likely to meet with lobbyists than those whose support had been weak or nonexistent.”
Other results were more interesting. Protestors who belonged to “at least one civic, community, labor, or political organization” proved to be 17 percent more likely to lobby. People who turned out for the demonstration after being contacted by an organization were 13 percent more likely to lobby – while those who found about the event only through the mass media were 16 percent less likely to go to Capitol Hill.
The contemporary antiwar movement has a “distinctly bimodal” distribution with respect to age. In other words, there are two significant cohorts, one between the ages of 18 and 27, the other between 46 and 67, “with relatively fewer participants outside these ranges.”
Each birthday added “about 1 percent to an individual’s willingness to lobby when all other variables are held at their means or modes,” report Heaney and Rojas in a paper for the journal American Politics Research. “We did not find that sex, race, or occupational prestige make a difference in an individual’s propensity to lobby.”
In conversation, Heaney also mentioned a provisional finding that they are now double-checking. “The single strongest predictor of lobbying was whether an individual had been involved in the movement against the Vietnam War.”
It was while attending a demonstration outside the Republican National Convention in New York in 2004 that Heaney came up with an expression that has somewhat complicated the reception of this research among his colleagues. The city’s labor unions had turned out a large and obstreperous crowd to express displeasure with the president. The crowd was overwhelmingly likely to vote for Democratic candidates, but Heaney was struck by the thought that it was a very different gathering from the one he expected would assemble before long at a Democratic national convention.
“I thought: this is more like a festival,” he told me. “It’s the Democratic Party. But it’s also the party having a party...in the street.”
This phrase – “the party in the street” – had a special overtone for Heaney as a political scientists, given one familiar schema used in analyzing American politics. In his profession, it is common to speak of a major party as having three important sectors: “the party in government,” “the party in the electorate,” and “the party as organization.”
The idea that mass movements might constitute a fourth sector of the party – with the Christian Right, for example, being a component of the Republican “party in the street” – might seem self-evident in some ways. But not so for political scientists, it seems. “We met a lot of resistance to the idea of the ‘party in the street,’” Heaney told me, “and to the idea that [it might apply] to the Republicans as well.” The paper in which Heaney and Rojas first referred to “the party in the street” ended up going to three different journals -- with substantial revisions along the way – before it was accepted for publication in American Politics Research.
Speaking of the antiwar protests as manifestations of the Democratic “party in the street” will also meet resistance from many activists. (A catchphrase of the hard left is that the Democratic Party is “the graveyard of mass movements.”) And according to their own surveys, Heaney and Rojas find that just over one fifth of demonstrators see themselves as clearly outside its ranks.
But that still leaves the majority of antiwar activists as either identifying themselves as Democrats or at least willing to vote for the party. “Like it or not,” write Heaney and Rojas, “their moral and political struggles are within or against the Democratic Party; it actions and inactions construct opportunities for and barriers to the achievement of their issue-specific policy goals.” (Though Heaney and Rojas don’t quote Richard Hofstadter, their analysis implicitly accepts the historian’s famous aphorism that American third parties “are like bees: they sting once and die.”)
“We do not claim,” they take care to note, “that the party in the street has equal standing with the party in government, the party in the electorate, or the party as organization. We are not asserting that the formal party organization is coordinating these activities. The party in the street lacks the stability possessed by other parts of the party because it is not supported by enduring institutions. Furthermore, it is small relative to other parts of the party and at times may be virtually nonexistent.”
As Heaney elaborated when we met, a great deal of the organizing work of the antiwar “party” is conducted by e-mail – a situation that makes it much easier for groups with a small staff to reach a large audience. But that also makes for somewhat shallow or episodic involvement in the movement on the part of many participants. An important area for study by political scientists might be the relationship between the emerging zone of activist organizations and the informal networks of campaign consultants, lobbyists, financial contributors, and activists” shaping the agenda of other sectors of political parties. “If they remain well organized and attract enthusiastic young activists,” write Rojas and Heaney, “then the mainstream political party is unable to ignore them for long.”
Studying the antiwar movement has not exhausted the attention of either scholar. Heaney is working on a book about Medicare, while Rojas is the author of From Black Power to Black Studies: How a Radical Social Movement Became an Academic Discipline, forthcoming from Johns Hopkins University Press. But now they have an abundance of data to analyze, and expect to finish four more papers over the next few months. In addition to crunching more than three years’ worth of survey data, Heaney and Rojas have been examining the antiwar movement’s publications online and observing in person how protests are organized.
I scribbled down working titles and thumbnail descriptions of the papers in progress as Heaney discussed them. So here, briefly, is an early report on some research you may hear pundits refer to knowingly some months from now....
“Mobilizing the Antiwar Movement” will analyze how organizations get people to turn out and which kinds of groups are most successful at it. “Network Dynamics of the Antiwar Movement” will consider how different groups interact at events and how those interactions have changed over time. “Leaders and Followers in the Antiwar Movement” will examine the survey data gathered at large protests, comparing and contrasting it with information about activists who participate in smaller workshops or training exercises for committed activists.
Finally, “Coalition Dissolution in the Antiwar Movement” will look at tensions within the organizing efforts. “There has been some work in sociology on coalition building,” as Heaney explained, “but there’s been almost none on how they fall apart.”
It’s worth repeating that all of this work on the antiwar “party in the street” could just as well inspire research on the relationship between conservative movements and the Republican Party. Perhaps someone will eventually write a paper called “Coalition Dissolution in the Christian Right.” I say that purely in the interests of scholarship, of course, and with no gloating at the prospect whatsoever.
Longtime readers of Intellectual Affairs may recall that this column occasionally indulges in reference-book nerdery. So it was a pleasant but appropriate surprise when the Bodleian Library of the University of Oxford provided a copy of its new edition of the very first dictionary of the English language. It has been out of print for almost 400 years, and the Bodleian is now home to the one known copy of it to have survived.
Available now as The First English Dictionary, 1604 (distributed by the University of Chicago Press), the work was originally published under the title A Table Alphabeticall. It was compiled in the late 16th century by one Robert Cawdrey. The book did not bring him fame or fortune, but it went through at least two revised editions within a decade. That suggests there must have been a market for Cawdrey’s guide to what the title page called the “hard usuall English wordes” that readers sometimes encountered “in Scripture, Sermons, or elswhere.”
Cawdrey had the misfortune, unlike fellow lexicographer Samuel Johnson, of never meeting his Boswell. Yet he had an eventful career – enough to allow for a small field of Cawdrey studies. An interesting introduction by John Simpson, the chief editor of the Oxford English Dictionary, sums up what is known about Cawdrey and suggests ways in which his dictionary may contain echoes of his life and times.
At the risk of being overly present-minded, there’s a sense in which Cawdrey was a pioneer in dealing with the effects of his era’s information explosion. Thanks to the printing press, the English language was undergoing a kind of mutation in the 16th century.
New words began to circulate in the uncharted zone between common usage and the cosmopolitan lingo of sophisticated urbanites who traveled widely. Learned gentlemen were traveling to France and Italy and coming back “to powder their talk with over-sea language,” as Cawdrey noted. Some kinds of “academicke” language (glossed by Cawdrey as “of the sect of wise and learned men”) were gaining wider usage. And readers were encountering words like “crocodile” and “akekorn” which were unfamiliar. Cawdrey’s terse definitions of them as “beast” and “fruit,” respectively, suggest he probably had seen neither.
Booksellers had offered lexicons of ancient and foreign languages. And there were handbooks explaining the meaning of specialized jargon, such as that used by lawyers. But it was Cawdrey’s bright idea that you might need to be able to translate new-fangled English into a more familiar set of “plaine English words.”
Cawdrey also found himself in the position of needing to explain his operating system. “To profit by this Table,” as he informed the “gentle Reader” in a note, “thou must learn the Alphabet, to wit, the order of the Letters as they stand....and where every Letter standeth.” Furthermore, you really needed to have it down cold. A word beginning with the letters “ca,” he noted, would appear earlier than one starting with “cu.” After using the “Table” for a while, you probably got the hang of it.
Who was this orderly innovator? Cawdrey, born in the middle of England sometime in the final years of Henry VIII, seems not to have attended Oxford or Cambridge. But he was learned enough to teach and to preach, and came to enjoy the patronage of a minister to Queen Elizabeth. He married, and raised a brood of eight children. In a preface to the dictionary, Cawdrey acknowledges the assistance of “my sonne Thomas, who now is Schoolmaister in London.”
Cawdrey published volumes on religious instruction and on the proper way to run a household so that each person knew his or her proper place. He also compiled “A Treasurie or store-house of similies both pleasant, delightfull, and profitable, for all estates of men in generall.” (Such verbosity was quite typical of book titles at the time. The full title page for his dictionary runs to about two paragraphs.)
Whatever his chances for mobility and modest renown within the Elizabethan intelligentsia were severely limited, however, given his strong religious convictions. For Cawdrey was a Puritan – that is, someone convinced that too many of the old Roman Catholic ways still clung to the Church of England.
Curious whether "Puritan" (a neologism with controversial overtones) appeared in dictionary, I looked it up. It isn’t there. But Cawrey does have “purifie,” meaning “purge, scoure, or make cleane” -- which is soon followed by “putrifie, to waxe rotten, or corrupted as a sore.” By the 1580s, Cawdrey had both words very much in mind when he spoke from the pulpit. When he was called before church authorities, one of the complaints was that he had given a sermon in which he had “depraved the Book of Common Prayer, saying, That the same was a Vile Book and Fy upon it.” He was stripped of his position as minister.
But Cawdrey did not give up without a fight. He appealed the sentence, making almost two dozen trips to London to argue that it was invalid under church law. All to no avail. He ignored hints from well-placed friends that he might get his job back by at least seeming to go along with the authorities on some points. For that matter, he continued to sign his letters as if he were the legitimate pastor of his town.
No doubt Cawdrey retained a following within the Puritan underground, but he presumably had to go back to teaching to earn a living. Details about his final years are few. It isn’t even clear when Cawdrey died. He would have been approaching 70 when his dictionary appeared, and references in reprints of his books a few years later imply that they were revised posthumously.
In his introductory essay, John Simpson points out that the OED now lists 60,000 words that are known to have been in use in English around the year 1600. Cawdrey defines about 2,500 of them. “We should probably assume that he was unable to include as many words as he would have liked,” writes Simpson, “in order to keep his book within bounds. It was, after all, an exploratory venture.”
But that makes the selection all the more interesting. It gives you a notion of what counted as a “hard word” at the time. Most of them are familiar now from ordinary usage, though not always in quite the sense that Cawdrey indicates. He gives the meaning of “decision” as “cutting away,” for example. Tones of the preacher can be heard in his slightly puzzling definition of “curiositie” as “picked diligence, greater carefulnes, then is seemly or necessarie.”
Given his Puritan leanings, it is interesting to see that the word “libertine” has no specifically erotic overtones for Cawdrey. He defines the word applying to those “loose in religion, one that thinks he may doe as he listeth.” One of the longest entries is for “incest,” explained as “unlawfull copulation of man and woman within the degrees of kinred, or alliance, forbidden by Gods law, whether it be in marriage or otherwise.”
It is a commonplace of much recent scholarship that, prior to the mania for categorizing varieties of sexual desire that emerged in the 19th century, the word “sodomy” covered a wide range of non-procreative acts, heterosexual as well as homosexual. Cawdrey, it seems, didn’t get the memo. He defines “sodomitrie” as “when one man lyeth filthylie with another man.” Conversely, and rather more puzzling, is his definition of “buggerie” (which one might assume to be a slang term for a rather specific act) as “conjunction with one of the same kinde, or of men with beasts.”
In a few entries, one detects references to Cawdrey’s drawn-out legal struggle of the 1580s and '90s. He explains that a "rejoinder" is “a thing added afterwards, or is when the defendant maketh answere to the replication of the plaintife.” So a rejoinder is a response, perhaps, to “sophistikation” which Cawdrey defines as “a cavilling, deceitful speech.”
Especially pointed and poignant is the entry for “temporise,” meaning “to serve the time, or to follow the fashions and behaviour of the time.” Say what you will about Puritan crankiness, but Robert Cawdrey did not “temporise.”
Particularly interesting to note are entries hinting at how the “new information infrastructure” (circa 1600) was affecting language. The expense of producing and distributing literature was going down. “Literature,” by the way, is defined by Cawdrey here as “learning.” Cawdrey includes a bit of scholarly jargon, “abstract,” which he explains means “drawne away from another: a litbooke or volume prepared out of a greater.”
Some of the words starting to drift into the ken of ordinary readers were derived from Greek, such as “democracie, a common-wealth gouerned by the people” and “monopolie, a license that none shall buy and sell a thing, but one alone.” Likewise with terms from the learned art of rhetoric such as “metaphor,” defined as "similitude, or the putting over of a word from his proper and naturall signification, to a forraine or unproper signification.”
Cawdrey’s opening address “To the Reader” is a manifesto for the Puritan plain style. Anyone seeking “to speak publiquely before the ignorant people,” he insists, should “bee admonished that they never affect any strange inkhorne termes, but labour to speake so as is commonly received, and so as the most ignorant may well understand them.”
At the same time, some of the fancier words were catching on. The purpose of the dictionary was to fill in the gap between language that “Ladies, Gentlewomen, or any other unskilfull persons” might encounter in their reading and what they could readily understand. (At this point, one would certainly like to know whether Cawdrey taught his own three daughters how to read.) Apart from its importance to the history of lexicography, this pioneering reference work remains interesting as an early effort to strike a balance between innovation and accessibility in language use.
“Some men seek so far for outlandish English,” the old Puritan divine complains, “that they forget altogether their mothers language, so that if some of their mothers were alive, they were not able to tell, or understand what they say.” Oh Robert Cawdrey, that thou shouldst be alive at this hour!
Jacques-Alain Miller has delivered unto us his thoughts on Google. In case the name does not signify, Jacques-Alain Miller is the son-in-law of the late Jacques Lacan and editor of his posthumously published works. He is not a Google enthusiast. The search engines follows “a totalitarian maxim,” he says. It is the new Big Brother. “It puts everything in its place,” Miller declares, “turning you into the sum of your clicks until the end of time.”
Powerful, then. And yet – hélas! – Google is also “stupid.” It can “scan all the books, plunder all the archives [of] cinema, television, the press, and beyond,” thereby subjecting the universe to “an omniscient gaze, traversing the world, lusting after every little last piece of information about everyone.” But it “is able to codify, but not to decode. It is the word in its brute materiality that it records.” (Read the whole thing here. And for another French complaint about Google, see this earlier column.)
When Miller pontificates, it is, verily, as a pontiff. Besides control of the enigmatic theorists’s literary estate, Miller has inherited Lacan’s mantle as leader of one international current in psychoanalysis. His influence spans several continents. Within the Lacanian movement, he is, so to speak, the analyst of the analysts’ analysts.
He was once also a student of Louis Althusser, whose seminar in Paris during the early 1960s taught apprentice Marxist philosophers not so much to analyze concepts as to “produce” them. Miller was the central figure in a moment of high drama during the era of high structuralism. During Althusser’s seminar, Miller complained that he had been busy producing something he called “metonymic causality” when another student stole it. He wanted his concept returned. (However this conflict was resolved, the real winner had to be any bemused bystander.)
Miller is, then, the past master of a certain mode of intellectual authority – one that has been deeply shaped by (and is ultimately inseparable from) tightly restricted fields of communication and exchange.
Someone once compared the Lacanian movement to a Masonic lodge. There were unpublished texts by the founder that remained more than usually esoteric: they were available in typescript editions of just a few copies, and then only to high-grade initiates.
It is hard to imagine a greater contrast to that digital flatland of relatively porous discursive borders about which Miller complains now. As well he might. (Resorting to Orwellian overkill is, in this context, probably a symptom of anxiety. There are plenty of reasons to worry and complain about Google, of course. But when you picture a cursor clicking a human face forever, it lacks something in the totalitarian-terror department.)
Yet closer examination of Miller’s pronouncement suggests another possibility. It isn’t just a document in which hierarchical intellectual authority comes to terms with the Web's numbskulled leveling. For the way Miller writes about the experience of using Google is quite revealing -- though not about the search engine itself.
“Our query is without syntax,” declares Miller, “minimal to the extreme; one click ... and bingo! It is a cascade -- the stark white of the query page is suddenly covered in words. The void flips into plenitude, concision to verbosity.... Finding the result that makes sense for you is therefore like looking for a needle in a haystack. Google would be intelligent if it could compute significations. But it can’t.”
In other words, Jacques-Alain Miller has no clue that algorithms determine the sequence of hits you get back from a search. (However intelligent Google might or might not be, the people behind it are quite clearly trying to “compute significations.”) He doesn’t grasp that you can shape a query – give it a syntax – to narrow its focus and heighten its precision. Miller’s complaints are a slightly more sophisticated version of someone typing “Whatever happened to Uncle Fred?” into Google and then feeling bewildered that the printout does not provide an answer.
For an informed contrast to Jacques-Alain Miller’s befuddled indignation, you might turn to Digital History Hacks, a very smart and rewarding blog maintained by William J. Turkel, an assistant professor of history at the University of Western Ontario. (As it happens, I first read about Miller in Psychoanalytic Politics: Jacques Lacan and Freud's French Revolution by one Sherry Turkle. The coincidence is marred by a slip of the signfier: they spell their names differently.)
The mandarin complaint about the new digital order is that it lacks history and substance, existing in a chaotic eternal present – one with no memory and precious little attention span. But a bibliographical guide that Turkel posted in January demonstrates that there is now extensive enough literature to speak of a field of digital history.
The term has a nice ambiguity to it – one that is worth thinking about. One the one hand, it can refer to the ways historians may use new media to do things they’ve always done – prepare archives, publish historiography, and so on. Daniel J. Cohen and Roy Rosenzweig’s Digital History: A Guide to Gathering, Preserving, and Presenting the Past on the Web (University of Pennsylvania Press, 2006) is the one handbook that ought to be known to scholars even outside the field of history itself. The full text of it is available for free online from the Center for History and New Media at George Mason University, which also hosts a useful selection of essays on digital history.
But as some of the material gathered there shows, digitalization itself creates opportunities for new kinds of history – and new problems, especially when documents exist in formats that have fallen out of use.
Furthermore, as various forms of information technology become more or more pervasive, it makes sense to begin thinking of another kind of digital history: the history of digitality. Impressed by the bibliography that Turkel had prepared – and by the point that it now represented a body of work one would need to master in order to do graduate-level work in digital history – I contacted him by e-mail to get more of his thoughts on the field.
“Digital history begins,” he says, “with traditional historical sources represented in digital form on a computer, and with 'born-digital' sources like e-mail, text messages, computer code, video games and digital video. Once you have the proper equipment, these digital sources can be duplicated, stored, accessed, manipulated and transmitted at almost no cost. A box of archival documents can be stored in only one location, has to be consulted in person, can be used by only a few people at a time, and suffers wear as it is used. It is relatively vulnerable to various kinds of disaster. Digital copies of those documents, once created, aren't subject to any of those limitations. For some purposes you really need the originals (e.g., a chemical analysis of ink or paper). For many or most other purposes, you can use digital representations instead. And note that once the chemical analysis is completed, it too becomes a digital representation.”
But that’s just the initial phase, or foundation level, of digital history – the scanning substratum, in effect, in which documents become more readily available. A much more complex set of questions come up as historians face the deeper changes in their work made possible by a wholly different sort of archival space – what Roy Rosenzweig calls the "culture ofabundance" created by digitality.
“He asks us to consider what it would mean to try and write history with an essentially complete archival record,” Turkel told me. “I think that his question is quite deep because up until now we haven't really emphasized the degree to which our discipline has been shaped by information costs. It costs something (in terms of time, money, resources) to learn a language, read a book, visit an archive, take some notes, track down confirming evidence, etc. Not surprisingly, historians have tended to frame projects so that they could actually be completed in a reasonable amount of time, using the availability and accessibility of sources to set limits.”
Reducing information costs in turn changes the whole economy of research – especially during the first phase, when one is framing questions and trying to figure out if they are worth pursuing.
“If you're writing about a relatively famous person,” as Turkel put it, “other historians will expect you to be familiar with what that person wrote, and probably with their correspondence. Obviously, you should also know some of the secondary literature. But if you have access to a complete archival record, you can learn things that might have been almost impossible to discover before. How did your famous person figure in people's dreams, for example? People sometimes write about their dreams in diaries and letters, or even keep dream journals. But say you wanted to know how Darwin figured in the dreams of African people in the late 19th century. You couldn't read one diary at a time, hoping someone had had a dream about him and written it down. With a complete digital archive, you could easily do a keyword search for something like "Darwin NEAR dream" and then filter your results.” As it happens, I conducted this interview a few weeks before coming across Jacques-Alain Miller’s comments on Google. It seems like synchronicity that Turkel would mention the possibility of digital historians getting involved in the interpretation of dreams (normally a psychoanalyst’s preserve). But for now, it sounds as if most historians are only slightly more savvy about digitality than the Lacanian Freemasons.
“All professional historians have a very clear idea about how to make use of archival and library sources,” Turkel says, “and many work with material culture, too. But I think far fewer have much sense of how search engines work or how to construct queries. Few are familiar with the range of online sources and tools. Very few are able to do things like write scrapers, parsers or spiders.”
(It pays to increase your word power. For a quick look at scraping and parsing, start here. For the role of spiders on the Web, have a look at this.)
“I believe that these kind of techniques will be increasingly important,” says Turkel, “and someday will be taken for granted. I guess I would consider digital history to have arrived as a field when most departments have at least one person who can (and does) offer a course in the subject. Right now, many departments are lucky to have someone who knows how to digitize paper sources or put up web pages.”
Last week,Intellectual Affairs took up the topic of what might be called scandal-mania -- the never-ending search for shock, controversy, and gratifying indignation regarding our “master thinkers.” Unfortunately there haven’t been enough “shocking revelations” recently to keep up with the demand. So the old ones are brought out of mothballs, from time to time.
A slightly different kind of case has come up recently involving Zygmunt Bauman, who is emeritus professor of sociology at the University of Leeds and the University of Warsaw. Bauman is a prolific author with a broad range of interests in social theory, but is probably best known for a series of books and essays analyzing the emergence of the new, increasingly fluid and unstable forms of cultural and social order sometimes called “postmodernism.”
No doubt that fact alone will suffice to convince a certain part of the public that he must be guilty of something. Be that as it may, Bauman is not actually a pomo enthusiast. While rejecting various strands of communitarianism, he is quite ambivalent about the fragmentation and confusion in the postmodern condition. His book Liquid Times: Living in an Age of Uncertainty, just issued by Polity, is quite typical of his work over the past few years -- a mixture of social theory and cultural criticism, sweeping in its generalizations but also alert to the anxieties one sees reflected in the newspaper and on CNN.
In March, a paragraph concerning Bauman appeared at Sign and Sight, a Web site providing capsule summaries in English of the Feuilletons (topical cultural articles) appearing in German newspapers and magazines. It noted the recent publication in the Frankfurter Allgemeine Zeitung of an article by a Polish historian named Bogdan Musial. The piece “uncovers the Stalinist past of the world famous sociologist,” as Sign and Sight put it.
It also quoted a bit of the article. "The fact is that Bauman was deeply involved with the violent communist regime in Poland for more than 20 years,” in Musial’s words, “fighting real and supposed enemies of Stalinism with a weapon in his hand, shooting them in the back. His activities can hardly be passed off as the youthful transgressions of an intellectual seduced and led astray by communist ideology. And it is astonishing that Bauman, who so loves to point the finger, does not reflect on his own deeds."
A few weeks later, another discussion of the matter appeared in The Irish Times -- this one by Andreas Hess, a senior lecturer in sociology at the University of Dublin. The piece bore what seems, with hindsight, the almost inevitable title of “Postmodernism Made Me Do It: A World Without Blame.” (The article is not available except to subscribers, but I’ll quote from a copy passed along by a friend.)
Summing up the charges in the German article, Hess said that secret files recently declassified in Poland revealed that Bauman “participated in operations of political cleansing of alleged political opponents in Poland between 1944 and 1954. The Polish files also show Bauman was praised by his superiors for having been quite successful in completing the tasks assigned, although he seems, as at least one note suggests, not to have taken any major part in direct military operations because of his ‘Semitic background.’ However, to be promoted to the rank of major at the youthful age of 23 was quite an achievement. As the author of the article [in the German newspaper] pointed out, Bauman remained a faithful member of the party apparatus.”
Hess goes on to suggest that “Bauman’s hidden past” is the key to his work as “one of the prophets of postmodernism.” This is not really argued so much as asserted -- and in a somewhat contradictory way.
On the one hand, it is implied that Bauman has used postmodern relativism as a way to excuse his earlier Stalinist crimes. Unfortunately for this argument, Bauman is actually a critic of postmodernism. And so, on the other hand, the sociologist is also guilty of attacking Western society by denouncing postmodernity. Whether or not this is a coherent claim, it points to some of what is at issue in the drama over “Bauman’s secret Stalinism,” as it’s called.
Now, I do not read German or Polish -- a decided disadvantage in coming to any sense of how the controversy has unfolded in Europe. Throughout the former Soviet sphere of influence, a vast and agonizingly complex set of problems has emerged surrounding “lustration” -- the process of "purifying" public life by legally disqualifying those who collaborated with the old Communist regimes from serving in positions of authority.
But let’s just look at the matter on purely in terms of the academic scandal we’ve been offered. I have read some of Bauman’s work, but not a lot. Under the circumstances that may be an advantage. I am not a disciple – and by no means feel committed to defending him, come what may.
If he has hidden his past, then its revelation is a necessary thing. But then, that is the real issue at stake. Everything turns on that “if.”
What did we know about Zygmunt Bauman before the opening of his files? What could be surmise about his life based on interviews, his bibliographical record, and books about him readily available at a decent university library?
One soon discovers that “Bauman’s hidden past” was very badly hidden indeed. He has never published a memoir about being a Stalinist -- nor about anything else, so far as I know -- but he has never concealed that part of his life either. The facts can be pieced together readily.
He was born in Poland in 1925 and emigrated to the Soviet Union with his family at the start of World War II. This was an altogether understandable decision, questions of ideology aside. Stalin’s regime was not averse to the occasional half-disguised outburst of anti-Semitism, but that was not the central point of its entire agenda, at least; so it is hardly surprising that a Jewish family might respond to the partition of Poland in 1939 by heading East.
Bauman studied physics and dreamed, he says, of becoming a scientist. He served as a member of the Polish equivalent of the Red Army during the war. He returned to his native country as a fervent young Communist, eager, he says, to rebuild Poland as a modern, egalitarian society – a “people’s democracy” as the Stalinist lingo had it. His wife Janina Bauman, in her memoir A Dream of Belonging: My Years in Postwar Poland (Virago, 1988) portrays him as a true believer in the late 1940s and early 1950s.
But there is no sense overstressing his idealism. To have been a member of the Polish United Workers Party was not a matter of teaching Sunday school classes on Lenin to happy peasant children. Bauman would have participated in the usual rounds of denuciation, purge, “thought reform,” and rationalized brutality. He was also an officer in the Polish army. The recent revelations specify that he belonged to the military intelligence division -- making him, in effect, part of the secret police.
But the latter counts a “revelation” only to someone with no sense of party/military relations in the Eastern bloc. Not every member of the military was a Communist cadre -- and an officer who was also a member of the party had a role in intelligence-gathering, more or less by definition.
But a Jewish party member was in a precarious position – again, almost by definition. In 1953, he was forced out of the army during one of the regime’s campaigns against “Zionists” and “cosmopolitans.” He enrolled in the University of Warsaw and retrained as a social scientist. He began to research on the history of the British Labour Party and the development of contemporary Polish society.
One ought not to read too much dissidence into the simple fact of doing empirical sociology. Bauman himself says he wanted to reform the regime, to bring it into line with its professed egalitarian values. And yet, under the circumstances, becoming a sociologist was at least somewhat oppositional a move. He published articles on alienation, the problems of the younger generation, and the challenge of fostering innovation in a planned economy.
And so he remained loyal to the regime -- in his moderately oppositional fashion -- until another wave of official anti-Semitism in 1968 made this impossible. In her memoir, Janina Bauman recalls their final weeks in Poland as a time of threatening phone calls, hulking strangers loitering outside their apartment, and TV broadcasts that repeated her husband’s name in hateful tones. “A scholarly article appeared in a respectable magazine,” she writes. “It attacked [Zygmunt] and others for their dangerous influence on Polish youth. It was signed by a close friend.”
Bauman and his family emigrated that year, eventually settling in Leeds. (He never faced a language barrier, having for some years been editor of a Polish sociological journal published in English.) His writings continued to be critical of both the Soviet system and of capitalism, and to support the labor movement. When Solidarity emerged in 1980 to challenge the state, Bauman welcomed it as the force that would shape of the future of Poland.
These facts are all part of the record -- put there, most of them, by Bauman himself. By no means is it a heroic tale. From time to time, he must have named names, and written things he didn’t believe, and forced himself to believe things that he knew, deep down, were not true.
And yet Bauman did not hide his past, either. It has always been available for anyone trying to come to some judgment of his work. He has been accused of failing to reflect upon his experience. But even that is a dubious reading of the evidence. A central point of his work on the “liquid” social structure of postmodernism is its contrast with the modernity that went before, which he says was “marked by the disciplinary power of the pastoral state.” He describes the Nazi and Stalinist regimes as the ultimate, extreme cases of that “disciplinary power.”
Let’s go out on a limb and at least consider the possibility that someone who admittedly spent years serving a social system that he now understands as issuing from the same matrix as Hitler’s regime may perhaps be telling us (in his own roundabout, sociologistic way) that he is morally culpable, no matter what his good intentions may have been.
Alas, this is not quite so exciting as “Postmodernist Conceals Sinister Past.” It doesn’t even have the satisfying denouement found in “The God That Failed,” that standard of ex-Communist disillusionment. Sorry about that.... It’s just a tale of a man getting older and – just possibly – wiser. I tend to think of that as a happy story, even so.
In the cartoons, an astonished character will at times need to grab his eyeballs as they come flying out of his head. Something like that happened to me a few months ago while going through the fall catalog of Columbia University Press. Buried deep in its pages – well behind all the exciting, glamorous titles at the bleeding edge of scholarship – was the listing for Tough Liberal: Albert Shanker and the Battles Over Schools, Unions, Race, and Democracy by Richard D. Kahlenberg. (It has just appeared in hardback.)
This was a title one might reasonably expect to see issued by a commercial publisher: Shanker, who died in 1997, was for many years the president of the American Federation of Teachers, which he helped build into one of the strongest unions in the AFL-CIO. It now has more than a million members, including about 160,000 who work in higher education; even if only one in a hundred were interested in the union’s history, that is quite a potential audience.
At the same time, it was a surprise to find the book published by a press better known for titles in cultural theory: works embodying a certain abstract radicalism, several miles in stratosphere above the labor movement. And Shanker, besides being a union bureaucrat, was something of a hardboiled ideologue – a fierce Cold Warrior, but no less ardent a Culture Warrior, denouncing both affirmative action and multiculturalism in tones that were, let’s say, emphatic.
Such “tough liberalism,” as his biographer calls it, made the labor leader a punchline in Woody Allen’s post-apocalyptic comedy "Sleeper" (1973). A character explains that no one is quite sure how civilization ended, but historians think it all started when “a man named Albert Shanker got his hands on an atomic bomb.”
A lot has changed since the days when a new movie by Woody Allen was a major event. And in any case, no labor leader has emerged in recent decades with quite the cultural and political profile that Shanker once had. Yet his name still has the power to provoke. There are Shankerites and anti-Shankerites.
Kahlenberg, a senior fellow at the Century Foundation in Washington, DC, admires Shanker and gives him the benefit of the doubt, more often than not. That tendency comes through, I think, in the IHE podcast we recently recorded. But Kahlenberg is not totally uncritical of Shanker. As we talked following the taping, Kahlenberg mentioned the passions stirred up by the leader's memory.
Some followers remain convinced that “Al” was right about more or less everything -- including the Vietnam War, which Shanker supported. Kahlenberg also looked into charges by Shanker's opponents that he received funds as part of the American intelligence community’s activity within the labor movement.
That accusation is hardly a surprising or implausible. All things considered, it would be surprising if Shanker were not connected with "the AFL-CIA” (as certain networks within the intelligence and labor communities were sometimes called). But Kahlenberg says critics haven’t offered solid evidence to back up the accusation. There is a difference between firm conviction and real proof. This is a matter some historian will eventually need to revisit, nailing things down with serious documentation.
Tough Liberal is not the first book about Shanker. But the previous volume, Dickson A. Mungazi’s Where He Stands: Albert Shanker and the American Federation of Teachers, published by Praeger in 1995, was not really a biography. Nor was it much of a contribution to labor history, given that Mungazi identifies Samuel Gompers (who died in 1924) as the first president of the CIO (established in 1935).
So Kahlenberg has made a real contribution by telling the story of this charismatic and/or megalomaniacal labor leader’s career. I say that as a reader who did not pick up the biography with any admiration for its subject – nor put it down converted to Shanker-style “toughness.” (Actually it made me think maybe Woody Allen was right.) But it’s an engaging book, and essential reading for anyone interested in the history of Cold War liberalism and its complicated legacy.
Further reading (and listening): An excerpt from Tough Liberal is available at Columbia UP's website. An early review of it appears in the latest issue of Washington Monthly. An extremely favorable treatment of the biography and of Shanker himself has recently appeared in The Wall Street Journal. For something altogether less laudatory, see the essay appearing ten years ago in the socialist journal New Politics. And by all means, lend an ear to the interview with Richard Kahlenberg, available as an IHE podcast.
Not all Islamophobes are fanatics. Most, on the contrary, are decent people who just want to live in peace. Islamophobia forms only part of their identity. They grew up fearing Islam, and they still worry about it from time to time, especially during holidays and on certain anniversaries; but many would confess to doubt about just how Islamophobic they feel deep down inside. They may find themselves wondering, for example, if the Koran is really that much more bloodthirsty than the Jewish scriptures (Joshua 6 is plenty murderous) or the Christian (Matthew 10:34 is not exactly comforting).
Unfortunately a handful of troublemakers thrive among them, parasitically. They spew out hatred through Web sites. They seek to silence their critics, and to recruit impressionable young people. Perhaps it is unfair to confuse matters through calling the moderates and the militants by the same name. It would be more fitting to say that the latter are really Islamophobofascists.
Some might find the expression offensive. That is too bad. If we don’t resist Islamophobofascism now, its intolerance can only spread. And we all know who benefits from that. One name in particular comes to mind. It belongs to a fellow who is now presumably living in a cave, drawing up long-term plans for a clash of civilizations.....
Maybe I had better trim the satirical sails before going totally out to sea. As neologisms go, “Islamophobofascism” probably sounds even more stupid than the term it mocks. But there is a point to it.
“Islamofascism” is a noxious and counterproductive term -- a bludgeon disguised as an idea. Its use comes at a cost, even beyond the obvious one that goes with making people dumber. “Islamofascism” is the preferred term of those who don’t see any distinction between Al Qaeda, the Iranian mullahs, and the Baathists. Guess what? They are different, which might just have been worth understanding a few years ago. (Better late than never, maybe; but not a whole lot better.)
The more serious consequence, over the long term, is that of offering deliberate insult to those Muslims who would be put to the sword under the reign of Jihadi fundamentalists. Disgust for cheap stunts done in the name of “Islamofascism awareness” is not a matter of doubting that the jihadis mean what they say. On the contrary, it goes with taking them seriously as enemies.
It should not be necessary to qualify that last point. Somebody who wants to kill you is your enemy, whether you care to think in such terms or not; and the followers of Bin Laden, while subtle on some matters, have a least not been shy about letting us know what methods they consider permissible in pursuit of their ends. The jihadis mean it. Recognizing this is not a matter of Islamophobia; it is a matter of paying attention.
And paying attention means, in this case, recognizing that most Muslims are not our enemies. It is disgraceful to have to spell that out. But let’s be clear about something: The jihadis are not our only problem. As anyone from abroad who likes and respects Americans will probably tell you, we tend to be our own worst enemy.
There is a strain of nativism, xenophobia, and small-mindedness in American life that is always there -- often subdued, but never too far out of earshot. To call this our fascist streak would be absurdly melodramatic. Fascism proper was, above all, purposeful and orderly, while fear and loathing towards the “un-American” is often enough the woolliest form of baffled resentment: the effect of comfortable ignorance turning sour at any demand on its meager resources of attention and sympathy.
This quality can subsist for long periods in a dormant or distracted state -- expressing itself in muttering or small-scale acts of hostility, but nothing large-scale. Perhaps it is restrained by the better angels of our nature.
But it means that the unscrupulous and the obtuse have a ready supply of raw material to mold into something vile when the occasion becomes available, or if there is some profit in it. H.L. Mencken explained that a demagogue is “one who will preach doctrines he knows to be untrue to men he knows to be idiots." The problem with this definition, of course, is that it is the product of a simpler era and so not nearly cynical enough. For a demagogue now, truth and knowledge have nothing to do with it.
For the really suave expression of Islamophobofascism, however, no local sideshow can compete with an interview that the British novelist Martin Amis gave last year. At the highest stages of cosmopolitan literary influence, it seems, one may express ideas worthy of a manic loon phoning a radio talk-show and get them published in the London Times.
“There’s a definite urge -- don’t you have it? -- to say, ‘The Muslim community will have to suffer until it gets its house in order,’ ” Amis said. “What sort of suffering? Not letting them travel. Deportation -- further down the road. Curtailing of freedoms. Strip-searching people who look like they’re from the Middle East or from Pakistan.… Discriminatory stuff, until it hurts the whole community and they start getting tough with their children.”
The cultural theorist Terry Eagleton issued a response to Amis in the preface to a new edition of his book “Ideology: An Introduction” -- first published in 1991 by Verso, which reissued it a few weeks ago. It stirred up a tiny tempest in the British press, which reduced the argument to the dimensions of a clash between two “bad boys” (albeit ones grown quite long in the tooth).
Quickly mounting to impressive heights of inanity, the coverage and commentary managed somehow to ignore the actual substance of the dispute: what Amis said (his explicit call to persecute all Muslims until they acted right) and how Eagleton responded.
“Joseph Stalin seems not to be Amis’s favorite historical character,” wrote Eagleton, alluding to the novelist’s Koba the Dread, a venture into Soviet political history published a while back. “Yet there is a good dose of Stalinism in the current right-wing notion that a spot of rough stuff may be justified by the end in view. Not just roughing up actual or intending criminals, mind, but the calculated harassment of a whole population. Amis is not recommending such tactics for criminals or suspects only; he is recommending them as a way of humiliating and insulting certain kinds of men and women at random, so they will return home and teach their children to be nice to the White Man. There seems to be something mildly defective about this logic.”
Eagleton’s introduction doesn’t underestimate the virulence of the jihadists. But his remarks do at least have the good sense to acknowledge that humiliation is a weapon that will not work in the long run. (As an aside, let me note that some of us don't have the luxury of either ignoring terrorism or regarding it as something that will be abated by a more aggressive posture in the world. Life in Washington, D.C., for the past several years has meant rarely getting on the subway without wondering if this might be the day. The "surge" did not reduce the faint background radiation of dread one little bit. Funny how these thing work out, or don't.)
Anybody with an ounce of brains and responsibility can tell that fostering an environment of hysteria is useful only to one side of this conflict.“The best way to preserve one’s values,” writes Eagleton, “is to practice them.” Well said; and worth keeping in mind whenever the Islamophobofascists start to rush about, trying to drum up some business.
We shouldn't regard them as just nuisances. They are something much more dangerous. Determined to turn the whole world against us, they act as sleeper cells of malice and stupidity. There are sober ways to respond to danger, and insane ways. It is the demagogue’s stock in trade to blur the distinction.