History

Quote Unquote

Keeping a commonplace book -- a notebook for copying out the striking passages you’ve come across while reading -- was once a fairly standard practice, not just among professional scholars but for anyone who (as the expression went) “had humane letters.” Some people still do it, though the very idea of creating your own customized, hand-written anthology does seem almost self-consciously old-fashioned now. Then again, that may be looking at things the wrong way. When John Locke circulated his New Method of a Common-Place Book in the late 17th century, he wasn’t offering Martha Stewart-like tips on how to be genteel. He had come up with a system of streamlined indexing and text-retrieval -- a way to convert the commonplace book into a piece of low-tech software for smart people on the go.

There is a fairly direct line running from Locke’s efficiency-enhancing techniques to The Yale Book of Quotations, a handsome and well-indexed compilation just issued by Yale University Press. That line runs through the work of John Bartlett, the prodigious American bookworm whose recall of passages from literature made him semi-famous in Cambridge, Mass. even before he published a small collection of Familiar Quotations in 1855. He included more and more material from his own commonplace book in later editions, so that the book grew to doorstop-sized. I don’t know whether or not Bartlett had read Locke’s essay. But he did index the book in a manner the philosopher would have found agreeable.

Following his death in 1905, “Bartlett’s” has become almost synonymous with the genre of quotation-collection itself – a degree of modest immortality that might have surprised him. (Chances are, he expected to be remembered for the fact that his friend James Russell Lowell once published a poem about Bartlett’s skill as a fisherman.)

The new Yale collection follows Bartlett’s example, both in its indexing and in sheer heft. It is not just a compilation but a work of scholarship. The editor, Fred R. Shapiro, is an associate librarian and lecturer in legal research at the Yale Law School; and his edition of The Oxford Dictionary of American Legal Quotations is well-regarded by both lawyers and reference librarians. In The Yale Book of Quotations, he proves even more diligent than Bartlett was about finding the exact origins and wording of familiar quotations.

The classic line from Voltaire that runs “I disapprove of what you say, but I will defend to the death your right to say it” does not appear among the selections from Voltaire, for the simple reason that he never actually said it. (According to an article appearing in the November 1943 issue of Modern Language Notes, it was actually coined by one of Voltaire's biographers, S. G. Tallentre.) Shapiro finds that the principle later known as “Murphy’s Law” was actually formulated by George Orwell in 1941. (“If there is a wrong thing to do,” wrote Orwell, “it will be done, infallibly. One has come to believe in that as if it were a law of nature.”)

In his posthumously published autobiography, Mark Twain attributed the phrase “lies, damned lies, and statistics” to Benjamin Disraeli. But the saying has long been credited to Twain himself, in the absence of any evidence that Disraeli actually said it. Thanks to the digitized editions of old newspapers, however, Shapiro finds it attributed to the former British prime minister in 1895, almost 30 years before Twain’s book was published.

It turns out that Clare Boothe Luce’s most famous quip, “No good deed goes unpunished,” first recorded in 1957, was actually attributed to Walter Winchell 15 years earlier. And as Shapiro notes, there is evidence to suggest that it had been a proverb even before that. Likewise, it was  not Liberace who coined the phrase “crying all the way to the bank” but rather, again, Winchell. (Oddly enough, the gossip columnist -- a writer as colorful as he was callous -- does not get his own entry.)

The historical notes in small type -- elaborating on sources and parallels, and sometimes cross-referencing other quotations within the volume -- make this a really useful reference work. It is also a profitable (or at least entertaining) way to procrastinate.

At the same time, it is a book that would have bewildered John Bartlett – and not simply because it places less emphasis on classic literature than commonplace-keepers once did. The editor has drawn on a much wider range of sources than any other volume of quotations I’ve come across, including film, television, popular songs, common sayings, and promotional catchphrases. Many of the choices are smart, or at least understandable. The mass media, after all, serve as the shared culture of our contemporary Global Village, as Marshall McLuhan used to say.

But many of the entries are inexplicable -- and some of them are just junk. What possible value is there to a selection of 140 advertising slogans (“There’s something about an Aqua Velva man”) or 90 television catchphrases (“This is CNN”)? The entry for Pedro Almodavar, the Spanish director, consists entirely of the title of one of his films, Women on the Verge of a Nervous Breakdown. Why bother?

A case might be made for including the “Space, the final frontier...” soliloquy from the opening of Star Trek, as Shapiro does in the entry for Gene Roddenberry. He also cross-references it to a quotation from 1958 by the late James R. Killian, then-president of MIT, who defined space exploration as a matter of “the thrust of curiosity that leads me to try to go where no man has gone before.” So far, so good. But why also include the slightly different wordings used in the openings to The Wrath of Khan and Star Trek: The Next Generation?

The fact that quotations from Mae West run to more than one and a half pages is not a problem. They are genuinely witty and memorable. (e.g., “Between two evils, I always pick the one I’ve never tried before.”) But how is it that the juvenile lyrics of Alanis Mrrissette merit nearly as much space as the entry for Homer? (The one from Greece, I mean, not from Springfield.)

It is hard to know what to make of some of these editorial decisions. It’s as if Shapiro had included, on principle, a certain amount of the static and babble that fills the head of anyone tuned into the contemporary culture – “quotations” just slightly more meaningful than the prevailing media noise (and perhaps not even that).

But another sense of culture prevailed in Bartlett’s day -- one that Matthew Arnold summed up as a matter of “getting to know, on all the matters that concern us, the best which has been thought and said in the world.” That doesn’t mean excluding popular culture. The lines here from Billie Holiday, Bob Dylan, and "The Simpsons" are all worth the space they fill. But the same is not true of “Plop plopp, fizz fizz, oh what a relief it is."

All such griping aside, The Yale Book of Quotations is an absorbing reminder that all one’s best observations were originally made by someone else. And it includes a passage from Dorothy Sayer explaining how to benefit from this: “I always have a quotation for everything,” she wrote. “It saves original thinking.”

I had considered suggesting that it might make a good present for Christmas, Hanukkah, Festivus, etc. According to the publisher’s Web site, the first printing is already sold out. It is available in bookstores, however, and also from some online booksellers. Here’s hoping it goes through many editions -- so that Shapiro will get a chance to recognize that Eminem’s considerable verbal skills do not translate well into cold type.

Author/s: 
Scott McLemee
Author's email: 
scott.mclemee@insidehighered.com

Scott McLemee writes Intellectual Affairs each week. Suggestions and ideas for future columns are welcome.

Eros Unbound

Valentine’s Day seems an appropriate occasion to honor the late Gershon Legman, who is said to have coined the slogan “Make love, not war.” Odd to think that saying had a particular author, rather than being spontaneously generated by the countercultural Zeitgeist in the 1960s. But I've seen the line attributed to Legman a few times over the years; and the new Yale Book of Quotations (discussed in an earlier column) is even more specific, indicates that he first said it during a speech at Ohio University in Athens, Ohio, sometime in November 1963.

Legman, who died in 1999 at the age of 81, was the rare instance of a scholar who had less of a career than a profound calling -- one that few academic institutions in his day could have accommodated. Legman was the consummate bibliographer and taxonomist of all things erotic: a tireless collector and analyst of all forms of discourse pertaining to human sexuality, including the orally transmitted literature known as folklore. He was an associate of Alfred Kinsey during the 1940s, but broke with him over questions of statistical methodology. If it hadn’t been that, it would have been something else; by all accounts, Legman was a rather prickly character.

But it is impossible to doubt his exacting standards of scholarship after reading The Horn Book: Studies in Erotic Folklore and Bibliography (University Books, 1964) -- a selection of Legman's papers reflecting years of exploration in the “restricted” collections of research libraries. (At the Library of Congress, for example, you will sometimes find a title listed as belonging to “the Delta Collection,” which was once available to a reader only after careful vetting by the authorities. The books themselves have long since been integrated into the rest of the library’s holdings, but not-yet-updated catalog listings still occasionally reveal that a volume formerly had that alluring status: forbidden yet protected.) Legman approached erotic literature and "blue" folklore with philological rigor, treating with care songs and books that only ever circulated on the sly.  

Some of Legman's work appeared from commercial publishers and reached a nonscholarly audience. He assembled two volumes of obscene limericks, organized thematically and in variorum. The title of another project, The Rationale of the Dirty Joke, only hints at its terrible sobriety and analytic earnestness. Sure, you can skim around in it for the jokes themselves. But Legman’s approach was strictly Freudian, his ear constantly turned to the frustration, anxiety, and confusion expressed in humor.

Not all of his work was quite that grim. Any scholar publishing a book called Oragentialism: Oral Techniques in Genital Excitation may be said to have contributed something to the sum total of human happiness. The first version, devoted exclusively to cunnilingus, appeared from a small publisher in the 1940s and can only have had very limited circulation. The commercial edition published in 1969 expanded its scope -- though Legman (who in some of his writings comes across, alas, as stridently hostile to the early gay rights movement) seemed very emphatic in insisting that his knowledge of fellatio was strictly as a recipient.

Defensiveness apart, what’s particularly striking about the book is the degree to which it really is a work of scholarship. You have to see his literature review (a critical evaluation of the available publications on the matter, whether popular, professional, or pornographic, in several languages) to believe it. Thanks to Legman’s efforts, it is possible to celebrate Valentine’s Day with a proper sense of tradition.

Legman was a pioneer of cultural studies, long before anyone thought to call it that. He served as editor for several issues of Neurotica, a great underground literary magazine published between 1948 and 1952. Most of its contributors were then unknown, outside very small circles; but they included Allen Ginsberg, Anatole Broyard, Leonard Bernstein, and an English professor from Canada named Marshall McLuhan.

As the title may suggest, Neurotica reflected the growing cultural influence of Freud. But it also went against the prevalent tendency to treat psychoanalysis as a tool for adjusting misfits to society. The journal treated American popular culture itself as profoundly deranged; and in developing this idea, Legman served as something like the house theorist.

In a series of essays adapted from his pamphlet Love and Death (1948), Legman cataloged the seemingly endless sadism and misogyny found in American movies, comic books, and pulp novels. (Although Love and Death is long out of print, a representative excerpt can be found in Jeet Heer and Kent Worcester's collection Arguing Comics: Literary Masters on a Popular Medium, published by the University Press of Mississippi in 2004.)

Legman pointed out that huge profits were to be made from depicting murder, mutilation, and sordid mayhem. But any attempt at a frank depiction of erotic desire, let alone of sex itself, was forbidden. And this was no coincidence, he concluded. A taste for violence was being “installed as a substitute outlet for forbidden sexuality” by the culture industry.

Censorship and repression were warping the American psyche at its deepest levels, Legman argued. The human needs that ought to be met by a healthy sexual life came back, in distorted form, as mass-media sadism: "the sense of individuality, the desire for importance, attention, power; the pleasure in controlling objects, the impulse toward violent activity, the urge towards fulfillment to the farthest reaches of the individual’s biological possibilities.... All these are lacking in greater or lesser degree when sex is lacking, and they must be replaced in full.”

Replaced, that is, by the noir pleasures of the trashy pop culture available in the 1940s.

Here, alas, it proves difficult to accept Legman's argument in quite the terms framing it. His complaints about censorship and hypocrisy are easy to take for granted as justified. But the artifacts that filled him with contempt and rage -- Gone With the Wind, the novels of Raymond Chandler, comic books with titles like Authentic Police Cases or Rip Kirby: Mystery of the Mangler -- are more likely to fill us with nostalgia.

It's not that his theory about their perverse subtext now seems wrong. On the contrary, it often feels as if he's on to something. But while condemning the pulp fiction or movies of his day as symptomatic of a neurotic culture, Legman puts his finger right on what makes them fascinating now -- their nervous edge, the tug of war between raw lust and Puritan rage.

In any case, a certain conclusion follows from Legman’s argument -- one that we can test against contemporary experience.

Censorship of realistic depictions of sexuality will intensify the climate of erotic repression, thereby creating an audience prone to consuming pop-culture sadomasochism. If so, per Legman, then the easing or abolition of censorship ought to yield, over time, fewer images and stories centering on violence, humiliation, and so on.

Well, we know how that experiment turned out. Erotica is now always just a few clicks away (several offers are pouring into your e-mail account as you read this sentence). And yet one of the most popular television programs in the United States is a drama whose hero is good at torture .

They may have been on to something in the pages of Neurotica, all those decades ago, but things have gotten more complicated in the meantime.

As it happens, I’ve just been reading a manuscript called “Eros Unbound: Pornography and the Internet” by Blaise Cronin, a professor of information science at Indiana University at Bloomington, and former dean of its School of Information and Library Science. His paper will appear in The Internet and American Business: An Historical Investigation, a collection edited by William Aspray and Paul Ceruzzi scheduled for publication by MIT Press in April 2008.

Contacting Cronin to ask permission to quote from his work, I asked if he had any connection with the Kinsey Institute, also in Bloomington. He doesn’t, but says he is on friendly terms with some of the researchers there. Kinsey was committed to recording and tabulating sexual activity in all its forms. Cronin admits that he cannot begin to describe all the varieties of online pornography. Then again, he doesn’t really want to try.

“I focus predominantly on the legal sex industry,” he writes in his paper, “concentrating on the output of what, for want of a better term, might be called the respectable, or at least licit, part of the pornography business. I readily acknowledge the existence of, but do not dwell upon the seamier side, unceremoniously referred to by an anonymous industry insider as the world of ‘dogs, horses, 12-year old girls, all this crazed Third-World s—.’ ”

The notion of a “respectable” pornography industry would have seemed oxymoronic when Legman published Love and Death. It’s clearly much less so at a time when half the hotel chains in the United States offer X-rated films on pay-per-view. Everyone knows that there is a huge market for online depictions of sexual behavior. But what Cronin’s study makes clear is that nobody has a clue just how big an industry it really is. Any figure you might hear cited now is, for all practical purposes, a fiction.

The truth of this seems to have dawned on Cronin following the publication, several years ago, of “E-rogenous Zones: Positioning Pornography in the Digital Marketplace,” a paper he co-authored with Elizabeth Davenport. One of the tables in their paper “estimated global sales figures for the legal sex/pornography industry,” offering a figure of around $56 billion annually. That estimate squared with information gathered from a number of trade and media organizations. But much of the raw data had originally been provided by a specific enterprise -- something called the Private Media Group, Inc., which Cronin describes as “a Barcelona-based, publicly traded adult entertainment company.”

After the paper appeared in the journal Information Society in 2001, Cronin says, he was contacted “by Private’s investor relations department wondering if I could furnish the company with growth projections and other related information for the adult entertainment industry -- I, who had sourced some of my data from their Web site.” That estimate of $56 billion per year, based on research now almost a decade old, is routinely cited as if it were authoritative and up to date.

“Many of the numbers bandied about by journalists, pundits, industry insiders and market research organizations,” he writes, “are lazily recycled, as in the case of our aforementioned table, moving effortlessly from one story and from one reporting context to the next. What seem to be original data and primary sources may actually be secondary or tertiary in character.... Some of the startling revenue estimates and growth forecasts produced over the years by reputable market research firms ... have been viewed all too often with awe rather than healthy skepticism.”

Where Legman was, so to speak, an ideologue of sex, Blaise Cronin seems more scrupulously dispassionate. His manuscript runs to some 50 pages and undertakes a very thorough review of the literature concerning online pornography. (My wife, a reference librarian whose work focuses largely on developments in digital technology and e-commerce, regards Cronin’s paper as one of the best studies of the subject around.) He doesn't treat the dissemination of pornography as either emancipatory or a sign of decadence. It's just one of the facts of life, so to speak.

His paper does contain a surprise, though. It's a commonplace now that porn is assuming an increasingly ordinary role as cultural commodity -- one generating incalculable, but certainly enormous, streams of revenue for cable companies, Internet service providers, hotel chains, and so on. But the "mainstreaming" of porn is a process that works both ways. Large sectors of the once-marginal industry are morphing into something ever more resembling corporate America.

“The sleazy strip joints, tiny sex shops, dingy backstreet video stores and other such outlets may not yet have disappeared,” writes Cronin, “but along with the Web-driven mainstreaming of pornography has come -- almost inevitably, one has to say -- full-blown corporatization and cosmeticization.... The archetypal mom and pop business is being replaced by a raft of companies with business school-trained accountants, marketing managers and investment analysts at the helm, an acceleration of a trend that began at the tail-end of the twentieth century. As the pariah industry strives to smarten itself up, the language used by some of the leading companies has become indistinguishable from that of Silicon Valley or Martha Stewart. It is a normalizing discourse designed to resonate with the industry’s largely affluent, middle class customer base.”

As an example, he quotes what sounds like a formal mission statement at one porn provider’s website: “New Frontier Media, Inc. is a technology driven content distribution company specializing in adult entertainment. Our corporate culture is built on a foundation of quality, integrity and commitment and our work environment is an extension of this…The Company offers diversity of cultures and ethnic groups. Dress is casual and holiday and summer parties are normal course. We support team and community activities.”

That’s right, they have casual Fridays down at the porn factory. Also, it sounds like, a softball team.

I doubt very much that anybody in this brave new world remembers cranky old Gershon Legman, with his index cards full of bibliographical data on Renaissance handbooks on making the beast with two backs. (Nowadays, of course, two backs might be considered conservative.) Ample opportunity now exists to watch or read about sex. Candor seems not just possible but obligatory. But that does not necessarily translate into happiness -- into satisfaction of  "the urge towards towards fulfillment to the farthest reaches of the individual’s biological possibilities," as Legman put it.

That language is a little gray, but the meaning is more romantic than it sounds. What Legman is actually celebrating is the exchange taking place at the farthest reaches of a couple's biological possibilities: the moment when sex turns into erotic communion. And for that, broadband access is irrelevant. For that, you need to be really lucky.

Author/s: 
Scott McLemee
Author's email: 
scott.mclemee@insidehighered.com

Party in the Streets

During the first administration of Franklin Delano Roosevelt (or so goes a story now making the rounds of American progressives), the president met with a group of citizens who urged him to seize the moment. Surely it was time for serious reforms: The Depression made it impossible to continue with business as usual. Just what measures the visitors to the Oval Office proposed -- well, that is not clear, at least from the versions I have heard. Perhaps they wanted laws to regulate banking, or to protect the right of labor unions to organize, or to provide income help for the aged. Maybe all of the above.

The president listened with interest and evident sympathy. As the meeting drew to a close, Roosevelt thanked his guests, expressing agreement with all they had suggested. “So now,” he told them on their way out the door, “go out there and make me do it.”

This is less a historical narrative, strictly speaking, than an edifying tale. Its lesson is simple. Even with wise and trustworthy leadership holding power -- perhaps especially then -- you must be ready to apply pressure from below. (The moral here is not especially partisan, by the way. One can easily imagine conservative activists spurring one another on with more or less the same story, with Ronald Reagan assuming the star role.)

I recalled this anecdote on Saturday after meeting Michael T. Heaney, an assistant professor of political science at the University of Florida. He stopped by for a visit after spending the afternoon collecting data at the antiwar demonstration here in Washington.

For the past few years, Heaney has been collaborating with Fabio Rojas,  an assistant professor of sociology at Indiana University, on a study of the turnout at major national antiwar protests. With the help of research assistants, they have done surveys of some 3,550 randomly selected demonstrators. (That figure includes the 350 surveys gathered this weekend.) Their research has already yielded two published papers, available here and here, with more now in the works.

We’ll go over some of their findings in a moment. But a remark that Heaney made in conversation resonated with that fable about the New Deal era, and it provides a context for understanding the work he and Rojas have been doing.

“Political scientists are good at analyzing how established institutions function,” he said. “We have the tools for that, and the tools work really well. But there is very strong resistance to studying informal organizations or to recognizing them as part of the political landscape.”

In the course of thinking over their research, Rojas and Heaney have improvised a concept they call “the party in the street” -- that segment of a political party that, to borrow FDR’s (possibly apocryphal) injunction, gets out there and pushes.

Party affiliation was only one of the questions asked during the survey, which also gathered information about a demonstrator’s age, gender, ethnicity, zip code, membership in non-political organizations, and how he or she heard about the protest. (The form allowed responders to remain anonymous.)

“We attended or sent proxies to all major protests during a one-year period, from August 2004 until September 2005,” Heaney told me, “and we’ve coded all those surveys. We’ve also collected surveys at other demonstrations since then, including roughly a thousand responses just in 2007.”

The researchers attended demonstrations sponsored by each of the two major coalitions organizing them, United for Peace and Justice (UFPJ) and Act Now to Stop War and End Racism (ANSWER). The two coalitions have been at odds with one another for years, but worked together to organize the September 2005 protest in Washington before going their separate ways again. “We couldn’t have planned this,” as Heaney puts it, “but now we have data from each stage – when the two coalitions were in conflict, when they worked together, and then again after they parted.”

During the September 2005 activities, Rojas and Heaney gathered information both from those  who attended a large open-air protest and from the thousand or so people who stuck around to lobby members of Congress two days later.

Their survey data also cover demonstrations in the months before and after the midterm elections in November, though most of those results remain to be processed.

“I’ve been shocked at how few academics have paid attention to the antiwar movement,” Heaney told me. “When we first went out to do a survey at a demonstration, I sort of expected to find other political scientists doing research too. But apart from a couple of people in sociology, there doesn’t seem to be much else happening so far.”

I asked if they had met with much suspicion in the course of their research -- people refusing to take the survey for fear of being, well, surveilled.

“No,” he said, “the response rate has been very high. There hasn’t been much paranoia. The temper isn’t like it was after 9/11. People don’t feel as much like the government is out to get them. And fear on the part of the police has gone down too. Now they don’t seem as concerned that a protest is going to turn into a terrorist act.”

The survey results from demonstrations in 2004 and 2005 showed that “40% of activists within the antiwar movement describe themselves as Democrats, 39% identify as independents (i.e. they list no party affiliation), 20% claim membership in a third party, and only 2% belong to the Republican party.”

Some of their findings confirm things one might predict from a simple deduction. Protestors who identified as members of the Democratic Party were more likely to stay in town to lobby their members of Congress than those who didn’t, for example.

Likewise, the researchers found that Democratic members of Congress “are more likely to meet with antiwar lobbyists than are Republicans, other things being equal.... Members of Congress who had previously expressed high levels of support for antiwar positions were more likely to meet with lobbyists than those whose support had been weak or nonexistent.”

Other results were more interesting. Protestors who belonged to “at least one civic, community, labor, or political organization” proved to be 17 percent more likely to lobby. People who turned out for the demonstration after being contacted by an organization were 13 percent more likely to lobby – while those who found about the event only through the mass media were 16 percent less likely to go to Capitol Hill.

The contemporary antiwar movement has a “distinctly bimodal” distribution with respect to age. In other words, there are two significant cohorts, one between the ages of 18 and 27, the other between 46 and 67, “with relatively fewer participants outside these ranges.”

Each birthday added “about 1 percent to an individual’s willingness to lobby when all other variables are held at their means or modes,” report Heaney and Rojas in a paper for the journal American Politics Research. “We did not find that sex, race, or occupational prestige make a difference in an individual’s propensity to lobby.”

In conversation, Heaney also mentioned a provisional finding that they are now double-checking. “The single strongest predictor of lobbying was whether an individual had been involved in the movement against the Vietnam War.”

It was while attending a demonstration outside the Republican National Convention in New York in 2004 that Heaney came up with an expression that has somewhat complicated the reception of this research among his colleagues. The city’s labor unions had turned out a large and obstreperous crowd to express displeasure with the president.  The crowd was overwhelmingly likely to vote for Democratic candidates, but Heaney was struck by the thought that it was a very different gathering from the one he expected would assemble before long at a Democratic national convention.

“I thought: this is more like a festival,” he told me. “It’s the Democratic Party. But it’s also the party having a party...in the street.”

This phrase – “the party in the street” – had a special overtone for Heaney as a political scientists, given one familiar schema used in analyzing American politics. In his profession, it is common to speak of a major party as having three important sectors: “the party in government,” “the party in the electorate,” and “the party as organization.”

The idea that mass movements might constitute a fourth sector of the party – with the Christian Right, for example, being a component of the Republican “party in the street” – might seem self-evident in some ways. But not so for political scientists, it seems. “We met a lot of resistance to the idea of the ‘party in the street,’” Heaney told me, “and to the idea that [it might apply] to the Republicans as well.” The paper in which Heaney and Rojas first referred to “the party in the street” ended up going to three different journals -- with substantial revisions along the way – before it was accepted for publication in American Politics Research.

Speaking of the antiwar protests as manifestations of the Democratic “party in the street” will also meet resistance from many activists. (A catchphrase of the hard left is that the Democratic Party is “the graveyard of mass movements.”) And according to their own surveys, Heaney and Rojas find that just over one fifth of demonstrators see themselves as clearly outside its ranks.

But that still leaves the majority of antiwar activists as either identifying themselves as Democrats or at least willing to vote for the party. “Like it or not,” write Heaney and Rojas, “their moral and political struggles are within or against the Democratic Party; it actions and inactions construct opportunities for and barriers to the achievement of their issue-specific policy goals.” (Though Heaney and Rojas don’t quote Richard Hofstadter, their analysis implicitly accepts the historian’s famous aphorism that American third parties “are like bees: they sting once and die.”)

“We do not claim,” they take care to note, “that the party in the street has equal standing with the party in government, the party in the electorate, or the party as organization. We are not asserting that the formal party organization is coordinating these activities. The party in the street lacks the stability possessed by other parts of the party because it is not supported by enduring institutions. Furthermore, it is small relative to other parts of the party and at times may be virtually nonexistent.”

As Heaney elaborated when we met, a great deal of the organizing work of the antiwar “party” is conducted by e-mail – a situation that makes it much easier for groups with a small staff to reach a large audience. But that also makes for somewhat shallow or episodic involvement in the movement on the part of many participants. An important area for study by political scientists might be the relationship between the emerging zone of activist organizations and the informal networks of campaign consultants, lobbyists, financial contributors, and activists” shaping the agenda of other sectors of political parties. “If they remain well organized and attract enthusiastic young activists,” write Rojas and Heaney, “then the mainstream political party is unable to ignore them for long.” 

Studying the antiwar movement has not exhausted the attention of either scholar. Heaney is working on a book about Medicare, while Rojas is the author of From Black Power to Black Studies: How a Radical Social Movement Became an Academic Discipline, forthcoming from Johns Hopkins University Press. But now they have an abundance of data to analyze, and expect to finish four more papers over the next few months. In addition to crunching more than three years’ worth of survey data, Heaney and Rojas have been examining the antiwar movement’s publications online and observing in person how protests are organized.

I scribbled down working titles and thumbnail descriptions of the papers in progress as Heaney discussed them. So here, briefly, is an early report on some research you may hear pundits refer to knowingly some months from now....

“Mobilizing the Antiwar Movement” will analyze how organizations get people to turn out and which kinds of groups are most successful at it. “Network Dynamics of the Antiwar Movement” will consider how different groups interact at events and how those interactions have changed over time. “Leaders and Followers in the Antiwar Movement” will examine the survey data gathered at large protests, comparing and contrasting it with information about activists who participate in smaller workshops or training exercises for committed activists.

Finally, “Coalition Dissolution in the Antiwar Movement” will look at tensions within the organizing efforts. “There has been some work in sociology on coalition building,” as Heaney explained, “but there’s been almost none on how they fall apart.”

It’s worth repeating that all of this work on the antiwar “party in the street” could just as well inspire research on the relationship between conservative movements and the Republican Party. Perhaps someone will eventually write a paper called “Coalition Dissolution in the Christian Right.” I say that purely in the interests of scholarship, of course, and with no gloating at the prospect whatsoever.

                                       

Author/s: 
Scott McLemee
Author's email: 
scott.mclemee@insidehighered.com

Scott McLemee writes Intellectual Affairs each week. Suggestions and ideas for future columns are welcome.

Hard Wordes in Plaine English

Longtime readers of Intellectual Affairs may recall that this column occasionally indulges in reference-book nerdery. So it was a pleasant but appropriate surprise when the Bodleian Library of the University of Oxford  provided a copy of its new edition of the very first dictionary of the English language. It has been out of print for almost 400 years, and the Bodleian is now home to the one known copy of it to have survived.

Available now as The First English Dictionary, 1604 (distributed by the University of Chicago Press), the work was originally published under the title A Table Alphabeticall. It was compiled in the late 16th century by one Robert Cawdrey. The book did not bring him fame or fortune, but it went through at least two revised editions within a decade. That suggests there must have been a market for Cawdrey’s guide to what the title page called the “hard usuall English wordes” that readers sometimes encountered “in Scripture, Sermons, or elswhere.”

Cawdrey had the misfortune, unlike fellow lexicographer Samuel Johnson, of never meeting his Boswell. Yet he had an eventful career – enough to allow for a small field of Cawdrey studies. An interesting introduction by John Simpson, the chief editor of the Oxford English Dictionary, sums up what is known about Cawdrey and suggests ways in which his dictionary may contain echoes of his life and times.

At the risk of being overly present-minded, there’s a sense in which Cawdrey was a pioneer in dealing with the effects of his era’s information explosion. Thanks to the printing press, the English language was undergoing a kind of mutation in the 16th century.

New words began to circulate in the uncharted zone between common usage and the cosmopolitan lingo of sophisticated urbanites who traveled widely. Learned gentlemen were  traveling to France and Italy and coming back “to powder their talk with over-sea language,” as Cawdrey noted. Some kinds of “academicke” language (glossed by Cawdrey as “of the sect of wise and learned men”) were gaining wider usage. And readers were encountering words like “crocodile” and “akekorn” which were unfamiliar. Cawdrey’s terse definitions of them as “beast” and “fruit,” respectively, suggest he probably had seen neither.

Booksellers had offered lexicons of ancient and foreign languages. And there were handbooks explaining the meaning of specialized jargon, such as that used by lawyers. But it was Cawdrey’s bright idea that you might need to be able to translate new-fangled English into a more familiar set of “plaine English words.”

Cawdrey also found himself in the position of needing to explain his operating system. “To profit by this Table,” as he informed the “gentle Reader” in a note, “thou must learn the Alphabet, to wit, the order of the Letters as they stand....and where every Letter standeth.” Furthermore, you really needed to have it down cold. A word beginning with the letters “ca,” he noted, would appear earlier than one starting with “cu.” After using the “Table” for a while, you probably got the hang of it.

Who was this orderly innovator? Cawdrey, born in the middle of England sometime in the final years of Henry VIII, seems not to have attended Oxford or Cambridge. But he was learned enough to teach and to preach, and came to enjoy the patronage of a minister to Queen Elizabeth. He married, and raised a brood of eight children. In a preface to the dictionary, Cawdrey acknowledges the assistance of “my sonne Thomas, who now is Schoolmaister in London.”

Cawdrey published volumes on religious instruction and on the proper way to run a household so that each person knew his or her proper place. He also compiled “A Treasurie or store-house of similies both pleasant, delightfull, and profitable, for all estates of men in generall.” (Such verbosity was quite typical of book titles at the time. The full title page for his dictionary runs to about two paragraphs.)

Whatever his chances for mobility and modest renown within the Elizabethan intelligentsia were severely limited, however, given his strong religious convictions. For Cawdrey was a Puritan – that is, someone convinced that too many of the old Roman Catholic ways still clung to the Church of England.

Curious whether "Puritan" (a neologism with controversial overtones) appeared in dictionary, I looked it up. It isn’t there. But Cawrey does have “purifie,” meaning “purge, scoure, or make cleane” -- which is soon followed by “putrifie, to waxe rotten, or corrupted as a sore.” By the 1580s, Cawdrey had both words very much in mind when he spoke from the pulpit. When he was called before church authorities, one of the complaints was that he had given a sermon in which he had “depraved the Book of Common Prayer, saying, That the same was a Vile Book and Fy upon it.” He was stripped of his position as minister.

But Cawdrey did not give up without a fight. He appealed the sentence, making almost two dozen trips to London to argue that it was invalid under church law. All to no avail. He ignored hints from well-placed friends that he might get his job back by at least seeming to go along with the authorities on some  points. For that matter, he continued to sign his letters as if he were the legitimate pastor of his town.

No doubt Cawdrey retained a following within the Puritan underground, but he presumably had to go back to teaching to earn a living. Details about his final years are few. It isn’t even clear when Cawdrey died. He would have been approaching 70 when his dictionary appeared, and references in reprints of his books a few years later imply that they were revised posthumously.

In his introductory essay, John Simpson points out that the OED now lists 60,000 words that are known to have been in use in English around the year 1600. Cawdrey defines about 2,500 of them. “We should probably assume that he was unable to include as many words as he would have liked,” writes Simpson, “in order to keep his book within bounds. It was, after all, an exploratory venture.”

But that makes the selection all the more interesting. It gives you a notion of what counted as a “hard word” at the time. Most of them are familiar now from ordinary usage, though not always in quite the sense that Cawdrey indicates. He gives the meaning of “decision” as “cutting away,” for example. Tones of the preacher can be heard in his slightly puzzling definition of “curiositie” as “picked diligence, greater carefulnes, then is seemly or necessarie.”

Given his Puritan leanings, it is interesting to see that the word “libertine” has no specifically erotic overtones for Cawdrey. He defines the word applying to those “loose in religion, one that thinks he may doe as he listeth.” One of the longest entries is for “incest,” explained as “unlawfull copulation of man and woman within the degrees of kinred, or alliance, forbidden by Gods law, whether it be in marriage or otherwise.”

It is a commonplace of much recent scholarship that, prior to the mania for categorizing varieties of sexual desire that emerged in the 19th century, the word “sodomy” covered a wide range of non-procreative acts, heterosexual as well as homosexual. Cawdrey, it seems, didn’t get the memo. He defines “sodomitrie” as “when one man lyeth filthylie with another man.” Conversely, and rather more puzzling, is his definition of “buggerie” (which one might assume to be a slang term for a rather specific act) as “conjunction with one of the same kinde, or of men with beasts.”

In a few entries, one detects references to Cawdrey’s drawn-out legal struggle of the 1580s and '90s. He explains that a "rejoinder" is “a thing added afterwards, or is when the defendant maketh answere to the replication of the plaintife.” So a rejoinder is a response, perhaps, to “sophistikation” which Cawdrey defines as “a cavilling, deceitful speech.”

Especially pointed and poignant is the entry for “temporise,” meaning “to serve the time, or to follow the fashions and behaviour of the time.” Say what you will about Puritan crankiness, but Robert Cawdrey did not “temporise.”

Particularly interesting to note are entries hinting at how the “new information infrastructure” (circa 1600) was affecting language. The expense of producing and distributing literature was going down. “Literature,” by the way, is defined by Cawdrey here as “learning.” Cawdrey includes a bit of scholarly jargon, “abstract,” which he explains means “drawne away from another: a litbooke or volume prepared out of a greater.”

Some of the words starting to drift into the ken of ordinary readers were derived from Greek, such as “democracie, a common-wealth gouerned by the people” and “monopolie, a license that none shall buy and sell a thing, but one alone.” Likewise with terms from the learned art of rhetoric such as “metaphor,” defined as "similitude, or the putting over of a word from his proper and naturall signification, to a forraine or unproper signification.”

Cawdrey’s opening address “To the Reader” is a manifesto for the Puritan plain style. Anyone seeking “to speak publiquely before the ignorant people,” he insists, should “bee admonished that they never affect any strange inkhorne termes, but labour to speake so as is commonly received, and so as the most ignorant may well understand them.”

At the same time, some of the fancier words were catching on. The purpose of the dictionary was to fill in the gap between language that “Ladies, Gentlewomen, or any other unskilfull persons” might encounter in their reading and what they could readily understand. (At this point, one would certainly like to know whether Cawdrey taught his own three daughters how to read.) Apart from its importance to the history of lexicography, this pioneering reference work remains interesting as an early effort to strike a balance between innovation and accessibility in language use.

“Some men seek so far for outlandish English,” the old Puritan divine complains, “that they forget altogether their mothers language, so that if some of their mothers were alive, they were not able to tell, or understand what they say.” Oh Robert Cawdrey, that thou shouldst be alive at this hour!

Author/s: 
Scott McLemee
Author's email: 
scott.mclemee@insidehighered.com

Scott McLemee writes Intellectual Affairs each week. Suggestions and ideas for future columns are welcome.

Digital Masonry

Jacques-Alain Miller has delivered unto us his thoughts on Google. In case the name does not signify, Jacques-Alain Miller is the son-in-law of the late Jacques Lacan and editor of his posthumously published works. He is not a Google enthusiast. The search engines follows “a totalitarian maxim,” he says. It is the new Big Brother. “It puts everything in its place,” Miller declares, “turning you into the sum of your clicks until the end of time.”

Powerful, then. And yet – hélas! – Google is also “stupid.” It can “scan all the books, plunder all the archives [of] cinema, television, the press, and beyond,” thereby subjecting the universe to “an omniscient gaze, traversing the world, lusting after every little last piece of information about everyone.” But it “is able to codify, but not to decode. It is the word in its brute materiality that it records.” (Read the whole thing here. And for another French complaint about Google, see this earlier column.)

When Miller pontificates, it is, verily, as a pontiff. Besides control of the enigmatic theorists’s literary estate, Miller has inherited Lacan’s mantle as leader of one international current in psychoanalysis. His influence spans several continents. Within the Lacanian movement, he is, so to speak, the analyst of the analysts’ analysts.

He was once also a student of Louis Althusser, whose seminar in Paris during the early 1960s taught apprentice Marxist philosophers not so much to analyze concepts as to “produce” them. Miller was the central figure in a moment of high drama during the era of high structuralism. During Althusser’s seminar, Miller complained that he had been busy producing something he called “metonymic causality” when another student stole it. He wanted his concept returned. (However this conflict was resolved, the real winner had to be any bemused bystander.)

Miller is, then, the past master of a certain mode of intellectual authority – one that has been deeply shaped by (and is ultimately inseparable from) tightly restricted fields of communication and exchange.

Someone once compared the Lacanian movement to a Masonic lodge. There were unpublished texts by the founder that remained more than usually esoteric: they were available in typescript editions of just a few copies, and then only to high-grade initiates.

It is hard to imagine a greater contrast to that digital flatland of relatively porous discursive borders about which Miller complains now. As well he might. (Resorting to Orwellian overkill is, in this context, probably a symptom of anxiety. There are plenty of reasons to worry and complain about Google, of course. But when you picture a cursor clicking a human face forever, it lacks something in the totalitarian-terror department.)

Yet closer examination of Miller’s pronouncement suggests another possibility. It isn’t just a document in which hierarchical intellectual authority comes to terms with the Web's numbskulled leveling. For the way Miller writes about the experience of using Google is quite revealing -- though not about the search engine itself.

“Our query is without syntax,” declares Miller, “minimal to the extreme; one click ... and bingo! It is a cascade -- the stark white of the query page is suddenly covered in words. The void flips into plenitude, concision to verbosity.... Finding the result that makes sense for you is therefore like looking for a needle in a haystack. Google would be intelligent if it could compute significations. But it can’t.”

In other words, Jacques-Alain Miller has no clue that algorithms determine the sequence of hits you get back from a search. (However intelligent Google might or might not be, the people behind it are quite clearly trying to “compute significations.”) He doesn’t grasp that you can shape a query – give it a syntax – to narrow its focus and heighten its precision. Miller’s complaints are a slightly more sophisticated version of someone typing “Whatever happened to Uncle Fred?” into Google and then feeling bewildered that the printout does not provide an answer.

For an informed contrast to Jacques-Alain Miller’s befuddled indignation, you might turn to Digital History Hacks, a very smart and rewarding blog maintained by William J. Turkel, an assistant professor of history at the University of Western Ontario. (As it happens, I first read about Miller in Psychoanalytic Politics: Jacques Lacan and Freud's French Revolution by one Sherry Turkle. The coincidence is marred by a slip of the signfier: they spell their names differently.)

The mandarin complaint about the new digital order is that it lacks history and substance, existing in a chaotic eternal present – one with no memory and precious little attention span. But a bibliographical guide that Turkel posted in January demonstrates that there is now extensive enough literature to speak of a field of digital history.

The term has a nice ambiguity to it – one that is worth thinking about. One the one hand, it can refer to the ways historians may use new media to do things they’ve always done – prepare archives, publish historiography, and so on. Daniel J. Cohen and Roy Rosenzweig’s Digital History: A Guide to Gathering, Preserving, and Presenting the Past on the Web (University of Pennsylvania Press, 2006) is the one handbook that ought to be known to scholars even outside the field of history itself. The full text of it is available for free online from the Center for History and New Media at George Mason University, which also hosts a useful selection of essays on digital history.

But as some of the material gathered there shows, digitalization itself creates opportunities for new kinds of history – and new problems, especially when documents exist in formats that have fallen out of use.

Furthermore, as various forms of information technology become more or more pervasive, it makes sense to begin thinking of another kind of digital history: the history of digitality.
Impressed by the bibliography that Turkel had prepared – and by the point that it now represented a body of work one would need to master in order to do graduate-level work in digital history – I contacted him by e-mail to get more of his thoughts on the field.

“Digital history begins,” he says, “with traditional historical sources represented in digital form on a computer, and with 'born-digital' sources like e-mail, text messages, computer code, video games and digital video. Once you have the proper equipment, these digital sources can be duplicated, stored, accessed, manipulated and transmitted at almost no cost. A box of archival documents can be stored in only one location, has to be consulted in person, can be used by only a few people at a time, and suffers wear as it is used. It is relatively vulnerable to various kinds of disaster. Digital copies of those documents, once created, aren't subject to any of those limitations. For some purposes you really need the originals (e.g., a chemical analysis of ink or paper). For many or most other purposes, you can use digital representations instead. And note that once the chemical analysis is completed, it too becomes a digital representation.”

But that’s just the initial phase, or foundation level, of digital history – the scanning substratum, in effect, in which documents become more readily available. A much more complex set of questions come up as historians face the deeper changes in their work made possible by a wholly different sort of archival space – what Roy Rosenzweig calls the "culture ofabundance" created by digitality.

“He asks us to consider what it would mean to try and write history with an essentially complete archival record,” Turkel told me. “I think that his question is quite deep because up until now we haven't really emphasized the degree to which our discipline has been shaped by information costs. It costs something (in terms of time, money, resources) to learn a language, read a book, visit an archive, take some notes, track down confirming evidence, etc. Not surprisingly, historians have tended to frame projects so that they could actually be completed in a reasonable amount of time, using the availability and accessibility of sources to set limits.”

Reducing information costs in turn changes the whole economy of research – especially during the first phase, when one is framing questions and trying to figure out if they are worth pursuing.

“If you're writing about a relatively famous person,” as Turkel put it, “other historians will expect you to be familiar with what that person wrote, and probably with their correspondence. Obviously, you should also know some of the secondary literature. But if you have access to a complete archival record, you can learn things that might have been almost impossible to discover before. How did your famous person figure in people's dreams, for example? People sometimes write about their dreams in diaries and letters, or even keep dream journals. But say you wanted to know how Darwin figured in the dreams of African people in the late 19th century. You couldn't read one diary at a time, hoping someone had had a dream about him and written it down. With a complete digital archive, you could easily do a keyword search for something like "Darwin NEAR dream" and then filter your results.”
As it happens, I conducted this interview a few weeks before coming across Jacques-Alain Miller’s comments on Google. It seems like synchronicity that Turkel would mention the possibility of digital historians getting involved in the interpretation of dreams (normally a psychoanalyst’s preserve). But for now, it sounds as if most historians are only slightly more savvy about digitality than the Lacanian Freemasons.

“All professional historians have a very clear idea about how to make use of archival and library sources,” Turkel says, “and many work with material culture, too. But I think far fewer have much sense of how search engines work or how to construct queries. Few are familiar with the range of online sources and tools. Very few are able to do things like write scrapers, parsers or spiders.”

(It pays to increase your word power. For a quick look at scraping and parsing, start here. For the role of spiders on the Web, have a look at this.)

“I believe that these kind of techniques will be increasingly important,” says Turkel, “and someday will be taken for granted. I guess I would consider digital history to have arrived as a field when most departments have at least one person who can (and does) offer a course in the subject. Right now, many departments are lucky to have someone who knows how to digitize paper sources or put up web pages.”

Author/s: 
Scott McLemee
Author's email: 
info@insidehighered.com

Scott McLemee writes Intellectual Affairs each week. Suggestions and ideas for future columns are welcome.

Speak, Memory

Last week,Intellectual Affairs took up the topic of what might be called scandal-mania -- the never-ending search for shock, controversy, and gratifying indignation regarding our “master thinkers.” Unfortunately there haven’t been enough “shocking revelations” recently to keep up with the demand. So the old ones are brought out of mothballs, from time to time.

A slightly different kind of case has come up recently involving Zygmunt Bauman, who is emeritus professor of sociology at the University of Leeds and the University of Warsaw. Bauman is a prolific author with a broad range of interests in social theory, but is probably best known for a series of books and essays analyzing the emergence of the new, increasingly fluid and unstable forms of cultural and social order sometimes called “postmodernism.”

No doubt that fact alone will suffice to convince a certain part of the public that he must be guilty of something. Be that as it may, Bauman is not actually a pomo enthusiast. While rejecting various strands of communitarianism, he is quite ambivalent about the fragmentation and confusion in the postmodern condition. His book Liquid Times: Living in an Age of Uncertainty, just issued by Polity, is quite typical of his work over the past few years -- a mixture of social theory and cultural criticism, sweeping in its generalizations but also alert to the anxieties one sees reflected in the newspaper and on CNN.

In March, a paragraph concerning Bauman appeared at Sign and Sight, a Web site providing capsule summaries in English of the Feuilletons (topical cultural articles) appearing in German newspapers and magazines. It noted the recent publication in the Frankfurter Allgemeine Zeitung of an article by a Polish historian named Bogdan Musial. The piece “uncovers the Stalinist past of the world famous sociologist,” as Sign and Sight put it.

It also quoted a bit of the article. "The fact is that Bauman was deeply involved with the violent communist regime in Poland for more than 20 years,” in Musial’s words, “fighting real and supposed enemies of Stalinism with a weapon in his hand, shooting them in the back. His activities can hardly be passed off as the youthful transgressions of an intellectual seduced and led astray by communist ideology. And it is astonishing that Bauman, who so loves to point the finger, does not reflect on his own deeds."

A few weeks later, another discussion of the matter appeared in The Irish Times -- this one by Andreas Hess, a senior lecturer in sociology at the University of Dublin. The piece bore what seems, with hindsight, the almost inevitable title of “Postmodernism Made Me Do It: A World Without Blame.” (The article is not available except to subscribers, but I’ll quote from a copy passed along by a friend.)

Summing up the charges in the German article, Hess said that secret files recently declassified in Poland revealed that Bauman “participated in operations of political cleansing of alleged political opponents in Poland between 1944 and 1954. The Polish files also show Bauman was praised by his superiors for having been quite successful in completing the tasks assigned, although he seems, as at least one note suggests, not to have taken any major part in direct military operations because of his ‘Semitic background.’ However, to be promoted to the rank of major at the youthful age of 23 was quite an achievement. As the author of the article [in the German newspaper] pointed out, Bauman remained a faithful member of the party apparatus.”

Hess goes on to suggest that “Bauman’s hidden past” is the key to his work as “one of the prophets of postmodernism.” This is not really argued so much as asserted -- and in a somewhat contradictory way.

On the one hand, it is implied that Bauman has used postmodern relativism as a way to excuse his earlier Stalinist crimes. Unfortunately for this argument, Bauman is actually a critic of postmodernism. And so, on the other hand, the sociologist is also guilty of attacking Western society by denouncing postmodernity. Whether or not this is a coherent claim, it points to some of what is at issue in the drama over “Bauman’s secret Stalinism,” as it’s called.

Now, I do not read German or Polish -- a decided disadvantage in coming to any sense of how the controversy has unfolded in Europe. Throughout the former Soviet sphere of influence, a vast and agonizingly complex set of problems has emerged surrounding “lustration” -- the process of "purifying" public life by legally disqualifying those who collaborated with the old Communist regimes from serving in positions of authority.

Debates over the politicized use of lustration in Poland have gone on for years. “What may look like an effort to reconcile with the Communist past,” wrote one Polish legal scholar not long ago, “is something else entirely. It is an assault on reconciliation and a generational bid for power.” There are bound to be implications to Bauman’s lustration that will be lost on those of us looking at it from a distance.

But let’s just look at the matter on purely in terms of the academic scandal we’ve been offered. I have read some of Bauman’s work, but not a lot. Under the circumstances that may be an advantage. I am not a disciple – and by no means feel committed to defending him, come what may.

If he has hidden his past, then its revelation is a necessary thing. But then, that is the real issue at stake. Everything turns on that “if.”

What did we know about Zygmunt Bauman before the opening of his files? What could be surmise about his life based on interviews, his bibliographical record, and books about him readily available at a decent university library?

One soon discovers that “Bauman’s hidden past” was very badly hidden indeed. He has never published a memoir about being a Stalinist -- nor about anything else, so far as I know -- but he has never concealed that part of his life either. The facts can be pieced together readily.

He was born in Poland in 1925 and emigrated to the Soviet Union with his family at the start of World War II. This was an altogether understandable decision, questions of ideology aside. Stalin’s regime was not averse to the occasional half-disguised outburst of anti-Semitism, but that was not the central point of its entire agenda, at least; so it is hardly surprising that a Jewish family might respond to the partition of Poland in 1939 by heading East.

Bauman studied physics and dreamed, he says, of becoming a scientist. He served as a member of the Polish equivalent of the Red Army during the war. He returned to his native country as a fervent young Communist, eager, he says, to rebuild Poland as a modern, egalitarian society – a “people’s democracy” as the Stalinist lingo had it. His wife Janina Bauman, in her memoir A Dream of Belonging: My Years in Postwar Poland (Virago, 1988) portrays him as a true believer in the late 1940s and early 1950s.

But there is no sense overstressing his idealism. To have been a member of the Polish United Workers Party was not a matter of teaching Sunday school classes on Lenin to happy peasant children. Bauman would have participated in the usual rounds of denuciation, purge, “thought reform,” and rationalized brutality. He was also an officer in the Polish army. The recent revelations specify that he belonged to the military intelligence division -- making him, in effect, part of the secret police.

But the latter counts a “revelation” only to someone with no sense of party/military relations in the Eastern bloc. Not every member of the military was a Communist cadre -- and an officer who was also a member of the party had a role in intelligence-gathering, more or less by definition.

But a Jewish party member was in a precarious position – again, almost by definition. In 1953, he was forced out of the army during one of the regime’s campaigns against “Zionists” and “cosmopolitans.” He enrolled in the University of Warsaw and retrained as a social scientist. He began to research on the history of the British Labour Party and the development of contemporary Polish society.

One ought not to read too much dissidence into the simple fact of doing empirical sociology. Bauman himself says he wanted to reform the regime, to bring it into line with its professed egalitarian values. And yet, under the circumstances, becoming a sociologist was at least somewhat oppositional a move. He published articles on alienation, the problems of the younger generation, and the challenge of fostering innovation in a planned economy.

And so he remained loyal to the regime -- in his moderately oppositional fashion -- until another wave of official anti-Semitism in 1968 made this impossible. In her memoir, Janina Bauman recalls their final weeks in Poland as a time of threatening phone calls, hulking strangers loitering outside their apartment, and TV broadcasts that repeated her husband’s name in hateful tones. “A scholarly article appeared in a respectable magazine,” she writes. “It attacked [Zygmunt] and others for their dangerous influence on Polish youth. It was signed by a close friend.”

Bauman and his family emigrated that year, eventually settling in Leeds. (He never faced a language barrier, having for some years been editor of a Polish sociological journal published in English.) His writings continued to be critical of both the Soviet system and of capitalism, and to support the labor movement. When Solidarity emerged in 1980 to challenge the state, Bauman welcomed it as the force that would shape of the future of Poland.

These facts are all part of the record -- put there, most of them, by Bauman himself. By no means is it a heroic tale. From time to time, he must have named names, and written things he didn’t believe, and forced himself to believe things that he knew, deep down, were not true.

And yet Bauman did not hide his past, either. It has always been available for anyone trying to come to some judgment of his work. He has been accused of failing to reflect upon his experience. But even that is a dubious reading of the evidence. A central point of his work on the “liquid” social structure of postmodernism is its contrast with the modernity that went before, which he says was “marked by the disciplinary power of the pastoral state.” He describes the Nazi and Stalinist regimes as the ultimate, extreme cases of that “disciplinary power.”

Let’s go out on a limb and at least consider the possibility that someone who admittedly spent years serving a social system that he now understands as issuing from the same matrix as Hitler’s regime may perhaps be telling us (in his own roundabout, sociologistic way) that he is morally culpable, no matter what his good intentions may have been.

Alas, this is not quite so exciting as “Postmodernist Conceals Sinister Past.” It doesn’t even have the satisfying denouement found in “The God That Failed,” that standard of ex-Communist disillusionment. Sorry about that.... It’s just a tale of a man getting older and – just possibly – wiser. I tend to think of that as a happy story, even so.

Author/s: 
Scott McLemee
Author's email: 
scott.mclemee@insidehighered.com

Scott McLemee writes Intellectual Affairs each week. Suggestions and ideas for future columns are welcome.

Tough Liberal

In the cartoons, an astonished character will at times need to grab his eyeballs as they come flying out of his head. Something like that happened to me a few months ago while going through the fall catalog of Columbia University Press. Buried deep in its pages – well behind all the exciting, glamorous titles at the bleeding edge of scholarship – was the listing for Tough Liberal: Albert Shanker and the Battles Over Schools, Unions, Race, and Democracy by Richard D. Kahlenberg. (It has just appeared in hardback.)

This was a title one might reasonably expect to see issued by a commercial publisher: Shanker, who died in 1997, was for many years the president of the American Federation of Teachers, which he helped build into one of the strongest unions in the AFL-CIO. It now has more than a million members, including about 160,000 who work in higher education; even if only one in a hundred were interested in the union’s history, that is quite a potential audience.

At the same time, it was a surprise to find the book published by a press better known for titles in cultural theory: works embodying a certain abstract radicalism, several miles in stratosphere above the labor movement. And Shanker, besides being a union bureaucrat, was something of a hardboiled ideologue – a fierce Cold Warrior, but no less ardent a Culture Warrior, denouncing both affirmative action and multiculturalism in tones that were, let’s say, emphatic.

Such “tough liberalism,” as his biographer calls it, made the labor leader a punchline in Woody Allen’s post-apocalyptic comedy "Sleeper" (1973). A character explains that no one is quite sure how civilization ended, but historians think it all started when “a man named Albert Shanker got his hands on an atomic bomb.”

A lot has changed since the days when a new movie by Woody Allen was a major event. And in any case, no labor leader has emerged in recent decades with quite the cultural and political profile that Shanker once had. Yet his name still has the power to provoke. There are Shankerites and anti-Shankerites.

Kahlenberg, a senior fellow at the Century Foundation in Washington, DC, admires Shanker and gives him the benefit of the doubt, more often than not. That tendency comes through, I think, in the IHE podcast we recently recorded. But Kahlenberg is not totally uncritical of Shanker. As we talked following the taping, Kahlenberg mentioned the passions stirred up by the leader's memory.

Some followers remain convinced that “Al” was right about more or less everything -- including the Vietnam War, which Shanker supported. Kahlenberg also looked into charges by Shanker's opponents that he received funds as part of the American intelligence community’s activity within the labor movement.

That accusation is hardly a surprising or implausible. All things considered, it would be surprising if Shanker were not connected with "the AFL-CIA” (as certain networks within the intelligence and labor communities were sometimes called). But Kahlenberg says critics haven’t offered solid evidence to back up the accusation. There is a difference between firm conviction and real proof. This is a matter some historian will eventually need to revisit, nailing things down with serious documentation.

Tough Liberal is not the first book about Shanker. But the previous volume, Dickson A. Mungazi’s Where He Stands: Albert Shanker and the American Federation of Teachers, published by Praeger in 1995, was not really a biography. Nor was it much of a contribution to labor history, given that Mungazi identifies Samuel Gompers (who died in 1924) as the first president of the CIO (established in 1935).

So Kahlenberg has made a real contribution by telling the story of this charismatic and/or megalomaniacal labor leader’s career. I say that as a reader who did not pick up the biography with any admiration for its subject – nor put it down converted to Shanker-style “toughness.” (Actually it made me think maybe Woody Allen was right.) But it’s an engaging book, and essential reading for anyone interested in the history of Cold War liberalism and its complicated legacy.

Further reading (and listening): An excerpt from Tough Liberal is available at Columbia UP's website. An early review of it appears in the latest issue of Washington Monthly. An extremely favorable treatment of the biography and of Shanker himself has recently appeared in The Wall Street Journal. For something altogether less laudatory, see the essay appearing ten years ago in the socialist journal New Politics. And by all means, lend an ear to the interview with Richard Kahlenberg, available as an IHE podcast.

Author/s: 
Scott McLemee
Author's email: 
scott.mclemee@insidehighered.com

Scott McLemee writes Intellectual Affairs each week. Suggestions and ideas for future columns are welcome.

Be Aware (Beware)

Not all Islamophobes are fanatics. Most, on the contrary, are decent people who just want to live in peace. Islamophobia forms only part of their identity. They grew up fearing Islam, and they still worry about it from time to time, especially during holidays and on certain anniversaries; but many would confess to doubt about just how Islamophobic they feel deep down inside. They may find themselves wondering, for example, if the Koran is really that much more bloodthirsty than the Jewish scriptures (Joshua 6 is plenty murderous) or the Christian (Matthew 10:34 is not exactly comforting).

Unfortunately a handful of troublemakers thrive among them, parasitically. They spew out hatred through Web sites. They seek to silence their critics, and to recruit impressionable young people. Perhaps it is unfair to confuse matters through calling the moderates and the militants by the same name. It would be more fitting to say that the latter are really Islamophobofascists.

Some might find the expression offensive. That is too bad. If we don’t resist Islamophobofascism now, its intolerance can only spread. And we all know who benefits from that. One name in particular comes to mind. It belongs to a fellow who is now presumably living in a cave, drawing up long-term plans for a clash of civilizations.....

Maybe I had better trim the satirical sails before going totally out to sea. As neologisms go, “Islamophobofascism” probably sounds even more stupid than the term it mocks. But there is a point to it.

“Islamofascism” is a noxious and counterproductive term -- a bludgeon disguised as an idea. Its use comes at a cost, even beyond the obvious one that goes with making people dumber. “Islamofascism” is the preferred term of those who don’t see any distinction between Al Qaeda, the Iranian mullahs, and the Baathists. Guess what? They are different, which might just have been worth understanding a few years ago. (Better late than never, maybe; but not a whole lot better.)

The more serious consequence, over the long term, is that of offering deliberate insult to those Muslims who would be put to the sword under the reign of Jihadi fundamentalists. Disgust for cheap stunts done in the name of “Islamofascism awareness” is not a matter of doubting that the jihadis mean what they say. On the contrary, it goes with taking them seriously as enemies.

It should not be necessary to qualify that last point. Somebody who wants to kill you is your enemy, whether you care to think in such terms or not; and the followers of Bin Laden, while subtle on some matters, have a least not been shy about letting us know what methods they consider permissible in pursuit of their ends. The jihadis mean it. Recognizing this is not a matter of Islamophobia; it is a matter of paying attention.

And paying attention means, in this case, recognizing that most Muslims are not our enemies. It is disgraceful to have to spell that out. But let’s be clear about something: The jihadis are not our only problem. As anyone from abroad who likes and respects Americans will probably tell you, we tend to be our own worst enemy.

There is a strain of nativism, xenophobia, and small-mindedness in American life that is always there -- often subdued, but never too far out of earshot. To call this our fascist streak would be absurdly melodramatic. Fascism proper was, above all, purposeful and orderly, while fear and loathing towards the “un-American” is often enough the woolliest form of baffled resentment: the effect of comfortable ignorance turning sour at any demand on its meager resources of attention and sympathy.

This quality can subsist for long periods in a dormant or distracted state -- expressing itself in muttering or small-scale acts of hostility, but nothing large-scale. Perhaps it is restrained by the better angels of our nature.

But it means that the unscrupulous and the obtuse have a ready supply of raw material to mold into something vile when the occasion becomes available, or if there is some profit in it. H.L. Mencken explained that a demagogue is “one who will preach doctrines he knows to be untrue to men he knows to be idiots." The problem with this definition, of course, is that it is the product of a simpler era and so not nearly cynical enough. For a demagogue now, truth and knowledge have nothing to do with it.

For the really suave expression of Islamophobofascism, however, no local sideshow can compete with an interview that the British novelist Martin Amis gave last year. At the highest stages of cosmopolitan literary influence, it seems, one may express ideas worthy of a manic loon phoning a radio talk-show and get them published in the London Times.

“There’s a definite urge -- don’t you have it? -- to say, ‘The Muslim community will have to suffer until it gets its house in order,’ ” Amis said. “What sort of suffering? Not letting them travel. Deportation -- further down the road. Curtailing of freedoms. Strip-searching people who look like they’re from the Middle East or from Pakistan.… Discriminatory stuff, until it hurts the whole community and they start getting tough with their children.”

The cultural theorist Terry Eagleton issued a response to Amis in the preface to a new edition of his book “Ideology: An Introduction” -- first published in 1991 by Verso, which reissued it a few weeks ago. It stirred up a tiny tempest in the British press, which reduced the argument to the dimensions of a clash between two “bad boys” (albeit ones grown quite long in the tooth).

Quickly mounting to impressive heights of inanity, the coverage and commentary managed somehow to ignore the actual substance of the dispute: what Amis said (his explicit call to persecute all Muslims until they acted right) and how Eagleton responded.

“Joseph Stalin seems not to be Amis’s favorite historical character,” wrote Eagleton, alluding to the novelist’s Koba the Dread, a venture into Soviet political history published a while back. “Yet there is a good dose of Stalinism in the current right-wing notion that a spot of rough stuff may be justified by the end in view. Not just roughing up actual or intending criminals, mind, but the calculated harassment of a whole population. Amis is not recommending such tactics for criminals or suspects only; he is recommending them as a way of humiliating and insulting certain kinds of men and women at random, so they will return home and teach their children to be nice to the White Man. There seems to be something mildly defective about this logic.”

Eagleton’s introduction doesn’t underestimate the virulence of the jihadists. But his remarks do at least have the good sense to acknowledge that humiliation is a weapon that will not work in the long run. (As an aside, let me note that some of us don't have the luxury of either ignoring terrorism or regarding it as something that will be abated by a more aggressive posture in the world. Life in Washington, D.C., for the past several years has meant rarely getting on the subway without wondering if this might be the day. The "surge" did not reduce the faint background radiation of dread one little bit. Funny how these thing work out, or don't.)

Anybody with an ounce of brains and responsibility can tell that fostering an environment of hysteria is useful only to one side of this conflict.“The best way to preserve one’s values,” writes Eagleton, “is to practice them.” Well said; and worth keeping in mind whenever the Islamophobofascists start to rush about, trying to drum up some business.

We shouldn't regard them as just nuisances. They are something much more dangerous. Determined to turn the whole world against us, they act as sleeper cells of malice and stupidity. There are sober ways to respond to danger, and insane ways. It is the demagogue’s stock in trade to blur the distinction.

Author/s: 
Scott McLemee
Author's email: 
scott.mclemee@insidehighered.com

Studying the Inhumanities

The United States does not torture. We have been told as much by the president, and more than once, in terms that are clear, forceful, unqualified. Even (so one must surmise) categorical. If the United States permits an interrogation technique, then it cannot be torture. Q.E.D.!

And so it is very disagreeable to have to quote statements such as the following: "Waterboarding is a torture technique that has its history rooted in the Spanish Inquisition. In 1947, the U.S. prosecuted a Japanese military officer for carrying out a form of waterboarding on a U.S. civilian during World War II. Waterboarding inflicts on its victims the terror of imminent death. And as with all torture techniques, it is, therefore, an inherently flawed method for gaining reliable information."

The nattering nabob of negativism in this case happens to be writing in Armed Forces Journal, which might be described as something like a trade journal for the U.S. military. One notes that the comment, published recently, is framed not in moral terms, but strictly with reference to torture's failure to meet the industry's needs: "In short, it doesn’t work. That blunt truth means all U.S. leaders, present and future, should be clear on the issue."

Well, it's a little late for that now, of course. Jameel Jaffer and Amrit Singh, both of the American Civil Liberties Union, have recently edited a volume called Administration of Torture: A Documentary History from Washington to Abu Ghraib and Beyond (Columbia University Press) that collects government memoranda from 2002 through 2005. They were selected from more than 100,000 pages of material released under the Freedom of Information, though only after litigation.

They add up to an account of how official talk of the "New Paradigm" after 9/11 led American forces to condone acts that would be would be be called torture if anyone else in the world did them. Perhaps we shouldn't get into semantics. But then again, even the expression "New Paradigm" seems a bit evasive. While John Milton was a bit inconsistent as a libertarian, Paradise Lost offers a plainspoken gloss on the thinking reflected in the documents gathered in the Columbia volume:

So spake the Fiend, and with necessitie,
The Tyrants plea, excus'd his devilish deeds.

Jameel Jaffer, one of the editors of Administration of Torture, answered a few questions by e-mail. A transcript of the exchange follows.

Q:Your introduction states that the prohibition of torture is "jus cogens." A puzzled layman turning to The Penguin Dictionary of International Relations discovers that this term "refers to a body of principles or norms in international law which override and supercede others" -- such that no treaty, for example, can be in violation of it. Piracy and genocide are prohibited by the same terms. But according to the same dictionary's account, "the precise application of jus cogens is not universally agreed upon."

Does international law include a clear, sharp definition of the criteria for torture? A line where it is distinguished from the kind of vigorous and disagreeable questioning of enemy combatants that is bound to happen during wartime? Or is this a matter in which "the precise application of jus cogens is not universally agreed upon"?

A: Everyone, including the Bush administration, agrees that the law prohibits torture. Torture is proscribed by the Geneva Conventions and the Convention Against Torture. It is also proscribed by the U.S. torture statute and the U.S. war crimes statute. I don't think anyone seriously argues that torture is anything other than a jus cogens norm. A problem arose in 2002, though, because the Office of Legal Counsel issued legal opinions that defined torture exceedingly narrowly -- vanishingly narrowly, in fact.

The OLC's unconscionably narrow definition of torture allowed the Bush administration to adopt interrogation methods that went far beyond those that had previously been considered acceptable. It's important to recognize, though, that the dispute was not over whether torture was illegal; it was over what kinds of methods would constitute torture. The U.S. torture statute defines torture to mean any act "specifically intended to inflict severe physical or mental pain or suffering."

What the OLC did in 2002 was to decide that methods like forcing prisoners into stress positions, waterboarding them, confining them in freezing cold cells, etc. didn't amount to torture. It concluded, absurdly, that an interrogation method would amount to torture only if it caused pain equivalent to that caused by organ failure or death. And it made the argument that even if interrogators engaged in torture, they couldn't be held criminally liable if they were acting under the president's authority as commander in chief.

It's worth pointing out that the Geneva Conventions proscribe not just torture but also cruel, inhuman, and degrading (CID) treatment. In 2002, though, the Bush administration took the position that al Qaeda and Taliban prisoners weren't protected by the Geneva Conventions -- not even by the most basic protections enshrined in "Common Article 3." Ultimately Congress enacted new statutes to make even clearer that CID was illegal. The Bush administration then did for "CID" what it had previously done for "torture": it just redefined the phrase so that the phrase wouldn't encompass the interrogation methods that it wanted to use. We're in court now arguing that the OLC's 2005 memos about CID should be released to the public.

Q: Some of the bureaucratic crosstalk among these documents can be a challenge to keep straight. There are exchanges among the Department of Defense, the Department of Justice, and the Federal Bureau of Investigation. Other material here quite unambiguously documents inhumane and even lethal treatment of prisoners. Most of that testimony seems to have been gathered in 2004 and '05, following the outcry over Abu Ghraib.

Do earlier documents show that reports of such treatment went up the chain of command? Or did things operate on a "don't ask, don't tell" basis, so to speak?

A: One of the most useful sets of records we obtained through the FOIA came from the FBI. The records I'm thinking of are e-mails and memos written by FBI agents who were stationed at Guantanamo in 2002 and 2003. The e-mails and memos document the agents' concerns with the harsh methods that were then being used by military interrogators. Towards the end of 2002, FBI agents began to express these concerns to their superiors, and on several occasions representatives of the FBI met with General Geoffrey Miller (who was then the commander of the military base at Guantanamo) to convey their concerns to him.

But the problem was not that military interrogators were exceeding the authority they had been given. The problem was that they were exercising the authority they had been given. They had been authorized to hold prisoners in stress positions, deprive them of light and auditory stimuli, strip them naked, isolate them for weeks at a time, and use military dogs to terrorize them. When military interrogators used those methods against prisoners, they weren't inventing the methods ad hoc. The methods had been approved by Defense Secretary Rumsfeld.

Q: Your Freedom of Information Act request (and subsequent litigation) led to the release of more than 100,000 pages of material. It sounds like this might be the tip of the iceberg. Two years ago the CIA destroyed videotapes of the interrogation of Al Qaeda figures, and recordings of the questioning of Jose Padilla have been "lost."

How much do you know about the kinds of material you are being denied access to? Are you aware of documentation that may already have gone down the memory hole?

A: Donald Rumsfeld famously distinguished between the "known unknowns" and the "unknown unknowns". Here, the known unknowns -- the records we know that the government is withholding -- include photographs of prisoner abuse at facilities other than Abu Ghraib; a September 2001 Presidential directive that authorized the CIA to set up a system of secret prisons in Eastern Europe and elsewhere; and an August 2002 Justice Department memorandum that advised the CIA about the legality of waterboarding and other extreme interrogation methods.

We learned several weeks ago that the Justice Department's Office of Legal Counsel is withholding three legal memos, all written in May of 2005, that narrowly construe the laws against cruel treatment and contend, wrongly, that waterboarding and other extreme methods can be used without offending those laws. When we learn of documents like the May 2005 memos, it's difficult not to wonder what else is still out there -- what unknown unknowns will come to light tomorrow.

Q: You write: "Senior administration officials, perhaps emboldened by Congress's failure to conduct any serious inquiry into past abuse, continue to violate domestic and international law." This volume reads like a dossier for a trial in the Hague. Suppose that did come to pass. Who would end up in the dock? Who is most culpable? (We're speaking hypothetically, here, of course, since that outcome does seem unlikely.)

A: I think any investigation would have to look at the very highest levels of the Bush administration. White House Counsel Alberto Gonzales (who later became Attorney General) wrote legal memos that were intended to allow interrogators to use inhumane methods and to insulate interrogators -- and officials -- from war crimes charges. John Yoo, a lawyer for the Justice Department's Office of Legal Counsel, wrote legal memoranda that allowed the use of torture. Defense Secretary Rumsfeld authorized interrogators to use inhumane methods at Guantanamo, and Lieut. General Ricardo Sanchez authorized interrogators to use similar methods in Iraq. Maj. Gen. Geoffrey Miller supervised the use of inhumane methods at Guantanamo and oversaw the "Gitmo-ization" of Abu Ghraib.

And it was President Bush, of course, who directed the CIA to set up secret detention centers abroad, allowed the CIA and Defense Department to adopt methods that in some cases amounted to torture, and said that al Qaeda and Taliban prisoners should be treated humanely only to the extent consistent with "military necessity." All the available evidence suggests that principal responsibility for the abuse and torture of prisoners belongs not to small groups of "rogue soldiers" but to senior officials in the Bush administration.

Author/s: 
Scott McLemee
Author's email: 
scott.mclemee@insidehighered.com

Disciplinary Associations Should Start Treating Job Seekers With Respect

As has become the annual tradition, the American Historical Association is out with its report lauding the health of the academic job market in history. The report, culled exclusively from job listings in Perspectives (an AHA publication) and Ph.D. completion statistics reported by history departments, shows that there are more available positions than there are historians produced. Other disciplines issue similar reports. While the AHA report may be viewed favorably by some such as scholars in Asian history, the most underpopulated field for historians, for others it reflects a general lack of concern from the association for the untenured and the graduate student. And the problems discussed here apply to many other disciplines as well.

As a national organization and the most powerful entity in the historical job market, the AHA has done surprisingly little to help the newest members of their profession. On the whole, historians pride themselves on their concern for social justice. In 2005, for example, the Organization of American Historians uprooted its annual conference and moved it to another city in a show of solidarity with hotel workers. When it comes to the plight of the discipline’s own working class, the unemployed job seeker, this compassion and concern is absent. In its place is an annual report from the AHA talking about how good it is for some. For others, there isn’t much the AHA can do. I find this lack of action, especially when compared to what is normally shown for the less fortunate, disheartening.

While the AHA can do nothing to overcome the dearth of tenure-track positions (which is a reality that deans, trustees, and legislators control), the association has a great deal of control over two things: job market statistics and the interview process. These areas, which some might say are of secondary concern, have made the job search a very inhospitable place. For one, the association could conduct a statistically sound study of the job market based on an actual survey of departments and job seekers. Drawing attention to the total number of jobs and the number of Ph.D.’s produced in the past year overlooks the fact that visiting faculty and independent scholars are also on the market. A more thorough census would provide better information to AHA members and possibly even a snapshot of many other employment concerns, including how the positions stack up in terms of pay, tenure-track status, and other key factors.

More importantly, the organization could do a number of things to reform the poorly designed hiring process that leaves applicants floating in a limbo of uncertainty throughout much of November and December. The lack of communication between search committees and job seekers is so common that it is now taken for granted along with death and taxes. Job applicants no longer expect any professional courtesy. While this results in a good bit of anxiety for anyone on the market, it can also lead to undue financial hardships that could easily be avoided. As a former editor of the H-Grad listserv and one currently searching for a tenure-track position, I can safely say that these concerns are pressing on the mind of most applicants.

The key to these job market reforms is the AHA itself. As the group most vested in the hiring process, it has done little to actively rectify some of the more egregious concerns with the job market. I have compiled a short list of changes that could be adopted with one vote at a business meeting. And most of these changes would benefit not just the AHA, but other disciplinary associations, especially in the humanities, where good, tenure-track jobs are not widely available. While this will absolutely not correct the disparity between job seekers and open positions, it will go a long way towards making the process a more fair and equitable institution.

1. Take a more accurate census of the job-seeking population annually. There is a glut of history Ph.D.'s. Everyone knows this. Yet for the past three years, the AHA has been trumpeting the idea that the job market is improving based solely on data that have no correlation with the actual situation. The AHA, like other associations, bases its data on job applicants solely on the number of new Ph.D.’s, ignoring the fact that so many of the past few years’ new doctorates remain either unemployed or in temporary positions, off the tenure track and with low pay and benefits. By only counting the new Ph.D.’s, the figures for job-seekers are significantly lower than they should be. The research being produced by the AHA needs to be more accurate so as to guide job applicants and graduate students as to their chances of finding a position. Since candidates who utilize the AHA Job Register at the annual meeting have to be registered as a meeting attendee, the AHA should include a census form with the conference registration. Questions such as "Are you a job seeker?," "What is your area of specialty?," "Have you ever had a tenure-track position?" and “How many years have you been on the job market?” would give a more accurate picture of just how truly dire the job market is. A follow-up survey every April would round out the study and enable applicants to assess their position in the market for the following fall. Job seekers could then make career choices based on tangible facts, rather than hearsay and propaganda.

2. Make the Job Register service a privilege that has to be earned. The AHA has a good deal of influence on the job market but has yet to utilize it in any significant way. Since most tenure-track positions are advertised in the AHA Perspectives and interviews are conducted at the AHA annual meeting, the AHA should mandate certain conditions that must be met before interviewing and advertising space is sold. If those conditions are not met, the AHA should deny departments the right to use their facilities and their ad space, thus adding substantial cost to the interviewing institutions. University HR departments and academic deans, often cited as the reason search committees are unable to communicate with applicants, would either allow the departments to comply with these provisions and foot the bill for a more expensive interview process. Lack of communication and the posting of identical positions without a hire for three or more years are two of the problems that stand out at the moment, but the usage could be expanded in future years to address new situations as the AHA sees fit.

3. Require that search committees inform applicants of their interview status via e-mail 30 days before the annual meeting. Graduate students, visiting lecturers, and independent scholars are, on the whole, not independently wealthy. Traveling across the country to stay at an upscale hotel in a major city just after the holiday season is a lot to ask, especially if a candidate has no interviews. Applicants, though, are at the mercy of the search committees, some of whom notify interviewees a week or less before the annual meeting. Applicants are forced to either keep their rooms and plane tickets past the cancellation date in the hopes that their phone will ring or pay higher airfares and higher hotel rates for last-minute bookings. Letting candidates know their interview status a month in advance would alleviate that situation and prevent the least paid of the profession from shouldering the heaviest and most burdensome travel costs. The AHA should set guidelines that search committees must let all job applicants know whether they will have an interview at the AHA 30 days before the annual meeting or face some sort of Job Register sanction up to suspension from its benefits for a set period of time as determined by the AHA.

4. Establish a general listserv for search committees and job seekers. Search committees are notorious for their lack of communications. Job seekers have pooled their resources into a number of academic career wikis, but these can be misused and are dependent on the truthfulness of the poster. The AHA can alleviate this uncertainty by creating a listserv and mandating that those who use the Job Register would agree to notify the AHA by e-mail at important phases of the job search process. Which steps those are would be open for negotiation, but everyone, committees and candidates alike, would know what those benchmarks are ahead of time. The AHA, and this is the critical step, would aggregate these notifications and send them out via a daily listserv to all job applicants who choose to subscribe. Under this system, for example, all who applied for the position in Pre-Modern China at Boise Valley State could know that the search committee has made AHA invitations, has made invitations for on-campus interviews, or that Dr. Damon Berryhill had accepted the position. Job applicants, who usually have no idea how the searches are progressing, would be more informed when fielding other offers and would no longer need to contact each institution directly for updates. Participation would also be in the hiring institution’s best interests, as it would reduce the need to communicate one on one with job candidates (a very time consuming task for search committee members) but still create a much more open system of communication for job seekers.

It is frustrating to me when scholars who have spent years examining the forces of reform and progress will take no action to better the lives of their fellow historians. Individuals who have studied the great reformers and crusaders of the past will simply throw up their hands and exclaim “its just the market!,” when confronted with horror stories of graduate students and visiting faculty on the hunt for tenure-track jobs. These people do nothing as if, by some sort of divine incantation, the injustices of the hiring process are set in stone and beyond human control. This is the attitude that needs to change the most.

It is worth mentioning that graduate students make up the largest constituency group of the AHA membership. As the H-Grad listserv and the academic careers wiki continue to gain popularity, it will not be much longer before job seekers figure out how to organize themselves and make their voices heard one way or another. One anonymous poster on the job wiki for American historians has already suggested that all job seekers flood this year’s business meeting and vote no on every provision until the AHA takes up job market reform. The leadership of the AHA should adopt these reforms, or at the very least make a reasonable effort to study them, in order to make the job market a more tolerable place for the profession’s newest members and to take the first steps toward a more equitable and open hiring process.

Author/s: 
Michael Bowen
Author's email: 
info@insidehighered.com

Michael Bowen is assistant director of the Bob Graham Center for Public Service and a visiting lecturer in the history department at the University of Florida.

Pages

Subscribe to RSS - History
Back to Top